NASA Astrophysics Data System (ADS)
Koch, Melissa; Gorges, Torie
2016-10-01
Underrepresented populations such as women, African-Americans, and Latinos/as often come to STEM (science, technology, engineering, and mathematics) careers by less traditional paths than White and Asian males. To better understand how and why women might shift toward STEM, particularly computer science, careers, we investigated the education and career direction of afterschool facilitators, primarily women of color in their twenties and thirties, who taught Build IT, an afterschool computer science curriculum for middle school girls. Many of these women indicated that implementing Build IT had influenced their own interest in technology and computer science and in some cases had resulted in their intent to pursue technology and computer science education. We wanted to explore the role that teaching Build IT may have played in activating or reactivating interest in careers in computer science and to see whether in the years following implementation of Build IT, these women pursued STEM education and/or careers. We reached nine facilitators who implemented the program in 2011-12 or shortly after. Many indicated that while facilitating Build IT, they learned along with the participants, increasing their interest in and confidence with technology and computer science. Seven of the nine participants pursued further STEM or computer science learning or modified their career paths to include more of a STEM or computer science focus. Through interviews, we explored what aspects of Build IT influenced these facilitators' interest and confidence in STEM and when relevant their pursuit of technology and computer science education and careers.
Introducing Molecular Life Science Students to Model Building Using Computer Simulations
ERIC Educational Resources Information Center
Aegerter-Wilmsen, Tinri; Kettenis, Dik; Sessink, Olivier; Hartog, Rob; Bisseling, Ton; Janssen, Fred
2006-01-01
Computer simulations can facilitate the building of models of natural phenomena in research, such as in the molecular life sciences. In order to introduce molecular life science students to the use of computer simulations for model building, a digital case was developed in which students build a model of a pattern formation process in…
Toward a Computational Model of Tutoring.
ERIC Educational Resources Information Center
Woolf, Beverly Park
1992-01-01
Discusses the integration of instructional science and computer science. Topics addressed include motivation for building knowledge-based systems; instructional design issues, including cognitive models, representing student intentions, and student models and error diagnosis; representing tutoring knowledge; building a tutoring system, including…
ERIC Educational Resources Information Center
Namdar, Bahadir
2017-01-01
The purpose of this study was to investigate preservice science teachers' collaborative knowledge building through socioscientific argumentation on healthy eating in a multiple representation-rich computer supported collaborative learning (CSCL) environment. This study was conducted with a group of preservice science teachers (n = 18) enrolled in…
Democratizing Computer Science
ERIC Educational Resources Information Center
Margolis, Jane; Goode, Joanna; Ryoo, Jean J.
2015-01-01
Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…
ERIC Educational Resources Information Center
Lin, Feng; Chan, Carol K. K.
2018-01-01
This study examined the role of computer-supported knowledge-building discourse and epistemic reflection in promoting elementary-school students' scientific epistemology and science learning. The participants were 39 Grade 5 students who were collectively pursuing ideas and inquiry for knowledge advance using Knowledge Forum (KF) while studying a…
Representing, Running, and Revising Mental Models: A Computational Model
ERIC Educational Resources Information Center
Friedman, Scott; Forbus, Kenneth; Sherin, Bruce
2018-01-01
People use commonsense science knowledge to flexibly explain, predict, and manipulate the world around them, yet we lack computational models of how this commonsense science knowledge is represented, acquired, utilized, and revised. This is an important challenge for cognitive science: Building higher order computational models in this area will…
"Computer Science Can Feed a Lot of Dreams"
ERIC Educational Resources Information Center
Educational Horizons, 2014
2014-01-01
Pat Yongpradit is the director of education at Code.org. He leads all education efforts, including professional development and curriculum creation, and he builds relationships with school districts. Pat joined "Educational Horizons" to talk about why it is important to teach computer science--even for non-computer science teachers. This…
ERIC Educational Resources Information Center
Kite, Vance; Park, Soonhye
2018-01-01
In 2006 Jeanette Wing, a professor of computer science at Carnegie Mellon University, proposed computational thinking (CT) as a literacy just as important as reading, writing, and mathematics. Wing defined CT as a set of skills and strategies computer scientists use to solve complex, computational problems (Wing 2006). The computer science and…
Building machines that adapt and compute like brains.
Kriegeskorte, Nikolaus; Mok, Robert M
2017-01-01
Building machines that learn and think like humans is essential not only for cognitive science, but also for computational neuroscience, whose ultimate goal is to understand how cognition is implemented in biological brains. A new cognitive computational neuroscience should build cognitive-level and neural-level models, understand their relationships, and test both types of models with both brain and behavioral data.
ERIC Educational Resources Information Center
Lee, Aimee T.; Hairston, Rosalina V.; Thames, Rachel; Lawrence, Tonya; Herron, Sherry S.
2002-01-01
Describes the Lateblight computer simulation implemented in the general biology laboratory and science methods course for elementary teachers to reinforce the processes of science and allow students to engage, explore, explain, elaborate, and evaluate the methods of building concepts in science. (Author/KHR)
Dragonfly: strengthening programming skills by building a game engine from scratch
NASA Astrophysics Data System (ADS)
Claypool, Mark
2013-06-01
Computer game development has been shown to be an effective hook for motivating students to learn both introductory and advanced computer science topics. While games can be made from scratch, to simplify the programming required game development often uses game engines that handle complicated or frequently used components of the game. These game engines present the opportunity to strengthen programming skills and expose students to a range of fundamental computer science topics. While educational efforts have been effective in using game engines to improve computer science education, there have been no published papers describing and evaluating students building a game engine from scratch as part of their course work. This paper presents the Dragonfly-approach in which students build a fully functional game engine from scratch and make a game using their engine as part of a junior-level course. Details on the programming projects are presented, as well as an evaluation of the results from two offerings that used Dragonfly. Student performance on the projects as well as student assessments demonstrates the efficacy of having students build a game engine from scratch in strengthening their programming skills.
Center for Building Science: Annual report, FY 1986
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cairns, E.J.; Rosenfeld, A.H.
1987-05-01
The Center for Building Science consists of four programs in the Applied Science Division: energy analysis, buildings energy systems, windows and lighting, and indoor environment. It was established to provide an umbrella so that goups in different programs but with similar interests could combine to perform joint research, develop new research areas, share resources, and produce joint publications. As detailed below, potential savings for the U.S. society from energy efficient buildings are enormous. But these savings can only be realized through an expanding federal RandD program that develops expertise in this new area. The Center for Building Science develops efficientmore » new building componenets, computer models, data and information systems, and trains needed builidng scientists. 135 refs., 72 figs., 18 tabs.« less
Reproducible research in vadose zone sciences
USDA-ARS?s Scientific Manuscript database
A significant portion of present-day soil and Earth science research is computational, involving complex data analysis pipelines, advanced mathematical and statistical models, and sophisticated computer codes. Opportunities for scientific progress are greatly diminished if reproducing and building o...
Building a Data Science capability for USGS water research and communication
NASA Astrophysics Data System (ADS)
Appling, A.; Read, E. K.
2015-12-01
Interpreting and communicating water issues in an era of exponentially increasing information requires a blend of domain expertise, computational proficiency, and communication skills. The USGS Office of Water Information has established a Data Science team to meet these needs, providing challenging careers for diverse domain scientists and innovators in the fields of information technology and data visualization. Here, we detail the experience of building a Data Science capability as a bridging element between traditional water resources analyses and modern computing tools and data management techniques. This approach includes four major components: 1) building reusable research tools, 2) documenting data-intensive research approaches in peer reviewed journals, 3) communicating complex water resources issues with interactive web visualizations, and 4) offering training programs for our peers in scientific computing. These components collectively improve the efficiency, transparency, and reproducibility of USGS data analyses and scientific workflows.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-11
... house and public hearing will be held at San Juan College, 4601 College Boulevard, Computer Science... Boulevard, Computer Science Building, Room 7103, Farmington, New Mexico 87402, (505) 326-3311. The open...
NASA Astrophysics Data System (ADS)
Lin, Feng; Chan, Carol K. K.
2018-04-01
This study examined the role of computer-supported knowledge-building discourse and epistemic reflection in promoting elementary-school students' scientific epistemology and science learning. The participants were 39 Grade 5 students who were collectively pursuing ideas and inquiry for knowledge advance using Knowledge Forum (KF) while studying a unit on electricity; they also reflected on the epistemic nature of their discourse. A comparison class of 22 students, taught by the same teacher, studied the same unit using the school's established scientific investigation method. We hypothesised that engaging students in idea-driven and theory-building discourse, as well as scaffolding them to reflect on the epistemic nature of their discourse, would help them understand their own scientific collaborative discourse as a theory-building process, and therefore understand scientific inquiry as an idea-driven and theory-building process. As hypothesised, we found that students engaged in knowledge-building discourse and reflection outperformed comparison students in scientific epistemology and science learning, and that students' understanding of collaborative discourse predicted their post-test scientific epistemology and science learning. To further understand the epistemic change process among knowledge-building students, we analysed their KF discourse to understand whether and how their epistemic practice had changed after epistemic reflection. The implications on ways of promoting epistemic change are discussed.
ERIC Educational Resources Information Center
Kim, Karen A.; Fann, Amy J.; Misa-Escalante, Kimberly O.
2011-01-01
Building on research that identifies and addresses issues of women's underrepresentation in computing, this article describes promising practices in undergraduate research experiences that promote women's long-term interest in computer science and engineering. Specifically, this article explores whether and how REU programs include programmatic…
Crane, Michael; Steinwand, Dan; Beckmann, Tim; Krpan, Greg; Liu, Shu-Guang; Nichols, Erin; Haga, Jim; Maddox, Brian; Bilderback, Chris; Feller, Mark; Homer, George
2001-01-01
The overarching goal of this project is to build a spatially distributed infrastructure for information science research by forming a team of information science researchers and providing them with similar hardware and software tools to perform collaborative research. Four geographically distributed Centers of the U.S. Geological Survey (USGS) are developing their own clusters of low-cost, personal computers into parallel computing environments that provide a costeffective way for the USGS to increase participation in the high-performance computing community. Referred to as Beowulf clusters, these hybrid systems provide the robust computing power required for conducting information science research into parallel computing systems and applications.
ERIC Educational Resources Information Center
Wheeler, David L.
1988-01-01
Scientists feel that progress in artificial intelligence and the availability of thousands of experimental results make this the right time to build and test theories on how people think and learn, using the computer to model minds. (MSE)
The Student/Library Computer Science Collaborative
ERIC Educational Resources Information Center
Hahn, Jim
2015-01-01
With funding from an Institute of Museum and Library Services demonstration grant, librarians of the Undergraduate Library at the University of Illinois at Urbana-Champaign partnered with students in computer science courses to design and build student-centered mobile apps. The grant work called for demonstration of student collaboration…
Promoting Technology-Assisted Active Learning in Computer Science Education
ERIC Educational Resources Information Center
Gao, Jinzhu; Hargis, Jace
2010-01-01
This paper describes specific active learning strategies for teaching computer science, integrating both instructional technologies and non-technology-based strategies shown to be effective in the literature. The theoretical learning components addressed include an intentional method to help students build metacognitive abilities, as well as…
ERIC Educational Resources Information Center
Benbow, Ross J.; Vivyan, Erika
2016-01-01
Building from findings showing that undergraduate computer science continues to have the highest attrition rates proportionally for women within postsecondary science, technology, engineering, and mathematics disciplines--a phenomenon that defies basic social equity goals in a high status field--this paper seeks to better understand how student…
Computer Science Lesson Study: Building Computing Skills among Elementary School Teachers
ERIC Educational Resources Information Center
Newman, Thomas R.
2017-01-01
The lack of diversity in the technology workforce in the United States has proven to be a stubborn problem, resisting even the most well-funded reform efforts. With the absence of computer science education in the mainstream K-12 curriculum, only a narrow band of students in public schools go on to careers in technology. The problem persists…
Formal Methods, Design, and Collaborative Learning in the First Computer Science Course.
ERIC Educational Resources Information Center
Troeger, Douglas R.
1995-01-01
A new introductory computer science course at City College of New York builds on a foundation of logic to teach programming based on a "design idea," a strong departure from conventional programming courses. Reduced attrition and increased student and teacher enthusiasm have resulted. (MSE)
Using Arduino to Teach Programming to First-Year Computer Science Students
ERIC Educational Resources Information Center
Tan, Wee Lum; Venema, Sven; Gonzalez, Ruben
2017-01-01
Transitioning to university is recognised as a challenging endeavour for commencing students. For commencing Computer Science students specifically, evidence suggests a link between poor performance in introductory technical courses, such as programming, and high attrition rates. Building resilience in students, particularly at the start of their…
Computer Networking Strategies for Building Collaboration among Science Educators.
ERIC Educational Resources Information Center
Aust, Ronald
The development and dissemination of science materials can be associated with technical delivery systems such as the Unified Network for Informatics in Teacher Education (UNITE). The UNITE project was designed to investigate ways for using computer networking to improve communications and collaboration among university schools of education and…
A Seminar in Mathematical Model-Building.
ERIC Educational Resources Information Center
Smith, David A.
1979-01-01
A course in mathematical model-building is described. Suggested modeling projects include: urban problems, biology and ecology, economics, psychology, games and gaming, cosmology, medicine, history, computer science, energy, and music. (MK)
76 FR 9765 - Advanced Scientific Computing Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-22
... DEPARTMENT OF ENERGY Advanced Scientific Computing Advisory Committee AGENCY: Office of Science... Advanced Scientific Computing Advisory Committee (ASCAC). The Federal Advisory Committee Act (Pub. L. 92... INFORMATION CONTACT: Melea Baker, Office of Advanced Scientific Computing Research, SC-21/Germantown Building...
Applying service learning to computer science: attracting and engaging under-represented students
NASA Astrophysics Data System (ADS)
Dahlberg, Teresa; Barnes, Tiffany; Buch, Kim; Bean, Karen
2010-09-01
This article describes a computer science course that uses service learning as a vehicle to accomplish a range of pedagogical and BPC (broadening participation in computing) goals: (1) to attract a diverse group of students and engage them in outreach to younger students to help build a diverse computer science pipeline, (2) to develop leadership and team skills using experiential techniques, and (3) to develop student attitudes associated with success and retention in computer science. First, we describe the course and how it was designed to incorporate good practice in service learning. We then report preliminary results showing a positive impact of the course on all pedagogical goals and discuss the implications of the results for broadening participation in computing.
NASA Astrophysics Data System (ADS)
Shell, Duane F.; Soh, Leen-Kiat
2013-12-01
The goal of the present study was to utilize a profiling approach to understand differences in motivation and strategic self-regulation among post-secondary STEM students in major versus required non-major computer science courses. Participants were 233 students from required introductory computer science courses (194 men; 35 women; 4 unknown) at a large Midwestern state university. Cluster analysis identified five profiles: (1) a strategic profile of a highly motivated by-any-means good strategy user; (2) a knowledge-building profile of an intrinsically motivated autonomous, mastery-oriented student; (3) a surface learning profile of a utility motivated minimally engaged student; (4) an apathetic profile of an amotivational disengaged student; and (5) a learned helpless profile of a motivated but unable to effectively self-regulate student. Among CS majors and students in courses in their major field, the strategic and knowledge-building profiles were the most prevalent. Among non-CS majors and students in required non-major courses, the learned helpless, surface learning, and apathetic profiles were the most prevalent. Students in the strategic and knowledge-building profiles had significantly higher retention of computational thinking knowledge than students in other profiles. Students in the apathetic and surface learning profiles saw little instrumentality of the course for their future academic and career objectives. Findings show that students in STEM fields taking required computer science courses exhibit the same constellation of motivated strategic self-regulation profiles found in other post-secondary and K-12 settings.
ERIC Educational Resources Information Center
Katz, Mary Maxwell; And Others
Teacher isolation is a significant problem in the science teaching profession. Traditional inservice solutions are often plagued by logistical difficulties or occur too infrequently to build ongoing teacher networks. Educational Technology Center (ETC) researchers reasoned that computer-based conferencing might promote collegial exchange among…
Building Cognition: The Construction of Computational Representations for Scientific Discovery.
Chandrasekharan, Sanjay; Nersessian, Nancy J
2015-11-01
Novel computational representations, such as simulation models of complex systems and video games for scientific discovery (Foldit, EteRNA etc.), are dramatically changing the way discoveries emerge in science and engineering. The cognitive roles played by such computational representations in discovery are not well understood. We present a theoretical analysis of the cognitive roles such representations play, based on an ethnographic study of the building of computational models in a systems biology laboratory. Specifically, we focus on a case of model-building by an engineer that led to a remarkable discovery in basic bioscience. Accounting for such discoveries requires a distributed cognition (DC) analysis, as DC focuses on the roles played by external representations in cognitive processes. However, DC analyses by and large have not examined scientific discovery, and they mostly focus on memory offloading, particularly how the use of existing external representations changes the nature of cognitive tasks. In contrast, we study discovery processes and argue that discoveries emerge from the processes of building the computational representation. The building process integrates manipulations in imagination and in the representation, creating a coupled cognitive system of model and modeler, where the model is incorporated into the modeler's imagination. This account extends DC significantly, and we present some of the theoretical and application implications of this extended account. Copyright © 2014 Cognitive Science Society, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sprague, Michael A.; Boldyrev, Stanislav; Fischer, Paul
This report details the impact exascale will bring to turbulent-flow simulations in applied science and technology. The need for accurate simulation of turbulent flows is evident across the DOE applied-science and engineering portfolios, including combustion, plasma physics, nuclear-reactor physics, wind energy, and atmospheric science. The workshop brought together experts in turbulent-flow simulation, computational mathematics, and high-performance computing. Building upon previous ASCR workshops on exascale computing, participants defined a research agenda and path forward that will enable scientists and engineers to continually leverage, engage, and direct advances in computational systems on the path to exascale computing.
Public Dialogue on Science in Sweden.
ERIC Educational Resources Information Center
Dyring, Annagreta
1988-01-01
Explains how Sweden has proceeded to popularize science. Addresses topics dealing with policy, the energy debate, booklets with large circulation, computers and society, contacts between schools and research, building up small science centers, mass media, literary quality, children's responsibility, and some of the challenges. (RT)
Challenges of Teaching Computer Science in Transition Countries: Albanian University Case
ERIC Educational Resources Information Center
Sotirofski, Kseanela; Kukeli, Agim; Kalemi, Edlira
2010-01-01
The main objective of our study is to determine the challenges faced during the process of teaching Computer Science in a university of a country in transition and make suggestions to improve this teaching process by perfecting the necessary conditions. Our survey builds on the thesis that we live in an information age; information technology is…
Nutrition and the science of disease prevention: a systems approach to support metabolic health
Bennett, Brian J.; Hall, Kevin D.; Hu, Frank B.; McCartney, Anne L.; Roberto, Christina
2017-01-01
Progress in nutritional science, genetics, computer science, and behavioral economics can be leveraged to address the challenge of noncommunicable disease. This report highlights the connection between nutrition and the complex science of preventing disease and discusses the promotion of optimal metabolic health, building on input from several complementary disciplines. The discussion focuses on (1) the basic science of optimal metabolic health, including data from gene–diet interactions, microbiome, and epidemiological research in nutrition, with the goal of defining better targets and interventions, and (2) how nutrition, from pharma to lifestyle, can build on systems science to address complex issues. PMID:26415028
DOE Office of Scientific and Technical Information (OSTI.GOV)
Almgren, Ann; DeMar, Phil; Vetter, Jeffrey
The widespread use of computing in the American economy would not be possible without a thoughtful, exploratory research and development (R&D) community pushing the performance edge of operating systems, computer languages, and software libraries. These are the tools and building blocks — the hammers, chisels, bricks, and mortar — of the smartphone, the cloud, and the computing services on which we rely. Engineers and scientists need ever-more specialized computing tools to discover new material properties for manufacturing, make energy generation safer and more efficient, and provide insight into the fundamentals of the universe, for example. The research division of themore » U.S. Department of Energy’s (DOE’s) Office of Advanced Scientific Computing and Research (ASCR Research) ensures that these tools and building blocks are being developed and honed to meet the extreme needs of modern science. See also http://exascaleage.org/ascr/ for additional information.« less
ERIC Educational Resources Information Center
DeWitt, Dorothy; Alias, Norlidah; Siraj, Saedah
2014-01-01
Collaborative problem-solving in science instruction allows learners to build their knowledge and understanding through interaction, using the language of science. Computer-mediated communication (CMC) tools facilitate collaboration and may provide the opportunity for interaction when using the language of science in learning. There seems to be…
ERIC Educational Resources Information Center
Danish, Joshua Adam
2009-01-01
Representations such as drawings, graphs, and computer simulations, are central to learning and doing science. Furthermore, ongoing success in science learning requires students to build on the representations and associated practices that they are presumed to have learned throughout their schooling career. Without these practices, students have…
Using Mental Imagery Processes for Teaching and Research in Mathematics and Computer Science
ERIC Educational Resources Information Center
Arnoux, Pierre; Finkel, Alain
2010-01-01
The role of mental representations in mathematics and computer science (for teaching or research) is often downplayed or even completely ignored. Using an ongoing work on the subject, we argue for a more systematic study and use of mental representations, to get an intuition of mathematical concepts, and also to understand and build proofs. We…
Partnership in Computational Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huray, Paul G.
1999-02-24
This is the final report for the "Partnership in Computational Science" (PICS) award in an amount of $500,000 for the period January 1, 1993 through December 31, 1993. A copy of the proposal with its budget is attached as Appendix A. This report first describes the consequent significance of the DOE award in building infrastructure of high performance computing in the Southeast and then describes the work accomplished under this grant and a list of publications resulting from it.
Building Cognition: The Construction of Computational Representations for Scientific Discovery
ERIC Educational Resources Information Center
Chandrasekharan, Sanjay; Nersessian, Nancy J.
2015-01-01
Novel computational representations, such as simulation models of complex systems and video games for scientific discovery (Foldit, EteRNA etc.), are dramatically changing the way discoveries emerge in science and engineering. The cognitive roles played by such computational representations in discovery are not well understood. We present a…
Enabling Wide-Scale Computer Science Education through Improved Automated Assessment Tools
NASA Astrophysics Data System (ADS)
Boe, Bryce A.
There is a proliferating demand for newly trained computer scientists as the number of computer science related jobs continues to increase. University programs will only be able to train enough new computer scientists to meet this demand when two things happen: when there are more primary and secondary school students interested in computer science, and when university departments have the resources to handle the resulting increase in enrollment. To meet these goals, significant effort is being made to both incorporate computational thinking into existing primary school education, and to support larger university computer science class sizes. We contribute to this effort through the creation and use of improved automated assessment tools. To enable wide-scale computer science education we do two things. First, we create a framework called Hairball to support the static analysis of Scratch programs targeted for fourth, fifth, and sixth grade students. Scratch is a popular building-block language utilized to pique interest in and teach the basics of computer science. We observe that Hairball allows for rapid curriculum alterations and thus contributes to wide-scale deployment of computer science curriculum. Second, we create a real-time feedback and assessment system utilized in university computer science classes to provide better feedback to students while reducing assessment time. Insights from our analysis of student submission data show that modifications to the system configuration support the way students learn and progress through course material, making it possible for instructors to tailor assignments to optimize learning in growing computer science classes.
ERIC Educational Resources Information Center
Knipfer, Kristin; Mayr, Eva; Zahn, Carmen; Schwan, Stephan; Hesse, Friedrich W.
2009-01-01
In this article, the potentials of advanced technologies for learning in science exhibitions are outlined. For this purpose, we conceptualize science exhibitions as "dynamic information space for knowledge building" which includes three pathways of knowledge communication. This article centers on the second pathway, that is, knowledge…
Directory of Federal Contractors and Cognizant Audit Offices
1994-12-31
34 qLELECTE 11. Title: , Directory of Federal Contractors and Cognizant Audit Offices 12. Performing Organization: 13 . Performing Report #: 14...GRAND AVE EL SEGUNDO CA 90245 3106150311 COMPUTER SCIENCE CORPORATION 04311 044 BUILDING T-164 FT. HUNTER LIGGE CA 93928 4083882775 COMPUTER SCIENCE...INC 03621 110 2650 INDUSTRIAL PKW MARINETTE WI 54143 7157359058 FIRE SYS INC 01271 084 5110 S. MACDILL AVE TAMPA FL 33611 6138379491 FIRE-BANN
ERIC Educational Resources Information Center
Blaser, Mark; Larsen, Jamie
1996-01-01
Presents five interactive, computer-based activities that mimic scientific tests used by sport researchers to help companies design high-performance athletic shoes, including impact tests, flexion tests, friction tests, video analysis, and computer modeling. Provides a platform for teachers to build connections between chemistry (polymer science),…
AIA Honors Imaginative Solutions to Common Campus Problems.
ERIC Educational Resources Information Center
Chronicle of Higher Education, 1987
1987-01-01
The American Institute of Architects honored five recently completed university buildings whose architects solved the difficulties of site and scale: Columbia University's Computer Science Building, Dartmouth's Hood Museum of Art, Emory's Museum of Art, Princeton's Lewis Thomas Laboratory, and the University of California at Irvine's Computer…
Machine learning: Trends, perspectives, and prospects.
Jordan, M I; Mitchell, T M
2015-07-17
Machine learning addresses the question of how to build computers that improve automatically through experience. It is one of today's most rapidly growing technical fields, lying at the intersection of computer science and statistics, and at the core of artificial intelligence and data science. Recent progress in machine learning has been driven both by the development of new learning algorithms and theory and by the ongoing explosion in the availability of online data and low-cost computation. The adoption of data-intensive machine-learning methods can be found throughout science, technology and commerce, leading to more evidence-based decision-making across many walks of life, including health care, manufacturing, education, financial modeling, policing, and marketing. Copyright © 2015, American Association for the Advancement of Science.
24 CFR 570.416 - Hispanic-serving institutions work study program.
Code of Federal Regulations, 2010 CFR
2010-04-01
... to pre-professional careers in these fields. (b) Definitions. The following definitions apply to HSI... such as natural sciences, computer sciences, mathematics, accounting, electronics, engineering, and the... pursuing careers in community building, and make them aware of the availability of assistance opportunities...
NASA Technical Reports Server (NTRS)
Brooks, Rodney Allen; Stein, Lynn Andrea
1994-01-01
We describe a project to capitalize on newly available levels of computational resources in order to understand human cognition. We will build an integrated physical system including vision, sound input and output, and dextrous manipulation, all controlled by a continuously operating large scale parallel MIMD computer. The resulting system will learn to 'think' by building on its bodily experiences to accomplish progressively more abstract tasks. Past experience suggests that in attempting to build such an integrated system we will have to fundamentally change the way artificial intelligence, cognitive science, linguistics, and philosophy think about the organization of intelligence. We expect to be able to better reconcile the theories that will be developed with current work in neuroscience.
NASA Astrophysics Data System (ADS)
Ni, Lijun
Computing education requires qualified computing teachers. The reality is that too few high schools in the U.S. have computing/computer science teachers with formal computer science (CS) training, and many schools do not have CS teacher at all. Moreover, teacher retention rate is often low. Beginning teacher attrition rate is particularly high in secondary education. Therefore, in addition to the need for preparing new CS teachers, we also need to support those teachers we have recruited and trained to become better teachers and continue to teach CS. Teacher education literature, especially teacher identity theory, suggests that a strong sense of teacher identity is a major indicator or feature of committed, qualified teachers. However, under the current educational system in the U.S., it could be challenging to establish teacher identity for high school (HS) CS teachers, e.g., due to a lack of teacher certification for CS. This thesis work centers upon understanding the sense of identity HS CS teachers hold and exploring ways of supporting their identity development through a professional development program: the Disciplinary Commons for Computing Educators (DCCE). DCCE has a major focus on promoting reflection on teaching practice and community building. With scaffolded activities such as course portfolio creation, peer review and peer observation among a group of HS CS teachers, it offers opportunities for CS teachers to explicitly reflect on and narrate their teaching, which is a central process of identity building through their participation within the community. In this thesis research, I explore the development of CS teacher identity through professional development programs. I first conducted an interview study with local HS CS teachers to understand their sense of identity and factors influencing their identity formation. I designed and enacted the professional program (DCCE) and conducted case studies with DCCE participants to understand how their participation in DCCE supported their identity development as a CS teacher. Overall, I found that these CS teachers held different teacher identities with varied features related to their motivation and commitment in teaching CS. I identified four concrete factors that contributed to these teachers' sense of professional identity as a CS teacher. I addressed some of these issues for CS teachers' identity development (especially the issue of lacking community) through offering professional development opportunities with a major focus on teacher reflection and community building. Results from this work indicate a potential model of supporting CS identity development, mapping the characteristics of the professional development program with particular facets of CS teacher identity. This work offers further understanding of the unique challenges that current CS teachers are facing in their CS teaching, as well as the challenges of preparing and supporting CS teachers. My findings also suggest guidelines for teacher education and professional development program design and implementation for building committed, qualified CS teachers in ways that promote the development of CS teacher identity.
A parallel-processing approach to computing for the geographic sciences
Crane, Michael; Steinwand, Dan; Beckmann, Tim; Krpan, Greg; Haga, Jim; Maddox, Brian; Feller, Mark
2001-01-01
The overarching goal of this project is to build a spatially distributed infrastructure for information science research by forming a team of information science researchers and providing them with similar hardware and software tools to perform collaborative research. Four geographically distributed Centers of the U.S. Geological Survey (USGS) are developing their own clusters of low-cost personal computers into parallel computing environments that provide a costeffective way for the USGS to increase participation in the high-performance computing community. Referred to as Beowulf clusters, these hybrid systems provide the robust computing power required for conducting research into various areas, such as advanced computer architecture, algorithms to meet the processing needs for real-time image and data processing, the creation of custom datasets from seamless source data, rapid turn-around of products for emergency response, and support for computationally intense spatial and temporal modeling.
Math and science technology access and use in South Dakota public schools grades three through five
NASA Astrophysics Data System (ADS)
Schwietert, Debra L.
The development of K-12 technology standards, soon to be added to state testing of technology proficiency, and the increasing presence of computers in homes and classrooms reflects the growing importance of technology in current society. This study examined math and science teachers' responses on a survey of technology use in grades three through five in South Dakota. A researcher-developed survey instrument was used to collect data from a random sample of 100 public schools throughout the South Dakota. Forced choice and open-ended responses were recorded. Most teachers have access to computers, but they lack resources to purchase software for their content areas, especially in science areas. Three-fourths of teachers in this study reported multiple computers in their classrooms and 67% reported access to labs in other areas of the school building. These numbers are lower than the national average of 84% of teachers with computers in their classrooms and 95% with access to computers elsewhere in the building (USDOE, 2000). Almost eight out of 10 teachers noted time as a barrier to learning more about educational software. Additional barriers included lack of school funds (38%), access to relevant training (32%), personal funds (30%), and poor quality of training (7%). Teachers most often use math and science software as supplemental, with practice tutorials cited as another common use. The most common interest for software was math for both boys and girls. The second most common choice for boys was science and for girls, language arts. Teachers reported that there was no preference for either individual or group work on computers for girls or boys. Most teachers do not systematically evaluate software for gender preferences, but review software over subjectively.
Computer-Assisted Microscopy in Science Teaching and Research.
ERIC Educational Resources Information Center
Radice, Gary P.
1997-01-01
Describes a technological approach to teaching the relationships between biological form and function. Computer-assisted image analysis was integrated into a microanatomy course. Students spend less time memorizing and more time observing, measuring, and interpreting, building technical and analytical skills. Appendices list hardware and software…
Bringing computational science to the public.
McDonagh, James L; Barker, Daniel; Alderson, Rosanna G
2016-01-01
The increasing use of computers in science allows for the scientific analyses of large datasets at an increasing pace. We provided examples and interactive demonstrations at Dundee Science Centre as part of the 2015 Women in Science festival, to present aspects of computational science to the general public. We used low-cost Raspberry Pi computers to provide hands on experience in computer programming and demonstrated the application of computers to biology. Computer games were used as a means to introduce computers to younger visitors. The success of the event was evaluated by voluntary feedback forms completed by visitors, in conjunction with our own self-evaluation. This work builds on the original work of the 4273π bioinformatics education program of Barker et al. (2013, BMC Bioinform. 14:243). 4273π provides open source education materials in bioinformatics. This work looks at the potential to adapt similar materials for public engagement events. It appears, at least in our small sample of visitors (n = 13), that basic computational science can be conveyed to people of all ages by means of interactive demonstrations. Children as young as five were able to successfully edit simple computer programs with supervision. This was, in many cases, their first experience of computer programming. The feedback is predominantly positive, showing strong support for improving computational science education, but also included suggestions for improvement. Our conclusions are necessarily preliminary. However, feedback forms suggest methods were generally well received among the participants; "Easy to follow. Clear explanation" and "Very easy. Demonstrators were very informative." Our event, held at a local Science Centre in Dundee, demonstrates that computer games and programming activities suitable for young children can be performed alongside a more specialised and applied introduction to computational science for older visitors.
Center for Center for Technology for Advanced Scientific Component Software (TASCS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kostadin, Damevski
A resounding success of the Scientific Discovery through Advanced Computing (SciDAC) program is that high-performance computational science is now universally recognized as a critical aspect of scientific discovery [71], complementing both theoretical and experimental research. As scientific communities prepare to exploit unprecedented computing capabilities of emerging leadership-class machines for multi-model simulations at the extreme scale [72], it is more important than ever to address the technical and social challenges of geographically distributed teams that combine expertise in domain science, applied mathematics, and computer science to build robust and flexible codes that can incorporate changes over time. The Center for Technologymore » for Advanced Scientific Component Software (TASCS)1 tackles these these issues by exploiting component-based software development to facilitate collaborative high-performance scientific computing.« less
System-on-Chip Design and Implementation
ERIC Educational Resources Information Center
Brackenbury, L. E. M.; Plana, L. A.; Pepper, J.
2010-01-01
The system-on-chip module described here builds on a grounding in digital hardware and system architecture. It is thus appropriate for third-year undergraduate computer science and computer engineering students, for post-graduate students, and as a training opportunity for post-graduate research students. The course incorporates significant…
Building Real World Domain-Specific Social Network Websites as a Capstone Project
ERIC Educational Resources Information Center
Yue, Kwok-Bun; De Silva, Dilhar; Kim, Dan; Aktepe, Mirac; Nagle, Stewart; Boerger, Chris; Jain, Anubha; Verma, Sunny
2009-01-01
This paper describes our experience of using Content Management Software (CMS), specifically Joomla, to build a real world domain-specific social network site (SNS) as a capstone project for graduate information systems and computer science students. As Web 2.0 technologies become increasingly important in driving business application development,…
Build IT: Scaling and Sustaining an Afterschool Computer Science Program for Girls
ERIC Educational Resources Information Center
Koch, Melissa; Gorges, Torie; Penuel, William R.
2012-01-01
"Co-design"--including youth development staff along with curriculum designers--is the key to developing an effective program that is both scalable and sustainable. This article describes Build IT, a two-year afterschool and summer curriculum designed to help middle school girls develop fluency in information technology (IT), interest in…
ERIC Educational Resources Information Center
Allen, Denise
1994-01-01
Reviews three educational computer software products: (1) a compact disc-read only memory (CD-ROM) bundle of five mathematics programs from the Apple Education Series; (2) "Sammy's Science House," with science activities for preschool through second grade (Edmark); and (3) "The Cat Came Back," an interactive CD-ROM game designed to build language…
Big Data: Next-Generation Machines for Big Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hack, James J.; Papka, Michael E.
Addressing the scientific grand challenges identified by the US Department of Energy’s (DOE’s) Office of Science’s programs alone demands a total leadership-class computing capability of 150 to 400 Pflops by the end of this decade. The successors to three of the DOE’s most powerful leadership-class machines are set to arrive in 2017 and 2018—the products of the Collaboration Oak Ridge Argonne Livermore (CORAL) initiative, a national laboratory–industry design/build approach to engineering nextgeneration petascale computers for grand challenge science. These mission-critical machines will enable discoveries in key scientific fields such as energy, biotechnology, nanotechnology, materials science, and high-performance computing, and servemore » as a milestone on the path to deploying exascale computing capabilities.« less
Blending an Android Development Course with Software Engineering Concepts
ERIC Educational Resources Information Center
Chatzigeorgiou, Alexander; Theodorou, Tryfon L.; Violettas, George E.; Xinogalos, Stelios
2016-01-01
The tremendous popularity of mobile computing and Android in particular has attracted millions of developers who see opportunities for building their own start-ups. As a consequence Computer Science students express an increasing interest into the related technology of Java development for Android applications. Android projects are complex by…
ERIC Educational Resources Information Center
Reyes-Palomares, Armando; Sanchez-Jimenez, Francisca; Medina, Miguel Angel
2009-01-01
A comprehensive understanding of biological functions requires new systemic perspectives, such as those provided by systems biology. Systems biology approaches are hypothesis-driven and involve iterative rounds of model building, prediction, experimentation, model refinement, and development. Developments in computer science are allowing for ever…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Potok, Thomas; Schuman, Catherine; Patton, Robert
The White House and Department of Energy have been instrumental in driving the development of a neuromorphic computing program to help the United States continue its lead in basic research into (1) Beyond Exascale—high performance computing beyond Moore’s Law and von Neumann architectures, (2) Scientific Discovery—new paradigms for understanding increasingly large and complex scientific data, and (3) Emerging Architectures—assessing the potential of neuromorphic and quantum architectures. Neuromorphic computing spans a broad range of scientific disciplines from materials science to devices, to computer science, to neuroscience, all of which are required to solve the neuromorphic computing grand challenge. In our workshopmore » we focus on the computer science aspects, specifically from a neuromorphic device through an application. Neuromorphic devices present a very different paradigm to the computer science community from traditional von Neumann architectures, which raises six major questions about building a neuromorphic application from the device level. We used these fundamental questions to organize the workshop program and to direct the workshop panels and discussions. From the white papers, presentations, panels, and discussions, there emerged several recommendations on how to proceed.« less
Software Build and Delivery Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robey, Robert W.
2016-07-10
This presentation deals with the hierarchy of software build and delivery systems. One of the goals is to maximize the success rate of new users and developers when first trying your software. First impressions are important. Early successes are important. This also reduces critical documentation costs. This is a presentation focused on computer science and goes into detail about code documentation.
Developing a Technology Enhanced CS0 Course for Engineering Students
ERIC Educational Resources Information Center
Lokkila, Erno; Kaila, Erkki; Lindén, Rolf; Laakso, Mikko-Jussi; Sutinen, Erkki
2016-01-01
The CS0 course in the curriculum typically has the role of introducing students into basic concepts and terminology of computer science. Hence, it is used to form a base on which the subsequent programming courses can build on. However, much of the effort to build better methodologies for courses is spent on introductory programming courses…
Building an Effective Interdisciplinary Professional Master's Degree
ERIC Educational Resources Information Center
Kline, Douglas M.; Vetter, Ron; Barnhill, Karen
2013-01-01
This article describes the creation of the Master of Science of Computer Science and Information Systems at University of North Carolina Wilmington. The creation of this graduate degree was funded by the Sloan Foundation as a new type of program, the Professional Master's. The program was designed with significant industry input, and is truly…
Outcomes from the DOE Workshop on Turbulent Flow Simulation at the Exascale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sprague, Michael; Boldyrev, Stanislav; Chang, Choong-Seock
This paper summarizes the outcomes from the Turbulent Flow Simulation at the Exascale: Opportunities and Challenges Workshop, which was held 4-5 August 2015, and was sponsored by the U.S. Department of Energy Office of Advanced Scientific Computing Research. The workshop objective was to define and describe the challenges and opportunities that computing at the exascale will bring to turbulent-flow simulations in applied science and technology. The need for accurate simulation of turbulent flows is evident across the U.S. Department of Energy applied-science and engineering portfolios, including combustion, plasma physics, nuclear-reactor physics, wind energy, and atmospheric science. The workshop brought togethermore » experts in turbulent-flow simulation, computational mathematics, and high-performance computing. Building upon previous ASCR workshops on exascale computing, participants defined a research agenda and path forward that will enable scientists and engineers to continually leverage, engage, and direct advances in computational systems on the path to exascale computing.« less
ERIC Educational Resources Information Center
Hanna, Philip; Allen, Angela; Kane, Russell; Anderson, Neil; McGowan, Aidan; Collins, Matthew; Hutchison, Malcolm
2015-01-01
This paper outlines a means of improving the employability skills of first-year university students through a closely integrated model of employer engagement within computer science modules. The outlined approach illustrates how employability skills, including communication, teamwork and time management skills, can be contextualised in a manner…
NASA Technical Reports Server (NTRS)
1986-01-01
The primary purpose of the report is to explore management approaches and technology developments for computation and data management systems designed to meet future needs in the space sciences.The report builds on work presented in previous reports on solar-terrestrial and planetary reports, broadening the outlook to all of the space sciences, and considering policy issues aspects related to coordiantion between data centers, missions, and ongoing research activities, because it is perceived that the rapid growth of data and the wide geographic distribution of relevant facilities will present especially troublesome problems for data archiving, distribution, and analysis.
Electrical Circuits in the Mathematics/Computer Science Classroom.
ERIC Educational Resources Information Center
McMillan, Robert D.
1988-01-01
Shows how, with little or no electrical background, students can apply Boolean algebra concepts to design and build integrated electrical circuits in the classroom that will reinforce important ideas in mathematics. (PK)
Building Systems from Scratch: An Exploratory Study of Students Learning about Climate Change
ERIC Educational Resources Information Center
Puttick, Gillian; Tucker-Raymond, Eli
2018-01-01
Science and computational practices such as modeling and abstraction are critical to understanding the complex systems that are integral to climate science. Given the demonstrated affordances of game design in supporting such practices, we implemented a free 4-day intensive workshop for middle school girls that focused on using the visual…
Health sciences library building projects, 1996-1997 survey.
Bowden, V M
1998-01-01
Nine building projects are briefly described, including four new libraries, two renovations, and three combined renovations and additions. The libraries range in size from 657 square feet to 136,832 square feet, with seating varying from 14 to 635. Three hospital libraries and four academic health sciences libraries are described in more detail. In each case an important consideration was the provision for computer access. Two of the libraries expanded their space for historical collections. Three of the libraries added mobile shelving as a way of storing print materials while providing space for other activities. Images PMID:9549012
The Open System Interconnection as a building block in a health sciences information network.
Boss, R W
1985-01-01
The interconnection of integrated health sciences library systems with other health sciences computer systems to achieve information networks will require either custom linkages among specific devices or the adoption of standards that all systems support. The most appropriate standards appear to be those being developed under the Open System Interconnection (OSI) reference model, which specifies a set of rules and functions that computers must follow to exchange information. The protocols have been modularized into seven different layers. The lowest three layers are generally available as off-the-shelf interfacing products. The higher layers require special development for particular applications. This paper describes the OSI, its application in health sciences networks, and specific tasks that remain to be undertaken. PMID:4052672
Large-scale visualization projects for teaching software engineering.
Müller, Christoph; Reina, Guido; Burch, Michael; Weiskopf, Daniel
2012-01-01
The University of Stuttgart's software engineering major complements the traditional computer science major with more practice-oriented education. Two-semester software projects in various application areas offered by the university's different computer science institutes are a successful building block in the curriculum. With this realistic, complex project setting, students experience the practice of software engineering, including software development processes, technologies, and soft skills. In particular, visualization-based projects are popular with students. Such projects offer them the opportunity to gain profound knowledge that would hardly be possible with only regular lectures and homework assignments.
ERIC Educational Resources Information Center
Geelan, David R.; Taylor, Peter C.
2004-01-01
Computer mediated communication--including web pages, email and web-based bulletin boards--was used to support the development of a cooperative learning community among students in a web-based distance education unit for practicing science and mathematics educators. The students lived in several Australian states and a number of Pacific Rim…
IEA EBC annex 53: Total energy use in buildings—Analysis and evaluation methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoshino, Hiroshi; Hong, Tianzhen; Nord, Natasa
One of the most significant barriers to achieving deep building energy efficiency is a lack of knowledge about the factors determining energy use. In fact, there is often a significant discrepancy between designed and real energy use in buildings, which is poorly understood but are believed to have more to do with the role of human behavior than building design. Building energy use is mainly influenced by six factors: climate, building envelope, building services and energy systems, building operation and maintenance, occupants’ activities and behavior, and indoor environmental quality. In the past, much research focused on the first three factors.more » However, the next three human-related factors can have an influence as significant as the first three. Annex 53 employed an interdisciplinary approach, integrating building science, architectural engineering, computer modeling and simulation, and social and behavioral science to develop and apply methods to analyze and evaluate the real energy use in buildings considering the six influencing factors. Finally, outcomes from Annex 53 improved understanding and strengthen knowledge regarding the robust prediction of total energy use in buildings, enabling reliable quantitative assessment of energy-savings measures, policies, and techniques.« less
IEA EBC annex 53: Total energy use in buildings—Analysis and evaluation methods
Yoshino, Hiroshi; Hong, Tianzhen; Nord, Natasa
2017-07-18
One of the most significant barriers to achieving deep building energy efficiency is a lack of knowledge about the factors determining energy use. In fact, there is often a significant discrepancy between designed and real energy use in buildings, which is poorly understood but are believed to have more to do with the role of human behavior than building design. Building energy use is mainly influenced by six factors: climate, building envelope, building services and energy systems, building operation and maintenance, occupants’ activities and behavior, and indoor environmental quality. In the past, much research focused on the first three factors.more » However, the next three human-related factors can have an influence as significant as the first three. Annex 53 employed an interdisciplinary approach, integrating building science, architectural engineering, computer modeling and simulation, and social and behavioral science to develop and apply methods to analyze and evaluate the real energy use in buildings considering the six influencing factors. Finally, outcomes from Annex 53 improved understanding and strengthen knowledge regarding the robust prediction of total energy use in buildings, enabling reliable quantitative assessment of energy-savings measures, policies, and techniques.« less
ERIC Educational Resources Information Center
Vernooy, D. Andrew; Alter, Kevin
2001-01-01
Presents design features of the University of Texas' Applied Computational Engineering and Sciences Building and discusses how institutions can guide the character of their architecture without subverting the architects' responsibility to confront their contemporary culture in a critical manner. (GR)
Schopf, Jennifer M.; Nitzberg, Bill
2002-01-01
The design and implementation of a national computing system and data grid has become a reachable goal from both the computer science and computational science point of view. A distributed infrastructure capable of sophisticated computational functions can bring many benefits to scientific work, but poses many challenges, both technical and socio-political. Technical challenges include having basic software tools, higher-level services, functioning and pervasive security, and standards, while socio-political issues include building a user community, adding incentives for sites to be part of a user-centric environment, and educating funding sources about the needs of this community. This paper details the areasmore » relating to Grid research that we feel still need to be addressed to fully leverage the advantages of the Grid.« less
12th Annual Science and Engineering Technology Conference/DoD TECH Exposition
2011-06-23
compound when planning horizons grow: long design - test - build-field-adapt lead-times exacerbate uncertain futures problems, overload designs , and...ERS Environment ERS: Tools and Technologies to Facilitate Adaptability & Trustability 4. Tying design , physical and computational testing 6...science, engineering concepts, processes, and design tools to: • Continuously coordinate design , testing , and production with warfighter review to
ERIC Educational Resources Information Center
Yoon, Susan A.; Anderson, Emma; Koehler-Yom, Jessica; Evans, Chad; Park, Miyoung; Sheldon, Josh; Schoenfeld, Ilana; Wendel, Daniel; Scheintaub, Hal; Klopfer, Eric
2017-01-01
The recent next generation science standards in the United States have emphasized learning about complex systems as a core feature of science learning. Over the past 15 years, a number of educational tools and theories have been investigated to help students learn about complex systems; but surprisingly, little research has been devoted to…
Griffiths, Thomas L; Lieder, Falk; Goodman, Noah D
2015-04-01
Marr's levels of analysis-computational, algorithmic, and implementation-have served cognitive science well over the last 30 years. But the recent increase in the popularity of the computational level raises a new challenge: How do we begin to relate models at different levels of analysis? We propose that it is possible to define levels of analysis that lie between the computational and the algorithmic, providing a way to build a bridge between computational- and algorithmic-level models. The key idea is to push the notion of rationality, often used in defining computational-level models, deeper toward the algorithmic level. We offer a simple recipe for reverse-engineering the mind's cognitive strategies by deriving optimal algorithms for a series of increasingly more realistic abstract computational architectures, which we call "resource-rational analysis." Copyright © 2015 Cognitive Science Society, Inc.
The NASA Science Internet: An integrated approach to networking
NASA Technical Reports Server (NTRS)
Rounds, Fred
1991-01-01
An integrated approach to building a networking infrastructure is an absolute necessity for meeting the multidisciplinary science networking requirements of the Office of Space Science and Applications (OSSA) science community. These networking requirements include communication connectivity between computational resources, databases, and library systems, as well as to other scientists and researchers around the world. A consolidated networking approach allows strategic use of the existing science networking within the Federal government, and it provides networking capability that takes into consideration national and international trends towards multivendor and multiprotocol service. It also offers a practical vehicle for optimizing costs and maximizing performance. Finally, and perhaps most important to the development of high speed computing is that an integrated network constitutes a focus for phasing to the National Research and Education Network (NREN). The NASA Science Internet (NSI) program, established in mid 1988, is structured to provide just such an integrated network. A description of the NSI is presented.
ERIC Educational Resources Information Center
Tang, Kok-Sing; Tan, Seng-Chee
2017-01-01
The study in this article examines and illustrates the intertextual meanings made by a group of high school science students as they embarked on a knowledge building discourse to solve a physics problem. This study is situated in a computer-supported collaborative learning (CSCL) environment designed to support student learning through a science…
Research in Information Processing and Computer Science. Final Technical Report.
ERIC Educational Resources Information Center
Carnegie-Mellon Univ., Pittsburgh, PA. Social Studies Curriculum Center.
This is the final scientific research report for the research in programing at Carnegie-Mellon University during 1968-1970. Three team programing efforts during the past two years have been the development of (1) BLISS--a system building language on the PDP-10 computer, (2) LC2--a conversational system on the IBM/360, and L*--a system building…
Unmet needs for analyzing biological big data: A survey of 704 NSF principal investigators
2017-01-01
In a 2016 survey of 704 National Science Foundation (NSF) Biological Sciences Directorate principal investigators (BIO PIs), nearly 90% indicated they are currently or will soon be analyzing large data sets. BIO PIs considered a range of computational needs important to their work, including high performance computing (HPC), bioinformatics support, multistep workflows, updated analysis software, and the ability to store, share, and publish data. Previous studies in the United States and Canada emphasized infrastructure needs. However, BIO PIs said the most pressing unmet needs are training in data integration, data management, and scaling analyses for HPC—acknowledging that data science skills will be required to build a deeper understanding of life. This portends a growing data knowledge gap in biology and challenges institutions and funding agencies to redouble their support for computational training in biology. PMID:29049281
Unmet needs for analyzing biological big data: A survey of 704 NSF principal investigators.
Barone, Lindsay; Williams, Jason; Micklos, David
2017-10-01
In a 2016 survey of 704 National Science Foundation (NSF) Biological Sciences Directorate principal investigators (BIO PIs), nearly 90% indicated they are currently or will soon be analyzing large data sets. BIO PIs considered a range of computational needs important to their work, including high performance computing (HPC), bioinformatics support, multistep workflows, updated analysis software, and the ability to store, share, and publish data. Previous studies in the United States and Canada emphasized infrastructure needs. However, BIO PIs said the most pressing unmet needs are training in data integration, data management, and scaling analyses for HPC-acknowledging that data science skills will be required to build a deeper understanding of life. This portends a growing data knowledge gap in biology and challenges institutions and funding agencies to redouble their support for computational training in biology.
NASA Astrophysics Data System (ADS)
Podrasky, A.; Covitt, B. A.; Woessner, W.
2017-12-01
The availability of clean water to support human uses and ecological integrity has become an urgent interest for many scientists, decision makers and citizens. Likewise, as computational capabilities increasingly revolutionize and become integral to the practice of science, technology, engineering and math (STEM) disciplines, the STEM+ Computing (STEM+C) Partnerships program seeks to integrate the use of computational approaches in K-12 STEM teaching and learning. The Comp Hydro project, funded by a STEM+C grant from the National Science Foundation, brings together a diverse team of scientists, educators, professionals and citizens at sites in Arizona, Colorado, Maryland and Montana to foster water literacy, as well as computational science literacy, by integrating authentic, place- and data- based learning using physical, mathematical, computational and conceptual models. This multi-state project is currently engaging four teams of six teachers who work during two academic years with educators and scientists at each site. Teams work to develop instructional units specific to their region that integrate hydrologic science and computational modeling. The units, currently being piloted in high school earth and environmental science classes, provide a classroom context to investigate student understanding of how computation is used in Earth systems science. To develop effective science instruction that is rich in place- and data- based learning, effective collaborations between researchers, educators, scientists, professionals and citizens are crucial. In this poster, we focus on project implementation in Montana, where an instructional unit has been developed and is being tested through collaboration among University scientists, researchers and educators, high school teachers and agency and industry scientists and engineers. In particular, we discuss three characteristics of effective collaborative science education design for developing and implementing place- and data- based science education to support students in developing socio-scientific and computational literacy sufficient for making decisions about real world issues such as groundwater contamination. These characteristics include that science education experiences are real, responsive/accessible and rigorous.
Building a bioinformatics community of practice through library education programs.
Moore, Margaret E; Vaughan, K T L; Hayes, Barrie E
2004-01-01
This paper addresses the following questions:What makes the community of practice concept an intriguing framework for developing library services for bioinformatics? What is the campus context and setting? What has been the Health Sciences Library's role in bioinformatics at the University of North Carolina (UNC) Chapel Hill? What are the Health Sciences Library's goals? What services are currently offered? How will these services be evaluated and developed? How can libraries demonstrate their value? Providing library services for an emerging community such as bioinformatics and computational biology presents special challenges for libraries including understanding needs, defining and communicating the library's role, building relationships within the community, preparing staff, and securing funding. Like many academic health sciences libraries, the University of North Carolina (UNC) at Chapel Hill Health Sciences Library is addressing these challenges in the context of its overall mission and goals.
NASA Technical Reports Server (NTRS)
2004-01-01
Since its founding in 1992, Global Science & Technology, Inc. (GST), of Greenbelt, Maryland, has been developing technologies and providing services in support of NASA scientific research. GST specialties include scientific analysis, science data and information systems, data visualization, communications, networking and Web technologies, computer science, and software system engineering. As a longtime contractor to Goddard Space Flight Center s Earth Science Directorate, GST scientific, engineering, and information technology staff have extensive qualifications with the synthesis of satellite, in situ, and Earth science data for weather- and climate-related projects. GST s experience in this arena is end-to-end, from building satellite ground receiving systems and science data systems, to product generation and research and analysis.
Training in Methods in Computational Neuroscience
1989-11-14
inferior colliculus served as inputs to a sheet of 100 cells within the medial geniculate body where combination sensitivity is first observed. Inputs from...course is for advanced graduate students and postdoctoral fellows in neurobiology , physics, electrical engineering, computer science and psychology...Research Code 1142BI 800 N. Quincy St Arlington, VA 22217-5000 Paul Adams Department of Neurobiology SUNY, Stony Brook Graduate Biology Building 576
NASA Astrophysics Data System (ADS)
Hart, Quyen N.
2015-01-01
We present a successful model for organizing a small University-sponsored summer camp that integrates astronomy and physics content with other science disciplines and computer programming content. The aim of our science and technology camp is to engage middle school students in a wide array of critical thinking tasks and hands-on activities centered on science and technology. Additionally, our program seeks to increase and maintain STEM interest among children, particularly in under-represented populations (e.g., Hispanic, African-American, women, and lower socioeconomic individuals) with hopes of decreasing disparities in diversity across many STEM fields.During this four-day camp, organized and facilitated by faculty volunteers, activities rotated through many STEM modules, including optics, telescopes, circuit building, computer hardware, and programming. Specifically, we scaffold camp activities to build upon similar ideas and content if possible. Using knowledge and skills gained through the AAS Astronomy Ambassadors program, we were able to integrate several astronomy activities into the camp, leading students through engaging activities, and conduct educational research. We present best practices on piloting a similar program in a university environment, our efforts to connect the learning outcomes common across all the modules, specifically in astronomy and physics, outline future camp activities, and the survey results on the impact of camp activities on attitudes toward science, technology, and science careers.
The Computing and Data Grid Approach: Infrastructure for Distributed Science Applications
NASA Technical Reports Server (NTRS)
Johnston, William E.
2002-01-01
With the advent of Grids - infrastructure for using and managing widely distributed computing and data resources in the science environment - there is now an opportunity to provide a standard, large-scale, computing, data, instrument, and collaboration environment for science that spans many different projects and provides the required infrastructure and services in a relatively uniform and supportable way. Grid technology has evolved over the past several years to provide the services and infrastructure needed for building 'virtual' systems and organizations. We argue that Grid technology provides an excellent basis for the creation of the integrated environments that can combine the resources needed to support the large- scale science projects located at multiple laboratories and universities. We present some science case studies that indicate that a paradigm shift in the process of science will come about as a result of Grids providing transparent and secure access to advanced and integrated information and technologies infrastructure: powerful computing systems, large-scale data archives, scientific instruments, and collaboration tools. These changes will be in the form of services that can be integrated with the user's work environment, and that enable uniform and highly capable access to these computers, data, and instruments, regardless of the location or exact nature of these resources. These services will integrate transient-use resources like computing systems, scientific instruments, and data caches (e.g., as they are needed to perform a simulation or analyze data from a single experiment); persistent-use resources. such as databases, data catalogues, and archives, and; collaborators, whose involvement will continue for the lifetime of a project or longer. While we largely address large-scale science in this paper, Grids, particularly when combined with Web Services, will address a broad spectrum of science scenarios. both large and small scale.
NASA Astrophysics Data System (ADS)
Fournier, Frederic
The learning environment created during this research/development constitutes a micro-laboratory allowing students at the secondary and collegial level to build a measurement system. This approach based on the concrete manufacture of measuring instruments showed that the student did not only acquire knowledge, but he developed a know-how in the technology of measuring systems and also a know-how and knowledge in experimental sciences. In conceptualizing and building his own measurement system, in a computer-assisted experimental environmental, the student performs a scientific investigation in which he must induce a causal relationship between the different variables at stake. He must then isolate this relationship by building a scheme for the control of the variables and model it in an algebraic and graphic form. We believe that this approach will allow the students to better understand the physical phenomena they will be measuring. The prototypes and software used to build these measuring instruments were evaluated and redesigned at the functional and didactic levels in order to offer a learning environment that respects in every way the competence approach and the integration between science and technology.
Indoor Multi-Sensor Acquisition System for Projects on Energy Renovation of Buildings.
Armesto, Julia; Sánchez-Villanueva, Claudio; Patiño-Cambeiro, Faustino; Patiño-Barbeito, Faustino
2016-05-28
Energy rehabilitation actions in buildings have become a great economic opportunity for the construction sector. They also constitute a strategic goal in the European Union (EU), given the energy dependence and the compromises with climate change of its member states. About 75% of existing buildings in the EU were built when energy efficiency codes had not been developed. Approximately 75% to 90% of those standing buildings are expected to remain in use in 2050. Significant advances have been achieved in energy analysis, simulation tools, and computer fluid dynamics for building energy evaluation. However, the gap between predictions and real savings might still be improved. Geomatics and computer science disciplines can really help in modelling, inspection, and diagnosis procedures. This paper presents a multi-sensor acquisition system capable of automatically and simultaneously capturing the three-dimensional geometric information, thermographic, optical, and panoramic images, ambient temperature map, relative humidity map, and light level map. The system integrates a navigation system based on a Simultaneous Localization and Mapping (SLAM) approach that allows georeferencing every data to its position in the building. The described equipment optimizes the energy inspection and diagnosis steps and facilitates the energy modelling of the building.
Indoor Multi-Sensor Acquisition System for Projects on Energy Renovation of Buildings
Armesto, Julia; Sánchez-Villanueva, Claudio; Patiño-Cambeiro, Faustino; Patiño-Barbeito, Faustino
2016-01-01
Energy rehabilitation actions in buildings have become a great economic opportunity for the construction sector. They also constitute a strategic goal in the European Union (EU), given the energy dependence and the compromises with climate change of its member states. About 75% of existing buildings in the EU were built when energy efficiency codes had not been developed. Approximately 75% to 90% of those standing buildings are expected to remain in use in 2050. Significant advances have been achieved in energy analysis, simulation tools, and computer fluid dynamics for building energy evaluation. However, the gap between predictions and real savings might still be improved. Geomatics and computer science disciplines can really help in modelling, inspection, and diagnosis procedures. This paper presents a multi-sensor acquisition system capable of automatically and simultaneously capturing the three-dimensional geometric information, thermographic, optical, and panoramic images, ambient temperature map, relative humidity map, and light level map. The system integrates a navigation system based on a Simultaneous Localization and Mapping (SLAM) approach that allows georeferencing every data to its position in the building. The described equipment optimizes the energy inspection and diagnosis steps and facilitates the energy modelling of the building. PMID:27240379
Tolaymat, Thabet; El Badawy, Amro; Sequeira, Reynold; Genaidy, Ash
2015-11-15
There is an urgent need for broad and integrated studies that address the risks of engineered nanomaterials (ENMs) along the different endpoints of the society, environment, and economy (SEE) complex adaptive system. This article presents an integrated science-based methodology to assess the potential risks of engineered nanomaterials. To achieve the study objective, two major tasks are accomplished, knowledge synthesis and algorithmic computational methodology. The knowledge synthesis task is designed to capture "what is known" and to outline the gaps in knowledge from ENMs risk perspective. The algorithmic computational methodology is geared toward the provision of decisions and an understanding of the risks of ENMs along different endpoints for the constituents of the SEE complex adaptive system. The approach presented herein allows for addressing the formidable task of assessing the implications and risks of exposure to ENMs, with the long term goal to build a decision-support system to guide key stakeholders in the SEE system towards building sustainable ENMs and nano-enabled products. Published by Elsevier B.V.
Advanced Training Techniques Using Computer Generated Imagery.
1981-09-15
Annual Technical Report for Period- 16 May 1980 - 15 July 1981 LJ Prepared for AIR FORCE OFFICE OF SCIENTIFIC RESEARCH Director of Life Sciences Building...Simulation Management Branch, ATC, Randolph AFB, TX 78148, November 1977. Allbee, K. F., Semple C. A.; Aircrew Training Devices Life Cycle Cost and Worth...in Simulator Design and Application, Life Sciences, Inc., 227 Lood 820 NE, Hurst, Texas 76053, AFOSR-TR-77- 0965, 30 September 1976 McDonnell Aircraft
D Topological Indoor Building Modeling Integrated with Open Street Map
NASA Astrophysics Data System (ADS)
Jamali, A.; Rahman, A. Abdul; Boguslawski, P.
2016-09-01
Considering various fields of applications for building surveying and various demands, geometry representation of a building is the most crucial aspect of a building survey. The interiors of the buildings need to be described along with the relative locations of the rooms, corridors, doors and exits in many kinds of emergency response, such as fire, bombs, smoke, and pollution. Topological representation is a challenging task within the Geography Information Science (GIS) environment, as the data structures required to express these relationships are particularly difficult to develop. Even within the Computer Aided Design (CAD) community, the structures for expressing the relationships between adjacent building parts are complex and often incomplete. In this paper, an integration of 3D topological indoor building modeling in Dual Half Edge (DHE) data structure and outdoor navigation network from Open Street Map (OSM) is presented.
NASA Astrophysics Data System (ADS)
Nguyen, L.; Chee, T.; Minnis, P.; Spangenberg, D.; Ayers, J. K.; Palikonda, R.; Vakhnin, A.; Dubois, R.; Murphy, P. R.
2014-12-01
The processing, storage and dissemination of satellite cloud and radiation products produced at NASA Langley Research Center are key activities for the Climate Science Branch. A constellation of systems operates in sync to accomplish these goals. Because of the complexity involved with operating such intricate systems, there are both high failure rates and high costs for hardware and system maintenance. Cloud computing has the potential to ameliorate cost and complexity issues. Over time, the cloud computing model has evolved and hybrid systems comprising off-site as well as on-site resources are now common. Towards our mission of providing the highest quality research products to the widest audience, we have explored the use of the Amazon Web Services (AWS) Cloud and Storage and present a case study of our results and efforts. This project builds upon NASA Langley Cloud and Radiation Group's experience with operating large and complex computing infrastructures in a reliable and cost effective manner to explore novel ways to leverage cloud computing resources in the atmospheric science environment. Our case study presents the project requirements and then examines the fit of AWS with the LaRC computing model. We also discuss the evaluation metrics, feasibility, and outcomes and close the case study with the lessons we learned that would apply to others interested in exploring the implementation of the AWS system in their own atmospheric science computing environments.
Critical Thinking Traits of Top-Tier Experts and Implications for Computer Science Education
2007-08-01
field of cognitive theory ," [Papert 1999] used his work while developing the Logo programming language. 19 Although other researchers had developed ...of computer expert systems influenced the development of current theories dealing with cognitive abilities. One of the most important initiatives by...multitude of factors involved. He also builds on the cognitive development work of Piaget and is not ready to abandon the generalist approach. Instead, he
A Long History of Supercomputing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grider, Gary
As part of its national security science mission, Los Alamos National Laboratory and HPC have a long, entwined history dating back to the earliest days of computing. From bringing the first problem to the nation’s first computer to building the first machine to break the petaflop barrier, Los Alamos holds many “firsts” in HPC breakthroughs. Today, supercomputers are integral to stockpile stewardship and the Laboratory continues to work with vendors in developing the future of HPC.
NASA Astrophysics Data System (ADS)
Makino, Junichiro
2002-12-01
We overview our GRAvity PipE (GRAPE) project to develop special-purpose computers for astrophysical N-body simulations. The basic idea of GRAPE is to attach a custom-build computer dedicated to the calculation of gravitational interaction between particles to a general-purpose programmable computer. By this hybrid architecture, we can achieve both a wide range of applications and very high peak performance. Our newest machine, GRAPE-6, achieved the peak speed of 32 Tflops, and sustained performance of 11.55 Tflops, for the total budget of about 4 million USD. We also discuss relative advantages of special-purpose and general-purpose computers and the future of high-performance computing for science and technology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spentzouris, Panagiotis; /Fermilab; Cary, John
The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessarymore » accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors.« less
NASA Astrophysics Data System (ADS)
Graves, S. J.; Keiser, K.; Law, E.; Yang, C. P.; Djorgovski, S. G.
2016-12-01
ECITE (EarthCube Integration and Testing Environment) is providing both cloud-based computational testing resources and an Assessment Framework for Technology Interoperability and Integration. NSF's EarthCube program is funding the development of cyberinfrastructure building block components as technologies to address Earth science research problems. These EarthCube building blocks need to support integration and interoperability objectives to work towards a coherent cyberinfrastructure architecture for the program. ECITE is being developed to provide capabilities to test and assess the interoperability and integration across funded EarthCube technology projects. EarthCube defined criteria for interoperability and integration are applied to use cases coordinating science problems with technology solutions. The Assessment Framework facilitates planning, execution and documentation of the technology assessments for review by the EarthCube community. This presentation will describe the components of ECITE and examine the methodology of cross walking between science and technology use cases.
Model-Based Building Verification in Aerial Photographs.
1987-09-01
Powers ’ordon E. Schacher Chaii nan Dean of Science and Electrical and Computer Engineering Engineering "p. 5.€ ’ ,’"..€ € . € -, _ _ . ."€ . 4...paper, we have proposed an ex)erimental knowledge-based " verification syste, te organization for change (letection is oitliinet. , Kowledge rules and
ERIC Educational Resources Information Center
Kitts, Christopher; Quinn, Neil
2004-01-01
Santa Clara University's Robotic Systems Laboratory conducts an aggressive robotic development and operations program in which interdisciplinary teams of undergraduate students build and deploy a wide range of robotic systems, ranging from underwater vehicles to spacecraft. These year-long projects expose students to the breadth of and…
ERIC Educational Resources Information Center
Stowe, Ryan; Elvey, Jacob
2016-01-01
Chemistry in high school is often presented as a jumbled mass of topics drawn from inorganic, analytical, and physical sub-disciplines. With no central theme to build on, students may have trouble grasping the chemical sciences as a coherent field. In this article, Stowe and Elvey describe an activity that integrates different facets of chemistry…
Asking Research Questions: Theoretical Presuppositions
ERIC Educational Resources Information Center
Tenenberg, Josh
2014-01-01
Asking significant research questions is a crucial aspect of building a research foundation in computer science (CS) education. In this article, I argue that the questions that we ask are shaped by internalized theoretical presuppositions about how the social and behavioral worlds operate. And although such presuppositions are essential in making…
2010-10-18
August 2010 was building the right game “ – World of Warcraft has 30% women (according to womengamers.com) Conclusion: – We don’t really understand why...Report of the National Academies on Informal Learning • Infancy - late adulthood: Learn about the world & develop important skills for science...Education With Rigor and Vigor – Excitement, interest, and motivation to learn about phenomena in the natural and physical world . – Generate
Critical thinking traits of top-tier experts and implications for computer science education
NASA Astrophysics Data System (ADS)
Bushey, Dean E.
A documented shortage of technical leadership and top-tier performers in computer science jeopardizes the technological edge, security, and economic well-being of the nation. The 2005 President's Information and Technology Advisory Committee (PITAC) Report on competitiveness in computational sciences highlights the major impact of science, technology, and innovation in keeping America competitive in the global marketplace. It stresses the fact that the supply of science, technology, and engineering experts is at the core of America's technological edge, national competitiveness and security. However, recent data shows that both undergraduate and postgraduate production of computer scientists is falling. The decline is "a quiet crisis building in the United States," a crisis that, if allowed to continue unchecked, could endanger America's well-being and preeminence among the world's nations. Past research on expert performance has shown that the cognitive traits of critical thinking, creativity, and problem solving possessed by top-tier performers can be identified, observed and measured. The studies show that the identified attributes are applicable across many domains and disciplines. Companies have begun to realize that cognitive skills are important for high-level performance and are reevaluating the traditional academic standards they have used to predict success for their top-tier performers in computer science. Previous research in the computer science field has focused either on programming skills of its experts or has attempted to predict the academic success of students at the undergraduate level. This study, on the other hand, examines the critical-thinking skills found among experts in the computer science field in order to explore the questions, "What cognitive skills do outstanding performers possess that make them successful?" and "How do currently used measures of academic performance correlate to critical-thinking skills among students?" The results of this study suggest a need to examine how critical-thinking abilities are learned in the undergraduate computer science curriculum and the need to foster these abilities in order to produce the high-level, critical-thinking professionals necessary to fill the growing need for these experts. Due to the fact that current measures of academic performance do not adequately depict students' cognitive abilities, assessment of these skills must be incorporated into existing curricula.
Kapur, Tina; Pieper, Steve; Fedorov, Andriy; Fillion-Robin, J-C; Halle, Michael; O'Donnell, Lauren; Lasso, Andras; Ungi, Tamas; Pinter, Csaba; Finet, Julien; Pujol, Sonia; Jagadeesan, Jayender; Tokuda, Junichi; Norton, Isaiah; Estepar, Raul San Jose; Gering, David; Aerts, Hugo J W L; Jakab, Marianna; Hata, Nobuhiko; Ibanez, Luiz; Blezek, Daniel; Miller, Jim; Aylward, Stephen; Grimson, W Eric L; Fichtinger, Gabor; Wells, William M; Lorensen, William E; Schroeder, Will; Kikinis, Ron
2016-10-01
The National Alliance for Medical Image Computing (NA-MIC) was launched in 2004 with the goal of investigating and developing an open source software infrastructure for the extraction of information and knowledge from medical images using computational methods. Several leading research and engineering groups participated in this effort that was funded by the US National Institutes of Health through a variety of infrastructure grants. This effort transformed 3D Slicer from an internal, Boston-based, academic research software application into a professionally maintained, robust, open source platform with an international leadership and developer and user communities. Critical improvements to the widely used underlying open source libraries and tools-VTK, ITK, CMake, CDash, DCMTK-were an additional consequence of this effort. This project has contributed to close to a thousand peer-reviewed publications and a growing portfolio of US and international funded efforts expanding the use of these tools in new medical computing applications every year. In this editorial, we discuss what we believe are gaps in the way medical image computing is pursued today; how a well-executed research platform can enable discovery, innovation and reproducible science ("Open Science"); and how our quest to build such a software platform has evolved into a productive and rewarding social engineering exercise in building an open-access community with a shared vision. Copyright © 2016 Elsevier B.V. All rights reserved.
Distribution Locational Real-Time Pricing Based Smart Building Control and Management
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hao, Jun; Dai, Xiaoxiao; Zhang, Yingchen
This paper proposes an real-virtual parallel computing scheme for smart building operations aiming at augmenting overall social welfare. The University of Denver's campus power grid and Ritchie fitness center is used for demonstrating the proposed approach. An artificial virtual system is built in parallel to the real physical system to evaluate the overall social cost of the building operation based on the social science based working productivity model, numerical experiment based building energy consumption model and the power system based real-time pricing mechanism. Through interactive feedback exchanged between the real and virtual system, enlarged social welfare, including monetary cost reductionmore » and energy saving, as well as working productivity improvements, can be achieved.« less
NASA Astrophysics Data System (ADS)
Crease, Robert P.
2008-04-01
There are few more dramatic illustrations of the vicissitudes of laboratory architecturethan the contrast between Building 20 at the Massachusetts Institute of Technology (MIT) and its replacement, the Ray and Maria Stata Center. Building 20 was built hurriedly in 1943 as temporary housing for MIT's famous Rad Lab, the site of wartime radar research, and it remained a productive laboratory space for over half a century. A decade ago it was demolished to make way for the Stata Center, an architecturally striking building designed by Frank Gehry to house MIT's computer science and artificial intelligence labs (above). But in 2004 - just two years after the Stata Center officially opened - the building was criticized for being unsuitable for research and became the subject of still ongoing lawsuits alleging design and construction failures.
NASA Astrophysics Data System (ADS)
Anderson, T.
2015-12-01
The Northeast Fisheries Science Center's (NEFSC) Student Drifters Program is providing education opportunities for students of all ages. Using GPS-tracked ocean drifters, various educational institutions can provide students with hands-on experience in physical oceanography, engineering, and computer science. In building drifters many high school and undergraduate students may focus on drifter construction, sometimes designing their own drifter or attempting to improve current NEFSC models. While learning basic oceanography younger students can build drifters with the help of an educator and directions available on the studentdrifters.org website. Once drifters are deployed, often by a local mariner or oceanographic partner, drifter tracks can be visualised on maps provided at http://nefsc.noaa.gov/drifter. With the lesson plans available for those interested in computer science, students may download, process, and plot the drifter position data with basic Python code provided. Drifter tracks help students to visualize ocean currents, and also allow them to understand real particle tracking applications such as in search and rescue, oil spill dispersion, larval transport, and the movement of injured sea animals. Additionally, ocean circulation modelers can use student drifter paths to validate their models. The Student Drifters Program has worked with over 100 schools, several of them having deployed drifters on the West Coast. Funding for the program often comes from individual schools and small grants but in the future will preferably come from larger government grants. NSF, Sea-Grant, NOAA, and EPA are all possible sources of funding, especially with the support of multiple schools and large marine education associations. The Student Drifters Program is a unique resource for educators, students, and scientists alike.
Research on application of intelligent computation based LUCC model in urbanization process
NASA Astrophysics Data System (ADS)
Chen, Zemin
2007-06-01
Global change study is an interdisciplinary and comprehensive research activity with international cooperation, arising in 1980s, with the largest scopes. The interaction between land use and cover change, as a research field with the crossing of natural science and social science, has become one of core subjects of global change study as well as the front edge and hot point of it. It is necessary to develop research on land use and cover change in urbanization process and build an analog model of urbanization to carry out description, simulation and analysis on dynamic behaviors in urban development change as well as to understand basic characteristics and rules of urbanization process. This has positive practical and theoretical significance for formulating urban and regional sustainable development strategy. The effect of urbanization on land use and cover change is mainly embodied in the change of quantity structure and space structure of urban space, and LUCC model in urbanization process has been an important research subject of urban geography and urban planning. In this paper, based upon previous research achievements, the writer systematically analyzes the research on land use/cover change in urbanization process with the theories of complexity science research and intelligent computation; builds a model for simulating and forecasting dynamic evolution of urban land use and cover change, on the basis of cellular automation model of complexity science research method and multi-agent theory; expands Markov model, traditional CA model and Agent model, introduces complexity science research theory and intelligent computation theory into LUCC research model to build intelligent computation-based LUCC model for analog research on land use and cover change in urbanization research, and performs case research. The concrete contents are as follows: 1. Complexity of LUCC research in urbanization process. Analyze urbanization process in combination with the contents of complexity science research and the conception of complexity feature to reveal the complexity features of LUCC research in urbanization process. Urban space system is a complex economic and cultural phenomenon as well as a social process, is the comprehensive characterization of urban society, economy and culture, and is a complex space system formed by society, economy and nature. It has dissipative structure characteristics, such as opening, dynamics, self-organization, non-balance etc. Traditional model cannot simulate these social, economic and natural driving forces of LUCC including main feedback relation from LUCC to driving force. 2. Establishment of Markov extended model of LUCC analog research in urbanization process. Firstly, use traditional LUCC research model to compute change speed of regional land use through calculating dynamic degree, exploitation degree and consumption degree of land use; use the theory of fuzzy set to rewrite the traditional Markov model, establish structure transfer matrix of land use, forecast and analyze dynamic change and development trend of land use, and present noticeable problems and corresponding measures in urbanization process according to research results. 3. Application of intelligent computation research and complexity science research method in LUCC analog model in urbanization process. On the basis of detailed elaboration of the theory and the model of LUCC research in urbanization process, analyze the problems of existing model used in LUCC research (namely, difficult to resolve many complexity phenomena in complex urban space system), discuss possible structure realization forms of LUCC analog research in combination with the theories of intelligent computation and complexity science research. Perform application analysis on BP artificial neural network and genetic algorithms of intelligent computation and CA model and MAS technology of complexity science research, discuss their theoretical origins and their own characteristics in detail, elaborate the feasibility of them in LUCC analog research, and bring forward improvement methods and measures on existing problems of this kind of model. 4. Establishment of LUCC analog model in urbanization process based on theories of intelligent computation and complexity science. Based on the research on abovementioned BP artificial neural network, genetic algorithms, CA model and multi-agent technology, put forward improvement methods and application assumption towards their expansion on geography, build LUCC analog model in urbanization process based on CA model and Agent model, realize the combination of learning mechanism of BP artificial neural network and fuzzy logic reasoning, express the regulation with explicit formula, and amend the initial regulation through self study; optimize network structure of LUCC analog model and methods and procedures of model parameters with genetic algorithms. In this paper, I introduce research theory and methods of complexity science into LUCC analog research and presents LUCC analog model based upon CA model and MAS theory. Meanwhile, I carry out corresponding expansion on traditional Markov model and introduce the theory of fuzzy set into data screening and parameter amendment of improved model to improve the accuracy and feasibility of Markov model in the research on land use/cover change.
ERIC Educational Resources Information Center
Korchnoy, Evgeny; Verner, Igor M.
2010-01-01
Growing popularity of robotics education motivates developing its didactics and studying it in teacher training programs. This paper presents a study conducted in the Department of Education in Technology and Science, Technion, in which university students and school pupils cope with robotics challenges of designing, building and operating…
Playing by Programming: Making Gameplay a Programming Activity
ERIC Educational Resources Information Center
Weintrop, David; Wilensky, Uri
2016-01-01
Video games are an oft-cited reason for young learners getting interested in programming and computer science. As such, many learning opportunities build on this interest by having kids program their own video games. This approach, while sometimes successful, has its drawbacks stemming from the fact that the challenge of programming and game…
Mechanical Modeling and Computer Simulation of Protein Folding
ERIC Educational Resources Information Center
Prigozhin, Maxim B.; Scott, Gregory E.; Denos, Sharlene
2014-01-01
In this activity, science education and modern technology are bridged to teach students at the high school and undergraduate levels about protein folding and to strengthen their model building skills. Students are guided from a textbook picture of a protein as a rigid crystal structure to a more realistic view: proteins are highly dynamic…
Language Learning in Mindbodyworld: A Sociocognitive Approach to Second Language Acquisition
ERIC Educational Resources Information Center
Atkinson, Dwight
2014-01-01
Based on recent research in cognitive science, interaction, and second language acquisition (SLA), I describe a sociocognitive approach to SLA. This approach adopts a "non-cognitivist" view of cognition: Instead of an isolated computational process in which input is extracted from the environment and used to build elaborate internal…
ERIC Educational Resources Information Center
Hannemann, Jim; Rice, Thomas R.
1991-01-01
At the Oakland Technical Center, which provides vocational programs for nine Michigan high schools, a one-semester course in Foundations of Technology Systems uses a computer-simulated manufacturing environment to teach applied math, science, language arts, communication skills, problem solving, and teamwork in the context of technology education.…
Avenues for crowd science in Hydrology.
NASA Astrophysics Data System (ADS)
Koch, Julian; Stisen, Simon
2016-04-01
Crowd science describes research that is conducted with the participation of the general public (the crowd) and gives the opportunity to involve the crowd in research design, data collection and analysis. In various fields, scientists have already drawn on underused human resources to advance research at low cost, with high transparency and large acceptance of the public due to the bottom up structure and the participatory process. Within the hydrological sciences, crowd research has quite recently become more established in the form of crowd observatories to generate hydrological data on water quality, precipitation or river flow. These innovative observatories complement more traditional ways of monitoring hydrological data and strengthen a community-based environmental decision making. However, the full potential of crowd science lies in internet based participation of the crowd and it is not yet fully exploited in the field of Hydrology. New avenues that are not primarily based on the outsourcing of labor, but instead capitalize the full potential of human capabilities have to emerge. In multiple realms of solving complex problems, like image detection, optimization tasks, narrowing of possible solutions, humans still remain more effective than computer algorithms. The most successful online crowd science projects Foldit and Galaxy Zoo have proven that the collective of tens of thousands users could clearly outperform traditional computer based science approaches. Our study takes advantage of the well trained human perception to conduct a spatial sensitivity analysis of land-surface variables of a distributed hydrological model to identify the most sensitive spatial inputs. True spatial performance metrics, that quantitatively compare patterns, are not trivial to choose and their applicability is often not universal. On the other hand humans can quickly integrate spatial information at various scales and are therefore a trusted competence. We selected zooniverse, the most popular crowd science platform where over a million registered users contribute to various research projects, to build a survey of the human perception. The survey will be shown during the interactive discussion, but moreover for building future avenues of crowd science in Hydrology the following questions should be discussed: (1) What hydrological problems are suitable for an internet based crowd science application? (2) How to abstract the complex problem to a medium that appeals to the crowd? (3) How to secure good science with reliable results? (4) Can the crowd replace existing and established computer based applications like parameter optimization or forecasting at all?
A Long History of Supercomputing
Grider, Gary
2018-06-13
As part of its national security science mission, Los Alamos National Laboratory and HPC have a long, entwined history dating back to the earliest days of computing. From bringing the first problem to the nationâs first computer to building the first machine to break the petaflop barrier, Los Alamos holds many âfirstsâ in HPC breakthroughs. Today, supercomputers are integral to stockpile stewardship and the Laboratory continues to work with vendors in developing the future of HPC.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dress, W.B.
Rosen's modeling relation is embedded in Popper's three worlds to provide an heuristic tool for model building and a guide for thinking about complex systems. The utility of this construct is demonstrated by suggesting a solution to the problem of pseudo science and a resolution of the famous Bohr-Einstein debates. A theory of bizarre systems is presented by an analogy with entangled particles of quantum mechanics. This theory underscores the poverty of present-day computational systems (e.g., computers) for creating complex and bizarre entities by distinguishing between mechanism and organism.
GeoBrain Computational Cyber-laboratory for Earth Science Studies
NASA Astrophysics Data System (ADS)
Deng, M.; di, L.
2009-12-01
Computational approaches (e.g., computer-based data visualization, analysis and modeling) are critical for conducting increasingly data-intensive Earth science (ES) studies to understand functions and changes of the Earth system. However, currently Earth scientists, educators, and students have met two major barriers that prevent them from being effectively using computational approaches in their learning, research and application activities. The two barriers are: 1) difficulties in finding, obtaining, and using multi-source ES data; and 2) lack of analytic functions and computing resources (e.g., analysis software, computing models, and high performance computing systems) to analyze the data. Taking advantages of recent advances in cyberinfrastructure, Web service, and geospatial interoperability technologies, GeoBrain, a project funded by NASA, has developed a prototype computational cyber-laboratory to effectively remove the two barriers. The cyber-laboratory makes ES data and computational resources at large organizations in distributed locations available to and easily usable by the Earth science community through 1) enabling seamless discovery, access and retrieval of distributed data, 2) federating and enhancing data discovery with a catalogue federation service and a semantically-augmented catalogue service, 3) customizing data access and retrieval at user request with interoperable, personalized, and on-demand data access and services, 4) automating or semi-automating multi-source geospatial data integration, 5) developing a large number of analytic functions as value-added, interoperable, and dynamically chainable geospatial Web services and deploying them in high-performance computing facilities, 6) enabling the online geospatial process modeling and execution, and 7) building a user-friendly extensible web portal for users to access the cyber-laboratory resources. Users can interactively discover the needed data and perform on-demand data analysis and modeling through the web portal. The GeoBrain cyber-laboratory provides solutions to meet common needs of ES research and education, such as, distributed data access and analysis services, easy access to and use of ES data, and enhanced geoprocessing and geospatial modeling capability. It greatly facilitates ES research, education, and applications. The development of the cyber-laboratory provides insights, lessons-learned, and technology readiness to build more capable computing infrastructure for ES studies, which can meet wide-range needs of current and future generations of scientists, researchers, educators, and students for their formal or informal educational training, research projects, career development, and lifelong learning.
Gaussian curvature analysis allows for automatic block placement in multi-block hexahedral meshing.
Ramme, Austin J; Shivanna, Kiran H; Magnotta, Vincent A; Grosland, Nicole M
2011-10-01
Musculoskeletal finite element analysis (FEA) has been essential to research in orthopaedic biomechanics. The generation of a volumetric mesh is often the most challenging step in a FEA. Hexahedral meshing tools that are based on a multi-block approach rely on the manual placement of building blocks for their mesh generation scheme. We hypothesise that Gaussian curvature analysis could be used to automatically develop a building block structure for multi-block hexahedral mesh generation. The Automated Building Block Algorithm incorporates principles from differential geometry, combinatorics, statistical analysis and computer science to automatically generate a building block structure to represent a given surface without prior information. We have applied this algorithm to 29 bones of varying geometries and successfully generated a usable mesh in all cases. This work represents a significant advancement in automating the definition of building blocks.
Enabling Discoveries in Earth Sciences Through the Geosciences Network (GEON)
NASA Astrophysics Data System (ADS)
Seber, D.; Baru, C.; Memon, A.; Lin, K.; Youn, C.
2005-12-01
Taking advantage of the state-of-the-art information technology resources GEON researchers are building a cyberinfrastructure designed to enable data sharing, semantic data integration, high-end computations and 4D visualization in easy-to-use web-based environments. The GEON Network currently allows users to search and register Earth science resources such as data sets (GIS layers, GMT files, geoTIFF images, ASCII files, relational databases etc), software applications or ontologies. Portal based access mechanisms enable developers to built dynamic user interfaces to conduct advanced processing and modeling efforts across distributed computers and supercomputers. Researchers and educators can access the networked resources through the GEON portal and its portlets that were developed to conduct better and more comprehensive science and educational studies. For example, the SYNSEIS portlet in GEON enables users to access in near-real time seismic waveforms from the IRIS Data Management Center, easily build a 3D geologic model within the area of the seismic station(s) and the epicenter and perform a 3D synthetic seismogram analysis to understand the lithospheric structure and earthquake source parameters for any given earthquake in the US. Similarly, GEON's workbench area enables users to create their own work environment and copy, visualize and analyze any data sets within the network, and create subsets of the data sets for their own purposes. Since all these resources are built as part of a Service-oriented Architecture (SOA), they are also used in other development platforms. One such platform is Kepler Workflow system which can access web service based resources and provides users with graphical programming interfaces to build a model to conduct computations and/or visualization efforts using the networked resources. Developments in the area of semantic integration of the networked datasets continue to advance and prototype studies can be accessed via the GEON portal at www.geongrid.org
ISMB 2016 offers outstanding science, networking, and celebration
Fogg, Christiana
2016-01-01
The annual international conference on Intelligent Systems for Molecular Biology (ISMB) is the major meeting of the International Society for Computational Biology (ISCB). Over the past 23 years the ISMB conference has grown to become the world's largest bioinformatics/computational biology conference. ISMB 2016 will be the year's most important computational biology event globally. The conferences provide a multidisciplinary forum for disseminating the latest developments in bioinformatics/computational biology. ISMB brings together scientists from computer science, molecular biology, mathematics, statistics and related fields. Its principal focus is on the development and application of advanced computational methods for biological problems. ISMB 2016 offers the strongest scientific program and the broadest scope of any international bioinformatics/computational biology conference. Building on past successes, the conference is designed to cater to variety of disciplines within the bioinformatics/computational biology community. ISMB 2016 takes place July 8 - 12 at the Swan and Dolphin Hotel in Orlando, Florida, United States. For two days preceding the conference, additional opportunities including Satellite Meetings, Student Council Symposium, and a selection of Special Interest Group Meetings and Applied Knowledge Exchange Sessions (AKES) are all offered to enable registered participants to learn more on the latest methods and tools within specialty research areas. PMID:27347392
ISMB 2016 offers outstanding science, networking, and celebration.
Fogg, Christiana
2016-01-01
The annual international conference on Intelligent Systems for Molecular Biology (ISMB) is the major meeting of the International Society for Computational Biology (ISCB). Over the past 23 years the ISMB conference has grown to become the world's largest bioinformatics/computational biology conference. ISMB 2016 will be the year's most important computational biology event globally. The conferences provide a multidisciplinary forum for disseminating the latest developments in bioinformatics/computational biology. ISMB brings together scientists from computer science, molecular biology, mathematics, statistics and related fields. Its principal focus is on the development and application of advanced computational methods for biological problems. ISMB 2016 offers the strongest scientific program and the broadest scope of any international bioinformatics/computational biology conference. Building on past successes, the conference is designed to cater to variety of disciplines within the bioinformatics/computational biology community. ISMB 2016 takes place July 8 - 12 at the Swan and Dolphin Hotel in Orlando, Florida, United States. For two days preceding the conference, additional opportunities including Satellite Meetings, Student Council Symposium, and a selection of Special Interest Group Meetings and Applied Knowledge Exchange Sessions (AKES) are all offered to enable registered participants to learn more on the latest methods and tools within specialty research areas.
Machine learning for Big Data analytics in plants.
Ma, Chuang; Zhang, Hao Helen; Wang, Xiangfeng
2014-12-01
Rapid advances in high-throughput genomic technology have enabled biology to enter the era of 'Big Data' (large datasets). The plant science community not only needs to build its own Big-Data-compatible parallel computing and data management infrastructures, but also to seek novel analytical paradigms to extract information from the overwhelming amounts of data. Machine learning offers promising computational and analytical solutions for the integrative analysis of large, heterogeneous and unstructured datasets on the Big-Data scale, and is gradually gaining popularity in biology. This review introduces the basic concepts and procedures of machine-learning applications and envisages how machine learning could interface with Big Data technology to facilitate basic research and biotechnology in the plant sciences. Copyright © 2014 Elsevier Ltd. All rights reserved.
DOE pushes for useful quantum computing
NASA Astrophysics Data System (ADS)
Cho, Adrian
2018-01-01
The U.S. Department of Energy (DOE) is joining the quest to develop quantum computers, devices that would exploit quantum mechanics to crack problems that overwhelm conventional computers. The initiative comes as Google and other companies race to build a quantum computer that can demonstrate "quantum supremacy" by beating classical computers on a test problem. But reaching that milestone will not mean practical uses are at hand, and the new $40 million DOE effort is intended to spur the development of useful quantum computing algorithms for its work in chemistry, materials science, nuclear physics, and particle physics. With the resources at its 17 national laboratories, DOE could play a key role in developing the machines, researchers say, although finding problems with which quantum computers can help isn't so easy.
NASA Astrophysics Data System (ADS)
Ewald, Mary Lou
2002-10-01
As a land-grant institution, Auburn University is committed to serving the citizens of Alabama through extension services and outreach programs. In following this outreach focus, the College of Sciences and Mathematics (COSAM) at AU has dedicated considerable resources to science and math related K-12 outreach programs, including two of our newest student-aimed programs: Youth Experiences in Science (YES) and Alabama BEST. Youth Experiences in Science (YES) is a Saturday enrichment program for middle school students. It includes a Fall and Spring Saturday component and a Summer camp experience. Activities include: LEGO's with Computers; Blood, Diseases & Forensics; Geometry of Models & Games; GPS Mapping; Polymer Chemistry; Electronics; and Genetics. Last year (2001-02), over 400 students attended a YES program on our campus. Alabama BEST (Boosting Engineering, Science & Technology) is a middle and high school robotics competition co-sponsored by COSAM and the College of Engineering at AU. Teams of students design and build robots and compete in a game format, with a new game theme introduced each year. This year, sixty teams from across Alabama and Georgia will have six weeks to design, build and perfect their robots before competition on October 18 and 19.
An integrated science-based methodology to assess potential ...
There is an urgent need for broad and integrated studies that address the risks of engineered nanomaterials (ENMs) along the different endpoints of the society, environment, and economy (SEE) complex adaptive system. This article presents an integrated science-based methodology to assess the potential risks of engineered nanomaterials. To achieve the study objective, two major tasks are accomplished, knowledge synthesis and algorithmic computational methodology. The knowledge synthesis task is designed to capture “what is known” and to outline the gaps in knowledge from ENMs risk perspective. The algorithmic computational methodology is geared toward the provision of decisions and an understanding of the risks of ENMs along different endpoints for the constituents of the SEE complex adaptive system. The approach presented herein allows for addressing the formidable task of assessing the implications and risks of exposure to ENMs, with the long term goal to build a decision-support system to guide key stakeholders in the SEE system towards building sustainable ENMs and nano-enabled products. The following specific aims are formulated to achieve the study objective: (1) to propose a system of systems (SoS) architecture that builds a network management among the different entities in the large SEE system to track the flow of ENMs emission, fate and transport from the source to the receptor; (2) to establish a staged approach for knowledge synthesis methodo
ERIC Educational Resources Information Center
Turcotte, Sandrine; Hamel, Christine
2016-01-01
This study addressed computer-supported collaborative scientific inquiries in remote networked schools (Quebec, Canada). Three dyads of Grade 5-6 classrooms from remote locations across the province collaborated using the knowledge-building tool Knowledge Forum. Customized scaffold supports embedded in the online tool were used to support student…
Workflow Management Systems for Molecular Dynamics on Leadership Computers
NASA Astrophysics Data System (ADS)
Wells, Jack; Panitkin, Sergey; Oleynik, Danila; Jha, Shantenu
Molecular Dynamics (MD) simulations play an important role in a range of disciplines from Material Science to Biophysical systems and account for a large fraction of cycles consumed on computing resources. Increasingly science problems require the successful execution of ''many'' MD simulations as opposed to a single MD simulation. There is a need to provide scalable and flexible approaches to the execution of the workload. We present preliminary results on the Titan computer at the Oak Ridge Leadership Computing Facility that demonstrate a general capability to manage workload execution agnostic of a specific MD simulation kernel or execution pattern, and in a manner that integrates disparate grid-based and supercomputing resources. Our results build upon our extensive experience of distributed workload management in the high-energy physics ATLAS project using PanDA (Production and Distributed Analysis System), coupled with recent conceptual advances in our understanding of workload management on heterogeneous resources. We will discuss how we will generalize these initial capabilities towards a more production level service on DOE leadership resources. This research is sponsored by US DOE/ASCR and used resources of the OLCF computing facility.
REVEAL: An Extensible Reduced Order Model Builder for Simulation and Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Agarwal, Khushbu; Sharma, Poorva; Ma, Jinliang
2013-04-30
Many science domains need to build computationally efficient and accurate representations of high fidelity, computationally expensive simulations. These computationally efficient versions are known as reduced-order models. This paper presents the design and implementation of a novel reduced-order model (ROM) builder, the REVEAL toolset. This toolset generates ROMs based on science- and engineering-domain specific simulations executed on high performance computing (HPC) platforms. The toolset encompasses a range of sampling and regression methods that can be used to generate a ROM, automatically quantifies the ROM accuracy, and provides support for an iterative approach to improve ROM accuracy. REVEAL is designed to bemore » extensible in order to utilize the core functionality with any simulator that has published input and output formats. It also defines programmatic interfaces to include new sampling and regression techniques so that users can ‘mix and match’ mathematical techniques to best suit the characteristics of their model. In this paper, we describe the architecture of REVEAL and demonstrate its usage with a computational fluid dynamics model used in carbon capture.« less
Quantum Testbeds Stakeholder Workshop (QTSW) Report meeting purpose and agenda.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hebner, Gregory A.
Quantum computing (QC) is a promising early-stage technology with the potential to provide scientific computing capabilities far beyond what is possible with even an Exascale computer in specific problems of relevance to the Office of Science. These include (but are not limited to) materials modeling, molecular dynamics, and quantum chromodynamics. However, commercial QC systems are not yet available and the technical maturity of current QC hardware, software, algorithms, and systems integration is woefully incomplete. Thus, there is a significant opportunity for DOE to define the technology building blocks, and solve the system integration issues to enable a revolutionary tool. Oncemore » realized, QC will have world changing impact on economic competitiveness, the scientific enterprise, and citizen well-being. Prior to this workshop, DOE / Office of Advanced Scientific Computing Research (ASCR) hosted a workshop in 2015 to explore QC scientific applications. The goal of that workshop was to assess the viability of QC technologies to meet the computational requirements in support of DOE’s science and energy mission and to identify the potential impact of these technologies.« less
An Expert System toward Buiding An Earth Science Knowledge Graph
NASA Astrophysics Data System (ADS)
Zhang, J.; Duan, X.; Ramachandran, R.; Lee, T. J.; Bao, Q.; Gatlin, P. N.; Maskey, M.
2017-12-01
In this ongoing work, we aim to build foundations of Cognitive Computing for Earth Science research. The goal of our project is to develop an end-to-end automated methodology for incrementally constructing Knowledge Graphs for Earth Science (KG4ES). These knowledge graphs can then serve as the foundational components for building cognitive systems in Earth science, enabling researchers to uncover new patterns and hypotheses that are virtually impossible to identify today. In addition, this research focuses on developing mining algorithms needed to exploit these constructed knowledge graphs. As such, these graphs will free knowledge from publications that are generated in a very linear, deterministic manner, and structure knowledge in a way that users can both interact and connect with relevant pieces of information. Our major contributions are two-fold. First, we have developed an end-to-end methodology for constructing Knowledge Graphs for Earth Science (KG4ES) using existing corpus of journal papers and reports. One of the key challenges in any machine learning, especially deep learning applications, is the need for robust and large training datasets. We have developed techniques capable of automatically retraining models and incrementally building and updating KG4ES, based on ever evolving training data. We also adopt the evaluation instrument based on common research methodologies used in Earth science research, especially in Atmospheric Science. Second, we have developed an algorithm to infer new knowledge that can exploit the constructed KG4ES. In more detail, we have developed a network prediction algorithm aiming to explore and predict possible new connections in the KG4ES and aid in new knowledge discovery.
Building a Semantic Framework for eScience
NASA Astrophysics Data System (ADS)
Movva, S.; Ramachandran, R.; Maskey, M.; Li, X.
2009-12-01
The e-Science vision focuses on the use of advanced computing technologies to support scientists. Recent research efforts in this area have focused primarily on “enabling” use of infrastructure resources for both data and computational access especially in Geosciences. One of the existing gaps in the existing e-Science efforts has been the failure to incorporate stable semantic technologies within the design process itself. In this presentation, we describe our effort in designing a framework for e-Science built using Service Oriented Architecture. Our framework provides users capabilities to create science workflows and mine distributed data. Our e-Science framework is being designed around a mass market tool to promote reusability across many projects. Semantics is an integral part of this framework and our design goal is to leverage the latest stable semantic technologies. The use of these stable semantic technologies will provide the users of our framework the useful features such as: allow search engines to find their content with RDFa tags; create RDF triple data store for their content; create RDF end points to share with others; and semantically mash their content with other online content available as RDF end point.
NASA Astrophysics Data System (ADS)
Blikstein, Paulo
The goal of this dissertation is to explore relations between content, representation, and pedagogy, so as to understand the impact of the nascent field of complexity sciences on science, technology, engineering and mathematics (STEM) learning. Wilensky & Papert coined the term "structurations" to express the relationship between knowledge and its representational infrastructure. A change from one representational infrastructure to another they call a "restructuration." The complexity sciences have introduced a novel and powerful structuration: agent-based modeling. In contradistinction to traditional mathematical modeling, which relies on equational descriptions of macroscopic properties of systems, agent-based modeling focuses on a few archetypical micro-behaviors of "agents" to explain emergent macro-behaviors of the agent collective. Specifically, this dissertation is about a series of studies of undergraduate students' learning of materials science, in which two structurations are compared (equational and agent-based), consisting of both design research and empirical evaluation. I have designed MaterialSim, a constructionist suite of computer models, supporting materials and learning activities designed within the approach of agent-based modeling, and over four years conducted an empirical inves3 tigation of an undergraduate materials science course. The dissertation is comprised of three studies: Study 1 - diagnosis . I investigate current representational and pedagogical practices in engineering classrooms. Study 2 - laboratory studies. I investigate the cognition of students engaging in scientific inquiry through programming their own scientific models. Study 3 - classroom implementation. I investigate the characteristics, advantages, and trajectories of scientific content knowledge that is articulated in epistemic forms and representational infrastructures unique to complexity sciences, as well as the feasibility of the integration of constructionist, agent-based learning environments in engineering classrooms. Data sources include classroom observations, interviews, videotaped sessions of model-building, questionnaires, analysis of computer-generated logfiles, and quantitative and qualitative analysis of artifacts. Results shows that (1) current representational and pedagogical practices in engineering classrooms were not up to the challenge of the complex content being taught, (2) by building their own scientific models, students developed a deeper understanding of core scientific concepts, and learned how to better identify unifying principles and behaviors in materials science, and (3) programming computer models was feasible within a regular engineering classroom.
Get immersed in the Soil Sciences: the first community of avatars in the EGU Assembly 2015!
NASA Astrophysics Data System (ADS)
Castillo, Sebastian; Alarcón, Purificación; Beato, Mamen; Emilio Guerrero, José; José Martínez, Juan; Pérez, Cristina; Ortiz, Leovigilda; Taguas, Encarnación V.
2015-04-01
Virtual reality and immersive worlds refer to artificial computer-generated environments, with which users act and interact as in a known environment by the use of figurative virtual individuals (avatars). Virtual environments will be the technology of the early twenty-first century that will most dramatically change the way we live, particularly in the areas of training and education, product development and entertainment (Schmorrow, 2009). The usefulness of immersive worlds has been proved in different fields. They reduce geographic and social barriers between different stakeholders and create virtual social spaces which can positively impact learning and discussion outcomes (Lorenzo et al. 2012). In this work we present a series of interactive meetings in a virtual building to celebrate the International Year of Soil to promote the importance of soil functions and its conservation. In a virtual room, the avatars of different senior researchers will meet young scientist avatars to talk about: 1) what remains to be done in Soil Sciences; 2) which are their main current limitations and difficulties and 3) which are the future hot research lines. The interactive participation does not require physically attend to the EGU Assembly 2015. In addition, this virtual building inspired in Soil Sciences can be completed with different teaching resources from different locations around the world and it will be used to improve the learning of Soil Sciences in a multicultural context. REFERENCES: Lorenzo C.M., Sicilia, M.A., Sánchez S. 2012. Studying the effectiveness of multi-user immersive environments for collaborative evaluation tasks. Computers & Education 59 (2012) 1361-1376 Schmorrow D.D. 2009. "Why virtual?" Theoretical Issues in Ergonomics Science 10(3): 279-282.
Optimization of knowledge-based systems and expert system building tools
NASA Technical Reports Server (NTRS)
Yasuda, Phyllis; Mckellar, Donald
1993-01-01
The objectives of the NASA-AMES Cooperative Agreement were to investigate, develop, and evaluate, via test cases, the system parameters and processing algorithms that constrain the overall performance of the Information Sciences Division's Artificial Intelligence Research Facility. Written reports covering various aspects of the grant were submitted to the co-investigators for the grant. Research studies concentrated on the field of artificial intelligence knowledge-based systems technology. Activities included the following areas: (1) AI training classes; (2) merging optical and digital processing; (3) science experiment remote coaching; (4) SSF data management system tests; (5) computer integrated documentation project; (6) conservation of design knowledge project; (7) project management calendar and reporting system; (8) automation and robotics technology assessment; (9) advanced computer architectures and operating systems; and (10) honors program.
Schweppe, M; Geigel, J
2011-01-01
Industry has increasingly emphasized the need for "soft" or interpersonal skills development and team-building experience in the college curriculum. Here, we discuss our experiences with providing such opportunities via a collaborative project called the Virtual Theater. In this joint project between the Rochester Institute of Technology's School of Design and Department of Computer Science, the goal is to enable live performance in a virtual space with participants in different physical locales. Students work in teams, collaborating with other students in and out of their disciplines.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peffer, Therese; Blumstein, Carl; Culler, David
The Project uses state-of-the-art computer science to extend the benefits of Building Automation Systems (BAS) typically found in large buildings (>100,000 square foot) to medium-sized commercial buildings (<50,000 sq ft). The BAS developed in this project, termed OpenBAS, uses an open-source and open software architecture platform, user interface, and plug-and-play control devices to facilitate adoption of energy efficiency strategies in the commercial building sector throughout the United States. At the heart of this “turn key” BAS is the platform with three types of controllers—thermostat, lighting controller, and general controller—that are easily “discovered” by the platform in a plug-and-play fashion. Themore » user interface showcases the platform and provides the control system set-up, system status display and means of automatically mapping the control points in the system.« less
A DS106 Thing Happened on the Way to the 3M Tech Forum
ERIC Educational Resources Information Center
Lockridge, Rochelle; Levine, Alan; Funes, Mariana
2014-01-01
This case study illustrates how DS106, a computer science course in Digital Storytelling from the University of Mary Washington (UMW) and accessible as an open course on the web, is being explored in a corporate environment at 3M, an American multinational corporation based in St. Paul, Minnesota, to build community, collaboration, and more…
ERIC Educational Resources Information Center
Winkel, Brian
2008-01-01
A complex technology-based problem in visualization and computation for students in calculus is presented. Strategies are shown for its solution and the opportunities for students to put together sequences of concepts and skills to build for success are highlighted. The problem itself involves placing an object under water in order to actually see…
ERIC Educational Resources Information Center
Ensign, Todd I.
2017-01-01
Educational robotics (ER) combines accessible and age-appropriate building materials, programmable interfaces, and computer coding to teach science and mathematics using the engineering design process. ER has been shown to increase K-12 students' understanding of STEM concepts, and can develop students' self-confidence and interest in STEM. As…
State University of New York Institute of Technology (SUNYIT) Visiting Scholars Program
2013-05-01
team members, and build the necessary backend metal interconnections. APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED 4 Baek-Young Choi...Cooperative and Opportunistic Mobile Cloud for Energy Efficient Positioning; Department of Computer Science Electrical Engineering, University of...Missouri - Kansas City The fast growing popularity of smartphones and tablets enables us the use of various intelligent mobile applications. As many of
Li, Hui; Lee, Taek; Dziubla, Thomas; Pi, Fengmei; Guo, Sijin; Xu, Jing; Li, Chan; Haque, Farzin; Liang, Xing-Jie; Guo, Peixuan
2015-01-01
Summary The value of polymers is manifested in their vital use as building blocks in material and life sciences. Ribonucleic acid (RNA) is a polynucleic acid, but its polymeric nature in materials and technological applications is often overlooked due to an impression that RNA is seemingly unstable. Recent findings that certain modifications can make RNA resistant to RNase degradation while retaining its authentic folding property and biological function, and the discovery of ultra-thermostable RNA motifs have adequately addressed the concerns of RNA unstability. RNA can serve as a unique polymeric material to build varieties of nanostructures including nanoparticles, polygons, arrays, bundles, membrane, and microsponges that have potential applications in biomedical and material sciences. Since 2005, more than a thousand publications on RNA nanostructures have been published in diverse fields, indicating a remarkable increase of interest in the emerging field of RNA nanotechnology. In this review, we aim to: delineate the physical and chemical properties of polymers that can be applied to RNA; introduce the unique properties of RNA as a polymer; review the current methods for the construction of RNA nanostructures; describe its applications in material, biomedical and computer sciences; and, discuss the challenges and future prospects in this field. PMID:26770259
NASA Astrophysics Data System (ADS)
Baytak, Ahmet
Among educational researchers and practitioners, there is a growing interest in employing computer games for pedagogical purposes. The present research integrated a technology education class and a science class where 5 th graders learned about environmental issues by designing games that involved environmental concepts. The purposes of this study were to investigate how designing computer games affected the development of students' environmental knowledge, programming knowledge, environmental awareness and interest in computers. It also explored the nature of the artifacts developed and the types of knowledge represented therein. A case study (Yin, 2003) was employed within the context of a 5 th grade elementary science classroom. Fifth graders designed computer games about environmental issues to present to 2nd graders by using Scratch software. The analysis of this study was based on multiple data sources: students' pre- and post-test scores on environmental awareness, their environmental knowledge, their interest in computer science, and their game design. Included in the analyses were also data from students' computer games, participant observations, and structured interviews. The results of the study showed that students were able to successfully design functional games that represented their understanding of environment, even though the gain between pre- and post-environmental knowledge test and environmental awareness survey were minimal. The findings indicate that all students were able to use various game characteristics and programming concepts, but their prior experience with the design software affected their representations. The analyses of the interview transcriptions and games show that students improved their programming skills and that they wanted to do similar projects for other subject areas in the future. Observations showed that game design appeared to lead to knowledge-building, interaction and collaboration among students. This, in turn, encouraged students to test and improve their designs. Sharing the games, it was found, has both positive and negative effects on the students' game design process and the representation of students' understandings of the domain subject.
Building the interspace: Digital library infrastructure for a University Engineering Community
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schatz, B.
A large-scale digital library is being constructed and evaluated at the University of Illinois, with the goal of bringing professional search and display to Internet information services. A testbed planned to grow to 10K documents and 100K users is being constructed in the Grainger Engineering Library Information Center, as a joint effort of the University Library and the National Center for Supercomputing Applications (NCSA), with evaluation and research by the Graduate School of Library and Information Science and the Department of Computer Science. The electronic collection will be articles from engineering and science journals and magazines, obtained directly from publishersmore » in SGML format and displayed containing all text, figures, tables, and equations. The publisher partners include IEEE Computer Society, AIAA (Aerospace Engineering), American Physical Society, and Wiley & Sons. The software will be based upon NCSA Mosaic as a network engine connected to commercial SGML displayers and full-text searchers. The users will include faculty/students across the midwestern universities in the Big Ten, with evaluations via interviews, surveys, and transaction logs. Concurrently, research into scaling the testbed is being conducted. This includes efforts in computer science, information science, library science, and information systems. These efforts will evaluate different semantic retrieval technologies, including automatic thesaurus and subject classification graphs. New architectures will be designed and implemented for a next generation digital library infrastructure, the Interspace, which supports interaction with information spread across information spaces within the Net.« less
Dery, Samuel; Vroom, Frances da-Costa; Godi, Anthony; Afagbedzi, Seth; Dwomoh, Duah
2016-09-01
Studies have shown that ICT adoption contributes to productivity and economic growth. It is therefore important that health workers have knowledge in ICT to ensure adoption and uptake of ICT tools to enable efficient health delivery. To determine the knowledge and use of ICT among students of the College of Health Sciences at the University of Ghana. This was a cross-sectional study conducted among students in all the five Schools of the College of Health Sciences at the University of Ghana. A total of 773 students were sampled from the Schools. Sampling proportionate to size was then used to determine the sample sizes required for each school, academic programme and level of programme. Simple random sampling was subsequently used to select students from each stratum. Computer knowledge was high among students at almost 99%. About 83% owned computers (p < 0.001) and self-rated computer knowledge was also 87 % (p <0.001). Usage was mostly for studying at 93% (p< 0.001). This study shows students have adequate knowledge and use of computers. It brings about an opportunity to introduce ICT in healthcare delivery to them. This will ensure their adequate preparedness to embrace new ways of delivering care to improve service delivery. Africa Build Project, Grant Number: FP7-266474.
clearScience: Infrastructure for Communicating Data-Intensive Science.
Bot, Brian M; Burdick, David; Kellen, Michael; Huang, Erich S
2013-01-01
Progress in biomedical research requires effective scientific communication to one's peers and to the public. Current research routinely encompasses large datasets and complex analytic processes, and the constraints of traditional journal formats limit useful transmission of these elements. We are constructing a framework through which authors can not only provide the narrative of what was done, but the primary and derivative data, the source code, the compute environment, and web-accessible virtual machines. This infrastructure allows authors to "hand their machine"- prepopulated with libraries, data, and code-to those interested in reviewing or building off of their work. This project, "clearScience," seeks to provide an integrated system that accommodates the ad hoc nature of discovery in the data-intensive sciences and seamless transitions from working to reporting. We demonstrate that rather than merely describing the science being reported, one can deliver the science itself.
NASA Astrophysics Data System (ADS)
Schwab, Ellianna; Faherty, Jacqueline K.; Barua, Prachurjya; Cooper, Ellie; Das, Debjani; Simone-Gonzalez, Luna; Sowah, Maxine; Valdez, Laura; BridgeUP: STEM
2018-01-01
BridgeUP: STEM (BridgeUP) is a program at the American Museum of Natural History (AMNH) that seeks to empower women by providing early-career scientists with research fellowships and high-school aged women with instruction in computer science and algorithmic methods. BridgeUP achieves this goal by employing post-baccalaureate women as Helen Fellows, who, in addition to conducting their own scientific research, mentor and teach high school students from the New York City area. The courses, targeted at early high-school students, are designed to teach algorithmic thinking and scientific methodology through the lens of computational science. In this poster we present the new BridgeUP astronomy curriculum created for 9th and 10th grade girls.The astronomy course we present is designed to introduce basic concepts as well as big data manipulation through a guided exploration of Gaia (DR1). Students learn about measuring astronomical distances through hands-on lab experiments illustrating the brightness/distance relationship, angular size calculations of the height of AMNH buildings, and in-depth Hertzsprung-Russell Diagram activities. Throughout these labs, students increase their proficiency in collecting and analyzing data, while learning to build and share code in teams. The students use their new skills to create color-color diagrams of known co-moving clusters (Oh et al. 2017) in the DR1 dataset using Python, Pandas and Matplotlib. We discuss the successes and lessons learned in the first implementation of this curriculum and show the preliminary work of six of the students, who are continuing with computational astronomy research over the current school year.
40 CFR 262.10 - Purpose, scope, and applicability.
Code of Federal Regulations, 2011 CFR
2011-07-01
... and Life Sciences, Arts and Sciences, Medicine, and Engineering and Mathematics; and Schools of..., Biology, Psychology, Anthropology, Geology and Earth Sciences, and Environmental, Coastal and Ocean Sciences Science Building (Bldg. #080); McCormack Building (Bldg. #020); and Wheatley Building (Bldg. #010...
40 CFR 262.10 - Purpose, scope, and applicability.
Code of Federal Regulations, 2012 CFR
2012-07-01
... and Life Sciences, Arts and Sciences, Medicine, and Engineering and Mathematics; and Schools of..., Biology, Psychology, Anthropology, Geology and Earth Sciences, and Environmental, Coastal and Ocean Sciences Science Building (Bldg. #080); McCormack Building (Bldg. #020); and Wheatley Building (Bldg. #010...
40 CFR 262.10 - Purpose, scope, and applicability.
Code of Federal Regulations, 2013 CFR
2013-07-01
... and Life Sciences, Arts and Sciences, Medicine, and Engineering and Mathematics; and Schools of..., Biology, Psychology, Anthropology, Geology and Earth Sciences, and Environmental, Coastal and Ocean Sciences Science Building (Bldg. #080); McCormack Building (Bldg. #020); and Wheatley Building (Bldg. #010...
40 CFR 262.10 - Purpose, scope, and applicability.
Code of Federal Regulations, 2014 CFR
2014-07-01
... and Life Sciences, Arts and Sciences, Medicine, and Engineering and Mathematics; and Schools of..., Biology, Psychology, Anthropology, Geology and Earth Sciences, and Environmental, Coastal and Ocean Sciences Science Building (Bldg. #080); McCormack Building (Bldg. #020); and Wheatley Building (Bldg. #010...
ERIC Educational Resources Information Center
Yoon, Susan A.; Koehler-Yom, Jessica; Anderson, Emma; Lin, Joyce; Klopfer, Eric
2015-01-01
Background: This exploratory study is part of a larger-scale research project aimed at building theoretical and practical knowledge of complex systems in students and teachers with the goal of improving high school biology learning through professional development and a classroom intervention. Purpose: We propose a model of adaptive expertise to…
2006-07-14
MINDS: Architecture & Design Technical Report Department of Computer Science and Engineering University of Minnesota 4-192 EECS Building 200 Union...Street SE Minneapolis, MN 55455-0159 USA TR 06-022 MINDS: Architecture & Design Varun Chandola, Eric Eilertson, Levent Ertoz, Gyorgy Simon, and Vipin...REPORT DATE 14 JUL 2006 2. REPORT TYPE 3. DATES COVERED 00-07-2006 to 00-07-2006 4. TITLE AND SUBTITLE MINDS: Architecture & Design 5a
Maximal aggregation of polynomial dynamical systems
Cardelli, Luca; Tschaikowski, Max
2017-01-01
Ordinary differential equations (ODEs) with polynomial derivatives are a fundamental tool for understanding the dynamics of systems across many branches of science, but our ability to gain mechanistic insight and effectively conduct numerical evaluations is critically hindered when dealing with large models. Here we propose an aggregation technique that rests on two notions of equivalence relating ODE variables whenever they have the same solution (backward criterion) or if a self-consistent system can be written for describing the evolution of sums of variables in the same equivalence class (forward criterion). A key feature of our proposal is to encode a polynomial ODE system into a finitary structure akin to a formal chemical reaction network. This enables the development of a discrete algorithm to efficiently compute the largest equivalence, building on approaches rooted in computer science to minimize basic models of computation through iterative partition refinements. The physical interpretability of the aggregation is shown on polynomial ODE systems for biochemical reaction networks, gene regulatory networks, and evolutionary game theory. PMID:28878023
Computing through Scientific Abstractions in SysBioPS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chin, George; Stephan, Eric G.; Gracio, Deborah K.
2004-10-13
Today, biologists and bioinformaticists have a tremendous amount of computational power at their disposal. With the availability of supercomputers, burgeoning scientific databases and digital libraries such as GenBank and PubMed, and pervasive computational environments such as the Grid, biologists have access to a wealth of computational capabilities and scientific data at hand. Yet, the rapid development of computational technologies has far exceeded the typical biologist’s ability to effectively apply the technology in their research. Computational sciences research and development efforts such as the Biology Workbench, BioSPICE (Biological Simulation Program for Intra-Cellular Evaluation), and BioCoRE (Biological Collaborative Research Environment) are importantmore » in connecting biologists and their scientific problems to computational infrastructures. On the Computational Cell Environment and Heuristic Entity-Relationship Building Environment projects at the Pacific Northwest National Laboratory, we are jointly developing a new breed of scientific problem solving environment called SysBioPSE that will allow biologists to access and apply computational resources in the scientific research context. In contrast to other computational science environments, SysBioPSE operates as an abstraction layer above a computational infrastructure. The goal of SysBioPSE is to allow biologists to apply computational resources in the context of the scientific problems they are addressing and the scientific perspectives from which they conduct their research. More specifically, SysBioPSE allows biologists to capture and represent scientific concepts and theories and experimental processes, and to link these views to scientific applications, data repositories, and computer systems.« less
Cellular intelligence: Microphenomenology and the realities of being.
Ford, Brian J
2017-12-01
Traditions of Eastern thought conceptualised life in a holistic sense, emphasising the processes of maintaining health and conquering sickness as manifestations of an essentially spiritual principle that was of overriding importance in the conduct of living. Western science, which drove the overriding and partial eclipse of Eastern traditions, became founded on a reductionist quest for ultimate realities which, in the modern scientific world, has embraced the notion that every living process can be successfully modelled by a digital computer system. It is argued here that the essential processes of cognition, response and decision-making inherent in living cells transcend conventional modelling, and microscopic studies of organisms like the shell-building amoebae and the rhodophyte alga Antithamnion reveal a level of cellular intelligence that is unrecognized by science and is not amenable to computer analysis. Copyright © 2017. Published by Elsevier Ltd.
John, Temitope M; Badejo, Joke A; Popoola, Segun I; Omole, David O; Odukoya, Jonathan A; Ajayi, Priscilla O; Aboyade, Mary; Atayero, Aderemi A
2018-06-01
This data article presents data of academic performances of undergraduate students in Science, Technology, Engineering and Mathematics (STEM) disciplines in Covenant University, Nigeria. The data shows academic performances of Male and Female students who graduated from 2010 to 2014. The total population of samples in the observation is 3046 undergraduates mined from Biochemistry (BCH), Building technology (BLD), Computer Engineering (CEN), Chemical Engineering (CHE), Industrial Chemistry (CHM), Computer Science (CIS), Civil Engineering (CVE), Electrical and Electronics Engineering (EEE), Information and Communication Engineering (ICE), Mathematics (MAT), Microbiology (MCB), Mechanical Engineering (MCE), Management and Information System (MIS), Petroleum Engineering (PET), Industrial Physics-Electronics and IT Applications (PHYE), Industrial Physics-Applied Geophysics (PHYG) and Industrial Physics-Renewable Energy (PHYR). The detailed dataset is made available in form of a Microsoft Excel spreadsheet in the supplementary material of this article.
The Future Medical Science and Colorectal Surgeons
2017-01-01
Future medical technology breakthroughs will build from the incredible progress made in computers, biotechnology, and nanotechnology and from the information learned from the human genome. With such technology and information, computer-aided diagnoses, organ replacement, gene therapy, personalized drugs, and even age reversal will become possible. True 3-dimensional system technology will enable surgeons to envision key clinical features and will help them in planning complex surgery. Surgeons will enter surgical instructions in a virtual space from a remote medical center, order a medical robot to perform the operation, and review the operation in real time on a monitor. Surgeons will be better than artificial intelligence or automated robots when surgeons (or we) love patients and ask questions for a better future. The purpose of this paper is looking at the future medical science and the changes of colorectal surgeons. PMID:29354602
The Future Medical Science and Colorectal Surgeons.
Kim, Young Jin
2017-12-01
Future medical technology breakthroughs will build from the incredible progress made in computers, biotechnology, and nanotechnology and from the information learned from the human genome. With such technology and information, computer-aided diagnoses, organ replacement, gene therapy, personalized drugs, and even age reversal will become possible. True 3-dimensional system technology will enable surgeons to envision key clinical features and will help them in planning complex surgery. Surgeons will enter surgical instructions in a virtual space from a remote medical center, order a medical robot to perform the operation, and review the operation in real time on a monitor. Surgeons will be better than artificial intelligence or automated robots when surgeons (or we) love patients and ask questions for a better future. The purpose of this paper is looking at the future medical science and the changes of colorectal surgeons.
NASA Technical Reports Server (NTRS)
Leiner, Barry M.; Gross, Anthony R. (Technical Monitor)
2002-01-01
The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. Operated by the Universities Space Research Association (a non-profit university consortium), RIACS is located at the NASA Ames Research Center, Moffett Field, California. It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in September 2003. Ames has been designated NASA's Center of Excellence in Information Technology. In this capacity, Ames is charged with the responsibility to build an Information Technology (IT) Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA Ames and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of IT research necessary to meet the future challenges of NASA missions: 1) Automated Reasoning for Autonomous Systems; 2) Human-Centered Computing; and 3) High Performance Computing and Networking. In addition, RIACS collaborates with NASA scientists to apply IT research to a variety of NASA application domains including aerospace technology, earth science, life sciences, and astrobiology. RIACS also engages in other activities, such as workshops, seminars, visiting scientist programs and student summer programs, designed to encourage and facilitate collaboration between the university and NASA IT research communities.
Open Science in the Cloud: Towards a Universal Platform for Scientific and Statistical Computing
NASA Astrophysics Data System (ADS)
Chine, Karim
The UK, through the e-Science program, the US through the NSF-funded cyber infrastructure and the European Union through the ICT Calls aimed to provide "the technological solution to the problem of efficiently connecting data, computers, and people with the goal of enabling derivation of novel scientific theories and knowledge".1 The Grid (Foster, 2002; Foster; Kesselman, Nick, & Tuecke, 2002), foreseen as a major accelerator of discovery, didn't meet the expectations it had excited at its beginnings and was not adopted by the broad population of research professionals. The Grid is a good tool for particle physicists and it has allowed them to tackle the tremendous computational challenges inherent to their field. However, as a technology and paradigm for delivering computing on demand, it doesn't work and it can't be fixed. On one hand, "the abstractions that Grids expose - to the end-user, to the deployers and to application developers - are inappropriate and they need to be higher level" (Jha, Merzky, & Fox), and on the other hand, academic Grids are inherently economically unsustainable. They can't compete with a service outsourced to the Industry whose quality and price would be driven by market forces. The virtualization technologies and their corollary, the Infrastructure-as-a-Service (IaaS) style cloud, hold the promise to enable what the Grid failed to deliver: a sustainable environment for computational sciences that would lower the barriers for accessing federated computational resources, software tools and data; enable collaboration and resources sharing and provide the building blocks of a ubiquitous platform for traceable and reproducible computational research.
NASA Astrophysics Data System (ADS)
Genoways, Sharon K.
STEM (Science, Technology, Engineering and Math) education creates critical thinkers, increases science literacy, and enables the next generation of innovators, which leads to new products and processes that sustain our economy (Hossain & Robinson, 2012). We have been hearing the warnings for several years, that there simply are not enough young scientists entering into the STEM professional pathways to replace all of the retiring professionals (Brown, Brown, Reardon, & Merrill, 2011; Harsh, Maltese, & Tai, 2012; Heilbronner, 2011; Scott, 2012). The problem is not necessarily due to a lack of STEM skills and concept proficiency. There also appears to be a lack of interest in these fields. Recent evidence suggests that many of the most proficient students, especially minority students and women, have been gravitating away from science and engineering toward other professions. (President's Council of Advisors on Science and Technology, 2010). The purpose of this qualitative research study was an attempt to determine how high schools can best prepare and encourage young women for a career in engineering or computer science. This was accomplished by interviewing a pool of 21 women, 5 recent high school graduates planning to major in STEM, 5 college students who had completed at least one full year of coursework in an engineering or computer science major and 11 professional women who had been employed as an engineer or computer scientist for at least one full year. These women were asked to share the high school courses, activities, and experiences that best prepared them to pursue an engineering or computer science major. Five central themes emerged from this study; coursework in physics and calculus, promotion of STEM camps and clubs, teacher encouragement of STEM capabilities and careers, problem solving, critical thinking and confidence building activities in the classroom, and allowing students the opportunity to fail and ask questions in a safe environment. These themes may be implemented by any instructor, in any course, who wishes to provide students with the means to success in their quest for a STEM career.
NASA Astrophysics Data System (ADS)
Gilbert-Valencia, Daniel H.
California community colleges contribute alarmingly few computer science degree or certificate earners. While the literature shows clear K-12 impediments to CS matriculation in higher education, very little is known about the experiences of those who overcome initial impediments to CS yet do not persist through to program completion. This phenomenological study explores insights into that specific experience by interviewing underrepresented, low income, first-generation college students who began community college intending to transfer to 4-year institutions majoring in CS but switched to another field and remain enrolled or graduated. This study explores the lived experiences of students facing barriers, their avenues for developing interest in CS, and the persistence support systems they encountered, specifically looking at how students constructed their academic choice from these experiences. The growing diversity within California's population necessitates that experiences specific to underrepresented students be considered as part of this exploration. Ten semi-structured interviews and observations were conducted, transcribed and coded. Artifacts supporting student experiences were also collected. Data was analyzed through a social-constructivist lens to provide insight into experiences and how they can be navigated to create actionable strategies for community college computer science departments wishing to increase student success. Three major themes emerged from this research: (1) students shared pre-college characteristics; (2) faced similar challenges in college CS courses; and (3) shared similar reactions to the "work" of computer science. Results of the study included (1) CS interest development hinged on computer ownership in the home; (2) participants shared characteristics that were ideal for college success but not CS success; and (3) encounters in CS departments produced unique challenges for participants. Though CS interest was and remains abundant, opportunities for learning programming skills before college were non-existent and there were few opportunities in college to build skills or establish a peer support networks. Recommendations for institutional leaders and further research are also provided.
Guidelines for Building Science Education
DOE Office of Scientific and Technical Information (OSTI.GOV)
Metzger, Cheryn E.; Rashkin, Samuel; Huelman, Pat
The U.S. Department of Energy’s (DOE) residential research and demonstration program, Building America, has triumphed through 20 years of innovation. Partnering with researchers, builders, remodelers, and manufacturers to develop innovative processes like advanced framing and ventilation standards, Building America has proven an energy efficient design can be more cost effective, healthy, and durable than a standard house. As Building America partners continue to achieve their stretch goals, they have found that the barrier to true market transformation for high performance homes is the limited knowledge-base of the professionals working in the building industry. With dozens of professionals taking part inmore » the design and execution of building and selling homes, each person should have basic building science knowledge relevant to their role, and an understanding of how various home components interface with each other. Instead, our industry typically experiences a fragmented approach to home building and design. After obtaining important input from stakeholders at the Building Science Education Kick-Off Meeting, DOE created a building science education strategy addressing education issues preventing the widespread adoption of high performance homes. This strategy targets the next generation and provides valuable guidance for the current workforce. The initiative includes: • Race to Zero Student Design Competition: Engages universities and provides students who will be the next generation of architects, engineers, construction managers and entrepreneurs with the necessary skills and experience they need to begin careers in clean energy and generate creative solutions to real world problems. • Building Science to Sales Translator: Simplifies building science into compelling sales language and tools to sell high performance homes to their customers. • Building Science Education Guidance: Brings together industry and academia to solve problems related to building science education. This report summarizes the steps DOE has taken to develop guidance for building science education and outlines a path forward towards creating real change for an industry in need. The Guidelines for Building Science Education outlined in Appendix A of this report have been developed for external stakeholders to use to certify that their programs are incorporating the most important aspects of building science at the most appropriate proficiency level for their role. The guidelines are intended to be used primarily by training organizations, universities, and certification bodies. Each guideline can be printed or saved as a stand-alone document for ease-of-use by the respective stakeholder group. In 2015, DOE, with leadership from Pacific Northwest National Laboratory (PNNL), is launching a multi-year campaign to promote the adoption of the Guidelines for Building Science Education in a variety of training settings.« less
High-Productivity Computing in Computational Physics Education
NASA Astrophysics Data System (ADS)
Tel-Zur, Guy
2011-03-01
We describe the development of a new course in Computational Physics at the Ben-Gurion University. This elective course for 3rd year undergraduates and MSc. students is being taught during one semester. Computational Physics is by now well accepted as the Third Pillar of Science. This paper's claim is that modern Computational Physics education should deal also with High-Productivity Computing. The traditional approach of teaching Computational Physics emphasizes ``Correctness'' and then ``Accuracy'' and we add also ``Performance.'' Along with topics in Mathematical Methods and case studies in Physics the course deals a significant amount of time with ``Mini-Courses'' in topics such as: High-Throughput Computing - Condor, Parallel Programming - MPI and OpenMP, How to build a Beowulf, Visualization and Grid and Cloud Computing. The course does not intend to teach neither new physics nor new mathematics but it is focused on an integrated approach for solving problems starting from the physics problem, the corresponding mathematical solution, the numerical scheme, writing an efficient computer code and finally analysis and visualization.
Cranswick, Lachlan Michael David
2008-01-01
The history of crystallographic computing and use of crystallographic software is one which traces the escape from the drudgery of manual human calculations to a world where the user delegates most of the travail to electronic computers. In practice, this involves practising crystallographers communicating their thoughts to the crystallographic program authors, in the hope that new procedures will be implemented within their software. Against this background, the development of small-molecule single-crystal and powder diffraction software is traced. Starting with the analogue machines and the use of Hollerith tabulators of the late 1930's, it is shown that computing developments have been science led, with new technologies being harnessed to solve pressing crystallographic problems. The development of software is also traced, with a final caution that few of the computations now performed daily are really understood by the program users. Unless a sufficient body of people continues to dismantle and re-build programs, the knowledge encoded in the old programs will become as inaccessible as the knowledge of how to build the Great Pyramid at Giza.
Building logical qubits in a superconducting quantum computing system
NASA Astrophysics Data System (ADS)
Gambetta, Jay M.; Chow, Jerry M.; Steffen, Matthias
2017-01-01
The technological world is in the midst of a quantum computing and quantum information revolution. Since Richard Feynman's famous `plenty of room at the bottom' lecture (Feynman, Engineering and Science23, 22 (1960)), hinting at the notion of novel devices employing quantum mechanics, the quantum information community has taken gigantic strides in understanding the potential applications of a quantum computer and laid the foundational requirements for building one. We believe that the next significant step will be to demonstrate a quantum memory, in which a system of interacting qubits stores an encoded logical qubit state longer than the incorporated parts. Here, we describe the important route towards a logical memory with superconducting qubits, employing a rotated version of the surface code. The current status of technology with regards to interconnected superconducting-qubit networks will be described and near-term areas of focus to improve devices will be identified. Overall, the progress in this exciting field has been astounding, but we are at an important turning point, where it will be critical to incorporate engineering solutions with quantum architectural considerations, laying the foundation towards scalable fault-tolerant quantum computers in the near future.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Windus, Theresa; Banda, Michael; Devereaux, Thomas
Computers have revolutionized every aspect of our lives. Yet in science, the most tantalizing applications of computing lie just beyond our reach. The current quest to build an exascale computer with one thousand times the capability of today’s fastest machines (and more than a million times that of a laptop) will take researchers over the next horizon. The field of materials, chemical reactions, and compounds is inherently complex. Imagine millions of new materials with new functionalities waiting to be discovered — while researchers also seek to extend those materials that are known to a dizzying number of new forms. Wemore » could translate massive amounts of data from high precision experiments into new understanding through data mining and analysis. We could have at our disposal the ability to predict the properties of these materials, to follow their transformations during reactions on an atom-by-atom basis, and to discover completely new chemical pathways or physical states of matter. Extending these predictions from the nanoscale to the mesoscale, from the ultrafast world of reactions to long-time simulations to predict the lifetime performance of materials, and to the discovery of new materials and processes will have a profound impact on energy technology. In addition, discovery of new materials is vital to move computing beyond Moore’s law. To realize this vision, more than hardware is needed. New algorithms to take advantage of the increase in computing power, new programming paradigms, and new ways of mining massive data sets are needed as well. This report summarizes the opportunities and the requisite computing ecosystem needed to realize the potential before us. In addition to pursuing new and more complete physical models and theoretical frameworks, this review found that the following broadly grouped areas relevant to the U.S. Department of Energy (DOE) Office of Advanced Scientific Computing Research (ASCR) would directly affect the Basic Energy Sciences (BES) mission need. Simulation, visualization, and data analysis are crucial for advances in energy science and technology. Revolutionary mathematical, software, and algorithm developments are required in all areas of BES science to take advantage of exascale computing architectures and to meet data analysis, management, and workflow needs. In partnership with ASCR, BES has an emerging and pressing need to develop new and disruptive capabilities in data science. More capable and larger high-performance computing (HPC) and data ecosystems are required to support priority research in BES. Continued success in BES research requires developing the next-generation workforce through education and training and by providing sustained career opportunities.« less
The need and potential for building a integrated knowledge-base of the Earth-Human system
NASA Astrophysics Data System (ADS)
Jacobs, Clifford
2011-03-01
The pursuit of scientific understanding is increasingly based on interdisciplinary research. To understand more deeply the planet and its interactions requires a progressively more holistic approach, exploring knowledge coming from all scientific and engineering disciplines including but not limited to, biology, chemistry, computer sciences, geosciences, material sciences, mathematics, physics, cyberinfrastucture, and social sciences. Nowhere is such an approach more critical than in the study of global climate change in which one of the major challenges is the development of next-generation Earth System Models that include coupled and interactive representations of ecosystems, agricultural working lands and forests, urban environments, biogeochemistry, atmospheric chemistry, ocean and atmospheric currents, the water cycle, land ice, and human activities.
ERIC Educational Resources Information Center
Schlenker, Richard M.
This manual was developed for use as a "how to" training device and provides a step-by-step introduction to using AppleWorks in the database mode. Instructions are given to prepare the original database with the headings of the user's choice. Inserting information records in the new database is covered, along with changing the layout of…
Humanoid Robots: A New Kind of Tool
2000-01-01
Breazeal (Ferrell), R. Irie, C. C. Kemp, M. J. Marjanovic , B. Scassellati, M. M. Williamson, Alternate Essences of Intelligence, AAAI 1998. 2 R. A. Brooks, C...Breazeal, M. J. Marjanovic , B. Scassellati, M. M. Williamson, The Cog Project: Building a Humanoid Robot, Computation fbr Metaphors, Analogy and...Functions, Vol. 608, 1990, New York Academy of Sciences, pp. 637-676. 7 M. J. Marjanovic , B. Scassellati, M. M. Williamson, Self-Taught Visually-Guided
Building Effective Pipelines to Increase Diversity in the Geosciences
NASA Astrophysics Data System (ADS)
Snow, E.; Robinson, C. R.; Neal-Mujahid, R.
2017-12-01
The U.S. Geological Survey (USGS) recognizes and understands the importance of a diverse workforce in advancing our science. Valuing Differences is one of the guiding principles of the USGS, and is the critical basis of the collaboration among the Youth and Education in Science (YES) program in the USGS Office of Science, Quality, and Integrity (OSQI), the Office of Diversity and Equal Opportunity (ODEO), and USGS science centers to build pipeline programs targeting diverse young scientists. Pipeline programs are robust, sustained relationships between two entities that provide a pathway from one to the other, in this case, from minority serving institutions to the USGS. The USGS has benefited from pipeline programs for many years. Our longest running program, with University of Puerto Rico Mayaguez (UPR), is a targeted outreach and internship program that has been managed by USGS scientists in Florida since the mid-1980's Originally begun as the Minority Participation in the Earth Sciences (MPES ) Program, it has evolved over the years, and in its several forms has brought dozens of interns to the USGS. Based in part on that success, in 2006 USGS scientists in Woods Hole MA worked with their Florida counterparts to build a pipeline program with City College of New York (CCNY). In this program, USGS scientists visit CCNY monthly, giving a symposium and meeting with students and faculty. The talks are so successful that the college created a course around them. In 2017, the CCNY and UPR programs brought 12 students to the USGS for summer internships. The CCNY model has been so successful that USGS is exploring creating similar pipeline programs. The YES office is coordinating with ODEO and USGS science centers to identify partner universities and build relationships that will lead to robust partnership where USGS scientists will visit regularly to engage with faculty and students and recruit students for USGS internships. The ideal partner universities will have a high population of underserved students, strong support for minority and first-generation students, proximity to a USGS office, and faculty and/or majors in several of the fields most important to USGS science: geology, geochemistry, energy, biology, ecology, environmental health, hydrology, climate science, GIS, high-capacity computing, and remote sensing.
The Community Seismic Network: Enabling Observations Through Citizen Science Participation
NASA Astrophysics Data System (ADS)
Kohler, M. D.; Clayton, R. W.; Heaton, T. H.; Bunn, J.; Guy, R.; Massari, A.; Chandy, K. M.
2017-12-01
The Community Seismic Network is a dense accelerometer array deployed in the greater Los Angeles area and represents the future of densely instrumented urban cities where localized vibration measurements are collected continuously throughout the free-field and built environment. The hardware takes advantage of developments in the semiconductor industry in the form of inexpensive MEMS accelerometers that are each coupled with a single board computer. The data processing and archival architecture borrows from developments in cloud computing and network connectedness. The ability to deploy densely in the free field and in upper stories of mid/high-rise buildings is enabled by community hosts for sensor locations. To this end, CSN has partnered with the Los Angeles Unified School District (LAUSD), the NASA-Jet Propulsion Laboratory (JPL), and commercial and civic building owners to host sensors. At these sites, site amplification estimates from RMS noise measurements illustrate the lateral variation in amplification over length scales of 100 m or less, that correlate with gradients in the local geology such as sedimentary basins that abut crystalline rock foothills. This is complemented by high-resolution, shallow seismic velocity models obtained using an H/V method. In addition, noise statistics are used to determine the reliability of sites for ShakeMap and earthquake early warning data. The LAUSD and JPL deployments are examples of how situational awareness and centralized warning products such as ShakeMap and ShakeCast are enabled by citizen science participation. Several buildings have been instrumented with at least one triaxial accelerometer per floor, providing measurements for real-time structural health monitoring through local, customized displays. For real-time and post-event evaluation, the free-field and built environment CSN data and products illustrate the feasibility of order-of-magnitude higher spatial resolution mapping compared to what is currently possible with traditional, regional seismic networks. The JPL experiment in particular represents a miniature prototype for city-wide earthquake monitoring that combines free-field measurements for ground shaking intensities, with mid-rise building response through advanced fragility curve computations.
Learning physical descriptors for materials science by compressed sensing
NASA Astrophysics Data System (ADS)
Ghiringhelli, Luca M.; Vybiral, Jan; Ahmetcik, Emre; Ouyang, Runhai; Levchenko, Sergey V.; Draxl, Claudia; Scheffler, Matthias
2017-02-01
The availability of big data in materials science offers new routes for analyzing materials properties and functions and achieving scientific understanding. Finding structure in these data that is not directly visible by standard tools and exploitation of the scientific information requires new and dedicated methodology based on approaches from statistical learning, compressed sensing, and other recent methods from applied mathematics, computer science, statistics, signal processing, and information science. In this paper, we explain and demonstrate a compressed-sensing based methodology for feature selection, specifically for discovering physical descriptors, i.e., physical parameters that describe the material and its properties of interest, and associated equations that explicitly and quantitatively describe those relevant properties. As showcase application and proof of concept, we describe how to build a physical model for the quantitative prediction of the crystal structure of binary compound semiconductors.
Lucier, R E
1995-01-01
In 1990, the University of California, San Francisco, dedicated a new library to serve the faculty, staff, and students and to meet their academic information needs for several decades to come. Major environmental changes present new and additional information management challenges, which can effectively be handled only through the widespread use of computing and computing technologies. Over the next five years, a three-pronged strategy will be followed. We are refining the current physical, paper-based library through the continuous application of technology for modernization and functional improvement. At the same time, we have begun the planning, design, and implementation of a "free-standing" Digital Library of the Health Sciences, focusing on the innovative application of technology. To ensure complementarity and product integrity where the two libraries interface, we will look to technology to transform these separate entities into an eventual, integral whole. PMID:7581192
Geoinformatics in the public service: building a cyberinfrastructure across the geological surveys
Allison, M. Lee; Gundersen, Linda C.; Richard, Stephen M.; Keller, G. Randy; Baru, Chaitanya
2011-01-01
Advanced information technology infrastructure is increasingly being employed in the Earth sciences to provide researchers with efficient access to massive central databases and to integrate diversely formatted information from a variety of sources. These geoinformatics initiatives enable manipulation, modeling and visualization of data in a consistent way, and are helping to develop integrated Earth models at various scales, and from the near surface to the deep interior. This book uses a series of case studies to demonstrate computer and database use across the geosciences. Chapters are thematically grouped into sections that cover data collection and management; modeling and community computational codes; visualization and data representation; knowledge management and data integration; and web services and scientific workflows. Geoinformatics is a fascinating and accessible introduction to this emerging field for readers across the solid Earth sciences and an invaluable reference for researchers interested in initiating new cyberinfrastructure projects of their own.
77 FR 48164 - National Institute Environmental Health Sciences; Notice of Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-13
... Health Sciences, Building 101, Rodbell Auditorium, 111 T. W. Alexander Drive, Research Triangle Park, NC.... of Environmental Health Sciences, Building 101, Rodbell Auditorium, 111 T. W. Alexander Drive... applications. Place: Nat. Inst. of Environmental Health Sciences, Building 101, Rodbell Auditorium, 111 T. W...
77 FR 26300 - National Institute of Environmental Health Sciences; Notice of Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-03
... Structural Biology. Place: Nat. Inst. of Environmental Health Sciences, Building 101, Rodbell Auditorium, 111... Environmental Health Sciences, Building 101, Rodbell Auditorium, 111 T. W. Alexander Drive, Research Triangle... Sessions. Place: Nat. Inst. of Environmental Health Sciences, Building 101, Rodbell Auditorium, 111 T. W...
20. SITE BUILDING 002 SCANNER BUILDING IN COMPUTER ...
20. SITE BUILDING 002 - SCANNER BUILDING - IN COMPUTER ROOM LOOKING AT "CONSOLIDATED MAINTENANCE OPERATIONS CENTER" JOB AREA AND OPERATION WORK CENTER. TASKS INCLUDE RADAR MAINTENANCE, COMPUTER MAINTENANCE, CYBER COMPUTER MAINTENANCE AND RELATED ACTIVITIES. - Cape Cod Air Station, Technical Facility-Scanner Building & Power Plant, Massachusetts Military Reservation, Sandwich, Barnstable County, MA
The fourth International Conference on Information Science and Cloud Computing
NASA Astrophysics Data System (ADS)
This book comprises the papers accepted by the fourth International Conference on Information Science and Cloud Computing (ISCC), which was held from 18-19 December, 2015 in Guangzhou, China. It has 70 papers divided into four parts. The first part focuses on Information Theory with 20 papers; the second part emphasizes Machine Learning also containing 21 papers; in the third part, there are 21 papers as well in the area of Control Science; and the last part with 8 papers is dedicated to Cloud Science. Each part can be used as an excellent reference by engineers, researchers and students who need to build a knowledge base of the most current advances and state-of-practice in the topics covered by the ISCC conference. Special thanks go to Professor Deyu Qi, General Chair of ISCC 2015, for his leadership in supervising the organization of the entire conference; Professor Tinghuai Ma, Program Chair, and members of program committee for evaluating all the submissions and ensuring the selection of only the highest quality papers; and the authors for sharing their ideas, results and insights. We sincerely hope that you enjoy reading papers included in this book.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Barton
2014-06-30
Peta-scale computing environments pose significant challenges for both system and application developers and addressing them required more than simply scaling up existing tera-scale solutions. Performance analysis tools play an important role in gaining this understanding, but previous monolithic tools with fixed feature sets have not sufficed. Instead, this project worked on the design, implementation, and evaluation of a general, flexible tool infrastructure supporting the construction of performance tools as “pipelines” of high-quality tool building blocks. These tool building blocks provide common performance tool functionality, and are designed for scalability, lightweight data acquisition and analysis, and interoperability. For this project, wemore » built on Open|SpeedShop, a modular and extensible open source performance analysis tool set. The design and implementation of such a general and reusable infrastructure targeted for petascale systems required us to address several challenging research issues. All components needed to be designed for scale, a task made more difficult by the need to provide general modules. The infrastructure needed to support online data aggregation to cope with the large amounts of performance and debugging data. We needed to be able to map any combination of tool components to each target architecture. And we needed to design interoperable tool APIs and workflows that were concrete enough to support the required functionality, yet provide the necessary flexibility to address a wide range of tools. A major result of this project is the ability to use this scalable infrastructure to quickly create tools that match with a machine architecture and a performance problem that needs to be understood. Another benefit is the ability for application engineers to use the highly scalable, interoperable version of Open|SpeedShop, which are reassembled from the tool building blocks into a flexible, multi-user interface set of tools. This set of tools targeted at Office of Science Leadership Class computer systems and selected Office of Science application codes. We describe the contributions made by the team at the University of Wisconsin. The project built on the efforts in Open|SpeedShop funded by DOE/NNSA and the DOE/NNSA Tri-Lab community, extended Open|Speedshop to the Office of Science Leadership Class Computing Facilities, and addressed new challenges found on these cutting edge systems. Work done under this project at Wisconsin can be divided into two categories, new algorithms and techniques for debugging, and foundation infrastructure work on our Dyninst binary analysis and instrumentation toolkits and MRNet scalability infrastructure.« less
Neuroengineering control and regulation of behavior
NASA Astrophysics Data System (ADS)
Wróbel, A.; Radzewicz, C.; Mankiewicz, L.; Hottowy, P.; Knapska, E.; Konopka, W.; Kublik, E.; Radwańska, K.; Waleszczyk, W. J.; Wójcik, D. K.
2014-11-01
To monitor neuronal circuits involved in emotional modulation of sensory processing we proposed a plan to establish novel research techniques combining recent biological, technical and analytical discoveries. The project was granted by National Science Center and we started to build a new experimental model for studying the selected circuits of genetically marked and behaviorally activated neurons. To achieve this goal we will combine the pioneering, interdisciplinary expertise of four Polish institutions: (i) the Nencki Institute of Experimental Biology (Polish Academy of Sciences) will deliver the expertise on genetically modified mice and rats, mapping of the neuronal circuits activated by behavior, monitoring complex behaviors measured in the IntelliCage system, electrophysiological brain activity recordings by multielectrodes in behaving animals, analysis and modeling of behavioral and electrophysiological data; (ii) the AGH University of Science and Technology (Faculty of Physics and Applied Computer Sciences) will use its experience in high-throughput electronics to build multichannel systems for recording the brain activity of behaving animals; (iii) the University of Warsaw (Faculty of Physics) and (iv) the Center for Theoretical Physics (Polish Academy of Sciences) will construct optoelectronic device for remote control of opto-animals produced in the Nencki Institute based on the unique experience in laser sources, studies of light propagation and its interaction with condensed media, wireless medical robotic systems, fast readout opto-electronics with control software and micromechanics.
The HSP, the QCN, and the Dragon: Developing inquiry-based QCN instructional modules in Taiwan
NASA Astrophysics Data System (ADS)
Chen, K. H.; Liang, W.; Chang, C.; Yen, E.; Lin, C.; Lin, G.
2012-12-01
High Scope Program (HSP) is a long-term project funded by NSC in Taiwan since 2006. It is designed to elevate the quality of science education by means of incorporating emerging science and technology into the traditional curricula in senior high schools. Quake-Catcher Network (QCN), a distributed computing project initiated by Stanford University and UC Riverside, encourages the volunteers to install the low-cost, novel sensors at home and school to build a seismic network. To meet both needs, we have developed a model curriculum that introduces QCN, earthquake science, and cloud computing into high school classrooms. Through professional development workshops, Taiwan cloud-based earthquake science learning platform, and QCN club on Facebook, we have worked closely with Lan-Yang Girl's Senior High School teachers' team to design workable teaching plans through a practical operation of seismic monitoring at home or school. However, some obstacles to learning appear including QCN installation/maintain problems, high self-noise of the sensor, difficulty of introducing earthquake sciences for high school teachers. The challenges of QCN outreach in Taiwan bring out our future plans: (1) development of easy, frequently updated, physics-based QCN-experiments for high school teachers, and (2) design of an interactive learning platform with social networking function for students.
Citizen Science to Support Community-based Flood Early Warning and Resilience Building
NASA Astrophysics Data System (ADS)
Paul, J. D.; Buytaert, W.; Allen, S.; Ballesteros-Cánovas, J. A.; Bhusal, J.; Cieslik, K.; Clark, J.; Dewulf, A.; Dhital, M. R.; Hannah, D. M.; Liu, W.; Nayaval, J. L.; Schiller, A.; Smith, P. J.; Stoffel, M.; Supper, R.
2017-12-01
In Disaster Risk Management, an emerging shift has been noted from broad-scale, top-down assessments towards more participatory, community-based, bottom-up approaches. Combined with technologies for robust and low-cost sensor networks, a citizen science approach has recently emerged as a promising direction in the provision of extensive, real-time information for flood early warning systems. Here we present the framework and initial results of a major new international project, Landslide EVO, aimed at increasing local resilience against hydrologically induced disasters in western Nepal by exploiting participatory approaches to knowledge generation and risk governance. We identify three major technological developments that strongly support our approach to flood early warning and resilience building in Nepal. First, distributed sensor networks, participatory monitoring, and citizen science hold great promise in complementing official monitoring networks and remote sensing by generating site-specific information with local buy-in, especially in data-scarce regions. Secondly, the emergence of open source, cloud-based risk analysis platforms supports the construction of a modular, distributed, and potentially decentralised data processing workflow. Finally, linking data analysis platforms to social computer networks and ICT (e.g. mobile phones, tablets) allows tailored interfaces and people-centred decision- and policy-support systems to be built. Our proposition is that maximum impact is created if end-users are involved not only in data collection, but also over the entire project life-cycle, including the analysis and provision of results. In this context, citizen science complements more traditional knowledge generation practices, and also enhances multi-directional information provision, risk management, early-warning systems and local resilience building.
Building Scalable Knowledge Graphs for Earth Science
NASA Astrophysics Data System (ADS)
Ramachandran, R.; Maskey, M.; Gatlin, P. N.; Zhang, J.; Duan, X.; Bugbee, K.; Christopher, S. A.; Miller, J. J.
2017-12-01
Estimates indicate that the world's information will grow by 800% in the next five years. In any given field, a single researcher or a team of researchers cannot keep up with this rate of knowledge expansion without the help of cognitive systems. Cognitive computing, defined as the use of information technology to augment human cognition, can help tackle large systemic problems. Knowledge graphs, one of the foundational components of cognitive systems, link key entities in a specific domain with other entities via relationships. Researchers could mine these graphs to make probabilistic recommendations and to infer new knowledge. At this point, however, there is a dearth of tools to generate scalable Knowledge graphs using existing corpus of scientific literature for Earth science research. Our project is currently developing an end-to-end automated methodology for incrementally constructing Knowledge graphs for Earth Science. Semantic Entity Recognition (SER) is one of the key steps in this methodology. SER for Earth Science uses external resources (including metadata catalogs and controlled vocabulary) as references to guide entity extraction and recognition (i.e., labeling) from unstructured text, in order to build a large training set to seed the subsequent auto-learning component in our algorithm. Results from several SER experiments will be presented as well as lessons learned.
The 'Biologically-Inspired Computing' Column
NASA Technical Reports Server (NTRS)
Hinchey, Mike
2006-01-01
The field of Biology changed dramatically in 1953, with the determination by Francis Crick and James Dewey Watson of the double helix structure of DNA. This discovery changed Biology for ever, allowing the sequencing of the human genome, and the emergence of a "new Biology" focused on DNA, genes, proteins, data, and search. Computational Biology and Bioinformatics heavily rely on computing to facilitate research into life and development. Simultaneously, an understanding of the biology of living organisms indicates a parallel with computing systems: molecules in living cells interact, grow, and transform according to the "program" dictated by DNA. Moreover, paradigms of Computing are emerging based on modelling and developing computer-based systems exploiting ideas that are observed in nature. This includes building into computer systems self-management and self-governance mechanisms that are inspired by the human body's autonomic nervous system, modelling evolutionary systems analogous to colonies of ants or other insects, and developing highly-efficient and highly-complex distributed systems from large numbers of (often quite simple) largely homogeneous components to reflect the behaviour of flocks of birds, swarms of bees, herds of animals, or schools of fish. This new field of "Biologically-Inspired Computing", often known in other incarnations by other names, such as: Autonomic Computing, Pervasive Computing, Organic Computing, Biomimetics, and Artificial Life, amongst others, is poised at the intersection of Computer Science, Engineering, Mathematics, and the Life Sciences. Successes have been reported in the fields of drug discovery, data communications, computer animation, control and command, exploration systems for space, undersea, and harsh environments, to name but a few, and augur much promise for future progress.
Biology Needs Evolutionary Software Tools: Let’s Build Them Right
Team, Galaxy; Goecks, Jeremy; Taylor, James
2018-01-01
Abstract Research in population genetics and evolutionary biology has always provided a computational backbone for life sciences as a whole. Today evolutionary and population biology reasoning are essential for interpretation of large complex datasets that are characteristic of all domains of today’s life sciences ranging from cancer biology to microbial ecology. This situation makes algorithms and software tools developed by our community more important than ever before. This means that we, developers of software tool for molecular evolutionary analyses, now have a shared responsibility to make these tools accessible using modern technological developments as well as provide adequate documentation and training. PMID:29688462
Microbiome Tools for Forensic Science.
Metcalf, Jessica L; Xu, Zhenjiang Z; Bouslimani, Amina; Dorrestein, Pieter; Carter, David O; Knight, Rob
2017-09-01
Microbes are present at every crime scene and have been used as physical evidence for over a century. Advances in DNA sequencing and computational approaches have led to recent breakthroughs in the use of microbiome approaches for forensic science, particularly in the areas of estimating postmortem intervals (PMIs), locating clandestine graves, and obtaining soil and skin trace evidence. Low-cost, high-throughput technologies allow us to accumulate molecular data quickly and to apply sophisticated machine-learning algorithms, building generalizable predictive models that will be useful in the criminal justice system. In particular, integrating microbiome and metabolomic data has excellent potential to advance microbial forensics. Copyright © 2017. Published by Elsevier Ltd.
77 FR 9673 - National Institute of Environmental Health Sciences; Notice of Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-17
... Neurobiology. Place: Nat. Inst. of Environmental Health Sciences, Building 101, Rodbell Auditorium, 111 T. W... Sciences, Building 101, Rodbell Auditorium, 111 T. W. Alexander Drive, Research Triangle Park, NC 27709...: Nat. Inst. of Environmental Health Sciences, Building 101, Rodbell Auditorium, 111 T. W. Alexander...
NASA Technical Reports Server (NTRS)
Keller, Richard M.
1991-01-01
The construction of scientific software models is an integral part of doing science, both within NASA and within the scientific community at large. Typically, model-building is a time-intensive and painstaking process, involving the design of very large, complex computer programs. Despite the considerable expenditure of resources involved, completed scientific models cannot easily be distributed and shared with the larger scientific community due to the low-level, idiosyncratic nature of the implemented code. To address this problem, we have initiated a research project aimed at constructing a software tool called the Scientific Modeling Assistant. This tool provides automated assistance to the scientist in developing, using, and sharing software models. We describe the Scientific Modeling Assistant, and also touch on some human-machine interaction issues relevant to building a successful tool of this type.
Science Gateways, Scientific Workflows and Open Community Software
NASA Astrophysics Data System (ADS)
Pierce, M. E.; Marru, S.
2014-12-01
Science gateways and scientific workflows occupy different ends of the spectrum of user-focused cyberinfrastructure. Gateways, sometimes called science portals, provide a way for enabling large numbers of users to take advantage of advanced computing resources (supercomputers, advanced storage systems, science clouds) by providing Web and desktop interfaces and supporting services. Scientific workflows, at the other end of the spectrum, support advanced usage of cyberinfrastructure that enable "power users" to undertake computational experiments that are not easily done through the usual mechanisms (managing simulations across multiple sites, for example). Despite these different target communities, gateways and workflows share many similarities and can potentially be accommodated by the same software system. For example, pipelines to process InSAR imagery sets or to datamine GPS time series data are workflows. The results and the ability to make downstream products may be made available through a gateway, and power users may want to provide their own custom pipelines. In this abstract, we discuss our efforts to build an open source software system, Apache Airavata, that can accommodate both gateway and workflow use cases. Our approach is general, and we have applied the software to problems in a number of scientific domains. In this talk, we discuss our applications to usage scenarios specific to earth science, focusing on earthquake physics examples drawn from the QuakSim.org and GeoGateway.org efforts. We also examine the role of the Apache Software Foundation's open community model as a way to build up common commmunity codes that do not depend upon a single "owner" to sustain. Pushing beyond open source software, we also see the need to provide gateways and workflow systems as cloud services. These services centralize operations, provide well-defined programming interfaces, scale elastically, and have global-scale fault tolerance. We discuss our work providing Apache Airavata as a hosted service to provide these features.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Runnels, Scott Robert; Bachrach, Harrison Ian; Carlson, Nils
The two primary purposes of LANL’s Computational Physics Student Summer Workshop are (1) To educate graduate and exceptional undergraduate students in the challenges and applications of computational physics of interest to LANL, and (2) Entice their interest toward those challenges. Computational physics is emerging as a discipline in its own right, combining expertise in mathematics, physics, and computer science. The mathematical aspects focus on numerical methods for solving equations on the computer as well as developing test problems with analytical solutions. The physics aspects are very broad, ranging from low-temperature material modeling to extremely high temperature plasma physics, radiation transportmore » and neutron transport. The computer science issues are concerned with matching numerical algorithms to emerging architectures and maintaining the quality of extremely large codes built to perform multi-physics calculations. Although graduate programs associated with computational physics are emerging, it is apparent that the pool of U.S. citizens in this multi-disciplinary field is relatively small and is typically not focused on the aspects that are of primary interest to LANL. Furthermore, more structured foundations for LANL interaction with universities in computational physics is needed; historically interactions rely heavily on individuals’ personalities and personal contacts. Thus a tertiary purpose of the Summer Workshop is to build an educational network of LANL researchers, university professors, and emerging students to advance the field and LANL’s involvement in it.« less
Theoretical and technological building blocks for an innovation accelerator
NASA Astrophysics Data System (ADS)
van Harmelen, F.; Kampis, G.; Börner, K.; van den Besselaar, P.; Schultes, E.; Goble, C.; Groth, P.; Mons, B.; Anderson, S.; Decker, S.; Hayes, C.; Buecheler, T.; Helbing, D.
2012-11-01
Modern science is a main driver of technological innovation. The efficiency of the scientific system is of key importance to ensure the competitiveness of a nation or region. However, the scientific system that we use today was devised centuries ago and is inadequate for our current ICT-based society: the peer review system encourages conservatism, journal publications are monolithic and slow, data is often not available to other scientists, and the independent validation of results is limited. The resulting scientific process is hence slow and sloppy. Building on the Innovation Accelerator paper by Helbing and Balietti [1], this paper takes the initial global vision and reviews the theoretical and technological building blocks that can be used for implementing an innovation (in first place: science) accelerator platform driven by re-imagining the science system. The envisioned platform would rest on four pillars: (i) Redesign the incentive scheme to reduce behavior such as conservatism, herding and hyping; (ii) Advance scientific publications by breaking up the monolithic paper unit and introducing other building blocks such as data, tools, experiment workflows, resources; (iii) Use machine readable semantics for publications, debate structures, provenance etc. in order to include the computer as a partner in the scientific process, and (iv) Build an online platform for collaboration, including a network of trust and reputation among the different types of stakeholders in the scientific system: scientists, educators, funding agencies, policy makers, students and industrial innovators among others. Any such improvements to the scientific system must support the entire scientific process (unlike current tools that chop up the scientific process into disconnected pieces), must facilitate and encourage collaboration and interdisciplinarity (again unlike current tools), must facilitate the inclusion of intelligent computing in the scientific process, must facilitate not only the core scientific process, but also accommodate other stakeholders such science policy makers, industrial innovators, and the general public. We first describe the current state of the scientific system together with up to a dozen new key initiatives, including an analysis of the role of science as an innovation accelerator. Our brief survey will show that there exist many separate ideas and concepts and diverse stand-alone demonstrator systems for different components of the ecosystem with many parts are still unexplored, and overall integration lacking. By analyzing a matrix of stakeholders vs. functionalities, we identify the required innovations. We (non-exhaustively) discuss a few of them: Publications that are meaningful to machines, innovative reviewing processes, data publication, workflow archiving and reuse, alternative impact metrics, tools for the detection of trends, community formation and emergence, as well as modular publications, citation objects and debate graphs. To summarize, the core idea behind the Innovation Accelerator is to develop new incentive models, rules, and interaction mechanisms to stimulate true innovation, revolutionizing the way in which we create knowledge and disseminate information.
On Roles of Models in Information Systems
NASA Astrophysics Data System (ADS)
Sølvberg, Arne
The increasing penetration of computers into all aspects of human activity makes it desirable that the interplay among software, data and the domains where computers are applied is made more transparent. An approach to this end is to explicitly relate the modeling concepts of the domains, e.g., natural science, technology and business, to the modeling concepts of software and data. This may make it simpler to build comprehensible integrated models of the interactions between computers and non-computers, e.g., interaction among computers, people, physical processes, biological processes, and administrative processes. This chapter contains an analysis of various facets of the modeling environment for information systems engineering. The lack of satisfactory conceptual modeling tools seems to be central to the unsatisfactory state-of-the-art in establishing information systems. The chapter contains a proposal for defining a concept of information that is relevant to information systems engineering.
NASA Technical Reports Server (NTRS)
Cohen, Jarrett
1999-01-01
Parallel computers built out of mass-market parts are cost-effectively performing data processing and simulation tasks. The Supercomputing (now known as "SC") series of conferences celebrated its 10th anniversary last November. While vendors have come and gone, the dominant paradigm for tackling big problems still is a shared-resource, commercial supercomputer. Growing numbers of users needing a cheaper or dedicated-access alternative are building their own supercomputers out of mass-market parts. Such machines are generally called Beowulf-class systems after the 11th century epic. This modern-day Beowulf story began in 1994 at NASA's Goddard Space Flight Center. A laboratory for the Earth and space sciences, computing managers there threw down a gauntlet to develop a $50,000 gigaFLOPS workstation for processing satellite data sets. Soon, Thomas Sterling and Don Becker were working on the Beowulf concept at the University Space Research Association (USRA)-run Center of Excellence in Space Data and Information Sciences (CESDIS). Beowulf clusters mix three primary ingredients: commodity personal computers or workstations, low-cost Ethernet networks, and the open-source Linux operating system. One of the larger Beowulfs is Goddard's Highly-parallel Integrated Virtual Environment, or HIVE for short.
Harnessing the power of emerging petascale platforms
NASA Astrophysics Data System (ADS)
Mellor-Crummey, John
2007-07-01
As part of the US Department of Energy's Scientific Discovery through Advanced Computing (SciDAC-2) program, science teams are tackling problems that require computational simulation and modeling at the petascale. A grand challenge for computer science is to develop software technology that makes it easier to harness the power of these systems to aid scientific discovery. As part of its activities, the SciDAC-2 Center for Scalable Application Development Software (CScADS) is building open source software tools to support efficient scientific computing on the emerging leadership-class platforms. In this paper, we describe two tools for performance analysis and tuning that are being developed as part of CScADS: a tool for analyzing scalability and performance, and a tool for optimizing loop nests for better node performance. We motivate these tools by showing how they apply to S3D, a turbulent combustion code under development at Sandia National Laboratory. For S3D, our node performance analysis tool helped uncover several performance bottlenecks. Using our loop nest optimization tool, we transformed S3D's most costly loop nest to reduce execution time by a factor of 2.94 for a processor working on a 503 domain.
NASA Astrophysics Data System (ADS)
Klopfer, Eric; Scheintaub, Hal; Huang, Wendy; Wendel, Daniel
Computational approaches to science are radically altering the nature of scientific investigatiogn. Yet these computer programs and simulations are sparsely used in science education, and when they are used, they are typically “canned” simulations which are black boxes to students. StarLogo The Next Generation (TNG) was developed to make programming of simulations more accessible for students and teachers. StarLogo TNG builds on the StarLogo tradition of agent-based modeling for students and teachers, with the added features of a graphical programming environment and a three-dimensional (3D) world. The graphical programming environment reduces the learning curve of programming, especially syntax. The 3D graphics make for a more immersive and engaging experience for students, including making it easy to design and program their own video games. Another change to StarLogo TNG is a fundamental restructuring of the virtual machine to make it more transparent. As a result of these changes, classroom use of TNG is expanding to new areas. This chapter is concluded with a description of field tests conducted in middle and high school science classes.
An Interrogative Model of Computer-Aided Adaptive Testing: Some Experimental Evidence
1988-09-01
Ahilitfas 2 Final 3g zj, research report, Office of Naval Research, Arlington, VA, June 1986. Brovn, 3. S. and Harris, a., " Artificial Intelligence and...Building an Intellegent Tutoring System," in Methods and Tactics in Cggnitive Science (Rds. Kintsch, Miller, and Poison), Lavrence Zrlbaum Associates...Education, Washington, DC, November 1984. 89 -7- In SIvasankaran, T. R. and Bul, Tung X., "A Bayesian Diagnostic Model for Intellegent CAI Systems
Science, Technical Innovation and Applications in Bioacoustics: Summary of a Workshop
2004-07-01
binaural processing have been neglected. From a signal-processing standpoint, we should avoid complex computational methods and instead use massively...design and/or build transducers or arrays with anywhere near the performance and, most importantly, environmental adaptability of animal binaural ...shell Small animal imaging Cardiac Imaging in Mice The Challenge Mouse heart • 7mm diameter • 8 beats /sec Mouse Heart L16-28MHzL5-10MHz Laptop
NASA Technical Reports Server (NTRS)
Davis, Bruce E.; Elliot, Gregory
1989-01-01
Jackson State University recently established the Center for Spatial Data Research and Applications, a Geographical Information System (GIS) and remote sensing laboratory. Taking advantage of new technologies and new directions in the spatial (geographic) sciences, JSU is building a Center of Excellence in Spatial Data Management. New opportunities for research, applications, and employment are emerging. GIS requires fundamental shifts and new demands in traditional computer science and geographic training. The Center is not merely another computer lab but is one setting the pace in a new applied frontier. GIS and its associated technologies are discussed. The Center's facilities are described. An ARC/INFO GIS runs on a Vax mainframe, with numerous workstations. Image processing packages include ELAS, LIPS, VICAR, and ERDAS. A host of hardware and software peripheral are used in support. Numerous projects are underway, such as the construction of a Gulf of Mexico environmental data base, development of AI in image processing, a land use dynamics study of metropolitan Jackson, and others. A new academic interdisciplinary program in Spatial Data Management is under development, combining courses in Geography and Computer Science. The broad range of JSU's GIS and remote sensing activities is addressed. The impacts on changing paradigms in the university and in the professional world conclude the discussion.
Advances in Data Management in Remote Sensing and Climate Modeling
NASA Astrophysics Data System (ADS)
Brown, P. G.
2014-12-01
Recent commercial interest in "Big Data" information systems has yielded little more than a sense of deja vu among scientists whose work has always required getting their arms around extremely large databases, and writing programs to explore and analyze it. On the flip side, there are some commercial DBMS startups building "Big Data" platform using techniques taken from earth science, astronomy, high energy physics and high performance computing. In this talk, we will introduce one such platform; Paradigm4's SciDB, the first DBMS designed from the ground up to combine the kinds of quality-of-service guarantees made by SQL DBMS platforms—high level data model, query languages, extensibility, transactions—with the kinds of functionality familiar to scientific users—arrays as structural building blocks, integrated linear algebra, and client language interfaces that minimize the learning curve. We will review how SciDB is used to manage and analyze earth science data by several teams of scientific users.
A Computing Infrastructure for Supporting Climate Studies
NASA Astrophysics Data System (ADS)
Yang, C.; Bambacus, M.; Freeman, S. M.; Huang, Q.; Li, J.; Sun, M.; Xu, C.; Wojcik, G. S.; Cahalan, R. F.; NASA Climate @ Home Project Team
2011-12-01
Climate change is one of the major challenges facing us on the Earth planet in the 21st century. Scientists build many models to simulate the past and predict the climate change for the next decades or century. Most of the models are at a low resolution with some targeting high resolution in linkage to practical climate change preparedness. To calibrate and validate the models, millions of model runs are needed to find the best simulation and configuration. This paper introduces the NASA effort on Climate@Home project to build a supercomputer based-on advanced computing technologies, such as cloud computing, grid computing, and others. Climate@Home computing infrastructure includes several aspects: 1) a cloud computing platform is utilized to manage the potential spike access to the centralized components, such as grid computing server for dispatching and collecting models runs results; 2) a grid computing engine is developed based on MapReduce to dispatch models, model configuration, and collect simulation results and contributing statistics; 3) a portal serves as the entry point for the project to provide the management, sharing, and data exploration for end users; 4) scientists can access customized tools to configure model runs and visualize model results; 5) the public can access twitter and facebook to get the latest about the project. This paper will introduce the latest progress of the project and demonstrate the operational system during the AGU fall meeting. It will also discuss how this technology can become a trailblazer for other climate studies and relevant sciences. It will share how the challenges in computation and software integration were solved.
Efficiency Assessment of a Blended-Learning Educational Methodology in Engineering
NASA Astrophysics Data System (ADS)
Rogado, Ana Belén González; Conde, Ma José Rodríguez; Migueláñez, Susana Olmos; Riaza, Blanca García; Peñalvo, Francisco José García
The content of this presentation highlights the importance of an active learning methodology in engineering university degrees in Spain. We present of some of the outcomes from an experimental study carried out during the academic years 2007/08 and 2008/09 with engineering students (Technical Industrial Engineering: Mechanics, Civical Design Engineering: Civical building, Technical Architecture and Technical Engineering on Computer Management.) at the University of Salamanca. In this research we select a subject which is common for the four degrees: Computer Science. This study has the aim of contributing to the improvement of education and teaching methods for a better performance of students in Engineering.
Computing exponentially faster: implementing a non-deterministic universal Turing machine using DNA
Currin, Andrew; Korovin, Konstantin; Ababi, Maria; Roper, Katherine; Kell, Douglas B.; Day, Philip J.
2017-01-01
The theory of computer science is based around universal Turing machines (UTMs): abstract machines able to execute all possible algorithms. Modern digital computers are physical embodiments of classical UTMs. For the most important class of problem in computer science, non-deterministic polynomial complete problems, non-deterministic UTMs (NUTMs) are theoretically exponentially faster than both classical UTMs and quantum mechanical UTMs (QUTMs). However, no attempt has previously been made to build an NUTM, and their construction has been regarded as impossible. Here, we demonstrate the first physical design of an NUTM. This design is based on Thue string rewriting systems, and thereby avoids the limitations of most previous DNA computing schemes: all the computation is local (simple edits to strings) so there is no need for communication, and there is no need to order operations. The design exploits DNA's ability to replicate to execute an exponential number of computational paths in P time. Each Thue rewriting step is embodied in a DNA edit implemented using a novel combination of polymerase chain reactions and site-directed mutagenesis. We demonstrate that the design works using both computational modelling and in vitro molecular biology experimentation: the design is thermodynamically favourable, microprogramming can be used to encode arbitrary Thue rules, all classes of Thue rule can be implemented, and non-deterministic rule implementation. In an NUTM, the resource limitation is space, which contrasts with classical UTMs and QUTMs where it is time. This fundamental difference enables an NUTM to trade space for time, which is significant for both theoretical computer science and physics. It is also of practical importance, for to quote Richard Feynman ‘there's plenty of room at the bottom’. This means that a desktop DNA NUTM could potentially utilize more processors than all the electronic computers in the world combined, and thereby outperform the world's current fastest supercomputer, while consuming a tiny fraction of its energy. PMID:28250099
Borlawsky, Tara B.; Dhaval, Rakesh; Hastings, Shannon L.; Payne, Philip R. O.
2009-01-01
In October 2006, the National Institutes of Health launched a new national consortium, funded through Clinical and Translational Science Awards (CTSA), with the primary objective of improving the conduct and efficiency of the inherently multi-disciplinary field of translational research. To help meet this goal, the Ohio State University Center for Clinical and Translational Science has launched a knowledge management initiative that is focused on facilitating widespread semantic interoperability among administrative, basic science, clinical and research computing systems, both internally and among the translational research community at-large, through the integration of domain-specific standard terminologies and ontologies with local annotations. This manuscript describes an agile framework that builds upon prevailing knowledge engineering and semantic interoperability methods, and will be implemented as part this initiative. PMID:21347164
Borlawsky, Tara B; Dhaval, Rakesh; Hastings, Shannon L; Payne, Philip R O
2009-03-01
In October 2006, the National Institutes of Health launched a new national consortium, funded through Clinical and Translational Science Awards (CTSA), with the primary objective of improving the conduct and efficiency of the inherently multi-disciplinary field of translational research. To help meet this goal, the Ohio State University Center for Clinical and Translational Science has launched a knowledge management initiative that is focused on facilitating widespread semantic interoperability among administrative, basic science, clinical and research computing systems, both internally and among the translational research community at-large, through the integration of domain-specific standard terminologies and ontologies with local annotations. This manuscript describes an agile framework that builds upon prevailing knowledge engineering and semantic interoperability methods, and will be implemented as part this initiative.
NASA Astrophysics Data System (ADS)
Lescinsky, D. T.; Wyborn, L. A.; Evans, B. J. K.; Allen, C.; Fraser, R.; Rankine, T.
2014-12-01
We present collaborative work on a generic, modular infrastructure for virtual laboratories (VLs, similar to science gateways) that combine online access to data, scientific code, and computing resources as services that support multiple data intensive scientific computing needs across a wide range of science disciplines. We are leveraging access to 10+ PB of earth science data on Lustre filesystems at Australia's National Computational Infrastructure (NCI) Research Data Storage Infrastructure (RDSI) node, co-located with NCI's 1.2 PFlop Raijin supercomputer and a 3000 CPU core research cloud. The development, maintenance and sustainability of VLs is best accomplished through modularisation and standardisation of interfaces between components. Our approach has been to break up tightly-coupled, specialised application packages into modules, with identified best techniques and algorithms repackaged either as data services or scientific tools that are accessible across domains. The data services can be used to manipulate, visualise and transform multiple data types whilst the scientific tools can be used in concert with multiple scientific codes. We are currently designing a scalable generic infrastructure that will handle scientific code as modularised services and thereby enable the rapid/easy deployment of new codes or versions of codes. The goal is to build open source libraries/collections of scientific tools, scripts and modelling codes that can be combined in specially designed deployments. Additional services in development include: provenance, publication of results, monitoring, workflow tools, etc. The generic VL infrastructure will be hosted at NCI, but can access alternative computing infrastructures (i.e., public/private cloud, HPC).The Virtual Geophysics Laboratory (VGL) was developed as a pilot project to demonstrate the underlying technology. This base is now being redesigned and generalised to develop a Virtual Hazards Impact and Risk Laboratory (VHIRL); any enhancements and new capabilities will be incorporated into a generic VL infrastructure. At same time, we are scoping seven new VLs and in the process, identifying other common components to prioritise and focus development.
The effect of technology on student science achievement
NASA Astrophysics Data System (ADS)
Hilton, June Kraft
2003-10-01
Prior research indicates that technology has had little effect on raising student achievement. Little empirical research exists, however, studying the effects of technology as a tool to improve student achievement through development of higher order thinking skills. Also, prior studies have not focused on the manner in which technology is being used in the classroom and at home to enhance teaching and learning. Empirical data from a secondary school representative of those in California were analyzed to determine the effects of technology on student science achievement. The quantitative analysis methods for the school data study included a multiple linear path analysis, using final course grade as the ultimate exogenous variable. In addition, empirical data from a nationwide survey on how Americans use the Internet were disaggregated by age and analyzed to determine the relationships between computer and Internet experience and (a) Internet use at home for school assignments and (b) more general computer use at home for school assignments for school age children. Analysis of data collected from the a "A Nation Online" Survey conducted by the United States Census Bureau assessed these relationships via correlations and cross-tabulations. Finally, results from these data analyses were assessed in conjunction with systemic reform efforts from 12 states designed to address improvements in science and mathematics education in light of the Third International Mathematics and Science Survey (TIMSS). Examination of the technology efforts in those states provided a more nuanced understanding of the impact technology has on student achievement. Key findings included evidence that technology training for teachers increased their use of the computer for instruction but students' final science course grade did not improve; school age children across the country did not use the computer at home for such higher-order cognitive activities as graphics and design or spreadsheets/databases; and states whose systemic reform initiatives included a mix of capacity building and alignment to state standards realized improved student achievement on the 2000 NAEP Science Assessment.
1980-05-01
engineering ,ZteNo D R RPTE16 research w 9 laboratory COMPARISON OF BUILDING LOADS ANALYSIS AND SYSTEM THERMODYNAMICS (BLAST) AD 0 5 5,0 3COMPUTER PROGRAM...Building Loads Analysis and System Thermodynamics (BLAST) computer program. A dental clinic and a battalion headquarters and classroom building were...Building and HVAC System Data Computer Simulation Comparison of Actual and Simulated Results ANALYSIS AND FINDINGS
NASA Astrophysics Data System (ADS)
Gomez, R.; Gentle, J.
2015-12-01
Modern data pipelines and computational processes require that meticulous methodologies be applied in order to insure that the source data, algorithms, and results are properly curated, managed and retained while remaining discoverable, accessible, and reproducible. Given the complexity of understanding the scientific problem domain being researched, combined with the overhead of learning to use advanced computing technologies, it becomes paramount that the next generation of scientists and researchers learn to embrace best-practices. The Integrative Computational Education and Research Traineeship (ICERT) is a National Science Foundation (NSF) Research Experience for Undergraduates (REU) Site at the Texas Advanced Computing Center (TACC). During Summer 2015, two ICERT interns joined the 3DDY project. 3DDY converts geospatial datasets into file types that can take advantage of new formats, such as natural user interfaces, interactive visualization, and 3D printing. Mentored by TACC researchers for ten weeks, students with no previous background in computational science learned to use scripts to build the first prototype of the 3DDY application, and leveraged Wrangler, the newest high performance computing (HPC) resource at TACC. Test datasets for quadrangles in central Texas were used to assemble the 3DDY workflow and code. Test files were successfully converted into a stereo lithographic (STL) format, which is amenable for use with a 3D printers. Test files and the scripts were documented and shared using the Figshare site while metadata was documented for the 3DDY application using OntoSoft. These efforts validated a straightforward set of workflows to transform geospatial data and established the first prototype version of 3DDY. Adding the data and software management procedures helped students realize a broader set of tangible results (e.g. Figshare entries), better document their progress and the final state of their work for the research group and community, helped students and researchers follow a clear set of formats and fill in the necessary details that may be lost otherwise, and exposed the students to the next generation workflows and practices for digital scholarship and scientific inquiry for converting geospatial data into formats that are easy to reuse.
NASA Astrophysics Data System (ADS)
Orr, C. H.; Mcfadden, R. R.; Manduca, C. A.; Kempler, L. A.
2016-12-01
Teaching with data, simulations, and models in the geosciences can increase many facets of student success in the classroom, and in the workforce. Teaching undergraduates about programming and improving students' quantitative and computational skills expands their perception of Geoscience beyond field-based studies. Processing data and developing quantitative models are critically important for Geoscience students. Students need to be able to perform calculations, analyze data, create numerical models and visualizations, and more deeply understand complex systems—all essential aspects of modern science. These skills require students to have comfort and skill with languages and tools such as MATLAB. To achieve comfort and skill, computational and quantitative thinking must build over a 4-year degree program across courses and disciplines. However, in courses focused on Geoscience content it can be challenging to get students comfortable with using computational methods to answers Geoscience questions. To help bridge this gap, we have partnered with MathWorks to develop two workshops focused on collecting and developing strategies and resources to help faculty teach students to incorporate data, simulations, and models into the curriculum at the course and program levels. We brought together faculty members from the sciences, including Geoscience and allied fields, who teach computation and quantitative thinking skills using MATLAB to build a resource collection for teaching. These materials, and the outcomes of the workshops are freely available on our website. The workshop outcomes include a collection of teaching activities, essays, and course descriptions that can help faculty incorporate computational skills at the course or program level. The teaching activities include in-class assignments, problem sets, labs, projects, and toolboxes. These activities range from programming assignments to creating and using models. The outcomes also include workshop syntheses that highlights best practices, a set of webpages to support teaching with software such as MATLAB, and an interest group actively discussing aspects these issues in Geoscience and allied fields. Learn more and view the resources at http://serc.carleton.edu/matlab_computation2016/index.html
A walk through the planned CS building. M.S. Thesis
NASA Technical Reports Server (NTRS)
Khorramabadi, Delnaz
1991-01-01
Using the architectural plan views of our future computer science building as test objects, we have completed the first stage of a Building walkthrough system. The inputs to our system are AutoCAD files. An AutoCAD converter translates the geometrical information in these files into a format suitable for 3D rendering. Major model errors, such as incorrect polygon intersections and random face orientations, are detected and fixed automatically. Interactive viewing and editing tools are provided to view the results, to modify and clean the model and to change surface attributes. Our display system provides a simple-to-use user interface for interactive exploration of buildings. Using only the mouse buttons, the user can move inside and outside the building and change floors. Several viewing and rendering options are provided, such as restricting the viewing frustum, avoiding wall collisions, and selecting different rendering algorithms. A plan view of the current floor, with the position of the eye point and viewing direction on it, is displayed at all times. The scene illumination can be manipulated, by interactively controlling intensity values for 5 light sources.
A Modelica-based Model Library for Building Energy and Control Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wetter, Michael
2009-04-07
This paper describes an open-source library with component models for building energy and control systems that is based on Modelica, an equation-based objectoriented language that is well positioned to become the standard for modeling of dynamic systems in various industrial sectors. The library is currently developed to support computational science and engineering for innovative building energy and control systems. Early applications will include controls design and analysis, rapid prototyping to support innovation of new building systems and the use of models during operation for controls, fault detection and diagnostics. This paper discusses the motivation for selecting an equation-based object-oriented language.more » It presents the architecture of the library and explains how base models can be used to rapidly implement new models. To demonstrate the capability of analyzing novel energy and control systems, the paper closes with an example where we compare the dynamic performance of a conventional hydronic heating system with thermostatic radiator valves to an innovative heating system. In the new system, instead of a centralized circulation pump, each of the 18 radiators has a pump whose speed is controlled using a room temperature feedback loop, and the temperature of the boiler is controlled based on the speed of the radiator pump. All flows are computed by solving for the pressure distribution in the piping network, and the controls include continuous and discrete time controls.« less
The Computational Science Environment (CSE)
2009-08-01
supported CSE platforms. Developers can also build against different versions of a particular package (e.g., Python-2.4 vs . Python-2.5) via a...8.2.1 TK Testing Error and Workaround It has been found that TK tends to produces more testing errors when using KDE , and in some instances, the test...suite freezes when reaching the TK select test. These issues have not been seen when using Gnome . 8.2.2 VTK Testing Error and Workaround VTK test
Sustaining Open Source Communities through Hackathons - An Example from the ASPECT Community
NASA Astrophysics Data System (ADS)
Heister, T.; Hwang, L.; Bangerth, W.; Kellogg, L. H.
2016-12-01
The ecosystem surrounding a successful scientific open source software package combines both social and technical aspects. Much thought has been given to the technology side of writing sustainable software for large infrastructure projects and software libraries, but less about building the human capacity to perpetuate scientific software used in computational modeling. One effective format for building capacity is regular multi-day hackathons. Scientific hackathons bring together a group of science domain users and scientific software contributors to make progress on a specific software package. Innovation comes through the chance to work with established and new collaborations. Especially in the domain sciences with small communities, hackathons give geographically distributed scientists an opportunity to connect face-to-face. They foster lively discussions amongst scientists with different expertise, promote new collaborations, and increase transparency in both the technical and scientific aspects of code development. ASPECT is an open source, parallel, extensible finite element code to simulate thermal convection, that began development in 2011 under the Computational Infrastructure for Geodynamics. ASPECT hackathons for the past 3 years have grown the number of authors to >50, training new code maintainers in the process. Hackathons begin with leaders establishing project-specific conventions for development, demonstrating the workflow for code contributions, and reviewing relevant technical skills. Each hackathon expands the developer community. Over 20 scientists add >6,000 lines of code during the >1 week event. Participants grow comfortable contributing to the repository and over half continue to contribute afterwards. A high return rate of participants ensures continuity and stability of the group as well as mentoring for novice members. We hope to build other software communities on this model, but anticipate each to bring their own unique challenges.
76 FR 57065 - National Institute of Environmental Health Sciences; Notice of Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-15
... Sciences, Building 101, Rodbell Auditorium, 111 T. W. Alexander Drive, Research Triangle Park, NC 27709... personnel issues. Place: Nat. Inst. of Environmental Health Sciences, Building 101, Rodbell Auditorium, 111..., Building 101, Rodbell Auditorium, 111 T. W. Alexander Drive, Research Triangle Park, NC 27709. Closed...
NASA's Participation in the National Computational Grid
NASA Technical Reports Server (NTRS)
Feiereisen, William J.; Zornetzer, Steve F. (Technical Monitor)
1998-01-01
Over the last several years it has become evident that the character of NASA's supercomputing needs has changed. One of the major missions of the agency is to support the design and manufacture of aero- and space-vehicles with technologies that will significantly reduce their cost. It is becoming clear that improvements in the process of aerospace design and manufacturing will require a high performance information infrastructure that allows geographically dispersed teams to draw upon resources that are broader than traditional supercomputing. A computational grid draws together our information resources into one system. We can foresee the time when a Grid will allow engineers and scientists to use the tools of supercomputers, databases and on line experimental devices in a virtual environment to collaborate with distant colleagues. The concept of a computational grid has been spoken of for many years, but several events in recent times are conspiring to allow us to actually build one. In late 1997 the National Science Foundation initiated the Partnerships for Advanced Computational Infrastructure (PACI) which is built around the idea of distributed high performance computing. The Alliance lead, by the National Computational Science Alliance (NCSA), and the National Partnership for Advanced Computational Infrastructure (NPACI), lead by the San Diego Supercomputing Center, have been instrumental in drawing together the "Grid Community" to identify the technology bottlenecks and propose a research agenda to address them. During the same period NASA has begun to reformulate parts of two major high performance computing research programs to concentrate on distributed high performance computing and has banded together with the PACI centers to address the research agenda in common.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Runnels, Scott Robert; Caldwell, Wendy; Brown, Barton Jed
The two primary purposes of LANL’s Computational Physics Student Summer Workshop are (1) To educate graduate and exceptional undergraduate students in the challenges and applications of computational physics of interest to LANL, and (2) Entice their interest toward those challenges. Computational physics is emerging as a discipline in its own right, combining expertise in mathematics, physics, and computer science. The mathematical aspects focus on numerical methods for solving equations on the computer as well as developing test problems with analytical solutions. The physics aspects are very broad, ranging from low-temperature material modeling to extremely high temperature plasma physics, radiation transportmore » and neutron transport. The computer science issues are concerned with matching numerical algorithms to emerging architectures and maintaining the quality of extremely large codes built to perform multi-physics calculations. Although graduate programs associated with computational physics are emerging, it is apparent that the pool of U.S. citizens in this multi-disciplinary field is relatively small and is typically not focused on the aspects that are of primary interest to LANL. Furthermore, more structured foundations for LANL interaction with universities in computational physics is needed; historically interactions rely heavily on individuals’ personalities and personal contacts. Thus a tertiary purpose of the Summer Workshop is to build an educational network of LANL researchers, university professors, and emerging students to advance the field and LANL’s involvement in it. This report includes both the background for the program and the reports from the students.« less
El Programa de Fortalecimiento de Capacidades de COSPAR
NASA Astrophysics Data System (ADS)
Gabriel, C.
2016-08-01
The provision of scientific data archives and analysis tools by diverse institutions in the world represents a unique opportunity for the development of scientific activities. An example of this is the European Space Agency's space observatory XMM-Newton with its Science Operations Centre at the European Space Astronomy Centre near Madrid, Spain. It provides through its science archive and web pages, not only the raw and processed data from the mission, but also analysis tools, and full documentation greatly helping their dissemination and use. These data and tools, freely accesible to anyone in the world, are the practical elements around which COSPAR (COmmittee on SPAce Research) Capacity Building Workshops have been conceived and developed, and held for a decade and a half in developing countries. The Programme started with X-ray workshops, but in-between it has been broadened to the most diverse space science areas. The workshops help to develop science at the highest level in those countries, in a long and substainable way, with a minimal investment (computer plus a moderate Internet connection). In this paper we discuss the basis, concepts, and achievements of the Capacity Building Programme. Two instances of the Programme have already taken place in Argentina, one of them devoted to X-ray astronomy and another to Infrared Astronomy. Several others have been organised for the Latin American region (Brazil, Uruguay and Mexico) with a large participation of young investigators from Argentina.
Clocks to Computers: A Machine-Based “Big Picture” of the History of Modern Science.
van Lunteren, Frans
2016-12-01
Over the last few decades there have been several calls for a “big picture” of the history of science. There is a general need for a concise overview of the rise of modern science, with a clear structure allowing for a rough division into periods. This essay proposes such a scheme, one that is both elementary and comprehensive. It focuses on four machines, which can be seen to have mediated between science and society during successive periods of time: the clock, the balance, the steam engine, and the computer. Following an extended developmental phase, each of these machines came to play a highly visible role in Western societies, both socially and economically. Each of these machines, moreover, was used as a powerful resource for the understanding of both inorganic and organic nature. More specifically, their metaphorical use helped to construe and refine some key concepts that would play a prominent role in such understanding. In each case the key concept would at some point be considered to represent the ultimate building block of reality. Finally, in a refined form, each of these machines would eventually make its entry in scientific research, thereby strengthening the ties between these machines and nature.
Toward a Big Data Science: A challenge of "Science Cloud"
NASA Astrophysics Data System (ADS)
Murata, Ken T.; Watanabe, Hidenobu
2013-04-01
During these 50 years, along with appearance and development of high-performance computers (and super-computers), numerical simulation is considered to be a third methodology for science, following theoretical (first) and experimental and/or observational (second) approaches. The variety of data yielded by the second approaches has been getting more and more. It is due to the progress of technologies of experiments and observations. The amount of the data generated by the third methodologies has been getting larger and larger. It is because of tremendous development and programming techniques of super computers. Most of the data files created by both experiments/observations and numerical simulations are saved in digital formats and analyzed on computers. The researchers (domain experts) are interested in not only how to make experiments and/or observations or perform numerical simulations, but what information (new findings) to extract from the data. However, data does not usually tell anything about the science; sciences are implicitly hidden in the data. Researchers have to extract information to find new sciences from the data files. This is a basic concept of data intensive (data oriented) science for Big Data. As the scales of experiments and/or observations and numerical simulations get larger, new techniques and facilities are required to extract information from a large amount of data files. The technique is called as informatics as a fourth methodology for new sciences. Any methodologies must work on their facilities: for example, space environment are observed via spacecraft and numerical simulations are performed on super-computers, respectively in space science. The facility of the informatics, which deals with large-scale data, is a computational cloud system for science. This paper is to propose a cloud system for informatics, which has been developed at NICT (National Institute of Information and Communications Technology), Japan. The NICT science cloud, we named as OneSpaceNet (OSN), is the first open cloud system for scientists who are going to carry out their informatics for their own science. The science cloud is not for simple uses. Many functions are expected to the science cloud; such as data standardization, data collection and crawling, large and distributed data storage system, security and reliability, database and meta-database, data stewardship, long-term data preservation, data rescue and preservation, data mining, parallel processing, data publication and provision, semantic web, 3D and 4D visualization, out-reach and in-reach, and capacity buildings. Figure (not shown here) is a schematic picture of the NICT science cloud. Both types of data from observation and simulation are stored in the storage system in the science cloud. It should be noted that there are two types of data in observation. One is from archive site out of the cloud: this is a data to be downloaded through the Internet to the cloud. The other one is data from the equipment directly connected to the science cloud. They are often called as sensor clouds. In the present talk, we first introduce the NICT science cloud. We next demonstrate the efficiency of the science cloud, showing several scientific results which we achieved with this cloud system. Through the discussions and demonstrations, the potential performance of sciences cloud will be revealed for any research fields.
Optimisation of Critical Infrastructure Protection: The SiVe Project on Airport Security
NASA Astrophysics Data System (ADS)
Breiing, Marcus; Cole, Mara; D'Avanzo, John; Geiger, Gebhard; Goldner, Sascha; Kuhlmann, Andreas; Lorenz, Claudia; Papproth, Alf; Petzel, Erhard; Schwetje, Oliver
This paper outlines the scientific goals, ongoing work and first results of the SiVe research project on critical infrastructure security. The methodology is generic while pilot studies are chosen from airport security. The outline proceeds in three major steps, (1) building a threat scenario, (2) development of simulation models as scenario refinements, and (3) assessment of alternatives. Advanced techniques of systems analysis and simulation are employed to model relevant airport structures and processes as well as offences. Computer experiments are carried out to compare and optimise alternative solutions. The optimality analyses draw on approaches to quantitative risk assessment recently developed in the operational sciences. To exploit the advantages of the various techniques, an integrated simulation workbench is build up in the project.
ERIC Educational Resources Information Center
Odell, Bill
2005-01-01
The spaces and structures used for undergraduate science often work against new teaching methods and fail to provide environments that attract the brightest students to science. The undergraduate science building often offers little to inspire the imaginations of young minds. The typical undergraduate science building also tends to work against…
Research Institute for Advanced Computer Science: Annual Report October 1998 through September 1999
NASA Technical Reports Server (NTRS)
Leiner, Barry M.; Gross, Anthony R. (Technical Monitor)
1999-01-01
The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center (ARC). It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. ARC has been designated NASA's Center of Excellence in Information Technology. In this capacity, ARC is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA ARC and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, and visiting scientist programs, designed to encourage and facilitate collaboration between the university and NASA information technology research communities.
Research Institute for Advanced Computer Science
NASA Technical Reports Server (NTRS)
Gross, Anthony R. (Technical Monitor); Leiner, Barry M.
2000-01-01
The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center. It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. Ames has been designated NASA's Center of Excellence in Information Technology. In this capacity, Ames is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA Ames and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth; (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking. Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, and visiting scientist programs, designed to encourage and facilitate collaboration between the university and NASA information technology research communities.
The HARNESS Workbench: Unified and Adaptive Access to Diverse HPC Platforms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sunderam, Vaidy S.
2012-03-20
The primary goal of the Harness WorkBench (HWB) project is to investigate innovative software environments that will help enhance the overall productivity of applications science on diverse HPC platforms. Two complementary frameworks were designed: one, a virtualized command toolkit for application building, deployment, and execution, that provides a common view across diverse HPC systems, in particular the DOE leadership computing platforms (Cray, IBM, SGI, and clusters); and two, a unified runtime environment that consolidates access to runtime services via an adaptive framework for execution-time and post processing activities. A prototype of the first was developed based on the concept ofmore » a 'system-call virtual machine' (SCVM), to enhance portability of the HPC application deployment process across heterogeneous high-end machines. The SCVM approach to portable builds is based on the insertion of toolkit-interpretable directives into original application build scripts. Modifications resulting from these directives preserve the semantics of the original build instruction flow. The execution of the build script is controlled by our toolkit that intercepts build script commands in a manner transparent to the end-user. We have applied this approach to a scientific production code (Gamess-US) on the Cray-XT5 machine. The second facet, termed Unibus, aims to facilitate provisioning and aggregation of multifaceted resources from resource providers and end-users perspectives. To achieve that, Unibus proposes a Capability Model and mediators (resource drivers) to virtualize access to diverse resources, and soft and successive conditioning to enable automatic and user-transparent resource provisioning. A proof of concept implementation has demonstrated the viability of this approach on high end machines, grid systems and computing clouds.« less
Final Report National Laboratory Professional Development Workshop for Underrepresented Participants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor, Valerie
The 2013 CMD-IT National Laboratories Professional Development Workshop for Underrepresented Participants (CMD-IT NLPDev 2013) was held at the Oak Ridge National Laboratory campus in Oak Ridge, TN. from June 13 - 14, 2013. Sponsored by the Department of Energy (DOE) Advanced Scientific Computing Research Program, the primary goal of these workshops is to provide information about career opportunities in computational science at the various national laboratories and to mentor the underrepresented participants through community building and expert presentations focused on career success. This second annual workshop offered sessions to facilitate career advancement and, in particular, the strategies and resources neededmore » to be successful at the national laboratories.« less
NASA Astrophysics Data System (ADS)
Moore, John W.
2000-05-01
The 1998 annual report of the Research Corporation ( http://www.rescorp.org) contains fascinating reading for anyone with an interest in science education at private institutions. An article titled "The Midas Touch: Do Soaring Endowments Have Any Impact on College Science" concludes that "college science is seldom more than an incidental beneficiary of endowment resources, even when they are conspicuously plentiful." Written by Research Corporation director of communication W. Stevenson Bacon, the article reports on a survey of leading undergraduate institutions, dividing them between those with endowments above and below 300 million. The first surprise to me was that Harvard's endowment of 727,522 per full-time equivalent (FTE) student is exceeded by Grinnell's 760,404, and Yale's 612,015 per FTE student is far exceeded by Agnes Scott's 692,914 (much of it in Coca-Cola stock and somewhat restricted) and closely rivaled by Swarthmore's 608,955. Of the eleven institutions in the Research Corporation survey, seven were above 300,000 per FTE student and only four were below. Private-college endowments have soared along with a soaring stock market. The Research Corporation report asks whether this increased endowment income is helping colleges to provide improved education in the sciences. A major use of endowment income and gift funds is for construction of buildings. Seven of the eleven institutions surveyed had building programs under way or planned for the sciences, and three of the four remaining expected to stress science facilities in upcoming campaigns. In some cases new buildings are designed to support science effectively, but in others, according to Research Corporation Vice President Michael Doyle, "the building is an elegant shell without modern instrumentation or flexibility for future uses." New construction serves to make a campus attractive to prospective students who will bring in the tuition fees that support most of a college's budget. An "elegant shell" may serve this goal adequately, and science faculty need to become intimately involved in building plans to ensure that a building is well equipped, flexible, and safe (see page 547 regarding safety). There appears to be little correlation between endowment and support for those who carry out research with undergraduates. Expectations regarding hours spent in classrooms and laboratories seem to depend on tradition. Some institutions below the 300,000/FTE line provide teaching credit for time spent with undergraduate research students, while many above it do not. A positive development is that five of the eleven institutions surveyed are raising endowment funds specifically to support summer student-faculty research programs, with campaign goals in the range from 0.5 to 6 million. This is a trend that could profitably be extended to many more colleges, because there is clear evidence that undergraduate research experience is strongly correlated with the success of students who are potential scientists. Endowment funds are being used to support startup packages for new faculty, which are required to attract the best teachers and researchers. From the survey, packages appear to be in the range from 20 to 50 thousand, and there has been a tenfold increase over the past 15 years. Endowment also supports purchases of instruments, where matching funds are required by federal grants. However, it is not always easy to come up with matching funds for big-ticket items like NMRs. Also, there is constant pressure to provide the latest in computer equipment, especially for use in teaching. Computers and other technology seem to become obsolete overnight, and maintaining facilities that will attract students who are more and more computer literate is an ongoing drain on endowment income. Recently competition for the best students has begun to draw endowment income away from science departments. In addition to scholarships based on need, merit awards have become de rigueur. There appears to be a trend to offer to match the best scholarship package a really good student has been able to get from a competing institution. The average tuition and fees paid at most institutions is well below the advertised "sticker price", and the difference is being made up from endowment income and gifts. Two thoughts came to me as I read the Research Corporation report. First, private funding agencies, such as the Research Corporation and the Camille and Henry Dreyfus Foundation (which sponsored JCE's Viewpoints series), are uniquely positioned to influence science research and science education in this country. Their reports and activities provide perspectives and ideas that those of us in the trenches might otherwise be too busy to come up with. Second, science departments in undergraduate institutions have considerable control over their destinies. Quoting the report, "small endowments and even substandard facilities do not rule out vigorous science departments-or even necessarily impact morale, if faculty can see that good use is being made of available resources." I would turn this around. If we don't allow external, uncontrollable forces to get us down, and if we work hard at things that will make a difference, we can accomplish a lot, even with only a little money. The most important factor is what we do- and what attitudes and habits of mind we impart to our students. A college or university that is well endowed with human resources provides the best possible venue for learning.
NASA Astrophysics Data System (ADS)
Berres, A.; Karthik, R.; Nugent, P.; Sorokine, A.; Myers, A.; Pang, H.
2017-12-01
Building an integrated data infrastructure that can meet the needs of a sustainable energy-water resource management requires a robust data management and geovisual analytics platform, capable of cross-domain scientific discovery and knowledge generation. Such a platform can facilitate the investigation of diverse complex research and policy questions for emerging priorities in Energy-Water Nexus (EWN) science areas. Using advanced data analytics, machine learning techniques, multi-dimensional statistical tools, and interactive geovisualization components, such a multi-layered federated platform is being developed, the Energy-Water Nexus Knowledge Discovery Framework (EWN-KDF). This platform utilizes several enterprise-grade software design concepts and standards such as extensible service-oriented architecture, open standard protocols, event-driven programming model, enterprise service bus, and adaptive user interfaces to provide a strategic value to the integrative computational and data infrastructure. EWN-KDF is built on the Compute and Data Environment for Science (CADES) environment in Oak Ridge National Laboratory (ORNL).
Design and Implementation of a Modern Automatic Deformation Monitoring System
NASA Astrophysics Data System (ADS)
Engel, Philipp; Schweimler, Björn
2016-03-01
The deformation monitoring of structures and buildings is an important task field of modern engineering surveying, ensuring the standing and reliability of supervised objects over a long period. Several commercial hardware and software solutions for the realization of such monitoring measurements are available on the market. In addition to them, a research team at the University of Applied Sciences in Neubrandenburg (NUAS) is actively developing a software package for monitoring purposes in geodesy and geotechnics, which is distributed under an open source licence and free of charge. The task of managing an open source project is well-known in computer science, but it is fairly new in a geodetic context. This paper contributes to that issue by detailing applications, frameworks, and interfaces for the design and implementation of open hardware and software solutions for sensor control, sensor networks, and data management in automatic deformation monitoring. It will be discussed how the development effort of networked applications can be reduced by using free programming tools, cloud computing technologies, and rapid prototyping methods.
Toward a first-principles integrated simulation of tokamak edge plasmas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chang, C S; Klasky, Scott A; Cummings, Julian
2008-01-01
Performance of the ITER is anticipated to be highly sensitive to the edge plasma condition. The edge pedestal in ITER needs to be predicted from an integrated simulation of the necessary firstprinciples, multi-scale physics codes. The mission of the SciDAC Fusion Simulation Project (FSP) Prototype Center for Plasma Edge Simulation (CPES) is to deliver such a code integration framework by (1) building new kinetic codes XGC0 and XGC1, which can simulate the edge pedestal buildup; (2) using and improving the existing MHD codes ELITE, M3D-OMP, M3D-MPP and NIMROD, for study of large-scale edge instabilities called Edge Localized Modes (ELMs); andmore » (3) integrating the codes into a framework using cutting-edge computer science technology. Collaborative effort among physics, computer science, and applied mathematics within CPES has created the first working version of the End-to-end Framework for Fusion Integrated Simulation (EFFIS), which can be used to study the pedestal-ELM cycles.« less
Global information infrastructure.
Lindberg, D A
1994-01-01
The High Performance Computing and Communications Program (HPCC) is a multiagency federal initiative under the leadership of the White House Office of Science and Technology Policy, established by the High Performance Computing Act of 1991. It has been assigned a critical role in supporting the international collaboration essential to science and to health care. Goals of the HPCC are to extend USA leadership in high performance computing and networking technologies; to improve technology transfer for economic competitiveness, education, and national security; and to provide a key part of the foundation for the National Information Infrastructure. The first component of the National Institutes of Health to participate in the HPCC, the National Library of Medicine (NLM), recently issued a solicitation for proposals to address a range of issues, from privacy to 'testbed' networks, 'virtual reality,' and more. These efforts will build upon the NLM's extensive outreach program and other initiatives, including the Unified Medical Language System (UMLS), MEDLARS, and Grateful Med. New Internet search tools are emerging, such as Gopher and 'Knowbots'. Medicine will succeed in developing future intelligent agents to assist in utilizing computer networks. Our ability to serve patients is so often restricted by lack of information and knowledge at the time and place of medical decision-making. The new technologies, properly employed, will also greatly enhance our ability to serve the patient.
The ACI-REF Program: Empowering Prospective Computational Researchers
NASA Astrophysics Data System (ADS)
Cuma, M.; Cardoen, W.; Collier, G.; Freeman, R. M., Jr.; Kitzmiller, A.; Michael, L.; Nomura, K. I.; Orendt, A.; Tanner, L.
2014-12-01
The ACI-REF program, Advanced Cyberinfrastructure - Research and Education Facilitation, represents a consortium of academic institutions seeking to further advance the capabilities of their respective campus research communities through an extension of the personal connections and educational activities that underlie the unique and often specialized cyberinfrastructure at each institution. This consortium currently includes Clemson University, Harvard University, University of Hawai'i, University of Southern California, University of Utah, and University of Wisconsin. Working together in a coordinated effort, the consortium is dedicated to the adoption of models and strategies which leverage the expertise and experience of its members with a goal of maximizing the impact of each institution's investment in research computing. The ACI-REFs (facilitators) are tasked with making connections and building bridges between the local campus researchers and the many different providers of campus, commercial, and national computing resources. Through these bridges, ACI-REFs assist researchers from all disciplines in understanding their computing and data needs and in mapping these needs to existing capabilities or providing assistance with development of these capabilities. From the Earth sciences perspective, we will give examples of how this assistance improved methods and workflows in geophysics, geography and atmospheric sciences. We anticipate that this effort will expand the number of researchers who become self-sufficient users of advanced computing resources, allowing them to focus on making research discoveries in a more timely and efficient manner.
NASA Technical Reports Server (NTRS)
Srivastava, Deepak; Meyyappan, Meyya; Yan, Jerry (Technical Monitor)
2000-01-01
Advanced miniaturization, a key thrust area to enable new science and exploration missions, provides ultrasmall sensors, power sources, communication, navigation, and propulsion systems with very low mass, volume, and power consumption. Revolutions in electronics and computing will allow reconfigurable, autonomous, 'thinking' spacecraft. Nanotechnology presents a whole new spectrum of opportunities to build device components and systems for entirely new space architectures: (1) networks of ultrasmall probes on planetary surfaces; (2) micro-rovers that drive, hop, fly, and burrow; and (3) collections of microspacecraft making a variety of measurements.
Final Technical Report - Center for Technology for Advanced Scientific Component Software (TASCS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sussman, Alan
2014-10-21
This is a final technical report for the University of Maryland work in the SciDAC Center for Technology for Advanced Scientific Component Software (TASCS). The Maryland work focused on software tools for coupling parallel software components built using the Common Component Architecture (CCA) APIs. Those tools are based on the Maryland InterComm software framework that has been used in multiple computational science applications to build large-scale simulations of complex physical systems that employ multiple separately developed codes.
Selin, Cynthia; Rawlings, Kelly Campbell; de Ridder-Vignone, Kathryn; Sadowski, Jathan; Altamirano Allende, Carlo; Gano, Gretchen; Davies, Sarah R; Guston, David H
2017-08-01
Public engagement with science and technology is now widely used in science policy and communication. Touted as a means of enhancing democratic discussion of science and technology, analysis of public engagement with science and technology has shown that it is often weakly tied to scientific governance. In this article, we suggest that the notion of capacity building might be a way of reframing the democratic potential of public engagement with science and technology activities. Drawing on literatures from public policy and administration, we outline how public engagement with science and technology might build citizen capacity, before using the notion of capacity building to develop five principles for the design of public engagement with science and technology. We demonstrate the use of these principles through a discussion of the development and realization of the pilot for a large-scale public engagement with science and technology activity, the Futurescape City Tours, which was carried out in Arizona in 2012.
Applying science and mathematics to big data for smarter buildings.
Lee, Young M; An, Lianjun; Liu, Fei; Horesh, Raya; Chae, Young Tae; Zhang, Rui
2013-08-01
Many buildings are now collecting a large amount of data on operations, energy consumption, and activities through systems such as a building management system (BMS), sensors, and meters (e.g., submeters and smart meters). However, the majority of data are not utilized and are thrown away. Science and mathematics can play an important role in utilizing these big data and accurately assessing how energy is consumed in buildings and what can be done to save energy, make buildings energy efficient, and reduce greenhouse gas (GHG) emissions. This paper discusses an analytical tool that has been developed to assist building owners, facility managers, operators, and tenants of buildings in assessing, benchmarking, diagnosing, tracking, forecasting, and simulating energy consumption in building portfolios. © 2013 New York Academy of Sciences.
Verifying a computational method for predicting extreme ground motion
Harris, R.A.; Barall, M.; Andrews, D.J.; Duan, B.; Ma, S.; Dunham, E.M.; Gabriel, A.-A.; Kaneko, Y.; Kase, Y.; Aagaard, Brad T.; Oglesby, D.D.; Ampuero, J.-P.; Hanks, T.C.; Abrahamson, N.
2011-01-01
In situations where seismological data is rare or nonexistent, computer simulations may be used to predict ground motions caused by future earthquakes. This is particularly practical in the case of extreme ground motions, where engineers of special buildings may need to design for an event that has not been historically observed but which may occur in the far-distant future. Once the simulations have been performed, however, they still need to be tested. The SCEC-USGS dynamic rupture code verification exercise provides a testing mechanism for simulations that involve spontaneous earthquake rupture. We have performed this examination for the specific computer code that was used to predict maximum possible ground motion near Yucca Mountain. Our SCEC-USGS group exercises have demonstrated that the specific computer code that was used for the Yucca Mountain simulations produces similar results to those produced by other computer codes when tackling the same science problem. We also found that the 3D ground motion simulations produced smaller ground motions than the 2D simulations.
Expertise transfer for expert system design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boose, J.H.
This book is about the Expertise Transfer System-a computer program which interviews experts and helps them build expert systems, i.e. computer programs that use knowledge from experts to make decisions and judgements under conditions of uncertainty. The techniques are useful to anyone who uses decision-making information based on the expertise of others. The methods can also be applied to personal decision-making. The interviewing methodology is borrowed from a branch of psychology called Personal Construct Theory. It is not necessary to use a computer to take advantage of the techniques from Personal Construction Theory; the fundamental procedures used by the Expertisemore » Transfer System can be performed using paper and pencil. It is not necessary that the reader understand very much about computers to understand the ideas in this book. The few relevant concepts from computer science and expert systems that are needed are explained in a straightforward manner. Ideas from Personal Construct Psychology are also introduced as needed.« less
Allen Newell's Program of Research: The Video-Game Test.
Gobet, Fernand
2017-04-01
Newell (1973) argued that progress in psychology was slow because research focused on experiments trying to answer binary questions, such as serial versus parallel processing. In addition, not enough attention was paid to the strategies used by participants, and there was a lack of theories implemented as computer models offering sufficient precision for being tested rigorously. He proposed a three-headed research program: to develop computational models able to carry out the task they aimed to explain; to study one complex task in detail, such as chess; and to build computational models that can account for multiple tasks. This article assesses the extent to which the papers in this issue advance Newell's program. While half of the papers devote much attention to strategies, several papers still average across them, a capital sin according to Newell. The three courses of action he proposed were not popular in these papers: Only two papers used computational models, with no model being both able to carry out the task and to account for human data; there was no systematic analysis of a specific video game; and no paper proposed a computational model accounting for human data in several tasks. It is concluded that, while they use sophisticated methods of analysis and discuss interesting results, overall these papers contribute only little to Newell's program of research. In this respect, they reflect the current state of psychology and cognitive science. This is a shame, as Newell's ideas might help address the current crisis of lack of replication and fraud in psychology. Copyright © 2017 The Author. Topics in Cognitive Science published by Wiley Periodicals, Inc. on behalf of Cognitive Science Society.
CE-ACCE: The Cloud Enabled Advanced sCience Compute Environment
NASA Astrophysics Data System (ADS)
Cinquini, L.; Freeborn, D. J.; Hardman, S. H.; Wong, C.
2017-12-01
Traditionally, Earth Science data from NASA remote sensing instruments has been processed by building custom data processing pipelines (often based on a common workflow engine or framework) which are typically deployed and run on an internal cluster of computing resources. This approach has some intrinsic limitations: it requires each mission to develop and deploy a custom software package on top of the adopted framework; it makes use of dedicated hardware, network and storage resources, which must be specifically purchased, maintained and re-purposed at mission completion; and computing services cannot be scaled on demand beyond the capability of the available servers.More recently, the rise of Cloud computing, coupled with other advances in containerization technology (most prominently, Docker) and micro-services architecture, has enabled a new paradigm, whereby space mission data can be processed through standard system architectures, which can be seamlessly deployed and scaled on demand on either on-premise clusters, or commercial Cloud providers. In this talk, we will present one such architecture named CE-ACCE ("Cloud Enabled Advanced sCience Compute Environment"), which we have been developing at the NASA Jet Propulsion Laboratory over the past year. CE-ACCE is based on the Apache OODT ("Object Oriented Data Technology") suite of services for full data lifecycle management, which are turned into a composable array of Docker images, and complemented by a plug-in model for mission-specific customization. We have applied this infrastructure to both flying and upcoming NASA missions, such as ECOSTRESS and SMAP, and demonstrated deployment on the Amazon Cloud, either using simple EC2 instances, or advanced AWS services such as Amazon Lambda and ECS (EC2 Container Services).
NASA Astrophysics Data System (ADS)
Mazingo, Diann Etsuko
Feedback has been identified as a key variable in developing academic self-efficacy. The types of feedback can vary from a traditional, objectivist approach that focuses on minimizing learner errors to a more constructivist approach, focusing on facilitating understanding. The influx of computer-based courses, whether online or through a series of computer-assisted instruction (CAI) modules require that the current research of effective feedback techniques in the classroom be extended to computer environments in order to impact their instructional design. In this study, exposure to different types of feedback during a chemistry CAI module was studied in relation to science self-efficacy (SSE) and performance on an objective-driven assessment (ODA) of the chemistry concepts covered in the unit. The quantitative analysis consisted of two separate ANCOVAs on the dependent variables, using pretest as the covariate and group as the fixed factor. No significant differences were found for either variable between the three groups on adjusted posttest means for the ODA and SSE measures (.95F(2, 106) = 1.311, p = 0.274 and .95F(2, 106) = 1.080, p = 0.344, respectively). However, a mixed methods approach yielded valuable qualitative insights into why only one overall quantitative effect was observed. These findings are discussed in relation to the need to further refine the instruments and methods used in order to more fully explore the possibility that type of feedback might play a role in developing SSE, and consequently, improve academic performance in science. Future research building on this study may reveal significance that could impact instructional design practices for developing online and computer-based instruction.
Meteor Observations as Big Data Citizen Science
NASA Astrophysics Data System (ADS)
Gritsevich, M.; Vinkovic, D.; Schwarz, G.; Nina, A.; Koschny, D.; Lyytinen, E.
2016-12-01
Meteor science represents an excellent example of the citizen science project, where progress in the field has been largely determined by amateur observations. Over the last couple of decades technological advancements in observational techniques have yielded drastic improvements in the quality, quantity and diversity of meteor data, while even more ambitious instruments are about to become operational. This empowers meteor science to boost its experimental and theoretical horizons and seek more advanced scientific goals. We review some of the developments that push meteor science into the Big Data era that requires more complex methodological approaches through interdisciplinary collaborations with other branches of physics and computer science. We argue that meteor science should become an integral part of large surveys in astronomy, aeronomy and space physics, and tackle the complexity of micro-physics of meteor plasma and its interaction with the atmosphere. The recent increased interest in meteor science triggered by the Chelyabinsk fireball helps in building the case for technologically and logistically more ambitious meteor projects. This requires developing new methodological approaches in meteor research, with Big Data science and close collaboration between citizen science, geoscience and astronomy as critical elements. We discuss possibilities for improvements and promote an opportunity for collaboration in meteor science within the currently established BigSkyEarth http://bigskyearth.eu/ network.
Cheyney University Curriculum and Infrastructure Enhamcement in STEM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eva, Sakkar Ara
Cheyney University is the oldest historically Black educational institution in America. Initially established as a “normal” school emphasizing the matriculation of educators, Cheyney has become a comprehensive university, one of 14 state universities comprising the Pennsylvania State System of Higher Education (PASSHE). Cheyney University graduates still become teachers, but they also enter such fields as journalism, medicine, science, mathematics, law, communication and government. Cheyney University is a small state owned HBCU with very limited resource. At present the university has about a thousand students with 15% in STEM. The CUCIES II grant made significant contribution in saving the computer sciencemore » program from being a discontinued program in the university. The grant enabled the university to hire a temporary faculty to teach in and update the computer science program. The program is enhanced with three tracks; cyber security, human computer interaction and general. The updated and enhanced computer science program will prepare professionals in the area of computer science with the knowledge, skills, and professional ethic needed for the current market. The new curriculum was developed for a professional profile that would focus on the technologies and techniques currently used in the industry. With faculty on board, the university worked with the department to bring back the computer science program from moratorium. Once in the path of being discontinued and loosing students, the program is now growing. Currently the student number has increased from 12 to 30. University is currently in the process of hiring a tenure track faculty in the computer science program. Another product of the grant is the proposal for introductory course in nanotechnology. The course is intended to generate interest in the nanotechnology field. The Natural and Applied Science department that houses all of the STEM programs in Cheyney University, is currently working to bring back environmental science program from moratorium. The university has been working to improve minority participation in STEM and made significant stride in terms of progressing students toward graduate programs and into professoriate track. This success is due to faculty mentors who work closely with students to guiding them through the application processes for research internship and graduate programs; it is also due to the university forming collaborative agreements with research intensive institutions, federal and state agencies and industry. The grant assisted in recruiting and retaining students in STEM by offering tuition scholarship, research scholarship and travel awards. Faculty professional development was supported by the grant by funding travel to conferences, meetings and webinar. As many HBCU Cheyney University is also trying to do more with less. As the STEM programs are inherently expensive, these are the ones that suffer more when resources are scarce. One of the goals of Cheyney University strategic plan is to strengthen STEM programs that is coherent with the critical skill need of Department of Energy. All of the Cheyney University STEM programs are now located in the new science building funded by Pennsylvania state.« less
Supporting Teachers to Automatically Build Accessible Pedagogical Resources: The APEINTA Project
NASA Astrophysics Data System (ADS)
Iglesias, Ana; Moreno, Lourdes; Jiménez, Javier
Most of the universities in Europe have started their process of adaptation towards a common educational space according to the European Higher Education Area (EHEA). The social dimension of the Bologna Process is a constituent part of the EHEA and it is a necessary condition for the attractiveness and competitiveness of the EHEA. Two of the main features of the social dimension are the equal access for all the students and the lifelong learning. One of the main problems of the adaptation process to the EHEA is that the teachers have no previous references and models to develop new pedagogical experiences accessible to all the students, nevertheless of their abilities, capabilities or accessibility characteristics. The APEINTA project presented in this paper can be used as a helpful tool for teachers in order to cope with the teaching demands of EHEA, helping the teachers to automatically build accessible pedagogical resources even when the teachers are not accessibility experts. This educational project has been successfully used in 2009 in two different degrees at the Carlos III University of Madrid: Computer Science and Library and Information Science.
Biotic games and cloud experimentation as novel media for biophysics education
NASA Astrophysics Data System (ADS)
Riedel-Kruse, Ingmar; Blikstein, Paulo
2014-03-01
First-hand, open-ended experimentation is key for effective formal and informal biophysics education. We developed, tested and assessed multiple new platforms that enable students and children to directly interact with and learn about microscopic biophysical processes: (1) Biotic games that enable local and online play using galvano- and photo-tactic stimulation of micro-swimmers, illustrating concepts such as biased random walks, Low Reynolds number hydrodynamics, and Brownian motion; (2) an undergraduate course where students learn optics, electronics, micro-fluidics, real time image analysis, and instrument control by building biotic games; and (3) a graduate class on the biophysics of multi-cellular systems that contains a cloud experimentation lab enabling students to execute open-ended chemotaxis experiments on slimemolds online, analyze their data, and build biophysical models. Our work aims to generate the equivalent excitement and educational impact for biophysics as robotics and video games have had for mechatronics and computer science, respectively. We also discuss how scaled-up cloud experimentation systems can support MOOCs with true lab components and life-science research in general.
SRT Status and Plans for Version-7
NASA Technical Reports Server (NTRS)
Susskind, Joel; Blaisdell, John; Iredell, Lena; Kouvaris, Louis
2015-01-01
The AIRS Science Team Version-6 retrieval algorithm is currently producing level-3 Climate Data Records (CDRs) from AIRS that have been proven useful to scientists in understanding climate processes. CDRs are gridded level-3 products which include all cases passing AIRS Climate QC. SRT has made significant further improvements to AIRS Version-6. Research is continuing at SRT toward the development of AIRS Version-7. At the last Science Team Meeting, we described results using SRT AIRS Version-6.19. SRT Version-6.19 is now an official build at JPL called 6.2. SRTs latest version is AIRS Version-6.22. We have also adapted AIRS Version-6.22 to run with CrISATMS. AIRS Version-6.22 and CrIS Version- 6.22 both run now on JPL computers, but are not yet official builds. The main reason for finalization of Version-7, and using it in the relatively near future for the future processing and reprocessing of old AIRS data, is to produce even better CDRs for use by climate scientists. For this reason all results shown in this talk use only AIRS Climate QC.
A cognitive computational model inspired by the immune system response.
Abdo Abd Al-Hady, Mohamed; Badr, Amr Ahmed; Mostafa, Mostafa Abd Al-Azim
2014-01-01
The immune system has a cognitive ability to differentiate between healthy and unhealthy cells. The immune system response (ISR) is stimulated by a disorder in the temporary fuzzy state that is oscillating between the healthy and unhealthy states. However, modeling the immune system is an enormous challenge; the paper introduces an extensive summary of how the immune system response functions, as an overview of a complex topic, to present the immune system as a cognitive intelligent agent. The homogeneity and perfection of the natural immune system have been always standing out as the sought-after model we attempted to imitate while building our proposed model of cognitive architecture. The paper divides the ISR into four logical phases: setting a computational architectural diagram for each phase, proceeding from functional perspectives (input, process, and output), and their consequences. The proposed architecture components are defined by matching biological operations with computational functions and hence with the framework of the paper. On the other hand, the architecture focuses on the interoperability of main theoretical immunological perspectives (classic, cognitive, and danger theory), as related to computer science terminologies. The paper presents a descriptive model of immune system, to figure out the nature of response, deemed to be intrinsic for building a hybrid computational model based on a cognitive intelligent agent perspective and inspired by the natural biology. To that end, this paper highlights the ISR phases as applied to a case study on hepatitis C virus, meanwhile illustrating our proposed architecture perspective.
A Cognitive Computational Model Inspired by the Immune System Response
Abdo Abd Al-Hady, Mohamed; Badr, Amr Ahmed; Mostafa, Mostafa Abd Al-Azim
2014-01-01
The immune system has a cognitive ability to differentiate between healthy and unhealthy cells. The immune system response (ISR) is stimulated by a disorder in the temporary fuzzy state that is oscillating between the healthy and unhealthy states. However, modeling the immune system is an enormous challenge; the paper introduces an extensive summary of how the immune system response functions, as an overview of a complex topic, to present the immune system as a cognitive intelligent agent. The homogeneity and perfection of the natural immune system have been always standing out as the sought-after model we attempted to imitate while building our proposed model of cognitive architecture. The paper divides the ISR into four logical phases: setting a computational architectural diagram for each phase, proceeding from functional perspectives (input, process, and output), and their consequences. The proposed architecture components are defined by matching biological operations with computational functions and hence with the framework of the paper. On the other hand, the architecture focuses on the interoperability of main theoretical immunological perspectives (classic, cognitive, and danger theory), as related to computer science terminologies. The paper presents a descriptive model of immune system, to figure out the nature of response, deemed to be intrinsic for building a hybrid computational model based on a cognitive intelligent agent perspective and inspired by the natural biology. To that end, this paper highlights the ISR phases as applied to a case study on hepatitis C virus, meanwhile illustrating our proposed architecture perspective. PMID:25003131
Science Support: The Building Blocks of Active Data Curation
NASA Astrophysics Data System (ADS)
Guillory, A.
2013-12-01
While the scientific method is built on reproducibility and transparency, and results are published in peer reviewed literature, we have come to the digital age of very large datasets (now of the order of petabytes and soon exabytes) which cannot be published in the traditional way. To preserve reproducibility and transparency, active curation is necessary to keep and protect the information in the long term, and 'science support' activities provide the building blocks for active data curation. With the explosive growth of data in all fields in recent years, there is a pressing urge for data centres to now provide adequate services to ensure long-term preservation and digital curation of project data outputs, however complex those may be. Science support provides advice and support to science projects on data and information management, from file formats through to general data management awareness. Another purpose of science support is to raise awareness in the science community of data and metadata standards and best practice, engendering a culture where data outputs are seen as valued assets. At the heart of Science support is the Data Management Plan (DMP) which sets out a coherent approach to data issues pertaining to the data generating project. It provides an agreed record of the data management needs and issues within the project. The DMP is agreed upon with project investigators to ensure that a high quality documented data archive is created. It includes conditions of use and deposit to clearly express the ownership, responsibilities and rights associated with the data. Project specific needs are also identified for data processing, visualization tools and data sharing services. As part of the National Centre for Atmospheric Science (NCAS) and National Centre for Earth Observation (NCEO), the Centre for Environmental Data Archival (CEDA) fulfills this science support role of facilitating atmospheric and Earth observation data generating projects to ensure successful management of the data and accompanying information for reuse and repurpose. Specific examples at CEDA include science support provided to FAAM (Facility for Airborne Atmospheric Measurements) aircraft campaigns and large-scale modelling projects such as UPSCALE, the largest ever PRACE (Partnership for Advanced Computing in Europe) computational project, dependent on CEDA to provide the high-performance storage, transfer capability and data analysis environment on the 'super-data-cluster' JASMIN. The impact of science support on scientific research is conspicuous: better documented datasets with an increasing collection of metadata associated to the archived data, ease of data sharing with the use of standards in formats and metadata and data citation. These establish a high-quality of data management ensuring long-term preservation and enabling re-use by peer scientists which ultimately leads to faster paced progress in science.
Fault-tolerant computer study. [logic designs for building block circuits
NASA Technical Reports Server (NTRS)
Rennels, D. A.; Avizienis, A. A.; Ercegovac, M. D.
1981-01-01
A set of building block circuits is described which can be used with commercially available microprocessors and memories to implement fault tolerant distributed computer systems. Each building block circuit is intended for VLSI implementation as a single chip. Several building blocks and associated processor and memory chips form a self checking computer module with self contained input output and interfaces to redundant communications buses. Fault tolerance is achieved by connecting self checking computer modules into a redundant network in which backup buses and computer modules are provided to circumvent failures. The requirements and design methodology which led to the definition of the building block circuits are discussed.
77 FR 61771 - National Institute of Environmental Health Sciences; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-11
... applications. Place: National Institute of Environmental Health Sciences, Building 101, Rodbell Auditorium, 111... Sciences, Building 101, Rodbell Auditorium, 111 T. W. Alexander Drive, Research Triangle Park, NC 27709...
Project Mapping to Build Capacity and Demonstrate Impact in the Earth Sciences
NASA Astrophysics Data System (ADS)
Hemmings, S. N.; Searby, N. D.; Murphy, K. J.; Mataya, C. J.; Crepps, G.; Clayton, A.; Stevens, C. L.
2017-12-01
Diverse organizations are increasingly using project mapping to communicate location-based information about their activities. NASA's Earth Science Division (ESD), through the Earth Science Data Systems and Applied Sciences' Capacity Building Program (CBP), has created a geographic information system of all ESD projects to support internal program management for the agency. The CBP's NASA DEVELOP program has built an interactive mapping tool to support capacity building for the program's varied constituents. This presentation will explore the types of programmatic opportunities provided by a geographic approach to management, communication, and strategic planning. We will also discuss the various external benefits that mapping supports and that build capacity in the Earth sciences. These include activities such as project matching (location-focused synergies), portfolio planning, inter- and intra-organizational collaboration, science diplomacy, and basic impact analysis.
Can the Computer Design a School Building?
ERIC Educational Resources Information Center
Roberts, Charles
The implications of computer technology and architecture are discussed with reference to school building design. A brief introduction is given of computer applications in other fields leading to the conclusions that computers alone cannot design school buildings but may serve as a useful tool in the overall design process. Specific examples are…
Relevancy in Problem Solving: A Computational Framework
ERIC Educational Resources Information Center
Kwisthout, Johan
2012-01-01
When computer scientists discuss the computational complexity of, for example, finding the shortest path from building A to building B in some town or city, their starting point typically is a formal description of the problem at hand, e.g., a graph with weights on every edge where buildings correspond to vertices, routes between buildings to…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Langer, S; Rotman, D; Schwegler, E
The Institutional Computing Executive Group (ICEG) review of FY05-06 Multiprogrammatic and Institutional Computing (M and IC) activities is presented in the attached report. In summary, we find that the M and IC staff does an outstanding job of acquiring and supporting a wide range of institutional computing resources to meet the programmatic and scientific goals of LLNL. The responsiveness and high quality of support given to users and the programs investing in M and IC reflects the dedication and skill of the M and IC staff. M and IC has successfully managed serial capacity, parallel capacity, and capability computing resources.more » Serial capacity computing supports a wide range of scientific projects which require access to a few high performance processors within a shared memory computer. Parallel capacity computing supports scientific projects that require a moderate number of processors (up to roughly 1000) on a parallel computer. Capability computing supports parallel jobs that push the limits of simulation science. M and IC has worked closely with Stockpile Stewardship, and together they have made LLNL a premier institution for computational and simulation science. Such a standing is vital to the continued success of laboratory science programs and to the recruitment and retention of top scientists. This report provides recommendations to build on M and IC's accomplishments and improve simulation capabilities at LLNL. We recommend that institution fully fund (1) operation of the atlas cluster purchased in FY06 to support a few large projects; (2) operation of the thunder and zeus clusters to enable 'mid-range' parallel capacity simulations during normal operation and a limited number of large simulations during dedicated application time; (3) operation of the new yana cluster to support a wide range of serial capacity simulations; (4) improvements to the reliability and performance of the Lustre parallel file system; (5) support for the new GDO petabyte-class storage facility on the green network for use in data intensive external collaborations; and (6) continued support for visualization and other methods for analyzing large simulations. We also recommend that M and IC begin planning in FY07 for the next upgrade of its parallel clusters. LLNL investments in M and IC have resulted in a world-class simulation capability leading to innovative science. We thank the LLNL management for its continued support and thank the M and IC staff for its vision and dedicated efforts to make it all happen.« less
Health sciences library building projects, 1998 survey.
Bowden, V M
1999-01-01
Twenty-eight health sciences library building projects are briefly described, including twelve new buildings and sixteen additions, remodelings, and renovations. The libraries range in size from 2,144 square feet to 190,000 gross square feet. Twelve libraries are described in detail. These include three hospital libraries, one information center sponsored by ten institutions, and eight academic health sciences libraries. Images PMID:10550027
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barker, Ashley D.; Bernholdt, David E.; Bland, Arthur S.
Oak Ridge National Laboratory’s (ORNL’s) Leadership Computing Facility (OLCF) continues to surpass its operational target goals: supporting users; delivering fast, reliable systems; creating innovative solutions for high-performance computing (HPC) needs; and managing risks, safety, and security aspects associated with operating one of the most powerful computers in the world. The results can be seen in the cutting-edge science delivered by users and the praise from the research community. Calendar year (CY) 2015 was filled with outstanding operational results and accomplishments: a very high rating from users on overall satisfaction that ties the highest-ever mark set in CY 2014; the greatestmore » number of core-hours delivered to research projects; the largest percentage of capability usage since the OLCF began tracking the metric in 2009; and success in delivering on the allocation of 60, 30, and 10% of core hours offered for the INCITE (Innovative and Novel Computational Impact on Theory and Experiment), ALCC (Advanced Scientific Computing Research Leadership Computing Challenge), and Director’s Discretionary programs, respectively. These accomplishments, coupled with the extremely high utilization rate, represent the fulfillment of the promise of Titan: maximum use by maximum-size simulations. The impact of all of these successes and more is reflected in the accomplishments of OLCF users, with publications this year in notable journals Nature, Nature Materials, Nature Chemistry, Nature Physics, Nature Climate Change, ACS Nano, Journal of the American Chemical Society, and Physical Review Letters, as well as many others. The achievements included in the 2015 OLCF Operational Assessment Report reflect first-ever or largest simulations in their communities; for example Titan enabled engineers in Los Angeles and the surrounding region to design and begin building improved critical infrastructure by enabling the highest-resolution Cybershake map for Southern California to date. The Titan system provides the largest extant heterogeneous architecture for computing and computational science. Usage is high, delivering on the promise of a system well-suited for capability simulations for science. This success is due in part to innovations in tracking and reporting the activity on the compute nodes, and using this information to further enable and optimize applications, extending and balancing workload across the entire node. The OLCF continues to invest in innovative processes, tools, and resources necessary to meet continuing user demand. The facility’s leadership in data analysis and workflows was featured at the Department of Energy (DOE) booth at SC15, for the second year in a row, highlighting work with researchers from the National Library of Medicine coupled with unique computational and data resources serving experimental and observational data across facilities. Effective operations of the OLCF play a key role in the scientific missions and accomplishments of its users. Building on the exemplary year of 2014, as shown by the 2014 Operational Assessment Report (OAR) review committee response in Appendix A, this OAR delineates the policies, procedures, and innovations implemented by the OLCF to continue delivering a multi-petaflop resource for cutting-edge research. This report covers CY 2015, which, unless otherwise specified, denotes January 1, 2015, through December 31, 2015.« less
Building the biomedical data science workforce.
Dunn, Michelle C; Bourne, Philip E
2017-07-01
This article describes efforts at the National Institutes of Health (NIH) from 2013 to 2016 to train a national workforce in biomedical data science. We provide an analysis of the Big Data to Knowledge (BD2K) training program strengths and weaknesses with an eye toward future directions aimed at any funder and potential funding recipient worldwide. The focus is on extramurally funded programs that have a national or international impact rather than the training of NIH staff, which was addressed by the NIH's internal Data Science Workforce Development Center. From its inception, the major goal of BD2K was to narrow the gap between needed and existing biomedical data science skills. As biomedical research increasingly relies on computational, mathematical, and statistical thinking, supporting the training and education of the workforce of tomorrow requires new emphases on analytical skills. From 2013 to 2016, BD2K jump-started training in this area for all levels, from graduate students to senior researchers.
Building the biomedical data science workforce
Dunn, Michelle C.; Bourne, Philip E.
2017-01-01
This article describes efforts at the National Institutes of Health (NIH) from 2013 to 2016 to train a national workforce in biomedical data science. We provide an analysis of the Big Data to Knowledge (BD2K) training program strengths and weaknesses with an eye toward future directions aimed at any funder and potential funding recipient worldwide. The focus is on extramurally funded programs that have a national or international impact rather than the training of NIH staff, which was addressed by the NIH’s internal Data Science Workforce Development Center. From its inception, the major goal of BD2K was to narrow the gap between needed and existing biomedical data science skills. As biomedical research increasingly relies on computational, mathematical, and statistical thinking, supporting the training and education of the workforce of tomorrow requires new emphases on analytical skills. From 2013 to 2016, BD2K jump-started training in this area for all levels, from graduate students to senior researchers. PMID:28715407
Enlarging the STEM pipeline working with youth-serving organizations
NASA Astrophysics Data System (ADS)
Porro, I.
2005-12-01
The After-School Astronomy Project (ASAP) is a comprehensive initiative to promote the pursuit of science learning among underrepresented youth. To this end ASAP specifically aims at building the capacity of urban community-based centers to deliver innovative science out-of-school programming to their youth. ASAP makes use of a modular curriculum consisting of a combination of hands-on activities and youth-led explorations of the night sky using MicroObservatory. Through project-based investigations students reinforce learning in astronomy and develop an understanding of science as inquiry, while also develop communication and computer skills. Through MicroObservatory students gain access to a network of educational telescopes, that they control over the Internet, software analysis tools and an online community of users. An integral part of ASAP is to provide professional development opportunities for after-school workers. This promotes a self-sustainable implementation of ASAP long-term and fosters the creation of a cadre of after-school professionals dedicated to facilitating science-based programs.
SAFETY IN THE DESIGN OF SCIENCE LABORATORIES AND BUILDING CODES.
ERIC Educational Resources Information Center
HOROWITZ, HAROLD
THE DESIGN OF COLLEGE AND UNIVERSITY BUILDINGS USED FOR SCIENTIFIC RESEARCH AND EDUCATION IS DISCUSSED IN TERMS OF LABORATORY SAFETY AND BUILDING CODES AND REGULATIONS. MAJOR TOPIC AREAS ARE--(1) SAFETY RELATED DESIGN FEATURES OF SCIENCE LABORATORIES, (2) LABORATORY SAFETY AND BUILDING CODES, AND (3) EVIDENCE OF UNSAFE DESIGN. EXAMPLES EMPHASIZE…
Data issues in the life sciences.
Thessen, Anne E; Patterson, David J
2011-01-01
We review technical and sociological issues facing the Life Sciences as they transform into more data-centric disciplines - the "Big New Biology". Three major challenges are: 1) lack of comprehensive standards; 2) lack of incentives for individual scientists to share data; 3) lack of appropriate infrastructure and support. Technological advances with standards, bandwidth, distributed computing, exemplar successes, and a strong presence in the emerging world of Linked Open Data are sufficient to conclude that technical issues will be overcome in the foreseeable future. While motivated to have a shared open infrastructure and data pool, and pressured by funding agencies in move in this direction, the sociological issues determine progress. Major sociological issues include our lack of understanding of the heterogeneous data cultures within Life Sciences, and the impediments to progress include a lack of incentives to build appropriate infrastructures into projects and institutions or to encourage scientists to make data openly available.
Data issues in the life sciences
Thessen, Anne E.; Patterson, David J.
2011-01-01
Abstract We review technical and sociological issues facing the Life Sciences as they transform into more data-centric disciplines - the “Big New Biology”. Three major challenges are: 1) lack of comprehensive standards; 2) lack of incentives for individual scientists to share data; 3) lack of appropriate infrastructure and support. Technological advances with standards, bandwidth, distributed computing, exemplar successes, and a strong presence in the emerging world of Linked Open Data are sufficient to conclude that technical issues will be overcome in the foreseeable future. While motivated to have a shared open infrastructure and data pool, and pressured by funding agencies in move in this direction, the sociological issues determine progress. Major sociological issues include our lack of understanding of the heterogeneous data cultures within Life Sciences, and the impediments to progress include a lack of incentives to build appropriate infrastructures into projects and institutions or to encourage scientists to make data openly available. PMID:22207805
Earth Science Data Fusion with Event Building Approach
NASA Technical Reports Server (NTRS)
Lukashin, C.; Bartle, Ar.; Callaway, E.; Gyurjyan, V.; Mancilla, S.; Oyarzun, R.; Vakhnin, A.
2015-01-01
Objectives of the NASA Information And Data System (NAIADS) project are to develop a prototype of a conceptually new middleware framework to modernize and significantly improve efficiency of the Earth Science data fusion, big data processing and analytics. The key components of the NAIADS include: Service Oriented Architecture (SOA) multi-lingual framework, multi-sensor coincident data Predictor, fast into-memory data Staging, multi-sensor data-Event Builder, complete data-Event streaming (a work flow with minimized IO), on-line data processing control and analytics services. The NAIADS project is leveraging CLARA framework, developed in Jefferson Lab, and integrated with the ZeroMQ messaging library. The science services are prototyped and incorporated into the system. Merging the SCIAMACHY Level-1 observations and MODIS/Terra Level-2 (Clouds and Aerosols) data products, and ECMWF re- analysis will be used for NAIADS demonstration and performance tests in compute Cloud and Cluster environments.
The building of the EUDAT Cross-Disciplinary Data Infrastructure
NASA Astrophysics Data System (ADS)
Lecarpentier, Damien; Michelini, Alberto; Wittenburg, Peter
2013-04-01
The EUDAT project is a European data initiative that brings together a unique consortium of 25 partners - including research communities, national data and high performance computing (HPC) centers, technology providers, and funding agencies - from 13 countries. EUDAT aims to build a sustainable cross-disciplinary and cross-national Commom Data Infrastructure (CDI) that provides a set of shared services for accessing and preserving research data. The design and deployment of these services is being coordinated by multi-disciplinary task forces comprising representatives from research communities and data centers. One of EUDAT's fundamental goals is the facilitation of cross-disciplinary data-intensive science. By providing opportunity for disciplines from across the spectrum to share data and cross-fertilize ideas, the CDI will encourage progress towards this vision of open and participatory data-intensive science. EUDAT will also facilitate this process through the creation of teams of experts from different disciplines, aiming to cooperatively develop services to meet the needs of several communities. Five research communities joined the EUDAT initiative at the start - CLARIN (Linguistics), ENES (Climate Modeling), EPOS (Earth Sciences), LifeWatch (Environmental Sciences - Biodiversity), VPH (Biological and Medical Sciences). They are acting as partners in the project, and have clear tasks and commitments. Since EUDAT started on the 1st of October 2011, we have been reviewing the approaches and requirements of these five communities regarding the deployment and use of a cross-disciplinary and persistent data e-Infrastructure. This analysis was conducted through interviews and frequent interactions with representatives of the communities. In this talk will be provided an updated status of the current CDI with specific refernce to the solid Earth science commnity of EPOS.
Multiunit Sequences in First Language Acquisition.
Theakston, Anna; Lieven, Elena
2017-07-01
Theoretical and empirical reasons suggest that children build their language not only out of individual words but also out of multiunit strings. These are the basis for the development of schemas containing slots. The slots are putative categories that build in abstraction while the schemas eventually connect to other schemas in terms of both meaning and form. Evidence comes from the nature of the input, the ways in which children construct novel utterances, the systematic errors that children make, and the computational modeling of children's grammars. However, much of this research is on English, which is unusual in its rigid word order and impoverished inflectional morphology. We summarize these results and explore their implications for languages with more flexible word order and/or much richer inflectional morphology. Copyright © 2017 Cognitive Science Society, Inc.
Practical skills of the future innovator
NASA Astrophysics Data System (ADS)
Kaurov, Vitaliy
2015-03-01
Physics graduates face and often are disoriented by the complex and turbulent world of startups, incubators, emergent technologies, big data, social network engineering, and so on. In order to build the curricula that foster the skills necessary to navigate this world, we will look at the experiences at the Wolfram Science Summer School that gathers annually international students for already more than a decade. We will look at the examples of projects and see the development of such skills as innovative thinking, data mining, machine learning, cloud technologies, device connectivity and the Internet of things, network analytics, geo-information systems, formalized computable knowledge, and the adjacent applied research skills from graph theory to image processing and beyond. This should give solid ideas to educators who will build standard curricula adapted for innovation and entrepreneurship education.
Know Your Discipline: Teaching the Philosophy of Computer Science
ERIC Educational Resources Information Center
Tedre, Matti
2007-01-01
The diversity and interdisciplinarity of computer science and the multiplicity of its uses in other sciences make it hard to define computer science and to prescribe how computer science should be carried out. The diversity of computer science also causes friction between computer scientists from different branches. Computer science curricula, as…
The Theoretical Astrophysical Observatory: Cloud-based Mock Galaxy Catalogs
NASA Astrophysics Data System (ADS)
Bernyk, Maksym; Croton, Darren J.; Tonini, Chiara; Hodkinson, Luke; Hassan, Amr H.; Garel, Thibault; Duffy, Alan R.; Mutch, Simon J.; Poole, Gregory B.; Hegarty, Sarah
2016-03-01
We introduce the Theoretical Astrophysical Observatory (TAO), an online virtual laboratory that houses mock observations of galaxy survey data. Such mocks have become an integral part of the modern analysis pipeline. However, building them requires expert knowledge of galaxy modeling and simulation techniques, significant investment in software development, and access to high performance computing. These requirements make it difficult for a small research team or individual to quickly build a mock catalog suited to their needs. To address this TAO offers access to multiple cosmological simulations and semi-analytic galaxy formation models from an intuitive and clean web interface. Results can be funnelled through science modules and sent to a dedicated supercomputer for further processing and manipulation. These modules include the ability to (1) construct custom observer light cones from the simulation data cubes; (2) generate the stellar emission from star formation histories, apply dust extinction, and compute absolute and/or apparent magnitudes; and (3) produce mock images of the sky. All of TAO’s features can be accessed without any programming requirements. The modular nature of TAO opens it up for further expansion in the future.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernyk, Maksym; Croton, Darren J.; Tonini, Chiara
We introduce the Theoretical Astrophysical Observatory (TAO), an online virtual laboratory that houses mock observations of galaxy survey data. Such mocks have become an integral part of the modern analysis pipeline. However, building them requires expert knowledge of galaxy modeling and simulation techniques, significant investment in software development, and access to high performance computing. These requirements make it difficult for a small research team or individual to quickly build a mock catalog suited to their needs. To address this TAO offers access to multiple cosmological simulations and semi-analytic galaxy formation models from an intuitive and clean web interface. Results canmore » be funnelled through science modules and sent to a dedicated supercomputer for further processing and manipulation. These modules include the ability to (1) construct custom observer light cones from the simulation data cubes; (2) generate the stellar emission from star formation histories, apply dust extinction, and compute absolute and/or apparent magnitudes; and (3) produce mock images of the sky. All of TAO’s features can be accessed without any programming requirements. The modular nature of TAO opens it up for further expansion in the future.« less
"Walk along Life Science Bldg>(Chemistry & I Bldg. in view)." ...
"Walk along Life Science Bldg>(Chemistry & I Bldg. in view)." 1960. Photo no. 548. Partial oblique view of the south front, Life Science Building, looking to the northeast. - San Bernardino Valley College, Life Science Building, 701 South Mount Vernon Avenue, San Bernardino, San Bernardino County, CA
Building an International Collaboration for GeoInformatics
NASA Astrophysics Data System (ADS)
Snyder, W. S.; Lehnert, K.; Klump, J.
2005-12-01
Geoinformatics (cyberinfrastructure for the geosciences) is being developed as a linked system of sites that provide to the Earth science community a library of research data research-grade tools to manipulate, mine, analyze and model interdisciplinary data, and mechanisms to provide the necessary computational resources for these activities. Our science is global in scope and hence, geoinformatics (GI) must be an international effort. How do we build this international GI? What are the main challenges presented by the political, cultural, organizational, and technical diversity of the global science community that we need to address to achieve a truly global cyberinfrastructure for the Geosciences? GI needs to be developed in an internet-like fashion establishing connections among independent globally distributed sites (`nodes') that will share, link, and integrate their data holdings and services. Independence of the GI pieces with respect to goals, scope, and approaches is critical to sustain commitment from people to build a GI node for which they feel ownership and get credit. This should not be fought by funding agencies - and certainly not by state and federal agencies. Communication, coordination, and collaboration are the core efforts to build the connections, but incentives and resources are required to advance and support them. Part of the coordination effort is development and maintenance of standards. Who should set these standards and govern their modification? Do we need an official international body to do so, and should this be a "governing body" or an "advisory body"? What role should international commissions and bodies such as CODATA/ICSU or IUGS-CGI, international societies and unions, the national geological surveys and other federal agencies play? Guidance from the science community is key to construct a system that geo-researchers will want to use, and that meets their needs. Only when the community endorses GI as a fundamental platform to support science research will we be able to address the challenging question how to insure sustained funding of GI which will undoubtly be a costly effort, convincing governments and funding agencies to invest in a global effort. Perhaps the most challenging problems are cultural ones such as the "my data" issue, the reluctance to share data even if they were generated with public funding. This is slowly being resolved by some funding agencies through moratorium periods for use of data before they are available to everyone, but will require a sustained "education" effort of the geoscience research community. Geoinformatics is the platform for a new paradigm in how we conduct our research. The challenges to building an international GI are quite serious, some might say daunting, but the conveners of this session feel that the effort is not only worth it, but required for the sake of our science research.
Computational Infrastructure for Geodynamics (CIG)
NASA Astrophysics Data System (ADS)
Gurnis, M.; Kellogg, L. H.; Bloxham, J.; Hager, B. H.; Spiegelman, M.; Willett, S.; Wysession, M. E.; Aivazis, M.
2004-12-01
Solid earth geophysicists have a long tradition of writing scientific software to address a wide range of problems. In particular, computer simulations came into wide use in geophysics during the decade after the plate tectonic revolution. Solution schemes and numerical algorithms that developed in other areas of science, most notably engineering, fluid mechanics, and physics, were adapted with considerable success to geophysics. This software has largely been the product of individual efforts and although this approach has proven successful, its strength for solving problems of interest is now starting to show its limitations as we try to share codes and algorithms or when we want to recombine codes in novel ways to produce new science. With funding from the NSF, the US community has embarked on a Computational Infrastructure for Geodynamics (CIG) that will develop, support, and disseminate community-accessible software for the greater geodynamics community from model developers to end-users. The software is being developed for problems involving mantle and core dynamics, crustal and earthquake dynamics, magma migration, seismology, and other related topics. With a high level of community participation, CIG is leveraging state-of-the-art scientific computing into a suite of open-source tools and codes. The infrastructure that we are now starting to develop will consist of: (a) a coordinated effort to develop reusable, well-documented and open-source geodynamics software; (b) the basic building blocks - an infrastructure layer - of software by which state-of-the-art modeling codes can be quickly assembled; (c) extension of existing software frameworks to interlink multiple codes and data through a superstructure layer; (d) strategic partnerships with the larger world of computational science and geoinformatics; and (e) specialized training and workshops for both the geodynamics and broader Earth science communities. The CIG initiative has already started to leverage and develop long-term strategic partnerships with open source development efforts within the larger thrusts of scientific computing and geoinformatics. These strategic partnerships are essential as the frontier has moved into multi-scale and multi-physics problems in which many investigators now want to use simulation software for data interpretation, data assimilation, and hypothesis testing.
Study of Wind Effects on Unique Buildings
NASA Astrophysics Data System (ADS)
Olenkov, V.; Puzyrev, P.
2017-11-01
The article deals with a numerical simulation of wind effects on the building of the Church of the Intercession of the Holy Virgin in the village Bulzi of the Chelyabinsk region. We presented a calculation algorithm and obtained pressure fields, velocity fields and the fields of kinetic energy of a wind stream, as well as streamlines. Computational fluid dynamic (CFD) evolved three decades ago at the interfaces of calculus mathematics and theoretical hydromechanics and has become a separate branch of science the subject of which is a numerical simulation of different fluid and gas flows as well as the solution of arising problems with the help of methods that involve computer systems. This scientific field which is of a great practical value is intensively developing. The increase in CFD-calculations is caused by the improvement of computer technologies, creation of multipurpose easy-to-use CFD-packagers that are available to a wide group of researchers and cope with various tasks. Such programs are not only competitive in comparison with physical experiments but sometimes they provide the only opportunity to answer the research questions. The following advantages of computer simulation can be pointed out: a) Reduction in time spent on design and development of a model in comparison with a real experiment (variation of boundary conditions). b) Numerical experiment allows for the simulation of conditions that are not reproducible with environmental tests (use of ideal gas as environment). c) Use of computational gas dynamics methods provides a researcher with a complete and ample information that is necessary to fully describe different processes of the experiment. d) Economic efficiency of computer calculations is more attractive than an experiment. e) Possibility to modify a computational model which ensures efficient timing (change of the sizes of wall layer cells in accordance with the chosen turbulence model).
SERVIR Science Applications for Capacity Building
NASA Technical Reports Server (NTRS)
Limaye, Ashutosh; Searby, Nancy D.; Irwin, Daniel
2012-01-01
SERVIR is a regional visualization and monitoring system using Earth observations to support environmental management, climate adaptation, and disaster response in developing countries. SERVIR is jointly sponsored by NASA and the U.S. Agency for International Development (USAID). SERVIR has been instrumental in development of science applications to support the decision-making and capacity building in the developing countries with the help of SERVIR Hubs. In 2011, NASA Research Opportunities in Space and Earth Sciences (ROSES) included a call for proposals to form SERVIR Applied Sciences Team (SERVIR AST) under Applied Sciences Capacity Building Program. Eleven proposals were selected, the Principal Investigators of which comprise the core of the SERVIR AST. The expertise on the Team span several societal benefit areas including agriculture, disasters, public health and air quality, water, climate and terrestrial carbon assessments. This presentation will cover the existing SERVIR science applications, capacity building components, overview of SERVIR AST projects, and anticipated impacts.
Reflections on the history of indoor air science, focusing on the last 50 years.
Sundell, J
2017-07-01
The scientific articles and Indoor Air conference publications of the indoor air sciences (IAS) during the last 50 years are summarized. In total 7524 presentations, from 79 countries, have been made at Indoor Air conferences held between 1978 (49 presentations) and 2014 (1049 presentations). In the Web of Science, 26 992 articles on indoor air research (with the word "indoor" as a search term) have been found (as of 1 Jan 2016) of which 70% were published during the last 10 years. The modern scientific history started in the 1970s with a question: "did indoor air pose a threat to health as did outdoor air?" Soon it was recognized that indoor air is more important, from a health point of view, than outdoor air. Topics of concern were first radon, environmental tobacco smoke, and lung cancer, followed by volatile organic compounds, formaldehyde and sick building syndrome, house dust-mites, asthma and allergies, Legionnaires disease, and other airborne infections. Later emerged dampness/mold-associated allergies and today's concern with "modern exposures-modern diseases." Ventilation, thermal comfort, indoor air chemistry, semi-volatile organic compounds, building simulation by computational fluid dynamics, and fine particulate matter are common topics today. From their beginning in Denmark and Sweden, then in the USA, the indoor air sciences now show increasing activity in East and Southeast Asia. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Alameda, J. C.
2011-12-01
Development and optimization of computational science models, particularly on high performance computers, and with the advent of ubiquitous multicore processor systems, practically on every system, has been accomplished with basic software tools, typically, command-line based compilers, debuggers, performance tools that have not changed substantially from the days of serial and early vector computers. However, model complexity, including the complexity added by modern message passing libraries such as MPI, and the need for hybrid code models (such as openMP and MPI) to be able to take full advantage of high performance computers with an increasing core count per shared memory node, has made development and optimization of such codes an increasingly arduous task. Additional architectural developments, such as many-core processors, only complicate the situation further. In this paper, we describe how our NSF-funded project, "SI2-SSI: A Productive and Accessible Development Workbench for HPC Applications Using the Eclipse Parallel Tools Platform" (WHPC) seeks to improve the Eclipse Parallel Tools Platform, an environment designed to support scientific code development targeted at a diverse set of high performance computing systems. Our WHPC project to improve Eclipse PTP takes an application-centric view to improve PTP. We are using a set of scientific applications, each with a variety of challenges, and using PTP to drive further improvements to both the scientific application, as well as to understand shortcomings in Eclipse PTP from an application developer perspective, to drive our list of improvements we seek to make. We are also partnering with performance tool providers, to drive higher quality performance tool integration. We have partnered with the Cactus group at Louisiana State University to improve Eclipse's ability to work with computational frameworks and extremely complex build systems, as well as to develop educational materials to incorporate into computational science and engineering codes. Finally, we are partnering with the lead PTP developers at IBM, to ensure we are as effective as possible within the Eclipse community development. We are also conducting training and outreach to our user community, including conference BOF sessions, monthly user calls, and an annual user meeting, so that we can best inform the improvements we make to Eclipse PTP. With these activities we endeavor to encourage use of modern software engineering practices, as enabled through the Eclipse IDE, with computational science and engineering applications. These practices include proper use of source code repositories, tracking and rectifying issues, measuring and monitoring code performance changes against both optimizations as well as ever-changing software stacks and configurations on HPC systems, as well as ultimately encouraging development and maintenance of testing suites -- things that have become commonplace in many software endeavors, but have lagged in the development of science applications. We view that the challenge with the increased complexity of both HPC systems and science applications demands the use of better software engineering methods, preferably enabled by modern tools such as Eclipse PTP, to help the computational science community thrive as we evolve the HPC landscape.
Polynomial-Time Algorithms for Building a Consensus MUL-Tree
Cui, Yun; Jansson, Jesper
2012-01-01
Abstract A multi-labeled phylogenetic tree, or MUL-tree, is a generalization of a phylogenetic tree that allows each leaf label to be used many times. MUL-trees have applications in biogeography, the study of host–parasite cospeciation, gene evolution studies, and computer science. Here, we consider the problem of inferring a consensus MUL-tree that summarizes a given set of conflicting MUL-trees, and present the first polynomial-time algorithms for solving it. In particular, we give a straightforward, fast algorithm for building a strict consensus MUL-tree for any input set of MUL-trees with identical leaf label multisets, as well as a polynomial-time algorithm for building a majority rule consensus MUL-tree for the special case where every leaf label occurs at most twice. We also show that, although it is NP-hard to find a majority rule consensus MUL-tree in general, the variant that we call the singular majority rule consensus MUL-tree can be constructed efficiently whenever it exists. PMID:22963134
Polynomial-time algorithms for building a consensus MUL-tree.
Cui, Yun; Jansson, Jesper; Sung, Wing-Kin
2012-09-01
A multi-labeled phylogenetic tree, or MUL-tree, is a generalization of a phylogenetic tree that allows each leaf label to be used many times. MUL-trees have applications in biogeography, the study of host-parasite cospeciation, gene evolution studies, and computer science. Here, we consider the problem of inferring a consensus MUL-tree that summarizes a given set of conflicting MUL-trees, and present the first polynomial-time algorithms for solving it. In particular, we give a straightforward, fast algorithm for building a strict consensus MUL-tree for any input set of MUL-trees with identical leaf label multisets, as well as a polynomial-time algorithm for building a majority rule consensus MUL-tree for the special case where every leaf label occurs at most twice. We also show that, although it is NP-hard to find a majority rule consensus MUL-tree in general, the variant that we call the singular majority rule consensus MUL-tree can be constructed efficiently whenever it exists.
Riding the Hype Wave: Evaluating new AI Techniques for their Applicability in Earth Science
NASA Astrophysics Data System (ADS)
Ramachandran, R.; Zhang, J.; Maskey, M.; Lee, T. J.
2016-12-01
Every few years a new technology rides the hype wave generated by the computer science community. Converts to this new technology who surface from both the science community and the informatics community promulgate that it can radically improve or even change the existing scientific process. Recent examples of new technology following in the footsteps of "big data" now include deep learning algorithms and knowledge graphs. Deep learning algorithms mimic the human brain and process information through multiple stages of transformation and representation. These algorithms are able to learn complex functions that map pixels directly to outputs without relying on human-crafted features and solve some of the complex classification problems that exist in science. Similarly, knowledge graphs aggregate information around defined topics that enable users to resolve their query without having to navigate and assemble information manually. Knowledge graphs could potentially be used in scientific research to assist in hypothesis formulation, testing, and review. The challenge for the Earth science research community is to evaluate these new technologies by asking the right questions and considering what-if scenarios. What is this new technology enabling/providing that is innovative and different? Can one justify the adoption costs with respect to the research returns? Since nothing comes for free, utilizing a new technology entails adoption costs that may outweigh the benefits. Furthermore, these technologies may require significant computing infrastructure in order to be utilized effectively. Results from two different projects will be presented along with lessons learned from testing these technologies. The first project primarily evaluates deep learning techniques for different applications of image retrieval within Earth science while the second project builds a prototype knowledge graph constructed for Hurricane science.
Short-term Temperature Prediction Using Adaptive Computing on Dynamic Scales
NASA Astrophysics Data System (ADS)
Hu, W.; Cervone, G.; Jha, S.; Balasubramanian, V.; Turilli, M.
2017-12-01
When predicting temperature, there are specific places and times when high accuracy predictions are harder. For example, not all the sub-regions in the domain require the same amount of computing resources to generate an accurate prediction. Plateau areas might require less computing resources than mountainous areas because of the steeper gradient of temperature change in the latter. However, it is difficult to estimate beforehand the optimal allocation of computational resources because several parameters play a role in determining the accuracy of the forecasts, in addition to orography. The allocation of resources to perform simulations can become a bottleneck because it requires human intervention to stop jobs or start new ones. The goal of this project is to design and develop a dynamic approach to generate short-term temperature predictions that can automatically determines the required computing resources and the geographic scales of the predictions based on the spatial and temporal uncertainties. The predictions and the prediction quality metrics are computed using a numeric weather prediction model, Analog Ensemble (AnEn), and the parallelization on high performance computing systems is accomplished using Ensemble Toolkit, one component of the RADICAL-Cybertools family of tools. RADICAL-Cybertools decouple the science needs from the computational capabilities by building an intermediate layer to run general ensemble patterns, regardless of the science. In this research, we show how the ensemble toolkit allows generating high resolution temperature forecasts at different spatial and temporal resolution. The AnEn algorithm is run using NAM analysis and forecasts data for the continental United States for a period of 2 years. AnEn results show that temperature forecasts perform well according to different probabilistic and deterministic statistical tests.
Key Lessons in Building "Data Commons": The Open Science Data Cloud Ecosystem
NASA Astrophysics Data System (ADS)
Patterson, M.; Grossman, R.; Heath, A.; Murphy, M.; Wells, W.
2015-12-01
Cloud computing technology has created a shift around data and data analysis by allowing researchers to push computation to data as opposed to having to pull data to an individual researcher's computer. Subsequently, cloud-based resources can provide unique opportunities to capture computing environments used both to access raw data in its original form and also to create analysis products which may be the source of data for tables and figures presented in research publications. Since 2008, the Open Cloud Consortium (OCC) has operated the Open Science Data Cloud (OSDC), which provides scientific researchers with computational resources for storing, sharing, and analyzing large (terabyte and petabyte-scale) scientific datasets. OSDC has provided compute and storage services to over 750 researchers in a wide variety of data intensive disciplines. Recently, internal users have logged about 2 million core hours each month. The OSDC also serves the research community by colocating these resources with access to nearly a petabyte of public scientific datasets in a variety of fields also accessible for download externally by the public. In our experience operating these resources, researchers are well served by "data commons," meaning cyberinfrastructure that colocates data archives, computing, and storage infrastructure and supports essential tools and services for working with scientific data. In addition to the OSDC public data commons, the OCC operates a data commons in collaboration with NASA and is developing a data commons for NOAA datasets. As cloud-based infrastructures for distributing and computing over data become more pervasive, we ask, "What does it mean to publish data in a data commons?" Here we present the OSDC perspective and discuss several services that are key in architecting data commons, including digital identifier services.
High-Performance Compute Infrastructure in Astronomy: 2020 Is Only Months Away
NASA Astrophysics Data System (ADS)
Berriman, B.; Deelman, E.; Juve, G.; Rynge, M.; Vöckler, J. S.
2012-09-01
By 2020, astronomy will be awash with as much as 60 PB of public data. Full scientific exploitation of such massive volumes of data will require high-performance computing on server farms co-located with the data. Development of this computing model will be a community-wide enterprise that has profound cultural and technical implications. Astronomers must be prepared to develop environment-agnostic applications that support parallel processing. The community must investigate the applicability and cost-benefit of emerging technologies such as cloud computing to astronomy, and must engage the Computer Science community to develop science-driven cyberinfrastructure such as workflow schedulers and optimizers. We report here the results of collaborations between a science center, IPAC, and a Computer Science research institute, ISI. These collaborations may be considered pathfinders in developing a high-performance compute infrastructure in astronomy. These collaborations investigated two exemplar large-scale science-driver workflow applications: 1) Calculation of an infrared atlas of the Galactic Plane at 18 different wavelengths by placing data from multiple surveys on a common plate scale and co-registering all the pixels; 2) Calculation of an atlas of periodicities present in the public Kepler data sets, which currently contain 380,000 light curves. These products have been generated with two workflow applications, written in C for performance and designed to support parallel processing on multiple environments and platforms, but with different compute resource needs: the Montage image mosaic engine is I/O-bound, and the NASA Star and Exoplanet Database periodogram code is CPU-bound. Our presentation will report cost and performance metrics and lessons-learned for continuing development. Applicability of Cloud Computing: Commercial Cloud providers generally charge for all operations, including processing, transfer of input and output data, and for storage of data, and so the costs of running applications vary widely according to how they use resources. The cloud is well suited to processing CPU-bound (and memory bound) workflows such as the periodogram code, given the relatively low cost of processing in comparison with I/O operations. I/O-bound applications such as Montage perform best on high-performance clusters with fast networks and parallel file-systems. Science-driven Cyberinfrastructure: Montage has been widely used as a driver application to develop workflow management services, such as task scheduling in distributed environments, designing fault tolerance techniques for job schedulers, and developing workflow orchestration techniques. Running Parallel Applications Across Distributed Cloud Environments: Data processing will eventually take place in parallel distributed across cyber infrastructure environments having different architectures. We have used the Pegasus Work Management System (WMS) to successfully run applications across three very different environments: TeraGrid, OSG (Open Science Grid), and FutureGrid. Provisioning resources across different grids and clouds (also referred to as Sky Computing), involves establishing a distributed environment, where issues of, e.g, remote job submission, data management, and security need to be addressed. This environment also requires building virtual machine images that can run in different environments. Usually, each cloud provides basic images that can be customized with additional software and services. In most of our work, we provisioned compute resources using a custom application, called Wrangler. Pegasus WMS abstracts the architectures of the compute environments away from the end-user, and can be considered a first-generation tool suitable for scientists to run their applications on disparate environments.
The Caltech Concurrent Computation Program - Project description
NASA Technical Reports Server (NTRS)
Fox, G.; Otto, S.; Lyzenga, G.; Rogstad, D.
1985-01-01
The Caltech Concurrent Computation Program wwhich studies basic issues in computational science is described. The research builds on initial work where novel concurrent hardware, the necessary systems software to use it and twenty significant scientific implementations running on the initial 32, 64, and 128 node hypercube machines have been constructed. A major goal of the program will be to extend this work into new disciplines and more complex algorithms including general packages that decompose arbitrary problems in major application areas. New high-performance concurrent processors with up to 1024-nodes, over a gigabyte of memory and multigigaflop performance are being constructed. The implementations cover a wide range of problems in areas such as high energy and astrophysics, condensed matter, chemical reactions, plasma physics, applied mathematics, geophysics, simulation, CAD for VLSI, graphics and image processing. The products of the research program include the concurrent algorithms, hardware, systems software, and complete program implementations.
North side, facing the courtyard. Life Science Building is to ...
North side, facing the courtyard. Life Science Building is to the left, out of view, and the library is to the right. Also out of view. - San Bernardino Valley College, Classics Building, 701 South Mount Vernon Avenue, San Bernardino, San Bernardino County, CA
Envisioning Science Environment Technology and Society
NASA Astrophysics Data System (ADS)
Maknun, J.; Busono, T.; Surasetja, I.
2018-02-01
Science Environment Technology and Society (SETS) approach helps students to connect science concept with the other aspects. This allows them to achieve a clearer depiction of how each concept is linked with the other concepts in SETS. Taking SETS into account will guide students to utilize science as a productive concept in inventing and developing technology, while minimizing its negative impacts on the environment and society. This article discusses the implementation of Sundanese local wisdoms, that can be found in the local stilt house (rumah panggung), in the Building Construction subject in vocational high school on Building Drawing Technique expertise. The stilt house structural system employs ties, pupurus joints, and wedges on its floor, wall, and truss frames, as well as its beams. This local knowledge was incorporated into the Building Construction learning program and applied on the following basic competences: applying wood’s specification and characteristics for building construction, managing wood’s specification and characteristics for building construction, analyzing building structure’s type and function based on their characteristics, reasoning building structure’s type and function based on their characteristics, categorizing wood construction works, and reasoning wood construction works. The research result is the Sundanese traditional-local-wisdom-based learning design of the Building Construction subject.
Chiang, Harry; Robinson, Lucy C; Brame, Cynthia J; Messina, Troy C
2013-01-01
Over the past 20 years, the biological sciences have increasingly incorporated chemistry, physics, computer science, and mathematics to aid in the development and use of mathematical models. Such combined approaches have been used to address problems from protein structure-function relationships to the workings of complex biological systems. Computer simulations of molecular events can now be accomplished quickly and with standard computer technology. Also, simulation software is freely available for most computing platforms, and online support for the novice user is ample. We have therefore created a molecular dynamics laboratory module to enhance undergraduate student understanding of molecular events underlying organismal phenotype. This module builds on a previously described project in which students use site-directed mutagenesis to investigate functions of conserved sequence features in members of a eukaryotic protein kinase family. In this report, we detail the laboratory activities of a MD module that provide a complement to phenotypic outcomes by providing a hypothesis-driven and quantifiable measure of predicted structural changes caused by targeted mutations. We also present examples of analyses students may perform. These laboratory activities can be integrated with genetics or biochemistry experiments as described, but could also be used independently in any course that would benefit from a quantitative approach to protein structure-function relationships. Copyright © 2013 Wiley Periodicals, Inc.
Noel, Jean-Paul; Blanke, Olaf; Serino, Andrea
2018-06-06
Integrating information across sensory systems is a critical step toward building a cohesive representation of the environment and one's body, and as illustrated by numerous illusions, scaffolds subjective experience of the world and self. In the last years, classic principles of multisensory integration elucidated in the subcortex have been translated into the language of statistical inference understood by the neocortical mantle. Most importantly, a mechanistic systems-level description of multisensory computations via probabilistic population coding and divisive normalization is actively being put forward. In parallel, by describing and understanding bodily illusions, researchers have suggested multisensory integration of bodily inputs within the peripersonal space as a key mechanism in bodily self-consciousness. Importantly, certain aspects of bodily self-consciousness, although still very much a minority, have been recently casted under the light of modern computational understandings of multisensory integration. In doing so, we argue, the field of bodily self-consciousness may borrow mechanistic descriptions regarding the neural implementation of inference computations outlined by the multisensory field. This computational approach, leveraged on the understanding of multisensory processes generally, promises to advance scientific comprehension regarding one of the most mysterious questions puzzling humankind, that is, how our brain creates the experience of a self in interaction with the environment. © 2018 The Authors. Annals of the New York Academy of Sciences published by Wiley Periodicals, Inc. on behalf of New York Academy of Sciences.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barhen, Jacob; Imam, Neena
2007-01-01
Revolutionary computing technologies are defined in terms of technological breakthroughs, which leapfrog over near-term projected advances in conventional hardware and software to produce paradigm shifts in computational science. For underwater threat source localization using information provided by a dynamical sensor network, one of the most promising computational advances builds upon the emergence of digital optical-core devices. In this article, we present initial results of sensor network calculations that focus on the concept of signal wavefront time-difference-of-arrival (TDOA). The corresponding algorithms are implemented on the EnLight processing platform recently introduced by Lenslet Laboratories. This tera-scale digital optical core processor is optimizedmore » for array operations, which it performs in a fixed-point-arithmetic architecture. Our results (i) illustrate the ability to reach the required accuracy in the TDOA computation, and (ii) demonstrate that a considerable speed-up can be achieved when using the EnLight 64a prototype processor as compared to a dual Intel XeonTM processor.« less
Validation of Computational Models in Biomechanics
Henninger, Heath B.; Reese, Shawn P.; Anderson, Andrew E.; Weiss, Jeffrey A.
2010-01-01
The topics of verification and validation (V&V) have increasingly been discussed in the field of computational biomechanics, and many recent articles have applied these concepts in an attempt to build credibility for models of complex biological systems. V&V are evolving techniques that, if used improperly, can lead to false conclusions about a system under study. In basic science these erroneous conclusions may lead to failure of a subsequent hypothesis, but they can have more profound effects if the model is designed to predict patient outcomes. While several authors have reviewed V&V as they pertain to traditional solid and fluid mechanics, it is the intent of this manuscript to present them in the context of computational biomechanics. Specifically, the task of model validation will be discussed with a focus on current techniques. It is hoped that this review will encourage investigators to engage and adopt the V&V process in an effort to increase peer acceptance of computational biomechanics models. PMID:20839648
Building an Integrated Environment for Multimedia
NASA Technical Reports Server (NTRS)
1997-01-01
Multimedia courseware on the solar system and earth science suitable for use in elementary, middle, and high schools was developed under this grant. The courseware runs on Silicon Graphics, Incorporated (SGI) workstations and personal computers (PCs). There is also a version of the courseware accessible via the World Wide Web. Accompanying multimedia database systems were also developed to enhance the multimedia courseware. The database systems accompanying the PC software are based on the relational model, while the database systems accompanying the SGI software are based on the object-oriented model.
Phytozome Comparative Plant Genomics Portal
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goodstein, David; Batra, Sajeev; Carlson, Joseph
2014-09-09
The Dept. of Energy Joint Genome Institute is a genomics user facility supporting DOE mission science in the areas of Bioenergy, Carbon Cycling, and Biogeochemistry. The Plant Program at the JGI applies genomic, analytical, computational and informatics platforms and methods to: 1. Understand and accelerate the improvement (domestication) of bioenergy crops 2. Characterize and moderate plant response to climate change 3. Use comparative genomics to identify constrained elements and infer gene function 4. Build high quality genomic resource platforms of JGI Plant Flagship genomes for functional and experimental work 5. Expand functional genomic resources for Plant Flagship genomes
Integrating interactive computational modeling in biology curricula.
Helikar, Tomáš; Cutucache, Christine E; Dahlquist, Lauren M; Herek, Tyler A; Larson, Joshua J; Rogers, Jim A
2015-03-01
While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology) class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology.
NASA Astrophysics Data System (ADS)
Idaszak, R.; Lenhardt, W. C.; Jones, M. B.; Ahalt, S.; Schildhauer, M.; Hampton, S. E.
2014-12-01
The NSF, in an effort to support the creation of sustainable science software, funded 16 science software institute conceptualization efforts. The goal of these conceptualization efforts is to explore approaches to creating the institutional, sociological, and physical infrastructures to support sustainable science software. This paper will present the lessons learned from two of these conceptualization efforts, the Institute for Sustainable Earth and Environmental Software (ISEES - http://isees.nceas.ucsb.edu) and the Water Science Software Institute (WSSI - http://waters2i2.org). ISEES is a multi-partner effort led by National Center for Ecological Analysis and Synthesis (NCEAS). WSSI, also a multi-partner effort, is led by the Renaissance Computing Institute (RENCI). The two conceptualization efforts have been collaborating due to the complementarity of their approaches and given the potential synergies of their science focus. ISEES and WSSI have engaged in a number of activities to address the challenges of science software such as workshops, hackathons, and coding efforts. More recently, the two institutes have also collaborated on joint activities including training, proposals, and papers. In addition to presenting lessons learned, this paper will synthesize across the two efforts to project a unified vision for a science software institute.
A Rationale for Building a Comprehensive Science Program for Inner-City Education.
ERIC Educational Resources Information Center
Martin, Charles Arthur
The intent of this dissertation was to develop a science curriculum from an inner-city perspective. Five units and a rationale for inner-city education are included. The units include both physical and biological science topics. The units are as follows: (1) Rationale for Building a Comprehensive Science Program for Inner-City Education; (2) With…
Final report: Prototyping a combustion corridor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rutland, Christopher J.; Leach, Joshua
2001-12-15
The Combustion Corridor is a concept in which researchers in combustion and thermal sciences have unimpeded access to large volumes of remote computational results. This will enable remote, collaborative analysis and visualization of state-of-the-art combustion science results. The Engine Research Center (ERC) at the University of Wisconsin - Madison partnered with Lawrence Berkeley National Laboratory, Argonne National Laboratory, Sandia National Laboratory, and several other universities to build and test the first stages of a combustion corridor. The ERC served two important functions in this partnership. First, we work extensively with combustion simulations so we were able to provide real worldmore » research data sets for testing the Corridor concepts. Second, the ERC was part of an extension of the high bandwidth based DOE National Laboratory connections to universities.« less
Identifying logical planes formed of compute nodes of a subcommunicator in a parallel computer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, Kristan D.; Faraj, Daniel
In a parallel computer, a plurality of logical planes formed of compute nodes of a subcommunicator may be identified by: for each compute node of the subcommunicator and for a number of dimensions beginning with a first dimension: establishing, by a plane building node, in a positive direction of the first dimension, all logical planes that include the plane building node and compute nodes of the subcommunicator in a positive direction of a second dimension, where the second dimension is orthogonal to the first dimension; and establishing, by the plane building node, in a negative direction of the first dimension,more » all logical planes that include the plane building node and compute nodes of the subcommunicator in the positive direction of the second dimension.« less
Determining position inside building via laser rangefinder and handheld computer
Ramsey, Jr James L. [Albuquerque, NM; Finley, Patrick [Albuquerque, NM; Melton, Brad [Albuquerque, NM
2010-01-12
An apparatus, computer software, and a method of determining position inside a building comprising selecting on a PDA at least two walls of a room in a digitized map of a building or a portion of a building, pointing and firing a laser rangefinder at corresponding physical walls, transmitting collected range information to the PDA, and computing on the PDA a position of the laser rangefinder within the room.
LFRic: Building a new Unified Model
NASA Astrophysics Data System (ADS)
Melvin, Thomas; Mullerworth, Steve; Ford, Rupert; Maynard, Chris; Hobson, Mike
2017-04-01
The LFRic project, named for Lewis Fry Richardson, aims to develop a replacement for the Met Office Unified Model in order to meet the challenges which will be presented by the next generation of exascale supercomputers. This project, a collaboration between the Met Office, STFC Daresbury and the University of Manchester, builds on the earlier GungHo project to redesign the dynamical core, in partnership with NERC. The new atmospheric model aims to retain the performance of the current ENDGame dynamical core and associated subgrid physics, while also enabling a far greater scalability and flexibility to accommodate future supercomputer architectures. Design of the model revolves around a principle of a 'separation of concerns', whereby the natural science aspects of the code can be developed without worrying about the underlying architecture, while machine dependent optimisations can be carried out at a high level. These principles are put into practice through the development of an autogenerated Parallel Systems software layer (known as the PSy layer) using a domain-specific compiler called PSyclone. The prototype model includes a re-write of the dynamical core using a mixed finite element method, in which different function spaces are used to represent the various fields. It is able to run in parallel with MPI and OpenMP and has been tested on over 200,000 cores. In this talk an overview of the both the natural science and computational science implementations of the model will be presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Willis, D. K.
2016-12-01
High performance computing (HPC) has been a defining strength of Lawrence Livermore National Laboratory (LLNL) since its founding. Livermore scientists have designed and used some of the world’s most powerful computers to drive breakthroughs in nearly every mission area. Today, the Laboratory is recognized as a world leader in the application of HPC to complex science, technology, and engineering challenges. Most importantly, HPC has been integral to the National Nuclear Security Administration’s (NNSA’s) Stockpile Stewardship Program—designed to ensure the safety, security, and reliability of our nuclear deterrent without nuclear testing. A critical factor behind Lawrence Livermore’s preeminence in HPC ismore » the ongoing investments made by the Laboratory Directed Research and Development (LDRD) Program in cutting-edge concepts to enable efficient utilization of these powerful machines. Congress established the LDRD Program in 1991 to maintain the technical vitality of the Department of Energy (DOE) national laboratories. Since then, LDRD has been, and continues to be, an essential tool for exploring anticipated needs that lie beyond the planning horizon of our programs and for attracting the next generation of talented visionaries. Through LDRD, Livermore researchers can examine future challenges, propose and explore innovative solutions, and deliver creative approaches to support our missions. The present scientific and technical strengths of the Laboratory are, in large part, a product of past LDRD investments in HPC. Here, we provide seven examples of LDRD projects from the past decade that have played a critical role in building LLNL’s HPC, computer science, mathematics, and data science research capabilities, and describe how they have impacted LLNL’s mission.« less
A research program in empirical computer science
NASA Technical Reports Server (NTRS)
Knight, J. C.
1991-01-01
During the grant reporting period our primary activities have been to begin preparation for the establishment of a research program in experimental computer science. The focus of research in this program will be safety-critical systems. Many questions that arise in the effort to improve software dependability can only be addressed empirically. For example, there is no way to predict the performance of the various proposed approaches to building fault-tolerant software. Performance models, though valuable, are parameterized and cannot be used to make quantitative predictions without experimental determination of underlying distributions. In the past, experimentation has been able to shed some light on the practical benefits and limitations of software fault tolerance. It is common, also, for experimentation to reveal new questions or new aspects of problems that were previously unknown. A good example is the Consistent Comparison Problem that was revealed by experimentation and subsequently studied in depth. The result was a clear understanding of a previously unknown problem with software fault tolerance. The purpose of a research program in empirical computer science is to perform controlled experiments in the area of real-time, embedded control systems. The goal of the various experiments will be to determine better approaches to the construction of the software for computing systems that have to be relied upon. As such it will validate research concepts from other sources, provide new research results, and facilitate the transition of research results from concepts to practical procedures that can be applied with low risk to NASA flight projects. The target of experimentation will be the production software development activities undertaken by any organization prepared to contribute to the research program. Experimental goals, procedures, data analysis and result reporting will be performed for the most part by the University of Virginia.
How to build better memory training games
Deveau, Jenni; Jaeggi, Susanne M.; Zordan, Victor; Phung, Calvin; Seitz, Aaron R.
2015-01-01
Can we create engaging training programs that improve working memory (WM) skills? While there are numerous procedures that attempt to do so, there is a great deal of controversy regarding their efficacy. Nonetheless, recent meta-analytic evidence shows consistent improvements across studies on lab-based tasks generalizing beyond the specific training effects (Au et al., 2014; Karbach and Verhaeghen, 2014), however, there is little research into how WM training aids participants in their daily life. Here we propose that incorporating design principles from the fields of Perceptual Learning (PL) and Computer Science might augment the efficacy of WM training, and ultimately lead to greater learning and transfer. In particular, the field of PL has identified numerous mechanisms (including attention, reinforcement, multisensory facilitation and multi-stimulus training) that promote brain plasticity. Also, computer science has made great progress in the scientific approach to game design that can be used to create engaging environments for learning. We suggest that approaches integrating knowledge across these fields may lead to a more effective WM interventions and better reflect real world conditions. PMID:25620916
NASA Astrophysics Data System (ADS)
Engel, P.; Schweimler, B.
2016-04-01
The deformation monitoring of structures and buildings is an important task field of modern engineering surveying, ensuring the standing and reliability of supervised objects over a long period. Several commercial hardware and software solutions for the realization of such monitoring measurements are available on the market. In addition to them, a research team at the Neubrandenburg University of Applied Sciences (NUAS) is actively developing a software package for monitoring purposes in geodesy and geotechnics, which is distributed under an open source licence and free of charge. The task of managing an open source project is well-known in computer science, but it is fairly new in a geodetic context. This paper contributes to that issue by detailing applications, frameworks, and interfaces for the design and implementation of open hardware and software solutions for sensor control, sensor networks, and data management in automatic deformation monitoring. It will be discussed how the development effort of networked applications can be reduced by using free programming tools, cloud computing technologies, and rapid prototyping methods.
One-Click Data Analysis Software for Science Operations
NASA Astrophysics Data System (ADS)
Navarro, Vicente
2015-12-01
One of the important activities of ESA Science Operations Centre is to provide Data Analysis Software (DAS) to enable users and scientists to process data further to higher levels. During operations and post-operations, Data Analysis Software (DAS) is fully maintained and updated for new OS and library releases. Nonetheless, once a Mission goes into the "legacy" phase, there are very limited funds and long-term preservation becomes more and more difficult. Building on Virtual Machine (VM), Cloud computing and Software as a Service (SaaS) technologies, this project has aimed at providing long-term preservation of Data Analysis Software for the following missions: - PIA for ISO (1995) - SAS for XMM-Newton (1999) - Hipe for Herschel (2009) - EXIA for EXOSAT (1983) Following goals have guided the architecture: - Support for all operations, post-operations and archive/legacy phases. - Support for local (user's computer) and cloud environments (ESAC-Cloud, Amazon - AWS). - Support for expert users, requiring full capabilities. - Provision of a simple web-based interface. This talk describes the architecture, challenges, results and lessons learnt gathered in this project.
Place-Based Learning: Interactive Learning and Net-Zero Design
ERIC Educational Resources Information Center
Holser, Alec; Becker, Michael
2011-01-01
Food and conservation science curriculum, net-zero design and student-based building performance monitoring have come together in the unique and innovative new Music and Science Building for Oregon's Hood River Middle School. The school's Permaculture-based curriculum both informed the building design and was also transformed through the…
Building a Science Communication Culture: One Agency's Approach
NASA Astrophysics Data System (ADS)
DeWitt, S.; Tenenbaum, L. F.; Betz, L.
2014-12-01
Science communication does not have to be a solitary practice. And yet, many scientists go about it alone and with little support from their peers and organizations. To strengthen community and build support for science communicators, NASA designed a training course aimed at two goals: 1) to develop individual scientists' communication skills, and 2) to begin to build a science communication culture at the agency. NASA offered a pilot version of this training course in 2014: the agency's first multidisciplinary face-to-face learning experience for science communicators. Twenty-six Earth, space and life scientists from ten field centers came together for three days of learning. They took part in fundamental skill-building exercises, individual development planning, and high-impact team projects. This presentation will describe the course design and learning objectives, the experience of the participants, and the evaluation results that will inform future offerings of communication training for NASA scientists and others.
Factors influencing exemplary science teachers' levels of computer use
NASA Astrophysics Data System (ADS)
Hakverdi, Meral
This study examines exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their students' use of computer applications/tools in or for their science class. After a relevant review of the literature certain variables were selected for analysis. These variables included personal self-efficacy in teaching with computers, outcome expectancy, pupil-control ideology, level of computer use, age, gender, teaching experience, personal computer use, professional computer use and science teachers' level of knowledge/skills in using specific computer applications for science instruction. The sample for this study includes middle and high school science teachers who received the Presidential Award for Excellence in Science Teaching Award (sponsored by the White House and the National Science Foundation) between the years 1997 and 2003 from all 50 states and U.S. territories. Award-winning science teachers were contacted about the survey via e-mail or letter with an enclosed return envelope. Of the 334 award-winning science teachers, usable responses were received from 92 science teachers, which made a response rate of 27.5%. Analysis of the survey responses indicated that exemplary science teachers have a variety of knowledge/skills in using computer related applications/tools. The most commonly used computer applications/tools are information retrieval via the Internet, presentation tools, online communication, digital cameras, and data collection probes. Results of the study revealed that students' use of technology in their science classroom is highly correlated with the frequency of their science teachers' use of computer applications/tools. The results of the multiple regression analysis revealed that personal self-efficacy related to the exemplary science teachers' level of computer use suggesting that computer use is dependent on perceived abilities at using computers. The teachers' use of computer-related applications/tools during class, and their personal self-efficacy, age, and gender are highly related with their level of knowledge/skills in using specific computer applications for science instruction. The teachers' level of knowledge/skills in using specific computer applications for science instruction and gender related to their use of computer-related applications/tools during class and the students' use of computer-related applications/tools in or for their science class. In conclusion, exemplary science teachers need assistance in learning and using computer-related applications/tool in their science class.
Cloudbus Toolkit for Market-Oriented Cloud Computing
NASA Astrophysics Data System (ADS)
Buyya, Rajkumar; Pandey, Suraj; Vecchiola, Christian
This keynote paper: (1) presents the 21st century vision of computing and identifies various IT paradigms promising to deliver computing as a utility; (2) defines the architecture for creating market-oriented Clouds and computing atmosphere by leveraging technologies such as virtual machines; (3) provides thoughts on market-based resource management strategies that encompass both customer-driven service management and computational risk management to sustain SLA-oriented resource allocation; (4) presents the work carried out as part of our new Cloud Computing initiative, called Cloudbus: (i) Aneka, a Platform as a Service software system containing SDK (Software Development Kit) for construction of Cloud applications and deployment on private or public Clouds, in addition to supporting market-oriented resource management; (ii) internetworking of Clouds for dynamic creation of federated computing environments for scaling of elastic applications; (iii) creation of 3rd party Cloud brokering services for building content delivery networks and e-Science applications and their deployment on capabilities of IaaS providers such as Amazon along with Grid mashups; (iv) CloudSim supporting modelling and simulation of Clouds for performance studies; (v) Energy Efficient Resource Allocation Mechanisms and Techniques for creation and management of Green Clouds; and (vi) pathways for future research.
NASA Astrophysics Data System (ADS)
Anbar, Ariel; Center for Education Through eXploration
2018-01-01
Advances in scientific visualization and public access to data have transformed science outreach and communication, but have yet to realize their potential impacts in the realm of education. Computer-based learning is a clear bridge between visualization and education that benefits students through adaptative personalization and enhanced access. Building this bridge requires close partnerships among scientists, technologists, and educators.The Infiniscope project fosters such partnerships to produce exploration-driven online learning experiences that teach basic science concepts using a combination of authentic space science narratives, data, and images, and a personalized guided inquiry approach. Infiniscope includes a web portal to host these digital learning experiences, as well as a teaching network of educators using and modifying these experiences. Infiniscope experiences are built around a new theory of digital learning design that we call “education through exploration” (ETX) developed during the creation of successful online, interactive science courses offered at ASU and other institutions. ETX builds on the research-based practices of active learning and guided inquiry to provide a set of design principles that aim to develop higher order thinking skills in addition to understanding of content. It is employed in these experiences by asking students to solve problems and actively discover relationships, supported by an intelligent tutoring system which provides immediate, personalized feedback and scaffolds scientific thinking and methods. The project is led by ASU’s School of Earth and Space Exploration working with learning designers in the Center for Education Through eXploration, with support from NASA’s Science Mission Directorate as part of the NASA Exploration Connection program.We will present an overview of ETX design, the Infinscope project, and emerging evidence of effectiveness.
Marine Science Building Dedicated
2003-10-17
Officials cut the ribbon during dedication ceremonies of the George A. Knauer Marine Science Building on Oct. 17 at NASA Stennis Space Center (SSC). The $2.75 million facility, the first building at the test site funded by the state of Mississippi, houses six science labs, classrooms and office space for 40 faculty and staff. Pictured are, from left, Rear Adm. Thomas Donaldson, commander of the Naval Meteorology and Oceanography Command; SSC Assistant Director David Throckmorton; Dr. George A. Knauer, founder of the Center of Marine Science at the University of Southern Mississippi (USM); Lt. Gov. Amy Tuck; and USM President Dr. Shelby Thames.
Marine Science Building Dedicated
NASA Technical Reports Server (NTRS)
2003-01-01
Officials cut the ribbon during dedication ceremonies of the George A. Knauer Marine Science Building on Oct. 17 at NASA Stennis Space Center (SSC). The $2.75 million facility, the first building at the test site funded by the state of Mississippi, houses six science labs, classrooms and office space for 40 faculty and staff. Pictured are, from left, Rear Adm. Thomas Donaldson, commander of the Naval Meteorology and Oceanography Command; SSC Assistant Director David Throckmorton; Dr. George A. Knauer, founder of the Center of Marine Science at the University of Southern Mississippi (USM); Lt. Gov. Amy Tuck; and USM President Dr. Shelby Thames.
West elevation. San Bernardino Valley Union Junior College, Science Building. ...
West elevation. San Bernardino Valley Union Junior College, Science Building. Also includes plan of entrance, section EE showing tiling and typical transom design, and a full size detail of a door jamb for inside concrete walls. Howard E. Jones, Architect, San Bernardino, California. Sheet 7, job no. 311. Scale 1.2 inch to the foot. February 15, 1927. - San Bernardino Valley College, Life Science Building, 701 South Mount Vernon Avenue, San Bernardino, San Bernardino County, CA
Development of a PC-based ground support system for a small satellite instrument
NASA Astrophysics Data System (ADS)
Deschambault, Robert L.; Gregory, Philip R.; Spenler, Stephen; Whalen, Brian A.
1993-11-01
The importance of effective ground support for the remote control and data retrieval of a satellite instrument cannot be understated. Problems with ground support may include the need to base personnel at a ground tracking station for extended periods, and the delay between the instrument observation and the processing of the data by the science team. Flexible solutions to such problems in the case of small satellite systems are provided by using low-cost, powerful personal computers and off-the-shelf software for data acquisition and processing, and by using Internet as a communication pathway to enable scientists to view and manipulate satellite data in real time at any ground location. The personal computer based ground support system is illustrated for the case of the cold plasma analyzer flown on the Freja satellite. Commercial software was used as building blocks for writing the ground support equipment software. Several levels of hardware support, including unit tests and development, functional tests, and integration were provided by portable and desktop personal computers. Satellite stations in Saskatchewan and Sweden were linked to the science team via phone lines and Internet, which provided remote control through a central point. These successful strategies will be used on future small satellite space programs.
Science Facilities Bibliography.
ERIC Educational Resources Information Center
National Science Foundation, Washington, DC.
A bibliographic collection on science buildings and facilities is cited with many different reference sources for those concerned with the design, planning, and layout of science facilities. References are given covering a broad scope of information on--(1) physical plant planning, (2) management and safety, (3) building type studies, (4) design…
NASA Astrophysics Data System (ADS)
Warner, T. T.; Swerdlin, S. P.; Chen, F.; Hayden, M.
2009-05-01
The innovative use of Computational Fluid-Dynamics (CFD) models to define the building- and street-scale atmospheric environment in urban areas can benefit society in a number of ways. Design criteria used by architectural climatologists, who help plan the livable cities of the future, require information about air movement within street canyons for different seasons and weather regimes. Understanding indoor urban air- quality problems and their mitigation, especially for older buildings, requires data on air movement and associated dynamic pressures near buildings. Learning how heat waves and anthropogenic forcing in cities collectively affect the health of vulnerable residents is a problem in building thermodynamics, human behavior, and neighborhood-scale and street-canyon-scale atmospheric sciences. And, predicting the movement of plumes of hazardous material released in urban industrial or transportation accidents requires detailed information about vertical and horizontal air motions in the street canyons. These challenges are closer to being addressed because of advances in CFD modeling, the coupling of CFD models with models of indoor air motion and air quality, and the coupling of CFD models with mesoscale weather-prediction models. This paper will review some of the new knowledge and technologies that are being developed to meet these atmospheric-environment needs of our growing urban populations.
Creative Building Design for Innovative Earth Science Teaching and Outreach (Invited)
NASA Astrophysics Data System (ADS)
Chan, M. A.
2009-12-01
Earth Science departments can blend the physical “bricks and mortar” facility with programs and educational displays to create a facility that is a permanent outreach tool and a welcoming home for teaching and research. The new Frederick Albert Sutton building at the University of Utah is one of the first LEED (Leadership in Energy and Environmental Design) certified Earth Science buildings in the country. Throughout the structure, creative architectural designs are combined with sustainability, artful geologic displays, and community partnerships. Distinctive features of the building include: 1) Unique, inviting geologic designs such as cross bedding pattern in the concrete foundation; “a river runs through it” (a pebble tile “stream” inside the entrance); “confluence” lobby with spectacular Eocene Green River fossil fish and plant walls; polished rock slabs; and many natural stone elements. All displays are also designed as teaching tools. 2) Student-generated, energy efficient, sustainable projects such as: solar tube lights, xeriscape & rock monoliths, rainwater collection, roof garden, pervious cement, and energy monitoring. 3) Reinforced concrete foundation for vibration-free analytical measurements, and exposed lab ceilings for duct work and infrastructure adaptability. The spectacular displays for this special project were made possible by new partnerships within the community. Companies participated with generous, in-kind donations (e.g., services, stone flooring and slabs, and landscape rocks). They received recognition in the building and in literature acknowledging donors. A beautiful built environment creates space that students, faculty, and staff are proud of. People feel good about coming to work, and they are happy about their surroundings. This makes a strong recruiting tool, with more productive and satisfied employees. Buildings with architectural interest and displays can showcase geology as art and science, while highlighting what Earth Scientists do. This approach can transform our Earth Science buildings into destinations for visitors, to show evoke inquiry. The building becomes a centerpiece, not another blank box on campus. Administrators at the University of Utah now want other new building structures to emulate our geoscience example. Done right, “bricks and mortar” can build stronger departments, infuse Earth Science into the community, and enhance our educational missions. LEED-certified Earth Science building with Eocene fossil fish wall, river pebble pattern in floor tile, displays, and student gathering areas.
Why Machine-Information Metaphors are Bad for Science and Science Education
NASA Astrophysics Data System (ADS)
Pigliucci, Massimo; Boudry, Maarten
2011-05-01
Genes are often described by biologists using metaphors derived from computational science: they are thought of as carriers of information, as being the equivalent of "blueprints" for the construction of organisms. Likewise, cells are often characterized as "factories" and organisms themselves become analogous to machines. Accordingly, when the human genome project was initially announced, the promise was that we would soon know how a human being is made, just as we know how to make airplanes and buildings. Importantly, modern proponents of Intelligent Design, the latest version of creationism, have exploited biologists' use of the language of information and blueprints to make their spurious case, based on pseudoscientific concepts such as "irreducible complexity" and on flawed analogies between living cells and mechanical factories. However, the living organism = machine analogy was criticized already by David Hume in his Dialogues Concerning Natural Religion. In line with Hume's criticism, over the past several years a more nuanced and accurate understanding of what genes are and how they operate has emerged, ironically in part from the work of computational scientists who take biology, and in particular developmental biology, more seriously than some biologists seem to do. In this article we connect Hume's original criticism of the living organism = machine analogy with the modern ID movement, and illustrate how the use of misleading and outdated metaphors in science can play into the hands of pseudoscientists. Thus, we argue that dropping the blueprint and similar metaphors will improve both the science of biology and its understanding by the general public.
Automatic computation for optimum height planning of apartment buildings to improve solar access
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seong, Yoon-Bok; Kim, Yong-Yee; Seok, Ho-Tae
2011-01-15
The objective of this study is to suggest a mathematical model and an optimal algorithm for determining the height of apartment buildings to satisfy the solar rights of survey buildings or survey housing units. The objective is also to develop an automatic computation model for the optimum height of apartment buildings and then to clarify the performance and expected effects. To accomplish the objective of this study, the following procedures were followed: (1) The necessity of the height planning of obstruction buildings to satisfy the solar rights of survey buildings or survey housing units is demonstrated by analyzing through amore » literature review the recent trend of disputes related to solar rights and to examining the social requirements in terms of solar rights. In addition, the necessity of the automatic computation system for height planning of apartment buildings is demonstrated and a suitable analysis method for this system is chosen by investigating the characteristics of analysis methods for solar rights assessment. (2) A case study on the process of height planning of apartment buildings will be briefly described and the problems occurring in this process will then be examined carefully. (3) To develop an automatic computation model for height planning of apartment buildings, geometrical elements forming apartment buildings are defined by analyzing the geometrical characteristics of apartment buildings. In addition, design factors and regulations required in height planning of apartment buildings are investigated. Based on this knowledge, the methodology and mathematical algorithm to adjust the height of apartment buildings by automatic computation are suggested and probable problems and the ways to resolve these problems are discussed. Finally, the methodology and algorithm for the optimization are suggested. (4) Based on the suggested methodology and mathematical algorithm, the automatic computation model for optimum height of apartment buildings is developed and the developed system is verified through the application of some cases. The effects of the suggested model are then demonstrated quantitatively and qualitatively. (author)« less
Why Johnny can't reengineer health care processes with information technology.
Webster, C; McLinden, S; Begler, K
1995-01-01
Many educational institutions are developing curricula that integrate computer and business knowledge and skills concerning a specific industry, such as banking or health care. We have developed a curriculum that emphasizes, equally, medical, computer, and business management concepts. Along the way we confronted a formidable obstacle, namely the domain specificity of the reference disciplines. Knowledge within each domain is sufficiently different from other domains that it reduces the leverage of building on preexisting knowledge and skills. We review this problem from the point of view of cognitive science (in particular, knowledge representation and machine learning) to suggest strategies for coping with incommensurate domain ontologies. These strategies include reflective judgment, implicit learning, abstraction, generalization, analogy, multiple inheritance, project-orientation, selectivity, goal- and failure-driven learning, and case- and story-based learning.
77 FR 60448 - National Institute of Environmental Health Sciences Notice of Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-03
... Health Sciences, Building 101, Rodbell Auditorium, 111 T. W. Alexander Drive, Research Triangle Park, NC..., Rodbell Auditorium, 111 T. W. Alexander Drive, Research Triangle Park, NC 27709. Closed: November 5, 2012... Sciences, Building 101, Rodbell Auditorium, 111 T. W. Alexander Drive, Research Triangle Park, NC 27709...
78 FR 59042 - National Institute of Environmental Health Sciences; Notice of Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-25
...: Nat. Inst. of Environmental Health Sciences, Building 101, Rodbell Auditorium, 111 T. W. Alexander... Auditorium, 111 T. W. Alexander Drive, Research Triangle Park, NC 27709. Closed: October 21, 2013, 11:15 a.m... Environmental Health Sciences, Building 101, Rodbell Auditorium, 111 T. W. Alexander Drive, Research Triangle...
77 FR 3480 - National Institute of Environmental Health Sciences Notice of Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-24
..., Rodbell Auditorium, 111 T. W. Alexander Drive, Research Triangle Park, NC 27709. Closed: February 15, 2012... Environmental Health Sciences, Building 101, Rodbell Auditorium, 111 T. W. Alexander Drive, Research Triangle... and issues. Place: Nat. Inst. of Environmental Health Sciences, Building 101, Rodbell Auditorium, 111...
77 FR 18252 - National Institute of Environmental Health Sciences; Notice of Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-27
... Environmental Health Sciences, Building 101, Rodbell Auditorium, 111 T.W. Alexander Drive, Research Triangle... issues. Place: Nat. Inst. of Environmental Health Sciences, Building 101, Rodbell Auditorium, 111 T. W..., Rodbell Auditorium, 111 T.W. Alexander Drive, Research Triangle Park, NC 27709. Contact Person: Gwen W...
75 FR 3474 - National Institute of Environmental Health Sciences; Notice of Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-21
...: Nat. Inst. of Environmental Health Sciences, Building 101, Rodbell Auditorium, 111 T. W. Alexander..., Rodbell Auditorium, 111 T.W. Alexander Drive, Research Triangle Park, NC 27709. Closed: February 19, 2010... Environmental Health Sciences, Building 101, Rodbell Auditorium, 111 T. W. Alexander Drive, Research Triangle...
78 FR 48695 - National Institute of Environmental Health Sciences; Notice of Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-09
... Health Sciences, Building 101, Rodbell Auditorium, 111 T. W. Alexander Drive, Research Triangle Park, NC.... of Environmental Health Sciences, Building 101, Rodbell Auditorium, 111 T. W. Alexander Drive..., Rodbell Auditorium, 111 T. W. Alexander Drive, Research Triangle Park, NC 27709. Contact Person: Gwen W...
77 FR 74198 - National Institute Environmental Health Sciences Notice of Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-13
... Health Sciences, Building 101, Rodbell Auditorium, 111 T. W. Alexander Drive, Research Triangle Park, NC... issues. Place: Nat. Inst. of Environmental Health Sciences, Building 101, Rodbell Auditorium, 111 T. W... Auditorium, 111 T. W. Alexander Drive, Research Triangle Park, NC 27709. Contact Person: Gwen W. Collman, Ph...
76 FR 6146 - National Institute of Environmental Health Sciences; Notice of Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-03
...: National Institute of Environmental Health Sciences, Building 101, Rodbell Auditorium, 111 T.W. Alexander... 101, Rodbell Auditorium, 111 T.W. Alexander Drive, Research Triangle Park, NC 27709. Open: February 17... Institute of Environmental Health Sciences, Building 101, Rodbell Auditorium, 111 T.W. Alexander Drive...
75 FR 19981 - National Institute of Environmental Health Sciences; Notice of Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-16
... Environmental Health Sciences, Building 101, Rodbell Auditorium, 111 T. W. Alexander Drive, Research Triangle... issues. Place: Nat. Inst. of Environmental Health Sciences, Building 101, Rodbell Auditorium, 111 T. W..., Rodbell Auditorium, 111 T. W. Alexander Drive, Research Triangle Park, NC 27709. Contact Person: Gwen W...
NASA Astrophysics Data System (ADS)
Bérczi, Sz.; Hegyi, S.; Hudoba, Gy.; Hargitai, H.; Kokiny, A.; Drommer, B.; Gucsik, A.; Pintér, A.; Kovács, Zs.
Several teachers and students had the possibility to visit International Space Camp in the vicinity of the MSFC NASA in Huntsville Alabama USA where they learned the success of simulators in space science education To apply these results in universities and colleges in Hungary we began a unified complex modelling in planetary geology robotics electronics and complex environmental analysis by constructing an experimental space probe model system First a university experimental lander HUNVEYOR Hungarian UNiversity surVEYOR then a rover named HUSAR Hungarian University Surface Analyser Rover has been built For Hunveyor the idea and example was the historical Surveyor program of NASA in the 1960-ies for the Husar the idea and example was the Pathfinder s rover Sojouner rover The first step was the construction of the lander a year later the rover followed The main goals are 1 to build the lander structure and basic electronics from cheap everyday PC compatible elements 2 to construct basic experiments and their instruments 3 to use the system as a space activity simulator 4 this simulator contains lander with on board computer for works on a test planetary surface and a terrestrial control computer 5 to harmonize the assemblage of the electronic system and instruments in various levels of autonomy from the power and communication circuits 6 to use the complex system in education for in situ understanding complex planetary environmental problems 7 to build various planetary environments for application of the
ERIC Educational Resources Information Center
Swart, Sandra, Ed.; Friesen, Barbara, Ed.; Holman, Ariel, Ed.; Aue, Nicole, Ed.
2009-01-01
The State of the Science conference was held in May, 2007 as part of the ongoing series of national conferences, "Building on Family Strengths," conducted by the Research and Training Center on Family Support and Children's Mental Health at Portland State University. The theme of this State-of-the Science conference was "Effective…
Investigation of wind behaviour around high-rise buildings
NASA Astrophysics Data System (ADS)
Mat Isa, Norasikin; Fitriah Nasir, Nurul; Sadikin, Azmahani; Ariff Hairul Bahara, Jamil
2017-09-01
A study on the investigation of wind behaviour around the high-rise buildings is done through an experiment using a wind tunnel and computational fluid dynamics. High-rise buildings refer to buildings or structures that have more than 12 floors. Wind is invisible to the naked eye; thus, it is hard to see and analyse its flow around and over buildings without the use of proper methods, such as the use of wind tunnel and computational fluid dynamics software.The study was conducted on buildings located in Presint 4, Putrajaya, Malaysia which is the Ministry of Rural and Regional Development, Ministry of Information Communications and Culture, Ministry of Urban Wellbeing, Housing and Local Government and the Ministry of Women, Family, and Community by making scaled models of the buildings. The parameters in which this study is conducted on are, four different wind velocities used based on the seasonal monsoons, and wind direction. ANSYS Fluent workbench software is used to compute the simulations in order to achieve the objectives of this study. The data from the computational fluid dynamics are validated with the experiment done through the wind tunnel. From the results obtained through the use of the computation fluid dynamics, this study can identify the characteristics of wind around buildings, including boundary layer of the buildings, separation flow, wake region and etc. Then analyses is conducted on the occurance resulting from the wind that passes the buildings based on the velocity difference between before and after the wind passes the buildings.
Careers in Data Science: A Berkeley Perspective
NASA Astrophysics Data System (ADS)
Koy, K.
2015-12-01
Last year, I took on an amazing opportunity to serve as the Executive Director of the new Berkeley Institute for Data Science (BIDS). After a 15-year career working with geospatial data to advance our understanding of the environment, I have been presented with a unique opportunity through BIDS to work with talented researchers from a wide variety of backgrounds. Founded in 2013, BIDS is a central hub of research and education at UC Berkeley designed to facilitate and nurture data-intensive science. We are building a community centered on a cohort of talented data science fellows and senior fellows who are representative of the world-class researchers from across our campus and are leading the data science revolution within their disciplines. Our initiatives are designed to bring together broad constituents of the data science community, including domain experts from the life, social, and physical sciences and methodological experts from computer science, statistics, and applied mathematics. While many of these individuals rarely cross professional paths, BIDS actively seeks new and creative ways to engage and foster collaboration across these different research fields. In this presentation, I will share my own story, along with some insights into how BIDS is supporting the careers of data scientists, including graduate students, postdocs, faculty, and research staff. I will also describe how these individuals we are helping support are working to address a number of data science-related challenges in scientific research.
The new library building at the University of Texas Health Science Center at San Antonio.
Kronick, D A; Bowden, V M; Olivier, E R
1985-01-01
The new University of Texas Health Science Center at San Antonio Library opened in June 1983, replacing the 1968 library building. Planning a new library building provides an opportunity for the staff to rethink their philosophy of service. Of paramount concern and importance is the need to convey this philosophy to the architects. This paper describes the planning process and the building's external features, interior layouts, and accommodations for technology. Details of the move to the building are considered and various aspects of the building are reviewed. Images PMID:3995205
UNLV’s environmentally friendly Science and Engineering Building is monitored for earthquake shaking
Kalkan, Erol; Savage, Woody; Reza, Shahneam; Knight, Eric; Tian, Ying
2013-01-01
The University of Nevada Las Vegas’ (UNLV) Science and Engineering Building is at the cutting edge of environmentally friendly design. As the result of a recent effort by the U.S. Geological Survey’s National Strong Motion Project in cooperation with UNLV, the building is now also in the forefront of buildings installed with structural monitoring systems to measure response during earthquakes. This is particularly important because this is the first such building in Las Vegas. The seismic instrumentation will provide essential data to better understand the structural performance of buildings, especially in this seismically active region.
The Laboratory for Terrestrial Physics
NASA Technical Reports Server (NTRS)
2003-01-01
The Laboratory for Terrestrial Physics is dedicated to the advancement of knowledge in Earth and planetary science, by conducting innovative research using space technology. The Laboratory's mission and activities support the work and new initiatives at NASA's Goddard Space Flight Center (GSFC). The Laboratory's success contributes to the Earth Science Directorate as a national resource for studies of Earth from Space. The Laboratory is part of the Earth Science Directorate based at the GSFC in Greenbelt, MD. The Directorate itself is comprised of the Global Change Data Center (GCDC), the Space Data and Computing Division (SDCD), and four science Laboratories, including Laboratory for Terrestrial Physics, Laboratory for Atmospheres, and Laboratory for Hydrospheric Processes all in Greenbelt, MD. The fourth research organization, Goddard Institute for Space Studies (GISS), is in New York, NY. Relevant to NASA's Strategic Plan, the Laboratory ensures that all work undertaken and completed is within the vision of GSFC. The philosophy of the Laboratory is to balance the completion of near term goals, while building on the Laboratory's achievements as a foundation for the scientific challenges in the future.
Weighing the Balance of Science Literacy in Education and Public Policy
NASA Astrophysics Data System (ADS)
Buxner, S.; Impey, C.; Johnson, B.
2015-11-01
Science literacy is a concern of educators and policy makers in the United States and all over the world. Science literacy is defined by society and includes important knowledge for individuals that varies with culture and local knowledge systems. The technological societies of the western world have delegated the knowledge that underpins their everyday world to mechanics who know how their cars work, technicians who know how their computers work, and policy wonks who know how their individual choices and actions will affect the environment and their health. The scientific principles that frame and sculpt the technological world are invisible and mysterious to most people. A question for debate is whether or not this is a healthy situation or not, and if not, what to do about it. The panelists shared their prospects and challenges of building science literacy with individuals in the United States and with Tibetan monks. As they discussed their efforts working with these different populations, they shared lessons based on common issues and unique solutions based on local knowledge systems and communities of learners.
NASA Astrophysics Data System (ADS)
Lawton, B.; Hemenway, M. K.; Mendez, B.; Odenwald, S.
2013-04-01
Among NASA's major education goals is the training of students in the Science, Technology, Engineering, and Math (STEM) disciplines. The use of real data, from some of the most sophisticated observatories in the world, provides formal educators the opportunity to teach their students real-world applications of the STEM subjects. Combining real space science data with lessons aimed at meeting state and national education standards provides a memorable educational experience that students can build upon throughout their academic careers. Many of our colleagues have adopted the use of real data in their education and public outreach (EPO) programs. There are challenges in creating resources using real data for classroom use that include, but are not limited to, accessibility to computers/Internet and proper instruction. Understanding and sharing these difficulties and best practices with the larger EPO community is critical to the development of future resources. In this session, we highlight three examples of how NASA data is being utilized in the classroom: the Galaxies and Cosmos Explorer Tool (GCET) that utilizes real Hubble Space Telescope data; the computer image-analysis resources utilized by the NASA WISE infrared mission; and the space science derived math applications from SpaceMath@NASA featuring the Chandra and Kepler space telescopes. Challenges and successes are highlighted for these projects. We also facilitate small-group discussions that focus on additional benefits and challenges of using real data in the formal education environment. The report-outs from those discussions are given here.
Building a cloud based distributed active archive data center
NASA Astrophysics Data System (ADS)
Ramachandran, Rahul; Baynes, Katie; Murphy, Kevin
2017-04-01
NASA's Earth Science Data System (ESDS) Program serves as a central cog in facilitating the implementation of NASA's Earth Science strategic plan. Since 1994, the ESDS Program has committed to the full and open sharing of Earth science data obtained from NASA instruments to all users. One of the key responsibilities of the ESDS Program is to continuously evolve the entire data and information system to maximize returns on the collected NASA data. An independent review was conducted in 2015 to holistically review the EOSDIS in order to identify gaps. The review recommendations were to investigate two areas: one, whether commercial cloud providers offer potential for storage, processing, and operational efficiencies, and two, the potential development of new data access and analysis paradigms. In response, ESDS has initiated several prototypes investigating the advantages and risks of leveraging cloud computing. This poster will provide an overview of one such prototyping activity, "Cumulus". Cumulus is being designed and developed as a "native" cloud-based data ingest, archive and management system that can be used for all future NASA Earth science data streams. The long term vision for Cumulus, its requirements, overall architecture, and implementation details, as well as lessons learned from the completion of the first phase of this prototype will be covered. We envision Cumulus will foster design of new analysis/visualization tools to leverage collocated data from all of the distributed DAACs as well as elastic cloud computing resources to open new research opportunities.
Gendered Expectations: Examining How Peers Shape Female Students' Intent to Pursue STEM Fields.
Riegle-Crumb, Catherine; Morton, Karisma
2017-01-01
Building on prior psychological and sociological research on the power of local environments to shape gendered outcomes in STEM fields, this study focuses on the critical stage of adolescence to explore the potential negative impact of exposure to exclusionary messages from peers within girls' science classrooms, as well as the positive potential impact of inclusionary messages. Specifically, utilizing longitudinal data from a diverse sample of adolescent youth, analyses examine how the presence of biased male peers, as well as confident female peers, shape girls' subsequent intentions to pursue different STEM fields, focusing specifically on intentions to pursue the male-dominated fields of computer science and engineering, as well as more gender equitable fields. Results reveal that exposure to a higher percentage of 8th grade male peers in the classroom who endorsed explicit gender/STEM stereotypes significantly and negatively predicted girls' later intentions to pursue a computer science/engineering (CS/E) major. Yet results also reveal that exposure to a higher percentage of confident female peers in the science classroom positively predicted such intentions. These results were specific to CS/E majors, suggesting that peers are an important source of messages regarding whether or not girls should pursue non-traditional STEM fields. This study calls attention to the importance of examining both positive and negative sources of influence within the local contexts where young people live and learn. Limitations and directions for future research are also discussed.
Gendered Expectations: Examining How Peers Shape Female Students' Intent to Pursue STEM Fields
Riegle-Crumb, Catherine; Morton, Karisma
2017-01-01
Building on prior psychological and sociological research on the power of local environments to shape gendered outcomes in STEM fields, this study focuses on the critical stage of adolescence to explore the potential negative impact of exposure to exclusionary messages from peers within girls' science classrooms, as well as the positive potential impact of inclusionary messages. Specifically, utilizing longitudinal data from a diverse sample of adolescent youth, analyses examine how the presence of biased male peers, as well as confident female peers, shape girls' subsequent intentions to pursue different STEM fields, focusing specifically on intentions to pursue the male-dominated fields of computer science and engineering, as well as more gender equitable fields. Results reveal that exposure to a higher percentage of 8th grade male peers in the classroom who endorsed explicit gender/STEM stereotypes significantly and negatively predicted girls' later intentions to pursue a computer science/engineering (CS/E) major. Yet results also reveal that exposure to a higher percentage of confident female peers in the science classroom positively predicted such intentions. These results were specific to CS/E majors, suggesting that peers are an important source of messages regarding whether or not girls should pursue non-traditional STEM fields. This study calls attention to the importance of examining both positive and negative sources of influence within the local contexts where young people live and learn. Limitations and directions for future research are also discussed. PMID:28360868
SENSE IT: Student Enabled Network of Sensors for the Environment using Innovative Technology
NASA Astrophysics Data System (ADS)
Hotaling, L. A.; Stolkin, R.; Kirkey, W.; Bonner, J. S.; Lowes, S.; Lin, P.; Ojo, T.
2010-12-01
SENSE IT is a project funded by the National Science Foundation (NSF) which strives to enrich science, technology, engineering and mathematics (STEM) education by providing teacher professional development and classroom projects in which high school students build from first principles, program, test and deploy sensors for water quality monitoring. Sensor development is a broad and interdisciplinary area, providing motivating scenarios in which to teach a multitude of STEM subjects, from mathematics and physics to biology and environmental science, while engaging students with hands on problems that reinforce conventional classroom learning by re-presenting theory as practical tools for building real-life working devices. The SENSE IT program is currently developing and implementing a set of high school educational modules which teach environmental science and basic engineering through the lens of fundamental STEM principles, at the same time introducing students to a new set of technologies that are increasingly important in the world of environmental research. Specifically, the project provides students with the opportunity to learn the engineering design process through the design, construction, programming and testing of a student-implemented water monitoring network in the Hudson and St. Lawrence Rivers in New York. These educational modules are aligned to state and national technology and science content standards and are designed to be compatible with standard classroom curricula to support a variety of core science, technology and mathematics classroom material. For example, while designing, programming and calibrating the sensors, the students are led through a series of tasks in which they must use core mathematics and physics theory to solve the real problems of making their sensors work. In later modules, students can explore environmental science and environmental engineering curricula while deploying and monitoring their sensors in local rivers. This presentation will provide an overview of the educational modules. A variety of sensors will be described, which are suitably simple for design and construction from first principles by high school students while being accurate enough for students to make meaningful environmental measurements. The presentation will also describe how the sensor building activities can be tied to core curricula classroom theory, enabling the modules to be utilized in regular classes by mathematics, science and computing teachers without disrupting their semester’s teaching goals. Furthermore, the presentation will address of the first two years of the SENSE IT project, during which 39 teachers have been equipped, trained on these materials, and have implemented the modules with around approximately 2,000 high school students.
Hydrological analysis in R: Topmodel and beyond
NASA Astrophysics Data System (ADS)
Buytaert, W.; Reusser, D.
2011-12-01
R is quickly gaining popularity in the hydrological sciences community. The wide range of statistical and mathematical functionality makes it an excellent tool for data analysis, modelling and uncertainty analysis. Topmodel was one of the first hydrological models being implemented as an R package and distributed through R's own distribution network CRAN. This facilitated pre- and postprocessing of data such as parameter sampling, calculation of prediction bounds, and advanced visualisation. However, apart from these basic functionalities, the package did not use many of the more advanced features of the R environment, especially from R's object oriented functionality. With R's increasing expansion in arenas such as high performance computing, big data analysis, and cloud services, we revisit the topmodel package, and use it as an example of how to build and deploy the next generation of hydrological models. R provides a convenient environment and attractive features to build and couple hydrological - and in extension other environmental - models, to develop flexible and effective data assimilation strategies, and to take the model beyond the individual computer by linking into cloud services for both data provision and computing. However, in order to maximise the benefit of these approaches, it will be necessary to adopt standards and ontologies for model interaction and information exchange. Some of those are currently being developed, such as the OGC web processing standards, while other will need to be developed.
NASA Astrophysics Data System (ADS)
Stoilescu, Dorian; Egodawatte, Gunawardena
2010-12-01
Research shows that female and male students in undergraduate computer science programs view computer culture differently. Female students are interested more in the use of computers than in doing programming, whereas male students see computer science mainly as a programming activity. The overall purpose of our research was not to find new definitions for computer science culture but to see how male and female students see themselves involved in computer science practices, how they see computer science as a successful career, and what they like and dislike about current computer science practices. The study took place in a mid-sized university in Ontario. Sixteen students and two instructors were interviewed to get their views. We found that male and female views are different on computer use, programming, and the pattern of student interactions. Female and male students did not have any major issues in using computers. In computing programming, female students were not so involved in computing activities whereas male students were heavily involved. As for the opinions about successful computer science professionals, both female and male students emphasized hard working, detailed oriented approaches, and enjoying playing with computers. The myth of the geek as a typical profile of successful computer science students was not found to be true.
Interoperability of GADU in using heterogeneous Grid resources for bioinformatics applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sulakhe, D.; Rodriguez, A.; Wilde, M.
2008-03-01
Bioinformatics tools used for efficient and computationally intensive analysis of genetic sequences require large-scale computational resources to accommodate the growing data. Grid computational resources such as the Open Science Grid and TeraGrid have proved useful for scientific discovery. The genome analysis and database update system (GADU) is a high-throughput computational system developed to automate the steps involved in accessing the Grid resources for running bioinformatics applications. This paper describes the requirements for building an automated scalable system such as GADU that can run jobs on different Grids. The paper describes the resource-independent configuration of GADU using the Pegasus-based virtual datamore » system that makes high-throughput computational tools interoperable on heterogeneous Grid resources. The paper also highlights the features implemented to make GADU a gateway to computationally intensive bioinformatics applications on the Grid. The paper will not go into the details of problems involved or the lessons learned in using individual Grid resources as it has already been published in our paper on genome analysis research environment (GNARE) and will focus primarily on the architecture that makes GADU resource independent and interoperable across heterogeneous Grid resources.« less
Knowledge-Building Activity Structures in Japanese Elementary Science Pedagogy
ERIC Educational Resources Information Center
Oshima, Jun; Oshima, Ritsuko; Murayama, Isao; Inagaki, Shigenori; Takenaka, Makiko; Yamamoto, Tomokazu; Yamaguchi, Etsuji; Nakayama, Hayashi
2006-01-01
The purpose of this study is to refine Japanese elementary science activity structures by using a CSCL approach to transform the classroom into a knowledge-building community. We report design studies on two science lessons in two consecutive years and describe the progressive refinement of the activity structures. Through comparisons of student…
78 FR 18997 - National Institute of Environmental Health Sciences; Notice of Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-28
... Health Sciences, Building 101, Rodbell Auditorium, 111 T. W. Alexander Drive, Research Triangle Park, NC... personnel issues. Place: Nat. Inst. of Environmental Health Sciences, Building 101, Rodbell Auditorium, 111... Auditorium, 111 T. W. Alexander Drive, Research Triangle Park, NC 27709. Closed: April 15, 2013, 3:15 p.m. to...
Using Google Earth to Teach Plate Tectonics and Science Explanations
ERIC Educational Resources Information Center
Blank, Lisa M.; Plautz, Mike; Almquist, Heather; Crews, Jeff; Estrada, Jen
2012-01-01
"A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas" emphasizes that the practice of science is inherently a model-building activity focused on constructing explanations using evidence and reasoning (NRC 2012). Because building and refining is an iterative process, middle school students may view this practice…
Earth Science Education in Zimbabwe
NASA Astrophysics Data System (ADS)
Walsh, Kevin L.
1999-05-01
Zimbabwe is a mineral-rich country with a long history of Earth Science Education. The establishment of a University Geology Department in 1960 allowed the country to produce its own earth science graduates. These graduates are readily absorbed by the mining industry and few are without work. Demand for places at the University is high and entry standards reflect this. Students enter the University after GCE A levels in three science subjects and most go on to graduate. Degree programmes include B.Sc. General in Geology (plus another science), B.Sc. Honours in Geology and M.Sc. in Exploration Geology and in Geophysics. The undergraduate curriculum is broad-based and increasingly vocationally orientated. A well-equipped building caters for relatively large student numbers and also houses analytical facilities used for research and teaching. Computers are used in teaching from the first year onwards. Staff are on average poorly qualified compared to other universities, but there is an impressive research element. The Department has good links with many overseas universities and external funding agencies play a strong supporting role. That said, financial constraints remain the greatest barrier to future development, although increasing links with the mining industry may cushion this.
NASA Astrophysics Data System (ADS)
Spellman, K.
2017-12-01
A changing climate has impacted Alaska communities at unprecedented rates, and the need for efficient and effective climate change learning in the Boreal and Arctic regions is urgent. Learning programs that can both increase personal understanding and connection to climate change science and also inform large scale scientific research about climate change are an attractive option for building community adaptive capacity at multiple scales. Citizen science has emerged as a powerful tool for facilitating learning across scales, and for building partnerships across natural sciences research, education, and outreach disciplines. As an early career scientist and interdisciplinary researcher, citizen science has become the centerpiece of my work and has provided some of the most rewarding moments of my career. I will discuss my early career journey building a research and leadership portfolio integrating climate change research, learning research, and public outreach through citizen science. I will share key experiences from graduate student to early career PI that cultivated my leadership skills and ability to build partnerships necessary to create citizen science programs that emphasize synergy between climate change research and education.
NASA Astrophysics Data System (ADS)
Bader, D. C.
2015-12-01
The Accelerated Climate Modeling for Energy (ACME) Project is concluding its first year. Supported by the Office of Science in the U.S. Department of Energy (DOE), its vision is to be "an ongoing, state-of-the-science Earth system modeling, modeling simulation and prediction project that optimizes the use of DOE laboratory resources to meet the science needs of the nation and the mission needs of DOE." Included in the "laboratory resources," is a large investment in computational, network and information technologies that will be utilized to both build better and more accurate climate models and broadly disseminate the data they generate. Current model diagnostic analysis and data dissemination technologies will not scale to the size of the simulations and the complexity of the models envisioned by ACME and other top tier international modeling centers. In this talk, the ACME Workflow component plans to meet these future needs will be described and early implementation examples will be highlighted.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, Kristan D.; Faraj, Daniel A.
In a parallel computer, a plurality of logical planes formed of compute nodes of a subcommunicator may be identified by: for each compute node of the subcommunicator and for a number of dimensions beginning with a first dimension: establishing, by a plane building node, in a positive direction of the first dimension, all logical planes that include the plane building node and compute nodes of the subcommunicator in a positive direction of a second dimension, where the second dimension is orthogonal to the first dimension; and establishing, by the plane building node, in a negative direction of the first dimension,more » all logical planes that include the plane building node and compute nodes of the subcommunicator in the positive direction of the second dimension.« less
NASA Astrophysics Data System (ADS)
Erickson, T. A.; Granger, B.; Grout, J.; Corlay, S.
2017-12-01
The volume of Earth science data gathered from satellites, aircraft, drones, and field instruments continues to increase. For many scientific questions in the Earth sciences, managing this large volume of data is a barrier to progress, as it is difficult to explore and analyze large volumes of data using the traditional paradigm of downloading datasets to a local computer for analysis. Furthermore, methods for communicating Earth science algorithms that operate on large datasets in an easily understandable and reproducible way are needed. Here we describe a system for developing, interacting, and sharing well-documented Earth Science algorithms that combines existing software components: Jupyter Notebook: An open-source, web-based environment that supports documents that combine code and computational results with text narrative, mathematics, images, and other media. These notebooks provide an environment for interactive exploration of data and development of well documented algorithms. Jupyter Widgets / ipyleaflet: An architecture for creating interactive user interface controls (such as sliders, text boxes, etc.) in Jupyter Notebooks that communicate with Python code. This architecture includes a default set of UI controls (sliders, dropboxes, etc.) as well as APIs for building custom UI controls. The ipyleaflet project is one example that offers a custom interactive map control that allows a user to display and manipulate geographic data within the Jupyter Notebook. Google Earth Engine: A cloud-based geospatial analysis platform that provides access to petabytes of Earth science data via a Python API. The combination of Jupyter Notebooks, Jupyter Widgets, ipyleaflet, and Google Earth Engine makes it possible to explore and analyze massive Earth science datasets via a web browser, in an environment suitable for interactive exploration, teaching, and sharing. Using these environments can make Earth science analyses easier to understand and reproducible, which may increase the rate of scientific discoveries and the transition of discoveries into real-world impacts.
More Than Words: The Role of Multiword Sequences in Language Learning and Use.
Christiansen, Morten H; Arnon, Inbal
2017-07-01
The ability to convey our thoughts using an infinite number of linguistic expressions is one of the hallmarks of human language. Understanding the nature of the psychological mechanisms and representations that give rise to this unique productivity is a fundamental goal for the cognitive sciences. A long-standing hypothesis is that single words and rules form the basic building blocks of linguistic productivity, with multiword sequences being treated as units only in peripheral cases such as idioms. The new millennium, however, has seen a shift toward construing multiword linguistic units not as linguistic rarities, but as important building blocks for language acquisition and processing. This shift-which originated within theoretical approaches that emphasize language learning and use-has far-reaching implications for theories of language representation, processing, and acquisition. Incorporating multiword units as integral building blocks blurs the distinction between grammar and lexicon; calls for models of production and comprehension that can accommodate and give rise to the effect of multiword information on processing; and highlights the importance of such units to learning. In this special topic, we bring together cutting-edge work on multiword sequences in theoretical linguistics, first-language acquisition, psycholinguistics, computational modeling, and second-language learning to present a comprehensive overview of the prominence and importance of such units in language, their possible role in explaining differences between first- and second-language learning, and the challenges the combined findings pose for theories of language. Copyright © 2017 Cognitive Science Society, Inc.
Avogadro: an advanced semantic chemical editor, visualization, and analysis platform
2012-01-01
Background The Avogadro project has developed an advanced molecule editor and visualizer designed for cross-platform use in computational chemistry, molecular modeling, bioinformatics, materials science, and related areas. It offers flexible, high quality rendering, and a powerful plugin architecture. Typical uses include building molecular structures, formatting input files, and analyzing output of a wide variety of computational chemistry packages. By using the CML file format as its native document type, Avogadro seeks to enhance the semantic accessibility of chemical data types. Results The work presented here details the Avogadro library, which is a framework providing a code library and application programming interface (API) with three-dimensional visualization capabilities; and has direct applications to research and education in the fields of chemistry, physics, materials science, and biology. The Avogadro application provides a rich graphical interface using dynamically loaded plugins through the library itself. The application and library can each be extended by implementing a plugin module in C++ or Python to explore different visualization techniques, build/manipulate molecular structures, and interact with other programs. We describe some example extensions, one which uses a genetic algorithm to find stable crystal structures, and one which interfaces with the PackMol program to create packed, solvated structures for molecular dynamics simulations. The 1.0 release series of Avogadro is the main focus of the results discussed here. Conclusions Avogadro offers a semantic chemical builder and platform for visualization and analysis. For users, it offers an easy-to-use builder, integrated support for downloading from common databases such as PubChem and the Protein Data Bank, extracting chemical data from a wide variety of formats, including computational chemistry output, and native, semantic support for the CML file format. For developers, it can be easily extended via a powerful plugin mechanism to support new features in organic chemistry, inorganic complexes, drug design, materials, biomolecules, and simulations. Avogadro is freely available under an open-source license from http://avogadro.openmolecules.net. PMID:22889332
Avogadro: an advanced semantic chemical editor, visualization, and analysis platform.
Hanwell, Marcus D; Curtis, Donald E; Lonie, David C; Vandermeersch, Tim; Zurek, Eva; Hutchison, Geoffrey R
2012-08-13
The Avogadro project has developed an advanced molecule editor and visualizer designed for cross-platform use in computational chemistry, molecular modeling, bioinformatics, materials science, and related areas. It offers flexible, high quality rendering, and a powerful plugin architecture. Typical uses include building molecular structures, formatting input files, and analyzing output of a wide variety of computational chemistry packages. By using the CML file format as its native document type, Avogadro seeks to enhance the semantic accessibility of chemical data types. The work presented here details the Avogadro library, which is a framework providing a code library and application programming interface (API) with three-dimensional visualization capabilities; and has direct applications to research and education in the fields of chemistry, physics, materials science, and biology. The Avogadro application provides a rich graphical interface using dynamically loaded plugins through the library itself. The application and library can each be extended by implementing a plugin module in C++ or Python to explore different visualization techniques, build/manipulate molecular structures, and interact with other programs. We describe some example extensions, one which uses a genetic algorithm to find stable crystal structures, and one which interfaces with the PackMol program to create packed, solvated structures for molecular dynamics simulations. The 1.0 release series of Avogadro is the main focus of the results discussed here. Avogadro offers a semantic chemical builder and platform for visualization and analysis. For users, it offers an easy-to-use builder, integrated support for downloading from common databases such as PubChem and the Protein Data Bank, extracting chemical data from a wide variety of formats, including computational chemistry output, and native, semantic support for the CML file format. For developers, it can be easily extended via a powerful plugin mechanism to support new features in organic chemistry, inorganic complexes, drug design, materials, biomolecules, and simulations. Avogadro is freely available under an open-source license from http://avogadro.openmolecules.net.
Building Fossils in the Elementary School and Writing about Them Using Computers.
ERIC Educational Resources Information Center
Schlenker, Richard M.; Yoshida, Sarah
This material describes a fossil-building activity using sea shells, chicken bones, and plaster for grade one through three students. Related process skills, vocabulary, computer principles, time requirements, and materials are listed. Two methods of building the fossils are discussed. After building the fossils, classes may be divided into pairs…
ERIC Educational Resources Information Center
Lin, Che-Li; Liang, Jyh-Chong; Su, Yi-Ching; Tsai, Chin-Chung
2013-01-01
Teacher-centered instruction has been widely adopted in college computer science classrooms and has some benefits in training computer science undergraduates. Meanwhile, student-centered contexts have been advocated to promote computer science education. How computer science learners respond to or prefer the two types of teacher authority,…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spentzouris, P.; /Fermilab; Cary, J.
The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessarymore » accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors. ComPASS is in the first year of executing its plan to develop the next-generation HPC accelerator modeling tools. ComPASS aims to develop an integrated simulation environment that will utilize existing and new accelerator physics modules with petascale capabilities, by employing modern computing and solver technologies. The ComPASS vision is to deliver to accelerator scientists a virtual accelerator and virtual prototyping modeling environment, with the necessary multiphysics, multiscale capabilities. The plan for this development includes delivering accelerator modeling applications appropriate for each stage of the ComPASS software evolution. Such applications are already being used to address challenging problems in accelerator design and optimization. The ComPASS organization for software development and applications accounts for the natural domain areas (beam dynamics, electromagnetics, and advanced acceleration), and all areas depend on the enabling technologies activities, such as solvers and component technology, to deliver the desired performance and integrated simulation environment. The ComPASS applications focus on computationally challenging problems important for design or performance optimization to all major HEP, NP, and BES accelerator facilities. With the cost and complexity of particle accelerators rising, the use of computation to optimize their designs and find improved operating regimes becomes essential, potentially leading to significant cost savings with modest investment.« less
Health sciences libraries building survey, 1999-2009.
Ludwig, Logan
2010-04-01
A survey was conducted of health sciences libraries to obtain information about newer buildings, additions, remodeling, and renovations. An online survey was developed, and announcements of survey availability posted to three major email discussion lists: Medical Library Association (MLA), Association of Academic Health Sciences Libraries (AAHSL), and MEDLIB-L. Previous discussions of library building projects on email discussion lists, a literature review, personal communications, and the author's consulting experiences identified additional projects. Seventy-eight health sciences library building projects at seventy-three institutions are reported. Twenty-two are newer facilities built within the last ten years; two are space expansions; forty-five are renovation projects; and nine are combinations of new and renovated space. Six institutions report multiple or ongoing renovation projects during the last ten years. The survey results confirm a continuing migration from print-based to digitally based collections and reveal trends in library space design. Some health sciences libraries report loss of space as they move toward creating space for "community" building. Libraries are becoming more proactive in using or retooling space for concentration, collaboration, contemplation, communication, and socialization. All are moving toward a clearer operational vision of the library as the institution's information nexus and not merely as a physical location with print collections.
Opting in and Creating Demand: Why Young People Choose to Teach Mathematics to Each Other
NASA Astrophysics Data System (ADS)
Tucker-Raymond, Eli; Lewis, Naama; Moses, Maisha; Milner, Chad
2016-12-01
Access to science, technology, engineering, and mathematics fields serves as a key entry point to economic mobility and civic enfranchisement. Such access must take seriously the intellectual power of the knowledge and practices of non-dominant youth. In our case, this has meant to shift epistemic authority in mathematics from academic institutions to young people themselves. This article is about why high school-aged students, from underrepresented groups, choose to participate in an out-of-school time program in which they teach younger children in the domains of mathematics and computer science. It argues for programmatic principles based on access, identity engagement, relationship building, and connections to community to support underrepresented youth as learners, teachers, leaders, and organizers in mathematics-related activities using game design as the focus of activity.
Cristescu, Melania E
2014-10-01
DNA-based species identification, known as barcoding, transformed the traditional approach to the study of biodiversity science. The field is transitioning from barcoding individuals to metabarcoding communities. This revolution involves new sequencing technologies, bioinformatics pipelines, computational infrastructure, and experimental designs. In this dynamic genomics landscape, metabarcoding studies remain insular and biodiversity estimates depend on the particular methods used. In this opinion article, I discuss the need for a coordinated advancement of DNA-based species identification that integrates taxonomic and barcoding information. Such an approach would facilitate access to almost 3 centuries of taxonomic knowledge and 1 decade of building repository barcodes. Conservation projects are time sensitive, research funding is becoming restricted, and informed decisions depend on our ability to embrace integrative approaches to biodiversity science. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Twelve small businesses who are developing equipment and computer programs for geophysics have won Small Business Innovative Research (SBIR) grants from the National Science Foundation for their 1989 proposals. The SBIR program was set up to encourage the private sector to undertake costly, advanced experimental work that has potential for great benefit.The geophysical research projects are a long-path intracavity laser spectrometer for measuring atmospheric trace gases, optimizing a local weather forecast model, a new platform for high-altitude atmospheric science, an advanced density logging tool, a deep-Earth sampling system, superconducting seismometers, a phased-array Doppler current profiler, monitoring mesoscale surface features of the ocean through automated analysis, krypton-81 dating in polar ice samples, discrete stochastic modeling of thunderstorm winds, a layered soil-synthetic liner base system to isolate buildings from earthquakes, and a low-cost continuous on-line organic-content monitor for water-quality determination.
Creating technical heritage object replicas in a virtual environment
NASA Astrophysics Data System (ADS)
Egorova, Olga; Shcherbinin, Dmitry
2016-03-01
The paper presents innovative informatics methods for creating virtual technical heritage replicas, which are of significant scientific and practical importance not only to researchers but to the public in general. By performing 3D modeling and animation of aircrafts, spaceships, architectural-engineering buildings, and other technical objects, the process of learning is achieved while promoting the preservation of the replicas for future generations. Modern approaches based on the wide usage of computer technologies attract a greater number of young people to explore the history of science and technology and renew their interest in the field of mechanical engineering.
A Boon for the Architect Engineer
NASA Technical Reports Server (NTRS)
1992-01-01
Langley Research Center's need for an improved construction specification system led to an automated system called SPECSINTACT. A catalog of specifications, the system enables designers to retrieve relevant sections from computer storage and modify them as needed. SPECSINTACT has also been adopted by government agencies. The system is an integral part of the Construction Criteria Base (CCB), a single disc containing design and construction information for 10 government agencies including the American Institute of Architects' MASTERSPEC. CCB employs CD- ROM technologies and is available from the National Institute of Building Sciences. Users report substantial savings in time and productivity.
New theory insights and experimental opportunities in Majorana wires
NASA Astrophysics Data System (ADS)
Alicea, Jason
Over the past decade, the quest for Majorana zero modes in exotic superconductors has undergone transformational advances on the design, fabrication, detection, and characterization fronts. The field now seems primed for a new era aimed at Majorana control and readout. This talk will survey intertwined theory and experimental developments that illuminate a practical path toward these higher-level goals. In particular, I will highlight near-term opportunities for testing fundamentals of topological quantum computing and longer-term strategies for building scalable hardware. Supported by the National Science Foundation (DMR-1341822), Institute for Quantum Information and Matter, and Walter Burke Institute at Caltech.
Hyperspectral Imaging and Related Field Methods: Building the Science
NASA Technical Reports Server (NTRS)
Goetz, Alexander F. H.; Steffen, Konrad; Wessman, Carol
1999-01-01
The proposal requested funds for the computing power to bring hyperspectral image processing into undergraduate and graduate remote sensing courses. This upgrade made it possible to handle more students in these oversubscribed courses and to enhance CSES' summer short course entitled "Hyperspectral Imaging and Data Analysis" provided for government, industry, university and military. Funds were also requested to build field measurement capabilities through the purchase of spectroradiometers, canopy radiation sensors and a differential GPS system. These instruments provided systematic and complete sets of field data for the analysis of hyperspectral data with the appropriate radiometric and wavelength calibration as well as atmospheric data needed for application of radiative transfer models. The proposed field equipment made it possible to team-teach a new field methods course, unique in the country, that took advantage of the expertise of the investigators rostered in three different departments, Geology, Geography and Biology.
Computational toxicology using the OpenTox application programming interface and Bioclipse
2011-01-01
Background Toxicity is a complex phenomenon involving the potential adverse effect on a range of biological functions. Predicting toxicity involves using a combination of experimental data (endpoints) and computational methods to generate a set of predictive models. Such models rely strongly on being able to integrate information from many sources. The required integration of biological and chemical information sources requires, however, a common language to express our knowledge ontologically, and interoperating services to build reliable predictive toxicology applications. Findings This article describes progress in extending the integrative bio- and cheminformatics platform Bioclipse to interoperate with OpenTox, a semantic web framework which supports open data exchange and toxicology model building. The Bioclipse workbench environment enables functionality from OpenTox web services and easy access to OpenTox resources for evaluating toxicity properties of query molecules. Relevant cases and interfaces based on ten neurotoxins are described to demonstrate the capabilities provided to the user. The integration takes advantage of semantic web technologies, thereby providing an open and simplifying communication standard. Additionally, the use of ontologies ensures proper interoperation and reliable integration of toxicity information from both experimental and computational sources. Conclusions A novel computational toxicity assessment platform was generated from integration of two open science platforms related to toxicology: Bioclipse, that combines a rich scriptable and graphical workbench environment for integration of diverse sets of information sources, and OpenTox, a platform for interoperable toxicology data and computational services. The combination provides improved reliability and operability for handling large data sets by the use of the Open Standards from the OpenTox Application Programming Interface. This enables simultaneous access to a variety of distributed predictive toxicology databases, and algorithm and model resources, taking advantage of the Bioclipse workbench handling the technical layers. PMID:22075173
Academic computer science and gender: A naturalistic study investigating the causes of attrition
NASA Astrophysics Data System (ADS)
Declue, Timothy Hall
Far fewer women than men take computer science classes in high school, enroll in computer science programs in college, or complete advanced degrees in computer science. The computer science pipeline begins to shrink for women even before entering college, but it is at the college level that the "brain drain" is the most evident numerically, especially in the first class taken by most computer science majors called "Computer Science 1" or CS-I. The result, for both academia and industry, is a pronounced technological gender disparity in academic and industrial computer science. The study revealed the existence of several factors influencing success in CS-I. First, and most clearly, the effect of attribution processes seemed to be quite strong. These processes tend to work against success for females and in favor of success for males. Likewise, evidence was discovered which strengthens theories related to prior experience and the perception that computer science has a culture which is hostile to females. Two unanticipated themes related to the motivation and persistence of successful computer science majors. The findings did not support the belief that females have greater logistical problems in computer science than males, or that females tend to have a different programming style than males which adversely affects the females' ability to succeed in CS-I.
Computer-Game Construction: A Gender-Neutral Attractor to Computing Science
ERIC Educational Resources Information Center
Carbonaro, Mike; Szafron, Duane; Cutumisu, Maria; Schaeffer, Jonathan
2010-01-01
Enrollment in Computing Science university programs is at a dangerously low level. A major reason for this is the general lack of interest in Computing Science by females. In this paper, we discuss our experience with using a computer game construction environment as a vehicle to encourage female participation in Computing Science. Experiments…
Secretary Chu visits Argonne—Groundbreaking ceremony for new Energy Sciences building
DOE Office of Scientific and Technical Information (OSTI.GOV)
Isaacs, Eric; Zimmer, Robert; Durbin, Dick
2011-01-01
U.S. Department of Energy Secretary Steven Chu, joined Senator Richard Durbin, University of Chicago President Robert Zimmer and Argonne Director Eric Isaacs to break ground for Argonne's new Energy and Sciences building.
Life sciences building, north rear, also showing north hall to ...
Life sciences building, north rear, also showing north hall to the right, and the library in the center distance. - San Bernardino Valley College, 701 South Mount Vernon Avenue, San Bernardino, San Bernardino County, CA
Intelligent Systems Technologies and Utilization of Earth Observation Data
NASA Technical Reports Server (NTRS)
Ramapriyan, H. K.; McConaughy, G. R.; Morse, H. S.
2004-01-01
The addition of raw data and derived geophysical parameters from several Earth observing satellites over the last decade to the data held by NASA data centers has created a data rich environment for the Earth science research and applications communities. The data products are being distributed to a large and diverse community of users. Due to advances in computational hardware, networks and communications, information management and software technologies, significant progress has been made in the last decade in archiving and providing data to users. However, to realize the full potential of the growing data archives, further progress is necessary in the transformation of data into information, and information into knowledge that can be used in particular applications. Sponsored by NASA s Intelligent Systems Project within the Computing, Information and Communication Technology (CICT) Program, a conceptual architecture study has been conducted to examine ideas to improve data utilization through the addition of intelligence into the archives in the context of an overall knowledge building system (KBS). Potential Intelligent Archive concepts include: 1) Mining archived data holdings to improve metadata to facilitate data access and usability; 2) Building intelligence about transformations on data, information, knowledge, and accompanying services; 3) Recognizing the value of results, indexing and formatting them for easy access; 4) Interacting as a cooperative node in a web of distributed systems to perform knowledge building; and 5) Being aware of other nodes in the KBS, participating in open systems interfaces and protocols for virtualization, and achieving collaborative interoperability.
ERIC Educational Resources Information Center
Zendler, Andreas; Klaudt, Dieter
2012-01-01
The significance of computer science for economics and society is undisputed. In particular, computer science is acknowledged to play a key role in schools (e.g., by opening multiple career paths). The provision of effective computer science education in schools is dependent on teachers who are able to properly represent the discipline and whose…
Building Capacity for Actionable Science and Decision Making in Alaska
NASA Astrophysics Data System (ADS)
Timm, K.; Kettle, N.; Buxbaum, T. M.; Trainor, S.; Walsh, J. E.; York, A.
2017-12-01
Investigations of the processes for developing actionable science and supporting partnerships between researchers and practitioners has received increasing attention over the past decade. These studies highlight the importance of leveraging existing relationships and trust, supporting iterative interactions, and dedicating sufficient financial and human capital to the development of usable climate science. However, significant gaps remain in our understanding of how to build capacity for more effective partnerships. To meet these ends, the Alaska Center for Climate Assessment and Policy (ACCAP) is developing a series of trainings for scientists and practitioners to build capacity for producing actionable science. This process includes three phases: scoping and development, training, and evaluation. This presentation reports on the scoping and development phase of the project, which draws on an extensive web-based search of past and present capacity building and training activities, document analysis, and surveys of trainers. A synthesis of successful formats (e.g., training, placements, etc.), curriculum topics (e.g., climate science, interpersonal communication), and approaches to recruitment and curriculum development will be outlined. We then outline our approach for co-developing trainings in three different sectors, which engages other boundary organizations to leverage trust and exiting network connections to tailor the training activities. Through this effort we ultimately seek to understand how the processes and outcomes for co-developing trainings in actionable science vary across sectors and their implications for building capacity.
The Spatial and the Visual in Mental Spatial Reasoning: An Ill-Posed Distinction
NASA Astrophysics Data System (ADS)
Schultheis, Holger; Bertel, Sven; Barkowsky, Thomas; Seifert, Inessa
It is an ongoing and controversial debate in cognitive science which aspects of knowledge humans process visually and which ones they process spatially. Similarly, artificial intelligence (AI) and cognitive science research, in building computational cognitive systems, tended to use strictly spatial or strictly visual representations. The resulting systems, however, were suboptimal both with respect to computational efficiency and cognitive plau sibility. In this paper, we propose that the problems in both research strands stem from a mis conception of the visual and the spatial in mental spatial knowl edge pro cessing. Instead of viewing the visual and the spatial as two clearly separable categories, they should be conceptualized as the extremes of a con tinuous dimension of representation. Regarding psychology, a continuous di mension avoids the need to exclusively assign processes and representations to either one of the cate gories and, thus, facilitates a more unambiguous rating of processes and rep resentations. Regarding AI and cognitive science, the con cept of a continuous spatial / visual dimension provides the possibility of rep re sentation structures which can vary continuously along the spatial / visual di mension. As a first step in exploiting these potential advantages of the pro posed conception we (a) introduce criteria allowing for a non-dichotomic judgment of processes and representations and (b) present an approach towards rep re sentation structures that can flexibly vary along the spatial / visual dimension.
A prototype Upper Atmospheric Research Collaboratory (UARC)
NASA Technical Reports Server (NTRS)
Clauer, C. R.; Atkins, D. E; Weymouth, T. E.; Olson, G. M.; Niciejewski, R.; Finholt, T. A.; Prakash, A.; Rasmussen, C. E.; Killeen, T.; Rosenberg, T. J.
1995-01-01
The National Collaboratory concept has great potential for enabling 'critical mass' working groups and highly interdisciplinary research projects. We report here on a new program to build a prototype collaboratory using the Sondrestrom Upper Atmospheric Research Facility in Kangerlussuaq, Greenland and a group of associated scientists. The Upper Atmospheric Research Collaboratory (UARC) is a joint venture of researchers in upper atmospheric and space science, computer science, and behavioral science to develop a testbed for collaborative remote research. We define the 'collaboratory' as an advanced information technology environment which enables teams to work together over distance and time on a wide variety of intellectual tasks. It provides: (1) human-to-human communications using shared computer tools and work spaces; (2) group access and use of a network of information, data, and knowledge sources; and (3) remote access and control of instruments for data acquisition. The UARC testbed is being implemented to support a distributed community of space scientists so that they have network access to the remote instrument facility in Kangerlussuaq and are able to interact among geographically distributed locations. The goal is to enable them to use the UARC rather than physical travel to Greenland to conduct team research campaigns. Even on short notice through the collaboratory from their home institutions, participants will be able to meet together to operate a battery of remote interactive observations and to acquire, process, and interpret the data.
The implementation and use of Ada on distributed systems with high reliability requirements
NASA Technical Reports Server (NTRS)
Knight, J. C.
1986-01-01
The general inadequacy of Ada for programming systems that must survive processor loss was shown. A solution to the problem was proposed in which there are no syntatic changes to Ada. The approach was evaluated using a full-scale, realistic application. The application used was the Advanced Transport Operating System (ATOPS), an experimental computer control system developed for a modified Boeing 737 aircraft. The ATOPS system is a full authority, real-time avionics system providing a large variety of advanced features. Methods of building fault tolerance into concurrent systems were explored. A set of criteria by which the proposed method will be judged was examined. Extensive interaction with personnel from Computer Sciences Corporation and NASA Langley occurred to determine the requirements of the ATOPS software. Backward error recovery in concurrent systems was assessed.
NASA Astrophysics Data System (ADS)
Hanna, Philip; Allen, Angela; Kane, Russell; Anderson, Neil; McGowan, Aidan; Collins, Matthew; Hutchison, Malcolm
2015-07-01
This paper outlines a means of improving the employability skills of first-year university students through a closely integrated model of employer engagement within computer science modules. The outlined approach illustrates how employability skills, including communication, teamwork and time management skills, can be contextualised in a manner that directly relates to student learning but can still be linked forward into employment. The paper tests the premise that developing employability skills early within the curriculum will result in improved student engagement and learning within later modules. The paper concludes that embedding employer participation within first-year models can help relate a distant notion of employability into something of more immediate relevance in terms of how students can best approach learning. Further, by enhancing employability skills early within the curriculum, it becomes possible to improve academic attainment within later modules.
NASA Astrophysics Data System (ADS)
Hutton, Christopher; Wagener, Thorsten; Freer, Jim; Han, Dawei; Duffy, Chris; Arheimer, Berit
2017-03-01
In this article, we reply to a comment made by Melsen et al. [2017] on our previous commentary regarding reproducibility in computational hydrology. Re-executing someone else's code and workflow to derive a set of published results does not by itself constitute reproducibility. However, it forms a key part of the process: it demonstrates that all the degrees of freedom and choices made by the scientist in running the experiment are contained within that code and workflow. This does not only allow us to build and extend directly from the original work, but with full knowledge of decisions made in the original experimental setup, we can then focus our attention to the degrees of freedom of interest: those that occur in hydrological systems that are ultimately our subject of study.
Nanoscale integration is the next frontier for nanotechnology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Picraux, Samuel T
2009-01-01
Nanoscale integration of materials and structures is the next critical step to exploit the promise of nanomaterials. Many novel and fascinating properties have been revealed for nanostructured materials. But if nanotechnology is to live up to its promise we must incorporate these nanoscale building blocks into functional systems that connect to the micro- and macroscale world. To do this we will inevitably need to understand and exploit the resulting combined unique properties of these integrated nanosystems. Much science waits to be discovered in the process. Nanoscale integration extends from the synthesis and fabrication of individual nanoscale building blocks, to themore » assembly of these building blocks into composite structures, and finally to the formation of complex functional systems. As illustrated in Figure 1, the building blocks may be homogeneous or heterogeneous, the composite materials may be nanocomposite or patterned structures, and the functional systems will involve additional combinations of materials. Nanoscale integration involves assembling diverse nanoscale materials across length scales to design and achieve new properties and functionality. At each stage size-dependent properties, the influence of surfaces in close proximity, and a multitude of interfaces all come into play. Whether the final system involves coherent electrons in a quantum computing approach, the combined flow of phonons and electrons for a high efficiency thermoelectric micro-generator, or a molecular recognition structure for bio-sensing, the combined effects of size, surface, and interface will be critical. In essence, one wants to combine the novel functions available through nanoscale science to achieve unique multi-functionalities not available in bulk materials. Perhaps the best-known example of integration is that of combining electronic components together into very large scale integrated circuits (VLSI). The integrated circuit has revolutionized electronics in many ways, from exploiting field-effect transistor devices and low power complementary logic to enable the electronic watch and hand calculator in the 1970's, to today's microprocessors and memories with billions of devices and a computational power not imagined a few decades ago. The manipulation of charges on a chip, the new concepts in combining devices for logic functions, and the new approaches to computation, information processing, and imaging have all emerged from Kilby and Noyce's simple concept of integrating devices on a single chip. Moving from hard to soft materials, a second more recent example of integration is the DNA microarray. These microarrays, with up to millions of elements in a planar array that can be optically read out, can simultaneously measure the expression of 10's of thousands of genes to study the effects of disease and treatment, or screen for single nucleotide polymorphisms for uses ranging from forensics to predisposition to disease. While still at an early stage, microarrays have revolutionized biosciences by providing the means to interrogate the complex genetic control of biological functions. Just as integrated circuits and microarrays have led to completely new functionalities and performance, the integration of nanoscale materials and structures is anticipated to lead to new performance and enable the design of new functionalities not previously envisioned. The fundamental questions underlying integration go beyond just complex fabrication or the engineering of known solutions; they lead to new discoveries and new science. The scientific challenges around nanoscale integration necessitate the development of new knowledge that is central to the advance of nanotechnology. To move forward one must address key science questions that arise in nanoscience integration and go beyond a single system or materials area. New science and discoveries especially await around three questions. How does one: (1) Control energy transfer and other interactions across interfaces and over mUltiple length scales? (2) Understand and control the interactions between nanoscale building blocks to assemble specific integrated structures? (3) Design and exploit interactions within assembled structures to achieve new properties and specific functionalities? These high level questions can serve to drive research, and to advance understanding of the complex phenomena and multifunctionality that may emerge from integration. For example, in photonics there is considerable effort to understand and control the response of nanoscale conducting structures on dielectrics, to allow one to localize, manipulate, and control electromagnetic energy in integrated systems such as in the field known as metamaterials. Essential to this area is a fundamental understanding of energy transfer across multiple length scales (question 1 above).« less
Advanced Computational Methods in Bio-Mechanics.
Al Qahtani, Waleed M S; El-Anwar, Mohamed I
2018-04-15
A novel partnership between surgeons and machines, made possible by advances in computing and engineering technology, could overcome many of the limitations of traditional surgery. By extending surgeons' ability to plan and carry out surgical interventions more accurately and with fewer traumas, computer-integrated surgery (CIS) systems could help to improve clinical outcomes and the efficiency of healthcare delivery. CIS systems could have a similar impact on surgery to that long since realised in computer-integrated manufacturing. Mathematical modelling and computer simulation have proved tremendously successful in engineering. Computational mechanics has enabled technological developments in virtually every area of our lives. One of the greatest challenges for mechanists is to extend the success of computational mechanics to fields outside traditional engineering, in particular to biology, the biomedical sciences, and medicine. Biomechanics has significant potential for applications in orthopaedic industry, and the performance arts since skills needed for these activities are visibly related to the human musculoskeletal and nervous systems. Although biomechanics is widely used nowadays in the orthopaedic industry to design orthopaedic implants for human joints, dental parts, external fixations and other medical purposes, numerous researches funded by billions of dollars are still running to build a new future for sports and human healthcare in what is called biomechanics era.
Burns, Randal; Roncal, William Gray; Kleissas, Dean; Lillaney, Kunal; Manavalan, Priya; Perlman, Eric; Berger, Daniel R; Bock, Davi D; Chung, Kwanghun; Grosenick, Logan; Kasthuri, Narayanan; Weiler, Nicholas C; Deisseroth, Karl; Kazhdan, Michael; Lichtman, Jeff; Reid, R Clay; Smith, Stephen J; Szalay, Alexander S; Vogelstein, Joshua T; Vogelstein, R Jacob
2013-01-01
We describe a scalable database cluster for the spatial analysis and annotation of high-throughput brain imaging data, initially for 3-d electron microscopy image stacks, but for time-series and multi-channel data as well. The system was designed primarily for workloads that build connectomes - neural connectivity maps of the brain-using the parallel execution of computer vision algorithms on high-performance compute clusters. These services and open-science data sets are publicly available at openconnecto.me. The system design inherits much from NoSQL scale-out and data-intensive computing architectures. We distribute data to cluster nodes by partitioning a spatial index. We direct I/O to different systems-reads to parallel disk arrays and writes to solid-state storage-to avoid I/O interference and maximize throughput. All programming interfaces are RESTful Web services, which are simple and stateless, improving scalability and usability. We include a performance evaluation of the production system, highlighting the effec-tiveness of spatial data organization.
Burns, Randal; Roncal, William Gray; Kleissas, Dean; Lillaney, Kunal; Manavalan, Priya; Perlman, Eric; Berger, Daniel R.; Bock, Davi D.; Chung, Kwanghun; Grosenick, Logan; Kasthuri, Narayanan; Weiler, Nicholas C.; Deisseroth, Karl; Kazhdan, Michael; Lichtman, Jeff; Reid, R. Clay; Smith, Stephen J.; Szalay, Alexander S.; Vogelstein, Joshua T.; Vogelstein, R. Jacob
2013-01-01
We describe a scalable database cluster for the spatial analysis and annotation of high-throughput brain imaging data, initially for 3-d electron microscopy image stacks, but for time-series and multi-channel data as well. The system was designed primarily for workloads that build connectomes— neural connectivity maps of the brain—using the parallel execution of computer vision algorithms on high-performance compute clusters. These services and open-science data sets are publicly available at openconnecto.me. The system design inherits much from NoSQL scale-out and data-intensive computing architectures. We distribute data to cluster nodes by partitioning a spatial index. We direct I/O to different systems—reads to parallel disk arrays and writes to solid-state storage—to avoid I/O interference and maximize throughput. All programming interfaces are RESTful Web services, which are simple and stateless, improving scalability and usability. We include a performance evaluation of the production system, highlighting the effec-tiveness of spatial data organization. PMID:24401992
Analysis on the University’s Network Security Level System in the Big Data Era
NASA Astrophysics Data System (ADS)
Li, Tianli
2017-12-01
The rapid development of science and technology, the continuous expansion of the scope of computer network applications, has gradually improved the social productive forces, has had a positive impact on the increase production efficiency and industrial scale of China's different industries. Combined with the actual application of computer network in the era of large data, we can see the existence of influencing factors such as network virus, hacker and other attack modes, threatening network security and posing a potential threat to the safe use of computer network in colleges and universities. In view of this unfavorable development situation, universities need to pay attention to the analysis of the situation of large data age, combined with the requirements of network security use, to build a reliable network space security system from the equipment, systems, data and other different levels. To avoid the security risks exist in the network. Based on this, this paper will analyze the hierarchical security system of cyberspace security in the era of large data.
Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing
2011-01-01
Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century. PMID:21444779
Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing
2011-04-05
Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century.
Secretary Chu visits ArgonneâGroundbreaking ceremony for new Energy Sciences building
Isaacs, Eric; Zimmer, Robert; Durbin, Dick; Chu, S
2018-06-06
U.S. Department of Energy Secretary Steven Chu, joined Senator Richard Durbin, University of Chicago President Robert Zimmer and Argonne Director Eric Isaacs to break ground for Argonne's new Energy and Sciences building.
An Ethical Governor for Constraining Lethal Action in an Autonomous System
2009-01-01
property is prohibited from being attacked, including buildings dedicated to religion, art , science … Activity Active Logical Form TargetDiscriminated...attacked, including buildings dedicated to religion, art , science , charitable purposes, and historic monuments. Prohibition LOW Civilian
A Financial Technology Entrepreneurship Program for Computer Science Students
ERIC Educational Resources Information Center
Lawler, James P.; Joseph, Anthony
2011-01-01
Education in entrepreneurship is becoming a critical area of curricula for computer science students. Few schools of computer science have a concentration in entrepreneurship in the computing curricula. The paper presents Technology Entrepreneurship in the curricula at a leading school of computer science and information systems, in which students…
ERIC Educational Resources Information Center
Menekse, Muhsin
2015-01-01
While there has been a remarkable interest to make computer science a core K-12 academic subject in the United States, there is a shortage of K-12 computer science teachers to successfully implement computer sciences courses in schools. In order to enhance computer science teacher capacity, training programs have been offered through teacher…
The Brazilian INPE-UFSM NANOSATC-BR CubeSat Development Capacity Building Program
NASA Astrophysics Data System (ADS)
Schuch, Nelson Jorge; Cupertino Durao, Otavio S.
The Brazilian INPE-UFSM NANOSATC-BR CubeSat Development Capacity Building Program (CBP) and the results of the NANOSATC-BR1, the first Brazilian CubeSat launching, expected for 2014's first semester, are presented. The CBP consists of two CubeSats, NANOSATC-BR 1 (1U) & 2 (2U) and is expected operate in orbit for at least 12 months each, with capacity building in space science, engineering and computer sciences for the development of space technologies using CubeSats satellites. The INPE-UFSM’s CBP Cooperation is basically among: (i) the Southern Regional Space Research Center (CRS), from the Brazilian INPE/MCTI, where acts the Program's General Coordinator and Projects NANOSATC-BR 1 & 2 Manager, having technical collaboration and management of the Mission’s General Coordinator for Engineering and Space Technology at INPE’s Headquarter (HQ), in São José dos Campos, São Paulo; (ii) the Santa Maria Space Science Laboratory (LACESM/CT) from the Federal University of Santa Maria - (UFSM); (iii) the Santa Maria Design House (SMDH); (iv) the Graduate Program in Microelectronics from the Federal University of Rio Grande do Sul (MG/II/UFRGS); and (v) the Aeronautic Institute of Technology (ITA/DCTA/CA-MD). The INPE-UFSM’s CBP has the involvement of UFSM' undergraduate students and graduate students from: INPE/MCTI, MG/II/UFRGS and ITA/DCTA/CA-MD. The NANOSATC-BR 1 & 2 Projects Ground Stations (GS) capacity building operation with VHF/UHF band and S-band antennas, are described in two specific papers at this COSPAR-2014. This paper focuses on the development of NANOSATC-BR 1 & 2 and on the launching of NANOSATC-BR1. The Projects' concepts were developed to: i) monitor, in real time, the Geospace, the Ionosphere, the energetic particle precipitation and the disturbances at the Earth's Magnetosphere over the Brazilian Territory, and ii) the determination of their effects on regions such as the South American Magnetic Anomaly (SAMA) and the Brazilian sector of the Equatorial Electrojet (EEJ). The Program has support from The Brazilian Space Agency (AEB).
NASA Astrophysics Data System (ADS)
Strayer, Michael
2007-09-01
Good morning. Welcome to Boston, the home of the Red Sox, Celtics and Bruins, baked beans, tea parties, Robert Parker, and SciDAC 2007. A year ago I stood before you to share the legacy of the first SciDAC program and identify the challenges that we must address on the road to petascale computing—a road E E Cummins described as `. . . never traveled, gladly beyond any experience.' Today, I want to explore the preparations for the rapidly approaching extreme scale (X-scale) generation. These preparations are the first step propelling us along the road of burgeoning scientific discovery enabled by the application of X- scale computing. We look to petascale computing and beyond to open up a world of discovery that cuts across scientific fields and leads us to a greater understanding of not only our world, but our universe. As part of the President's America Competitiveness Initiative, the ASCR Office has been preparing a ten year vision for computing. As part of this planning the LBNL together with ORNL and ANL hosted three town hall meetings on Simulation and Modeling at the Exascale for Energy, Ecological Sustainability and Global Security (E3). The proposed E3 initiative is organized around four programmatic themes: Engaging our top scientists, engineers, computer scientists and applied mathematicians; investing in pioneering large-scale science; developing scalable analysis algorithms, and storage architectures to accelerate discovery; and accelerating the build-out and future development of the DOE open computing facilities. It is clear that we have only just started down the path to extreme scale computing. Plan to attend Thursday's session on the out-briefing and discussion of these meetings. The road to the petascale has been at best rocky. In FY07, the continuing resolution provided 12% less money for Advanced Scientific Computing than either the President, the Senate, or the House. As a consequence, many of you had to absorb a no cost extension for your SciDAC work. I am pleased that the President's FY08 budget restores the funding for SciDAC. Quoting from Advanced Scientific Computing Research description in the House Energy and Water Development Appropriations Bill for FY08, "Perhaps no other area of research at the Department is so critical to sustaining U.S. leadership in science and technology, revolutionizing the way science is done and improving research productivity." As a society we need to revolutionize our approaches to energy, environmental and global security challenges. As we go forward along the road to the X-scale generation, the use of computation will continue to be a critical tool along with theory and experiment in understanding the behavior of the fundamental components of nature as well as for fundamental discovery and exploration of the behavior of complex systems. The foundation to overcome these societal challenges will build from the experiences and knowledge gained as you, members of our SciDAC research teams, work together to attack problems at the tera- and peta- scale. If SciDAC is viewed as an experiment for revolutionizing scientific methodology, then a strategic goal of ASCR program must be to broaden the intellectual base prepared to address the challenges of the new X-scale generation of computing. We must focus our computational science experiences gained over the past five years on the opportunities introduced with extreme scale computing. Our facilities are on a path to provide the resources needed to undertake the first part of our journey. Using the newly upgraded 119 teraflop Cray XT system at the Leadership Computing Facility, SciDAC research teams have in three days performed a 100-year study of the time evolution of the atmospheric CO2 concentration originating from the land surface. The simulation of the El Nino/Southern Oscillation which was part of this study has been characterized as `the most impressive new result in ten years' gained new insight into the behavior of superheated ionic gas in the ITER reactor as a result of an AORSA run on 22,500 processors that achieved over 87 trillion calculations per second (87 teraflops) which is 74% of the system's theoretical peak. Tomorrow, Argonne and IBM will announce that the first IBM Blue Gene/P, a 100 teraflop system, will be shipped to the Argonne Leadership Computing Facility later this fiscal year. By the end of FY2007 ASCR high performance and leadership computing resources will include the 114 teraflop IBM Blue Gene/P; a 102 teraflop Cray XT4 at NERSC and a 119 teraflop Cray XT system at Oak Ridge. Before ringing in the New Year, Oak Ridge will upgrade to 250 teraflops with the replacement of the dual core processors with quad core processors and Argonne will upgrade to between 250-500 teraflops, and next year, a petascale Cray Baker system is scheduled for delivery at Oak Ridge. The multidisciplinary teams in our SciDAC Centers for Enabling Technologies and our SciDAC Institutes must continue to work with our Scientific Application teams to overcome the barriers that prevent effective use of these new systems. These challenges include: the need for new algorithms as well as operating system and runtime software and tools which scale to parallel systems composed of hundreds of thousands processors; program development environments and tools which scale effectively and provide ease of use for developers and scientific end users; and visualization and data management systems that support moving, storing, analyzing, manipulating and visualizing multi-petabytes of scientific data and objects. The SciDAC Centers, located primarily at our DOE national laboratories will take the lead in ensuring that critical computer science and applied mathematics issues are addressed in a timely and comprehensive fashion and to address issues associated with research software lifecycle. In contrast, the SciDAC Institutes, which are university-led centers of excellence, will have more flexibility to pursue new research topics through a range of research collaborations. The Institutes will also work to broaden the intellectual and researcher base—conducting short courses and summer schools to take advantage of new high performance computing capabilities. The SciDAC Outreach Center at Lawrence Berkeley National Laboratory complements the outreach efforts of the SciDAC Institutes. The Outreach Center is our clearinghouse for SciDAC activities and resources and will communicate with the high performance computing community in part to understand their needs for workshops, summer schools and institutes. SciDAC is not ASCR's only effort to broaden the computational science community needed to meet the challenges of the new X-scale generation. I hope that you were able to attend the Computational Science Graduate Fellowship poster session last night. ASCR developed the fellowship in 1991 to meet the nation's growing need for scientists and technology professionals with advanced computer skills. CSGF, now jointly funded between ASCR and NNSA, is more than a traditional academic fellowship. It has provided more than 200 of the best and brightest graduate students with guidance, support and community in preparing them as computational scientists. Today CSGF alumni are bringing their diverse top-level skills and knowledge to research teams at DOE laboratories and in industries such as Proctor and Gamble, Lockheed Martin and Intel. At universities they are working to train the next generation of computational scientists. To build on this success, we intend to develop a wholly new Early Career Principal Investigator's (ECPI) program. Our objective is to stimulate academic research in scientific areas within ASCR's purview especially among faculty in early stages of their academic careers. Last February, we lost Ken Kennedy, one of the leading lights of our community. As we move forward into the extreme computing generation, his vision and insight will be greatly missed. In memorial to Ken Kennedy, we shall designate the ECPI grants to beginning faculty in Computer Science as the Ken Kennedy Fellowship. Watch the ASCR website for more information about ECPI and other early career programs in the computational sciences. We look to you, our scientists, researchers, and visionaries to take X-scale computing and use it to explode scientific discovery in your fields. We at SciDAC will work to ensure that this tool is the sharpest and most precise and efficient instrument to carve away the unknown and reveal the most exciting secrets and stimulating scientific discoveries of our time. The partnership between research and computing is the marriage that will spur greater discovery, and as Spencer said to Susan in Robert Parker's novel, `Sudden Mischief', `We stick together long enough, and we may get as smart as hell'. Michael Strayer
Computation Directorate 2008 Annual Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crawford, D L
2009-03-25
Whether a computer is simulating the aging and performance of a nuclear weapon, the folding of a protein, or the probability of rainfall over a particular mountain range, the necessary calculations can be enormous. Our computers help researchers answer these and other complex problems, and each new generation of system hardware and software widens the realm of possibilities. Building on Livermore's historical excellence and leadership in high-performance computing, Computation added more than 331 trillion floating-point operations per second (teraFLOPS) of power to LLNL's computer room floors in 2008. In addition, Livermore's next big supercomputer, Sequoia, advanced ever closer to itsmore » 2011-2012 delivery date, as architecture plans and the procurement contract were finalized. Hyperion, an advanced technology cluster test bed that teams Livermore with 10 industry leaders, made a big splash when it was announced during Michael Dell's keynote speech at the 2008 Supercomputing Conference. The Wall Street Journal touted Hyperion as a 'bright spot amid turmoil' in the computer industry. Computation continues to measure and improve the costs of operating LLNL's high-performance computing systems by moving hardware support in-house, by measuring causes of outages to apply resources asymmetrically, and by automating most of the account and access authorization and management processes. These improvements enable more dollars to go toward fielding the best supercomputers for science, while operating them at less cost and greater responsiveness to the customers.« less
Dinov, Ivo D
2016-01-01
Managing, processing and understanding big healthcare data is challenging, costly and demanding. Without a robust fundamental theory for representation, analysis and inference, a roadmap for uniform handling and analyzing of such complex data remains elusive. In this article, we outline various big data challenges, opportunities, modeling methods and software techniques for blending complex healthcare data, advanced analytic tools, and distributed scientific computing. Using imaging, genetic and healthcare data we provide examples of processing heterogeneous datasets using distributed cloud services, automated and semi-automated classification techniques, and open-science protocols. Despite substantial advances, new innovative technologies need to be developed that enhance, scale and optimize the management and processing of large, complex and heterogeneous data. Stakeholder investments in data acquisition, research and development, computational infrastructure and education will be critical to realize the huge potential of big data, to reap the expected information benefits and to build lasting knowledge assets. Multi-faceted proprietary, open-source, and community developments will be essential to enable broad, reliable, sustainable and efficient data-driven discovery and analytics. Big data will affect every sector of the economy and their hallmark will be 'team science'.
Evaluating Emulation-based Models of Distributed Computing Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, Stephen T.; Gabert, Kasimir G.; Tarman, Thomas D.
Emulation-based models of distributed computing systems are collections of virtual ma- chines, virtual networks, and other emulation components configured to stand in for oper- ational systems when performing experimental science, training, analysis of design alterna- tives, test and evaluation, or idea generation. As with any tool, we should carefully evaluate whether our uses of emulation-based models are appropriate and justified. Otherwise, we run the risk of using a model incorrectly and creating meaningless results. The variety of uses of emulation-based models each have their own goals and deserve thoughtful evaluation. In this paper, we enumerate some of these uses andmore » describe approaches that one can take to build an evidence-based case that a use of an emulation-based model is credible. Predictive uses of emulation-based models, where we expect a model to tell us something true about the real world, set the bar especially high and the principal evaluation method, called validation , is comensurately rigorous. We spend the majority of our time describing and demonstrating the validation of a simple predictive model using a well-established methodology inherited from decades of development in the compuational science and engineering community.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bartlett, Roscoe A; Heroux, Dr. Michael A; Willenbring, James
2012-01-01
Software lifecycles are becoming an increasingly important issue for computational science & engineering (CSE) software. The process by which a piece of CSE software begins life as a set of research requirements and then matures into a trusted high-quality capability is both commonplace and extremely challenging. Although an implicit lifecycle is obviously being used in any effort, the challenges of this process--respecting the competing needs of research vs. production--cannot be overstated. Here we describe a proposal for a well-defined software lifecycle process based on modern Lean/Agile software engineering principles. What we propose is appropriate for many CSE software projects thatmore » are initially heavily focused on research but also are expected to eventually produce usable high-quality capabilities. The model is related to TriBITS, a build, integration and testing system, which serves as a strong foundation for this lifecycle model, and aspects of this lifecycle model are ingrained in the TriBITS system. Indeed this lifecycle process, if followed, will enable large-scale sustainable integration of many complex CSE software efforts across several institutions.« less
Computer Science | Classification | College of Engineering & Applied
EMS 1011 profile photo Adrian Dumitrescu, Ph.D.ProfessorComputer Science(414) 229-4265Eng & Math @uwm.eduEng & Math Sciences 919 profile photo Hossein Hosseini, Ph.D.ProfessorComputer Science(414) 229 -5184hosseini@uwm.eduEng & Math Sciences 1091 profile photo Amol Mali, Ph.D.Associate ProfessorComputer
Computers in Science Education: Can They Go Far Enough? Have We Gone Too Far?
ERIC Educational Resources Information Center
Schrock, John Richard
1984-01-01
Indicates that although computers may churn out creative research, science is still dependent on science education, and that science education consists of increasing human experience. Also considers uses and misuses of computers in the science classroom, examining Edgar Dale's "cone of experience" related to laboratory computer and "extended…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robinson, Alastair; Regnier, Cindy; Settlemyre, Kevin
Massachusetts Institute of Technology (MIT) partnered with the U.S. Department of Energy (DOE) to develop and implement solutions to retrofit existing buildings to reduce energy consumption by at least 30% as part of DOE’s Commercial Building Partnerships (CBP) Program.1 Lawrence Berkeley National Laboratory (LBNL) provided technical expertise in support of this DOE program. MIT is one of the U.S.’s foremost higher education institutions, occupying a campus that is nearly 100 years old, with a building floor area totaling more than 12 million square feet. The CBP project focused on improving the energy performance of two campus buildings, the Ray andmore » Maria Stata Center (RMSC) and the Building W91 (BW91) data center. A key goal of the project was to identify energy saving measures that could be applied to other buildings both within MIT’s portfolio and at other higher education institutions. The CBP retrofits at MIT are projected to reduce energy consumption by approximately 48%, including a reduction of around 72% in RMSC lighting energy and a reduction of approximately 55% in RMSC server room HVAC energy. The energy efficiency measure (EEM) package proposed for the BW91 data center is expected to reduce heating, ventilation, and air-conditioning (HVAC) energy use by 30% to 50%, depending on the final air intake temperature that is established for the server racks. The RMSC, an iconic building designed by Frank Gehry, houses the Computer Science and Artificial Intelligence Laboratory, the Laboratory for Information and Decision Systems, and the Department of Linguistics and Philosophy.« less
Student Practices, Learning, and Attitudes When Using Computerized Ranking Tasks
NASA Astrophysics Data System (ADS)
Lee, Kevin M.; Prather, E. E.; Collaboration of Astronomy Teaching Scholars CATS
2011-01-01
Ranking Tasks are a novel type of conceptual exercise based on a technique called rule assessment. Ranking Tasks present students with a series of four to eight icons that describe slightly different variations of a basic physical situation. Students are then asked to identify the order, or ranking, of the various situations based on some physical outcome or result. The structure of Ranking Tasks makes it difficult for students to rely strictly on memorized answers and mechanical substitution of formulae. In addition, by changing the presentation of the different scenarios (e.g., photographs, line diagrams, graphs, tables, etc.) we find that Ranking Tasks require students to develop mental schema that are more flexible and robust. Ranking tasks may be implemented on the computer which requires students to order the icons through drag-and-drop. Computer implementation allows the incorporation of background material, grading with feedback, and providing additional similar versions of the task through randomization so that students can build expertise through practice. This poster will summarize the results of a study of student usage of computerized ranking tasks. We will investigate 1) student practices (How do they make use of these tools?), 2) knowledge and skill building (Do student scores improve with iteration and are there diminishing returns?), and 3) student attitudes toward using computerized Ranking Tasks (Do they like using them?). This material is based upon work supported by the National Science Foundation under Grant No. 0715517, a CCLI Phase III Grant for the Collaboration of Astronomy Teaching Scholars (CATS). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.
NASA Astrophysics Data System (ADS)
Gulland, E.-K.; Veenendaal, B.; Schut, A. G. T.
2012-07-01
Problem-solving knowledge and skills are an important attribute of spatial sciences graduates. The challenge of higher education is to build a teaching and learning environment that enables students to acquire these skills in relevant and authentic applications. This study investigates the effectiveness of traditional face-to-face teaching and online learning technologies in supporting the student learning of problem-solving and computer programming skills, techniques and solutions. The student cohort considered for this study involves students in the surveying as well as geographic information science (GISc) disciplines. Also, students studying across a range of learning modes including on-campus, distance and blended, are considered in this study. Student feedback and past studies reveal a lack of student interest and engagement in problem solving and computer programming. Many students do not see such skills as directly relevant and applicable to their perceptions of what future spatial careers hold. A range of teaching and learning methods for both face-to-face teaching and distance learning were introduced to address some of the perceived weaknesses of the learning environment. These included initiating greater student interaction in lectures, modifying assessments to provide greater feedback and student accountability, and the provision of more interactive and engaging online learning resources. The paper presents and evaluates the teaching methods used to support the student learning environment. Responses of students in relation to their learning experiences were collected via two anonymous, online surveys and these results were analysed with respect to student pass and retention rates. The study found a clear distinction between expectations and engagement of surveying students in comparison to GISc students. A further outcome revealed that students who were already engaged in their learning benefited the most from the interactive learning resources and opportunities provided.
Health sciences libraries building survey, 1999–2009
Ludwig, Logan
2010-01-01
Objective: A survey was conducted of health sciences libraries to obtain information about newer buildings, additions, remodeling, and renovations. Method: An online survey was developed, and announcements of survey availability posted to three major email discussion lists: Medical Library Association (MLA), Association of Academic Health Sciences Libraries (AAHSL), and MEDLIB-L. Previous discussions of library building projects on email discussion lists, a literature review, personal communications, and the author's consulting experiences identified additional projects. Results: Seventy-eight health sciences library building projects at seventy-three institutions are reported. Twenty-two are newer facilities built within the last ten years; two are space expansions; forty-five are renovation projects; and nine are combinations of new and renovated space. Six institutions report multiple or ongoing renovation projects during the last ten years. Conclusions: The survey results confirm a continuing migration from print-based to digitally based collections and reveal trends in library space design. Some health sciences libraries report loss of space as they move toward creating space for “community” building. Libraries are becoming more proactive in using or retooling space for concentration, collaboration, contemplation, communication, and socialization. All are moving toward a clearer operational vision of the library as the institution's information nexus and not merely as a physical location with print collections. PMID:20428277
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
The vision described here builds on the present U.S. activities in fusion plasma and materials science relevant to the energy goal and extends plasma science at the frontier of discovery. The plan is founded on recommendations made by the National Academies, a number of recent studies by the Fusion Energy Sciences Advisory Committee (FESAC), and the Administration’s views on the greatest opportunities for U.S. scientific leadership.This report highlights five areas of critical importance for the U.S. fusion energy sciences enterprise over the next decade: 1) Massively parallel computing with the goal of validated whole-fusion-device modeling will enable a transformation inmore » predictive power, which is required to minimize risk in future fusion energy development steps; 2) Materials science as it relates to plasma and fusion sciences will provide the scientific foundations for greatly improved plasma confinement and heat exhaust; 3) Research in the prediction and control of transient events that can be deleterious to toroidal fusion plasma confinement will provide greater confidence in machine designs and operation with stable plasmas; 4) Continued stewardship of discovery in plasma science that is not expressly driven by the energy goal will address frontier science issues underpinning great mysteries of the visible universe and help attract and retain a new generation of plasma/fusion science leaders; 5) FES user facilities will be kept world-leading through robust operations support and regular upgrades. Finally, we will continue leveraging resources among agencies and institutions and strengthening our partnerships with international research facilities.« less
Big Thinking: The Power of Nanoscience (LBNL Science at the Theater)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Milliron, Delia; Sanili, Babak; Weber-Bargioni, Alex
2011-06-06
Science at the Theater, June 6th, 2011: Berkeley Lab scientists reveal how nanoscience will bring us cleaner energy, faster computers, and improved medicine. Alex Weber-Bargioni: How can we see things at the nanoscale? Alex is pioneering new methods that provide unprecedented insight into nanoscale materials and molecular interactions. The goal is to create rules for building nanoscale materials. Babak Sanii: Nature is an expert at making nanoscale devices such as proteins. Babak is developing ways to see these biological widgets, which could help scientists develop synthetic devices that mimic the best that nature has to offer. Ting Xu: How aremore » we going to make nanoscale devices? A future in which materials and devices are able to assemble themselves may not be that far down the road. Ting is finding ways to induce a wide range of nanoscopic building blocks to assemble into complex structures. Delia Milliron: The dividends of nanoscience could reshape the way we live, from smart windows and solar cells to artificial photosynthesis and improved medical diagnosis. Delia is at the forefront of converting fundamental research into nanotechnology. Moderated by Jim DeYoreo, interim director of the Molecular Foundry, a facility located at Berkeley Lab where scientists from around the world address the myriad challenges in nanoscience.« less
NASA Technical Reports Server (NTRS)
1987-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April, 1986 through September 30, 1986 is summarized.
78 FR 10180 - Annual Computational Science Symposium; Conference
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-13
...] Annual Computational Science Symposium; Conference AGENCY: Food and Drug Administration, HHS. ACTION... Computational Science Symposium.'' The purpose of the conference is to help the broader community align and share experiences to advance computational science. At the conference, which will bring together FDA...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hules, John
This 1998 annual report from the National Scientific Energy Research Computing Center (NERSC) presents the year in review of the following categories: Computational Science; Computer Science and Applied Mathematics; and Systems and Services. Also presented are science highlights in the following categories: Basic Energy Sciences; Biological and Environmental Research; Fusion Energy Sciences; High Energy and Nuclear Physics; and Advanced Scientific Computing Research and Other Projects.
New library buildings: the Health Sciences Library, Memorial University of Newfoundland, St. John's.
Fredericksen, R B
1979-01-01
The new Health Sciences Library of Memorial University of Newfoundland is described and illustrated. A library facility that forms part of a larger health sciences center, this is a medium-sized academic health sciences library built on a single level. Along with a physical description of the library and its features, the concepts of single-level libraries, phased occupancy, and the project management approach to building a large health center library are discussed in detail. Images PMID:476319
NASA Astrophysics Data System (ADS)
Berland, Matthew W.
As scientists use the tools of computational and complex systems theory to broaden science perspectives (e.g., Bar-Yam, 1997; Holland, 1995; Wolfram, 2002), so can middle-school students broaden their perspectives using appropriate tools. The goals of this dissertation project are to build, study, evaluate, and compare activities designed to foster both computational and complex systems fluencies through collaborative constructionist virtual and physical robotics. In these activities, each student builds an agent (e.g., a robot-bird) that must interact with fellow students' agents to generate a complex aggregate (e.g., a flock of robot-birds) in a participatory simulation environment (Wilensky & Stroup, 1999a). In a participatory simulation, students collaborate by acting in a common space, teaching each other, and discussing content with one another. As a result, the students improve both their computational fluency and their complex systems fluency, where fluency is defined as the ability to both consume and produce relevant content (DiSessa, 2000). To date, several systems have been designed to foster computational and complex systems fluencies through computer programming and collaborative play (e.g., Hancock, 2003; Wilensky & Stroup, 1999b); this study suggests that, by supporting the relevant fluencies through collaborative play, they become mutually reinforcing. In this work, I will present both the design of the VBOT virtual/physical constructionist robotics learning environment and a comparative study of student interaction with the virtual and physical environments across four middle-school classrooms, focusing on the contrast in systems perspectives differently afforded by the two environments. In particular, I found that while performance gains were similar overall, the physical environment supported agent perspectives on aggregate behavior, and the virtual environment supported aggregate perspectives on agent behavior. The primary research questions are: (1) What are the relative affordances of virtual and physical constructionist robotics systems towards computational and complex systems fluencies? (2) What can middle school students learn using computational/complex systems learning environments in a collaborative setting? (3) In what ways are these environments and activities effective in teaching students computational and complex systems fluencies?
Enduring Influence of Stereotypical Computer Science Role Models on Women's Academic Aspirations
ERIC Educational Resources Information Center
Cheryan, Sapna; Drury, Benjamin J.; Vichayapai, Marissa
2013-01-01
The current work examines whether a brief exposure to a computer science role model who fits stereotypes of computer scientists has a lasting influence on women's interest in the field. One-hundred undergraduate women who were not computer science majors met a female or male peer role model who embodied computer science stereotypes in appearance…
A Web of Resources for Introductory Computer Science.
ERIC Educational Resources Information Center
Rebelsky, Samuel A.
As the field of Computer Science has grown, the syllabus of the introductory Computer Science course has changed significantly. No longer is it a simple introduction to programming or a tutorial on computer concepts and applications. Rather, it has become a survey of the field of Computer Science, touching on a wide variety of topics from digital…
Using Automated Scores of Student Essays to Support Teacher Guidance in Classroom Inquiry
NASA Astrophysics Data System (ADS)
Gerard, Libby F.; Linn, Marcia C.
2016-02-01
Computer scoring of student written essays about an inquiry topic can be used to diagnose student progress both to alert teachers to struggling students and to generate automated guidance. We identify promising ways for teachers to add value to automated guidance to improve student learning. Three teachers from two schools and their 386 students participated. We draw on evidence from student progress, observations of how teachers interact with students, and reactions of teachers. The findings suggest that alerts for teachers prompted rich teacher-student conversations about energy in photosynthesis. In one school, the combination of the automated guidance plus teacher guidance was more effective for student science learning than two rounds of personalized, automated guidance. In the other school, both approaches resulted in equal learning gains. These findings suggest optimal combinations of automated guidance and teacher guidance to support students to revise explanations during inquiry and build integrated understanding of science.
Dupoux, Emmanuel
2018-04-01
Spectacular progress in the information processing sciences (machine learning, wearable sensors) promises to revolutionize the study of cognitive development. Here, we analyse the conditions under which 'reverse engineering' language development, i.e., building an effective system that mimics infant's achievements, can contribute to our scientific understanding of early language development. We argue that, on the computational side, it is important to move from toy problems to the full complexity of the learning situation, and take as input as faithful reconstructions of the sensory signals available to infants as possible. On the data side, accessible but privacy-preserving repositories of home data have to be setup. On the psycholinguistic side, specific tests have to be constructed to benchmark humans and machines at different linguistic levels. We discuss the feasibility of this approach and present an overview of current results. Copyright © 2017 Elsevier B.V. All rights reserved.
Infrared Astrophysics in the SOFIA Era - An Overview
NASA Astrophysics Data System (ADS)
Yorke, Harold W.
2018-06-01
The Stratospheric Observatory for Infrared Astronomy (SOFIA) provides the international astronomical community access to a broad range of instrumentation that covers wavelengths spanning the near to far infrared. The high spectral resolution of many of these instruments in several wavelength bands is unmatched by any existing or near future planned facility. The far infrared polarization capabilities of one of its instruments, HAWC+, is also unique. Moreover, SOFIA allows for additional instrument augmentations, as new state-of-the-art photometric, spectrometric, and polarimetric capabilities have been added and are being further improved. The fact that SOFIA provides ample mass, power, computing capabilities as well as 4K cooling eases the constraints on future instrument design, technical readiness, and the instrument build to an extent not possible for space-borne missions. We will review SOFIA's current and future planned capabilities and highlight specific science areas for which the stratospheric observatory will be able to significantly advance Origins science topics.
Raghavan, Ramesh; Camarata, Stephen; White, Karl; Barbaresi, William; Parish, Susan; Krahn, Gloria
2018-05-17
The aim of the study was to provide an overview of population science as applied to speech and language disorders, illustrate data sources, and advance a research agenda on the epidemiology of these conditions. Computer-aided database searches were performed to identify key national surveys and other sources of data necessary to establish the incidence, prevalence, and course and outcome of speech and language disorders. This article also summarizes a research agenda that could enhance our understanding of the epidemiology of these disorders. Although the data yielded estimates of prevalence and incidence for speech and language disorders, existing sources of data are inadequate to establish reliable rates of incidence, prevalence, and outcomes for speech and language disorders at the population level. Greater support for inclusion of speech and language disorder-relevant questions is necessary in national health surveys to build the population science in the field.
ORNL Sustainable Campus Initiative
DOE Office of Scientific and Technical Information (OSTI.GOV)
Halford, Christopher K
2012-01-01
The research conducted at Oak Ridge National Laboratory (ORNL) spans many disciplines and has the potential for far-reaching impact in many areas of everyday life. ORNL researchers and operations staff work on projects in areas as diverse as nuclear power generation, transportation, materials science, computing, and building technologies. As the U.S. Department of Energy s (DOE) largest science and energy research facility, ORNL seeks to establish partnerships with industry in the development of innovative new technologies. The primary focus of this current research deals with developing technologies which improve or maintain the quality of life for humans while reducing themore » overall impact on the environment. In its interactions with industry, ORNL serves as both a facility for sustainable research, as well as a representative of DOE to the private sector. For these reasons it is important that the everyday operations of the Laboratory reflect a dedication to the concepts of stewardship and sustainability.« less
NASA Technical Reports Server (NTRS)
1988-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April l, 1988 through September 30, 1988.
NASA Technical Reports Server (NTRS)
1984-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science during the period October 1, 1983 through March 31, 1984 is summarized.
NASA Technical Reports Server (NTRS)
1987-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1986 through March 31, 1987 is summarized.
High school computer science education paves the way for higher education: the Israeli case
NASA Astrophysics Data System (ADS)
Armoni, Michal; Gal-Ezer, Judith
2014-07-01
The gap between enrollments in higher education computing programs and the high-tech industry's demands is widely reported, and is especially prominent for women. Increasing the availability of computer science education in high school is one of the strategies suggested in order to address this gap. We look at the connection between exposure to computer science in high school and pursuing computing in higher education. We also examine the gender gap, in the context of high school computer science education. We show that in Israel, students who took the high-level computer science matriculation exam were more likely to pursue computing in higher education. Regarding the issue of gender, we will show that, in general, in Israel the difference between males and females who take computer science in high school is relatively small, and a larger, though still not very large difference exists only for the highest exam level. In addition, exposing females to high-level computer science in high school has more relative impact on pursuing higher education in computing.
NASA Astrophysics Data System (ADS)
Childs, L. M.; Rogers, L.; Favors, J.; Ruiz, M.
2012-12-01
Through the years, NASA has played a distinct/important/vital role in advancing Earth System Science to meet the challenges of environmental management and policy decision making. Within NASA's Earth Science Division's Applied Sciences' Program, the DEVELOP National Program seeks to extend NASA Earth Science for societal benefit. DEVELOP is a capacity building program providing young professionals and students the opportunity to utilize NASA Earth observations and model output to demonstrate practical applications of those resources to society. Under the guidance of science advisors, DEVELOP teams work in alignment with local, regional, national and international partner organizations to identify the widest array of practical uses for NASA data to enhance related management decisions. The program's structure facilitates a two-fold approach to capacity building by fostering an environment of scientific and professional development opportunities for young professionals and students, while also providing end-user organizations enhanced management and decision making tools for issues impacting their communities. With the competitive nature and growing societal role of science and technology in today's global workplace, DEVELOP is building capacity in the next generation of scientists and leaders by fostering a learning and growing environment where young professionals possess an increased understanding of teamwork, personal development, and scientific/professional development and NASA's Earth Observation System. DEVELOP young professionals are partnered with end user organizations to conduct 10 week feasibility studies that demonstrate the use of NASA Earth science data for enhanced decision making. As a result of the partnership, end user organizations are introduced to NASA Earth Science technologies and capabilities, new methods to augment current practices, hands-on training with practical applications of remote sensing and NASA Earth science, improved remote sensing and geographic information science (GIS) capabilities, and opportunities for networking with the NASA and Earth Science community. By engaging young professionals and end user organizations, DEVELOP strives to uniquely build capacity through the extension of NASA Earth Science outcomes to the public through projects that innovatively use NASA Earth observations to address environmental concerns and impact policy and decision making.
Prognocean Plus: the Science-Oriented Sea Level Prediction System as a Tool for Public Stakeholders
NASA Astrophysics Data System (ADS)
Świerczyńska, M. G.; Miziński, B.; Niedzielski, T.
2015-12-01
The novel real-time system for sea level prediction, known as Prognocean Plus, has been developed as a new generation service available through the Polish supercomputing grid infrastructure. The researchers can access the service at https://prognocean.plgrid.pl/. Although the system is science-oriented, we wish to discuss herein its potentials to enhance ocean management studies carried out routinely by public stakeholders. The system produces the short- and medium-term predictions of global altimetric gridded Sea Level Anomaly (SLA) time series, updated daily. The spatial resolution of the SLA forecasts is 1/4° x 1/4°, while the temporal resolution of prognoses is equal to 1 day. The system computes the predictions of time-variable ocean topography using five data-based models, which are not computationally demanding, enabling us to compare their skillfulness in respect to physically-based approaches commonly used by different sea level prediction systems. However, the aim of the system is not only to compute the predictions for science purposes, but primarily to build a user-oriented platform that serves the prognoses and their statistics to a broader community. Thus, we deliver the SLA forecasts as a rapid service available online. In order to provide potential users with the access to science results the Web Map Service (WMS) for Prognocean Plus is designed. We regularly publish the forecasts, both in the interactive graphical WMS service, available from the browser, as well as through the Web Coverage Service (WCS) standard. The Prognocean Plus system, as an early-response system, may be interesting for public stakeholders. It may be used for marine navigation as well as for climate risk management (delineate areas vulnerable to local sea level rise), marine management (advise offered for offshore activities) and coastal management (early warnings against coastal floodings).
Participatory Design of Human-Centered Cyberinfrastructure (Invited)
NASA Astrophysics Data System (ADS)
Pennington, D. D.; Gates, A. Q.
2010-12-01
Cyberinfrastructure, by definition, is about people sharing resources to achieve outcomes that cannot be reached independently. CI depends not just on creating discoverable resources, or tools that allow those resources to be processed, integrated, and visualized -- but on human activation of flows of information across those resources. CI must be centered on human activities. Yet for those CI projects that are directed towards observational science, there are few models for organizing collaborative research in ways that align individual research interests into a collective vision of CI-enabled science. Given that the emerging technologies are themselves expected to change the way science is conducted, it is not simply a matter of conducting requirements analysis on how scientists currently work, or building consensus among the scientists on what is needed. Developing effective CI depends on generating a new, creative vision of problem solving within a community based on computational concepts that are, in some cases, still very abstract and theoretical. The computer science theory may (or may not) be well formalized, but the potential for impact on any particular domain is typically ill-defined. In this presentation we will describe approaches being developed and tested at the CyberShARE Center of Excellence at University of Texas in El Paso for ill-structured problem solving within cross-disciplinary teams of scientists and computer scientists working on data intensive environmental and geoscience. These approaches deal with the challenges associated with sharing and integrating knowledge across disciplines; the challenges of developing effective teamwork skills in a culture that favors independent effort; and the challenges of evolving shared, focused research goals from ill-structured, vague starting points - all issues that must be confronted by every interdisciplinary CI project. We will introduce visual and semantic-based tools that can enable the collaborative research design process and illustrate their application in designing and developing useful end-to-end data solutions for scientists. Lastly, we will outline areas of future investigation within CyberShARE that we believe have the potential for high impact.
NASA Astrophysics Data System (ADS)
Wyborn, L. A.; Evans, B. J. K.
2015-12-01
The National Computational Infrastructure (NCI) at the Australian National University (ANU) has evolved to become Australia's peak computing centre for national computational and Data-intensive Earth system science. More recently NCI collocated 10 Petabytes of 34 major national and international environmental, climate, earth system, geophysics and astronomy data collections to create the National Environmental Research Interoperability Data Platform (NERDIP). Spatial scales of the collections range from global to local ultra-high resolution, whilst sizes range from 3PB down to a few GB. The data is highly connected to both NCI HPC and cloud resources via low latency internal networks with massive bandwidth. Now that the collections are collocated on a single data platform, the 'Hype' and expectations around potential use cases for the NERDIP are high. Not unexpected issues are emerging such as access, licensing issues, ownership, and incompatible data standards. Many communities are standardised within their domain, but achieving true interdisciplinary science will require all communities to move towards open interoperable data formats such as NetCDF4/HDF5. This transition will impact on software using proprietary or non-open standards. But before we reach the 'Plateau of Productivity', there needs to be greater 'Enlightenment' of users to encourage them to realise that this unprecedented Earth system science platform provides a rich mine of opportunities for discovery and innovation for a diverse range of both domain-specific and interdisciplinary investigations including climate and weather research, impact analysis, environment, remote sensing and geophysics and develop new and innovative interdisciplinary use cases that will guide those architecting the system and help minimise the amplitude of the 'Trough of Disillusionment' and ensure greater productivity and uptake of the collections that make NERDIP unique in the next generation of Data-intensive Science.
Defining Computational Thinking for Mathematics and Science Classrooms
NASA Astrophysics Data System (ADS)
Weintrop, David; Beheshti, Elham; Horn, Michael; Orton, Kai; Jona, Kemi; Trouille, Laura; Wilensky, Uri
2016-02-01
Science and mathematics are becoming computational endeavors. This fact is reflected in the recently released Next Generation Science Standards and the decision to include "computational thinking" as a core scientific practice. With this addition, and the increased presence of computation in mathematics and scientific contexts, a new urgency has come to the challenge of defining computational thinking and providing a theoretical grounding for what form it should take in school science and mathematics classrooms. This paper presents a response to this challenge by proposing a definition of computational thinking for mathematics and science in the form of a taxonomy consisting of four main categories: data practices, modeling and simulation practices, computational problem solving practices, and systems thinking practices. In formulating this taxonomy, we draw on the existing computational thinking literature, interviews with mathematicians and scientists, and exemplary computational thinking instructional materials. This work was undertaken as part of a larger effort to infuse computational thinking into high school science and mathematics curricular materials. In this paper, we argue for the approach of embedding computational thinking in mathematics and science contexts, present the taxonomy, and discuss how we envision the taxonomy being used to bring current educational efforts in line with the increasingly computational nature of modern science and mathematics.
Cloud-based Jupyter Notebooks for Water Data Analysis
NASA Astrophysics Data System (ADS)
Castronova, A. M.; Brazil, L.; Seul, M.
2017-12-01
The development and adoption of technologies by the water science community to improve our ability to openly collaborate and share workflows will have a transformative impact on how we address the challenges associated with collaborative and reproducible scientific research. Jupyter notebooks offer one solution by providing an open-source platform for creating metadata-rich toolchains for modeling and data analysis applications. Adoption of this technology within the water sciences, coupled with publicly available datasets from agencies such as USGS, NASA, and EPA enables researchers to easily prototype and execute data intensive toolchains. Moreover, implementing this software stack in a cloud-based environment extends its native functionality to provide researchers a mechanism to build and execute toolchains that are too large or computationally demanding for typical desktop computers. Additionally, this cloud-based solution enables scientists to disseminate data processing routines alongside journal publications in an effort to support reproducibility. For example, these data collection and analysis toolchains can be shared, archived, and published using the HydroShare platform or downloaded and executed locally to reproduce scientific analysis. This work presents the design and implementation of a cloud-based Jupyter environment and its application for collecting, aggregating, and munging various datasets in a transparent, sharable, and self-documented manner. The goals of this work are to establish a free and open source platform for domain scientists to (1) conduct data intensive and computationally intensive collaborative research, (2) utilize high performance libraries, models, and routines within a pre-configured cloud environment, and (3) enable dissemination of research products. This presentation will discuss recent efforts towards achieving these goals, and describe the architectural design of the notebook server in an effort to support collaborative and reproducible science.
Curricular Design for Intelligent Systems in Geosciences Using Urban Groundwater Studies.
NASA Astrophysics Data System (ADS)
Cabral-Cano, E.; Pierce, S. A.; Fuentes-Pineda, G.; Arora, R.
2016-12-01
Geosciences research frequently focuses on process-centered phenomena, studying combinations of physical, geological, chemical, biological, ecological, and anthropogenic factors. These interconnected Earth systems can be best understood through the use of digital tools that should be documented as workflows. To develop intelligent systems, it is important that geoscientists and computing and information sciences experts collaborate to: (1) develop a basic understanding of the geosciences and computing and information sciences disciplines so that the problem and solution approach are clear to all stakeholders, and (2) implement the desired intelligent system with a short turnaround time. However, these interactions and techniques are seldom covered in traditional Earth Sciences curricula. We have developed an exchange course on Intelligent Systems for Geosciences to support workforce development and build capacity to facilitate skill-development at the undergraduate student-level. The first version of this course was offered jointly by the University of Texas at Austin and the Universidad Nacional Autónoma de México as an intensive, study-abroad summer course. Content included: basic Linux introduction, shell scripting and high performance computing, data management, experts systems, field data collection exercises and basics of machine learning. Additionally, student teams were tasked to develop a term projects that centered on applications of Intelligent Systems applied to urban and karst groundwater systems. Projects included expert system and reusable workflow development for subsidence hazard analysis in Celaya, Mexico, a classification model to analyze land use change over a 30 Year Period in Austin, Texas, big data processing and decision support for central Texas groundwater case studies and 3D mapping with point cloud processing at three Texas field sites. We will share experiences and pedagogical insights to improve future versions of this course.
Building America Top Innovations 2013 Profile – Building America Solution Center
DOE Office of Scientific and Technical Information (OSTI.GOV)
none,
2013-09-01
This Top Innovation profile provides information about the Building America Solution Center created by Pacific Northwest National Laboratory, a web tool connecting users to thousands of pieces of building science information developed by DOE’s Building America research partners.
. Education Ph.D., Computer Science, Colorado School of Mines M.S., Computer Science, University of Queensland B.S., Computer Science, Colorado School of Mines Brunhart-Lupo Nicholas Brunhart-Lupo Computational Science Nicholas.Brunhart-Lupo@nrel.gov
ERIC Educational Resources Information Center
Margolis, Jane; Goode, Joanna; Bernier, David
2011-01-01
Broadening computer science learning to include more students is a crucial item on the United States' education agenda, these authors say. Although policymakers advocate more computer science expertise, computer science offerings in high schools are few--and actually shrinking. In addition, poorly resourced schools with a high percentage of…
NASA Technical Reports Server (NTRS)
1989-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1988 through March 31, 1989 is summarized.
Virtual working systems to support R&D groups
NASA Astrophysics Data System (ADS)
Dew, Peter M.; Leigh, Christine; Drew, Richard S.; Morris, David; Curson, Jayne
1995-03-01
The paper reports on the progress at Leeds University to build a Virtual Science Park (VSP) to enhance the University's ability to interact with industry, grow its applied research and workplace learning activities. The VSP exploits the advances in real time collaborative computing and networking to provide an environment that meets the objectives of physically based science parks without the need for the organizations to relocate. It provides an integrated set of services (e.g. virtual consultancy, workbased learning) built around a structured person- centered information model. This model supports the integration of tools for: (a) navigating around the information space; (b) browsing information stored within the VSP database; (c) communicating through a variety of Person-to-Person collaborative tools; and (d) the ability to the information stored in the VSP including the relationships to other information that support the underlying model. The paper gives an overview of a generic virtual working system based on X.500 directory services and the World-Wide Web that can be used to support the Virtual Science Park. Finally the paper discusses some of the research issues that need to be addressed to fully realize a Virtual Science Park.
Experimenter's laboratory for visualized interactive science
NASA Technical Reports Server (NTRS)
Hansen, Elaine R.; Klemp, Marjorie K.; Lasater, Sally W.; Szczur, Marti R.; Klemp, Joseph B.
1992-01-01
The science activities of the 1990's will require the analysis of complex phenomena and large diverse sets of data. In order to meet these needs, we must take advantage of advanced user interaction techniques: modern user interface tools; visualization capabilities; affordable, high performance graphics workstations; and interoperable data standards and translator. To meet these needs, we propose to adopt and upgrade several existing tools and systems to create an experimenter's laboratory for visualized interactive science. Intuitive human-computer interaction techniques have already been developed and demonstrated at the University of Colorado. A Transportable Applications Executive (TAE+), developed at GSFC, is a powerful user interface tool for general purpose applications. A 3D visualization package developed by NCAR provides both color shaded surface displays and volumetric rendering in either index or true color. The Network Common Data Form (NetCDF) data access library developed by Unidata supports creation, access and sharing of scientific data in a form that is self-describing and network transparent. The combination and enhancement of these packages constitutes a powerful experimenter's laboratory capable of meeting key science needs of the 1990's. This proposal encompasses the work required to build and demonstrate this capability.
Experimenter's laboratory for visualized interactive science
NASA Technical Reports Server (NTRS)
Hansen, Elaine R.; Klemp, Marjorie K.; Lasater, Sally W.; Szczur, Marti R.; Klemp, Joseph B.
1993-01-01
The science activities of the 1990's will require the analysis of complex phenomena and large diverse sets of data. In order to meet these needs, we must take advantage of advanced user interaction techniques: modern user interface tools; visualization capabilities; affordable, high performance graphics workstations; and interoperatable data standards and translator. To meet these needs, we propose to adopt and upgrade several existing tools and systems to create an experimenter's laboratory for visualized interactive science. Intuitive human-computer interaction techniques have already been developed and demonstrated at the University of Colorado. A Transportable Applications Executive (TAE+), developed at GSFC, is a powerful user interface tool for general purpose applications. A 3D visualization package developed by NCAR provides both color-shaded surface displays and volumetric rendering in either index or true color. The Network Common Data Form (NetCDF) data access library developed by Unidata supports creation, access and sharing of scientific data in a form that is self-describing and network transparent. The combination and enhancement of these packages constitutes a powerful experimenter's laboratory capable of meeting key science needs of the 1990's. This proposal encompasses the work required to build and demonstrate this capability.
EarthCube GeoLink: Semantics and Linked Data for the Geosciences
NASA Astrophysics Data System (ADS)
Arko, R. A.; Carbotte, S. M.; Chandler, C. L.; Cheatham, M.; Fils, D.; Hitzler, P.; Janowicz, K.; Ji, P.; Jones, M. B.; Krisnadhi, A.; Lehnert, K. A.; Mickle, A.; Narock, T.; O'Brien, M.; Raymond, L. M.; Schildhauer, M.; Shepherd, A.; Wiebe, P. H.
2015-12-01
The NSF EarthCube initiative is building next-generation cyberinfrastructure to aid geoscientists in collecting, accessing, analyzing, sharing, and visualizing their data and knowledge. The EarthCube GeoLink Building Block project focuses on a specific set of software protocols and vocabularies, often characterized as the Semantic Web and "Linked Data", to publish data online in a way that is easily discoverable, accessible, and interoperable. GeoLink brings together specialists from the computer science, geoscience, and library science domains, and includes data from a network of NSF-funded repositories that support scientific studies in marine geology, marine ecosystems, biogeochemistry, and paleoclimatology. We are working collaboratively with closely-related Building Block projects including EarthCollab and CINERGI, and solicit feedback from RCN projects including Cyberinfrastructure for Paleogeosciences (C4P) and iSamples. GeoLink has developed a modular ontology that describes essential geoscience research concepts; published data from seven collections (to date) on the Web as geospatially-enabled Linked Data using this ontology; matched and mapped data between collections using shared identifiers for investigators, repositories, datasets, funding awards, platforms, research cruises, physical specimens, and gazetteer features; and aggregated the results in a shared knowledgebase that can be queried via a standard SPARQL endpoint. Client applications have been built around the knowledgebase, including a Web/map-based data browser using the Leaflet JavaScript library and a simple query service using the OpenSearch format. Future development will include extending and refining the GeoLink ontology, adding content from additional repositories, developing semi-automated algorithms to enhance metadata, and further work on client applications.
Building bridges between the physical and biological sciences.
Ninham, B W; Boström, M
2005-12-16
This paper attempts to identify major conceptual issues that have inhibited the application of physical chemistry to problems in the biological sciences. We will trace out where theories went wrong, how to repair the present foundations, and discuss current progress toward building a better dialogue.
Courtyard between the library, at left, and the life sciences ...
Courtyard between the library, at left, and the life sciences building, at right. The north end of the administration building is just out of view to the right. - San Bernardino Valley College, 701 South Mount Vernon Avenue, San Bernardino, San Bernardino County, CA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collins, W. E.
2004-08-16
Computational Science plays a big role in research and development in mathematics, science, engineering and biomedical disciplines. The Alliance for Computational Science Collaboration (ACSC) has the goal of training African-American and other minority scientists in the computational science field for eventual employment with the Department of Energy (DOE). The involvements of Historically Black Colleges and Universities (HBCU) in the Alliance provide avenues for producing future DOE African-American scientists. Fisk University has been participating in this program through grants from the DOE. The DOE grant supported computational science activities at Fisk University. The research areas included energy related projects, distributed computing,more » visualization of scientific systems and biomedical computing. Students' involvement in computational science research included undergraduate summer research at Oak Ridge National Lab, on-campus research involving the participation of undergraduates, participation of undergraduate and faculty members in workshops, and mentoring of students. These activities enhanced research and education in computational science, thereby adding to Fisk University's spectrum of research and educational capabilities. Among the successes of the computational science activities are the acceptance of three undergraduate students to graduate schools with full scholarships beginning fall 2002 (one for master degree program and two for Doctoral degree program).« less
Theoretical basis of the DOE-2 building energy use analysis program
NASA Astrophysics Data System (ADS)
Curtis, R. B.
1981-04-01
A user-oriented, public domain, computer program was developed that will enable architects and engineers to perform design and retrofit studies of the energy-use of buildings under realistic weather conditions. The DOE-2.1A has been named by the US DOE as the standard evaluation technique for the Congressionally mandated building energy performance standards (BEPS). A number of program design decisions were made that determine the breadth of applicability of DOE-2.1. Such design decisions are intrinsic to all building energy use analysis computer programs and determine the types of buildings or the kind of HVAC systems that can be modeled. In particular, the weighting factor method used in DOE-2 has both advantages and disadvantages relative to other computer programs.
Changing the face of science: Lessons from the 2017 Science-A-Thon
NASA Astrophysics Data System (ADS)
Barnes, R. T.; Licker, R.; Burt, M. A.; Holloway, T.
2017-12-01
Studies have shown that over two-thirds of Americans cannot name a living scientist. This disconnect is a concern for science and scientists, considering the large role of public funding for science, and the importance of science in many policy issues. As a large-scale public outreach initiative and fundraiser, the Earth Science Women's Network (ESWN) launched "Science-A-Thon" on July 13, 2017. This "day of science" invited participants to share 12 photos over 12 hours of a day, including both personal routines and professional endeavors. Over 200 scientists participated, with the #DayofScience hashtag trending on Twitter for the day. Earth scientists represented the largest portion of participants, but the event engaged cancer biologists, computer scientists, and more, including scientists from more than 10 countries. Science-A-Thon builds on the success and visibility of other social media campaigns, such as #actuallivingscientist and #DresslikeaWoman. Importantly these efforts share a common goal, by providing diverse images of scientists we can shift the public perception of who a scientist is and what science looks like in the real world. This type of public engagement offers a wide range of potential role models for students, and individual stories to increase public engagement with science. Social media campaigns such as this shift the public perception of who scientists are, why they do what they do, and what they do each day. The actions and conversations emerging from Science-A-Thon included scientists talking about (1) their science and motivation, (2) the purpose and need for ESWN, and (3) why they chose to participate in this event increased the reach of a social media campaign and fundraiser.
The NASA computer science research program plan
NASA Technical Reports Server (NTRS)
1983-01-01
A taxonomy of computer science is included, one state of the art of each of the major computer science categories is summarized. A functional breakdown of NASA programs under Aeronautics R and D, space R and T, and institutional support is also included. These areas were assessed against the computer science categories. Concurrent processing, highly reliable computing, and information management are identified.
Knowledge-Based Environmental Context Modeling
NASA Astrophysics Data System (ADS)
Pukite, P. R.; Challou, D. J.
2017-12-01
As we move from the oil-age to an energy infrastructure based on renewables, the need arises for new educational tools to support the analysis of geophysical phenomena and their behavior and properties. Our objective is to present models of these phenomena to make them amenable for incorporation into more comprehensive analysis contexts. Starting at the level of a college-level computer science course, the intent is to keep the models tractable and therefore practical for student use. Based on research performed via an open-source investigation managed by DARPA and funded by the Department of Interior [1], we have adapted a variety of physics-based environmental models for a computer-science curriculum. The original research described a semantic web architecture based on patterns and logical archetypal building-blocks (see figure) well suited for a comprehensive environmental modeling framework. The patterns span a range of features that cover specific land, atmospheric and aquatic domains intended for engineering modeling within a virtual environment. The modeling engine contained within the server relied on knowledge-based inferencing capable of supporting formal terminology (through NASA JPL's Semantic Web for Earth and Environmental Technology (SWEET) ontology and a domain-specific language) and levels of abstraction via integrated reasoning modules. One of the key goals of the research was to simplify models that were ordinarily computationally intensive to keep them lightweight enough for interactive or virtual environment contexts. The breadth of the elements incorporated is well-suited for learning as the trend toward ontologies and applying semantic information is vital for advancing an open knowledge infrastructure. As examples of modeling, we have covered such geophysics topics as fossil-fuel depletion, wind statistics, tidal analysis, and terrain modeling, among others. Techniques from the world of computer science will be necessary to promote efficient use of our renewable natural resources. [1] C2M2L (Component, Context, and Manufacturing Model Library) Final Report, https://doi.org/10.13140/RG.2.1.4956.3604
On teaching computer ethics within a computer science department.
Quinn, Michael J
2006-04-01
The author has surveyed a quarter of the accredited undergraduate computer science programs in the United States. More than half of these programs offer a 'social and ethical implications of computing' course taught by a computer science faculty member, and there appears to be a trend toward teaching ethics classes within computer science departments. Although the decision to create an 'in house' computer ethics course may sometimes be a pragmatic response to pressure from the accreditation agency, this paper argues that teaching ethics within a computer science department can provide students and faculty members with numerous benefits. The paper lists topics that can be covered in a computer ethics course and offers some practical suggestions for making the course successful.
Computational Science News | Computational Science | NREL
-Cooled High-Performance Computing Technology at the ESIF February 28, 2018 NREL Launches New Website for High-Performance Computing System Users The National Renewable Energy Laboratory (NREL) Computational Science Center has launched a revamped website for users of the lab's high-performance computing (HPC
On Study of Building Smart Campus under Conditions of Cloud Computing and Internet of Things
NASA Astrophysics Data System (ADS)
Huang, Chao
2017-12-01
two new concepts in the information era are cloud computing and internet of things, although they are defined differently, they share close relationship. It is a new measure to realize leap-forward development of campus by virtue of cloud computing, internet of things and other internet technologies to build smart campus. This paper, centering on the construction of smart campus, analyzes and compares differences between network in traditional campus and that in smart campus, and makes proposals on how to build smart campus finally from the perspectives of cloud computing and internet of things.
1988-07-08
Marcus and C. Baczynski), Computer Science Press, Rockville, Maryland, 1986. 3. An Introduction to Pascal and Precalculus , Computer Science Press...Science Press, Rockville, Maryland, 1986. 35. An Introduction to Pascal and Precalculus , Computer Science Press, Rockville, Maryland, 1986. 36
Empirical Determination of Competence Areas to Computer Science Education
ERIC Educational Resources Information Center
Zendler, Andreas; Klaudt, Dieter; Seitz, Cornelia
2014-01-01
The authors discuss empirically determined competence areas to K-12 computer science education, emphasizing the cognitive level of competence. The results of a questionnaire with 120 professors of computer science serve as a database. By using multi-dimensional scaling and cluster analysis, four competence areas to computer science education…
Factors Influencing Exemplary Science Teachers' Levels of Computer Use
ERIC Educational Resources Information Center
Hakverdi, Meral; Dana, Thomas M.; Swain, Colleen
2011-01-01
The purpose of this study was to examine exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their…
Preparing Future Secondary Computer Science Educators
ERIC Educational Resources Information Center
Ajwa, Iyad
2007-01-01
Although nearly every college offers a major in computer science, many computer science teachers at the secondary level have received little formal training. This paper presents details of a project that could make a significant contribution to national efforts to improve computer science education by combining teacher education and professional…
NASA Astrophysics Data System (ADS)
Bender, Jason D.
Understanding hypersonic aerodynamics is important for the design of next-generation aerospace vehicles for space exploration, national security, and other applications. Ground-level experimental studies of hypersonic flows are difficult and expensive; thus, computational science plays a crucial role in this field. Computational fluid dynamics (CFD) simulations of extremely high-speed flows require models of chemical and thermal nonequilibrium processes, such as dissociation of diatomic molecules and vibrational energy relaxation. Current models are outdated and inadequate for advanced applications. We describe a multiscale computational study of gas-phase thermochemical processes in hypersonic flows, starting at the atomic scale and building systematically up to the continuum scale. The project was part of a larger effort centered on collaborations between aerospace scientists and computational chemists. We discuss the construction of potential energy surfaces for the N4, N2O2, and O4 systems, focusing especially on the multi-dimensional fitting problem. A new local fitting method named L-IMLS-G2 is presented and compared with a global fitting method. Then, we describe the theory of the quasiclassical trajectory (QCT) approach for modeling molecular collisions. We explain how we implemented the approach in a new parallel code for high-performance computing platforms. Results from billions of QCT simulations of high-energy N2 + N2, N2 + N, and N2 + O2 collisions are reported and analyzed. Reaction rate constants are calculated and sets of reactive trajectories are characterized at both thermal equilibrium and nonequilibrium conditions. The data shed light on fundamental mechanisms of dissociation and exchange reactions -- and their coupling to internal energy transfer processes -- in thermal environments typical of hypersonic flows. We discuss how the outcomes of this investigation and other related studies lay a rigorous foundation for new macroscopic models for hypersonic CFD. This research was supported by the Department of Energy Computational Science Graduate Fellowship and by the Air Force Office of Scientific Research Multidisciplinary University Research Initiative.
NASA Astrophysics Data System (ADS)
Mentzel, C.
2017-12-01
Modern scientific data continue to increase in volume, variety, and velocity, and though the hype of big data has subsided, its usefulness for scientific discovery has only just begun. Harnessing these data for new insights, more efficient decision making, and other mission critical uses requires a combination of skills and expertise, often labeled data science. Data science can be thought of as a combination of statistics, computation and the domain from which the data relate, and so is a true interdisciplinary pursuit. Though it has reaped large benefits in companies able to afford the high cost of the severely limited talent pool, it suffers from lack of support in mission driven organizations. Not purely in any one historical field, data science is proving difficult to find a home in traditional university academic departments and other research organizations. The landscape of data science efforts, from academia, industry and government, can be characterized as nascent, enthusiastic, uneven, and highly competitive. Part of the challenge in documenting these trends is the lack of agreement about what data science is, and who is a data scientist. Defining these terms too closely and too early runs the risk of cutting off a tremendous amount of productive creativity, but waiting too long leaves many people without a sustainable career, and many organizations without the necessary skills to gain value from their data. This talk will explore the landscape of data science efforts in the US, including how organizations are building and sustaining data science teams.
OPENING REMARKS: SciDAC: Scientific Discovery through Advanced Computing
NASA Astrophysics Data System (ADS)
Strayer, Michael
2005-01-01
Good morning. Welcome to SciDAC 2005 and San Francisco. SciDAC is all about computational science and scientific discovery. In a large sense, computational science characterizes SciDAC and its intent is change. It transforms both our approach and our understanding of science. It opens new doors and crosses traditional boundaries while seeking discovery. In terms of twentieth century methodologies, computational science may be said to be transformational. There are a number of examples to this point. First are the sciences that encompass climate modeling. The application of computational science has in essence created the field of climate modeling. This community is now international in scope and has provided precision results that are challenging our understanding of our environment. A second example is that of lattice quantum chromodynamics. Lattice QCD, while adding precision and insight to our fundamental understanding of strong interaction dynamics, has transformed our approach to particle and nuclear science. The individual investigator approach has evolved to teams of scientists from different disciplines working side-by-side towards a common goal. SciDAC is also undergoing a transformation. This meeting is a prime example. Last year it was a small programmatic meeting tracking progress in SciDAC. This year, we have a major computational science meeting with a variety of disciplines and enabling technologies represented. SciDAC 2005 should position itself as a new corner stone for Computational Science and its impact on science. As we look to the immediate future, FY2006 will bring a new cycle to SciDAC. Most of the program elements of SciDAC will be re-competed in FY2006. The re-competition will involve new instruments for computational science, new approaches for collaboration, as well as new disciplines. There will be new opportunities for virtual experiments in carbon sequestration, fusion, and nuclear power and nuclear waste, as well as collaborations with industry and virtual prototyping. New instruments of collaboration will include institutes and centers while summer schools, workshops and outreach will invite new talent and expertise. Computational science adds new dimensions to science and its practice. Disciplines of fusion, accelerator science, and combustion are poised to blur the boundaries between pure and applied science. As we open the door into FY2006 we shall see a landscape of new scientific challenges: in biology, chemistry, materials, and astrophysics to name a few. The enabling technologies of SciDAC have been transformational as drivers of change. Planning for major new software systems assumes a base line employing Common Component Architectures and this has become a household word for new software projects. While grid algorithms and mesh refinement software have transformed applications software, data management and visualization have transformed our understanding of science from data. The Gordon Bell prize now seems to be dominated by computational science and solvers developed by TOPS ISIC. The priorities of the Office of Science in the Department of Energy are clear. The 20 year facilities plan is driven by new science. High performance computing is placed amongst the two highest priorities. Moore's law says that by the end of the next cycle of SciDAC we shall have peta-flop computers. The challenges of petascale computing are enormous. These and the associated computational science are the highest priorities for computing within the Office of Science. Our effort in Leadership Class computing is just a first step towards this goal. Clearly, computational science at this scale will face enormous challenges and possibilities. Performance evaluation and prediction will be critical to unraveling the needed software technologies. We must not lose sight of our overarching goal—that of scientific discovery. Science does not stand still and the landscape of science discovery and computing holds immense promise. In this environment, I believe it is necessary to institute a system of science based performance metrics to help quantify our progress towards science goals and scientific computing. As a final comment I would like to reaffirm that the shifting landscapes of science will force changes to our computational sciences, and leave you with the quote from Richard Hamming, 'The purpose of computing is insight, not numbers'.
Li, Y; Nielsen, P V
2011-12-01
There has been a rapid growth of scientific literature on the application of computational fluid dynamics (CFD) in the research of ventilation and indoor air science. With a 1000-10,000 times increase in computer hardware capability in the past 20 years, CFD has become an integral part of scientific research and engineering development of complex air distribution and ventilation systems in buildings. This review discusses the major and specific challenges of CFD in terms of turbulence modelling, numerical approximation, and boundary conditions relevant to building ventilation. We emphasize the growing need for CFD verification and validation, suggest ongoing needs for analytical and experimental methods to support the numerical solutions, and discuss the growing capacity of CFD in opening up new research areas. We suggest that CFD has not become a replacement for experiment and theoretical analysis in ventilation research, rather it has become an increasingly important partner. We believe that an effective scientific approach for ventilation studies is still to combine experiments, theory, and CFD. We argue that CFD verification and validation are becoming more crucial than ever as more complex ventilation problems are solved. It is anticipated that ventilation problems at the city scale will be tackled by CFD in the next 10 years. © 2011 John Wiley & Sons A/S.
Energy consumption program: A computer model simulating energy loads in buildings
NASA Technical Reports Server (NTRS)
Stoller, F. W.; Lansing, F. L.; Chai, V. W.; Higgins, S.
1978-01-01
The JPL energy consumption computer program developed as a useful tool in the on-going building modification studies in the DSN energy conservation project is described. The program simulates building heating and cooling loads and computes thermal and electric energy consumption and cost. The accuracy of computations are not sacrificed, however, since the results lie within + or - 10 percent margin compared to those read from energy meters. The program is carefully structured to reduce both user's time and running cost by asking minimum information from the user and reducing many internal time-consuming computational loops. Many unique features were added to handle two-level electronics control rooms not found in any other program.
Bristol, R. Sky; Euliss, Ned H.; Booth, Nathaniel L.; Burkardt, Nina; Diffendorfer, Jay E.; Gesch, Dean B.; McCallum, Brian E.; Miller, David M.; Morman, Suzette A.; Poore, Barbara S.; Signell, Richard P.; Viger, Roland J.
2013-01-01
Core Science Systems is a new mission of the U.S. Geological Survey (USGS) that resulted from the 2007 Science Strategy, "Facing Tomorrow's Challenges: U.S. Geological Survey Science in the Decade 2007-2017." This report describes the Core Science Systems vision and outlines a strategy to facilitate integrated characterization and understanding of the complex Earth system. The vision and suggested actions are bold and far-reaching, describing a conceptual model and framework to enhance the ability of the USGS to bring its core strengths to bear on pressing societal problems through data integration and scientific synthesis across the breadth of science. The context of this report is inspired by a direction set forth in the 2007 Science Strategy. Specifically, ecosystem-based approaches provide the underpinnings for essentially all science themes that define the USGS. Every point on Earth falls within a specific ecosystem where data, other information assets, and the expertise of USGS and its many partners can be employed to quantitatively understand how that ecosystem functions and how it responds to natural and anthropogenic disturbances. Every benefit society obtains from the planet-food, water, raw materials to build infrastructure, homes and automobiles, fuel to heat homes and cities, and many others, are derived from or affect ecosystems. The vision for Core Science Systems builds on core strengths of the USGS in characterizing and understanding complex Earth and biological systems through research, modeling, mapping, and the production of high quality data on the Nation's natural resource infrastructure. Together, these research activities provide a foundation for ecosystem-based approaches through geologic mapping, topographic mapping, and biodiversity mapping. The vision describes a framework founded on these core mapping strengths that makes it easier for USGS scientists to discover critical information, share and publish results, and identify potential collaborations that transcend all USGS missions. The framework is designed to improve the efficiency of scientific work within USGS by establishing a means to preserve and recall data for future applications, organizing existing scientific knowledge and data to facilitate new use of older information, and establishing a future workflow that naturally integrates new data, applications, and other science products to make interdisciplinary research easier and more efficient. Given the increasing need for integrated data and interdisciplinary approaches to solve modern problems, leadership by the Core Science Systems mission will facilitate problem solving by all USGS missions in ways not formerly possible. The report lays out a strategy to achieve this vision through three goals with accompanying objectives and actions. The first goal builds on and enhances the strengths of the Core Science Systems mission in characterizing and understanding the Earth system from the geologic framework to the topographic characteristics of the land surface and biodiversity across the Nation. The second goal enhances and develops new strengths in computer and information science to make it easier for USGS scientists to discover data and models, share and publish results, and discover connections between scientific information and knowledge. The third goal brings additional focus to research and development methods to address complex issues affecting society that require integration of knowledge and new methods for synthesizing scientific information. Collectively, the report lays out a strategy to create a seamless connection between all USGS activities to accelerate and make USGS science more efficient by fully integrating disciplinary expertise within a new and evolving science paradigm for a changing world in the 21st century.
Science strategy for Core Science Systems in the U.S. Geological Survey, 2013-2023
Bristol, R. Sky; Euliss, Ned H.; Booth, Nathaniel L.; Burkardt, Nina; Diffendorfer, Jay E.; Gesch, Dean B.; McCallum, Brian E.; Miller, David M.; Morman, Suzette A.; Poore, Barbara S.; Signell, Richard P.; Viger, Roland J.
2012-01-01
Core Science Systems is a new mission of the U.S. Geological Survey (USGS) that grew out of the 2007 Science Strategy, “Facing Tomorrow’s Challenges: U.S. Geological Survey Science in the Decade 2007–2017.” This report describes the vision for this USGS mission and outlines a strategy for Core Science Systems to facilitate integrated characterization and understanding of the complex earth system. The vision and suggested actions are bold and far-reaching, describing a conceptual model and framework to enhance the ability of USGS to bring its core strengths to bear on pressing societal problems through data integration and scientific synthesis across the breadth of science.The context of this report is inspired by a direction set forth in the 2007 Science Strategy. Specifically, ecosystem-based approaches provide the underpinnings for essentially all science themes that define the USGS. Every point on earth falls within a specific ecosystem where data, other information assets, and the expertise of USGS and its many partners can be employed to quantitatively understand how that ecosystem functions and how it responds to natural and anthropogenic disturbances. Every benefit society obtains from the planet—food, water, raw materials to build infrastructure, homes and automobiles, fuel to heat homes and cities, and many others, are derived from or effect ecosystems.The vision for Core Science Systems builds on core strengths of the USGS in characterizing and understanding complex earth and biological systems through research, modeling, mapping, and the production of high quality data on the nation’s natural resource infrastructure. Together, these research activities provide a foundation for ecosystem-based approaches through geologic mapping, topographic mapping, and biodiversity mapping. The vision describes a framework founded on these core mapping strengths that makes it easier for USGS scientists to discover critical information, share and publish results, and identify potential collaborations that transcend all USGS missions. The framework is designed to improve the efficiency of scientific work within USGS by establishing a means to preserve and recall data for future applications, organizing existing scientific knowledge and data to facilitate new use of older information, and establishing a future workflow that naturally integrates new data, applications, and other science products to make it easier and more efficient to conduct interdisciplinary research over time. Given the increasing need for integrated data and interdisciplinary approaches to solve modern problems, leadership by the Core Science Systems mission will facilitate problem solving by all USGS missions in ways not formerly possible.The report lays out a strategy to achieve this vision through three goals with accompanying objectives and actions. The first goal builds on and enhances the strengths of the Core Science Systems mission in characterizing and understanding the earth system from the geologic framework to the topographic characteristics of the land surface and biodiversity across the nation. The second goal enhances and develops new strengths in computer and information science to make it easier for USGS scientists to discover data and models, share and publish results, and discover connections between scientific information and knowledge. The third goal brings additional focus to research and development methods to address complex issues affecting society that require integration of knowledge and new methods for synthesizing scientific information. Collectively, the report lays out a strategy to create a seamless connection between all USGS activities to accelerate and make USGS science more efficient by fully integrating disciplinary expertise within a new and evolving science paradigm for a changing world in the 21st century.
Executable research compendia in geoscience research infrastructures
NASA Astrophysics Data System (ADS)
Nüst, Daniel
2017-04-01
From generation through analysis and collaboration to communication, scientific research requires the right tools. Scientists create their own software using third party libraries and platforms. Cloud computing, Open Science, public data infrastructures, and Open Source enable scientists with unprecedented opportunites, nowadays often in a field "Computational X" (e.g. computational seismology) or X-informatics (e.g. geoinformatics) [0]. This increases complexity and generates more innovation, e.g. Environmental Research Infrastructures (environmental RIs [1]). Researchers in Computational X write their software relying on both source code (e.g. from https://github.com) and binary libraries (e.g. from package managers such as APT, https://wiki.debian.org/Apt, or CRAN, https://cran.r-project.org/). They download data from domain specific (cf. https://re3data.org) or generic (e.g. https://zenodo.org) data repositories, and deploy computations remotely (e.g. European Open Science Cloud). The results themselves are archived, given persistent identifiers, connected to other works (e.g. using https://orcid.org/), and listed in metadata catalogues. A single researcher, intentionally or not, interacts with all sub-systems of RIs: data acquisition, data access, data processing, data curation, and community support [3]. To preserve computational research [3] proposes the Executable Research Compendium (ERC), a container format closing the gap of dependency preservation by encapsulating the runtime environment. ERCs and RIs can be integrated for different uses: (i) Coherence: ERC services validate completeness, integrity and results (ii) Metadata: ERCs connect the different parts of a piece of research and faciliate discovery (iii) Exchange and Preservation: ERC as usable building blocks are the shared and archived entity (iv) Self-consistency: ERCs remove dependence on ephemeral sources (v) Execution: ERC services create and execute a packaged analysis but integrate with existing platforms for display and control These integrations are vital for capturing workflows in RIs and connect key stakeholders (scientists, publishers, librarians). They are demonstrated using developments by the DFG-funded project Opening Reproducible Research (http://o2r.info). Semi-automatic creation of ERCs based on research workflows is a core goal of the project. References [0] Tony Hey, Stewart Tansley, Kristin Tolle (eds), 2009. The Fourth Paradigm: Data-Intensive Scientific Discovery. Microsoft Research. [1] P. Martin et al., Open Information Linking for Environmental Research Infrastructures, 2015 IEEE 11th International Conference on e-Science, Munich, 2015, pp. 513-520. doi: 10.1109/eScience.2015.66 [2] Y. Chen et al., Analysis of Common Requirements for Environmental Science Research Infrastructures, The International Symposium on Grids and Clouds (ISGC) 2013, Taipei, 2013, http://pos.sissa.it/archive/conferences/179/032/ISGC [3] Opening Reproducible Research, Geophysical Research Abstracts Vol. 18, EGU2016-7396, 2016, http://meetingorganizer.copernicus.org/EGU2016/EGU2016-7396.pdf
Are "New Building" Learning Gains Sustainable?
ERIC Educational Resources Information Center
Walczak, Mary M.; Van Wylen, David G. L.
2015-01-01
New science facilities have become a reality on many college campuses in the last few decades. Large time investments in creating shared programmatic vision and designing flexible spaces, partnered with large fiscal investments, have created a new generation of science building. Unfortunately, few studies provide evidence about whether the…
Chemistry in a Large, Multidisciplinary Laboratory.
ERIC Educational Resources Information Center
Lingren, Wesley E.; Hughson, Robert C.
1982-01-01
Describes a science facility built at Seattle Pacific University for approximately 70 percent of the capital cost of a conventional science building. The building serves seven disciplines on a regular basis. The operation of the multidisciplinary laboratory, special features, laboratory security, and student experience/reactions are highlighted.…
2002-05-01
attacks against buildings dedicated to religion, education, art , science or charitable purposes, historic monuments, hospitals and places where the sick...against buildings dedicated to religion, education, art , science or charitable purposes, historic monuments, hospitals and places where the sick and
Requirements Engineering in Building Climate Science Software
ERIC Educational Resources Information Center
Batcheller, Archer L.
2011-01-01
Software has an important role in supporting scientific work. This dissertation studies teams that build scientific software, focusing on the way that they determine what the software should do. These requirements engineering processes are investigated through three case studies of climate science software projects. The Earth System Modeling…
ERIC Educational Resources Information Center
Forwood, Bruce S.
This bibliography has been produced as part of a research program attempting to develop a new approach to building environment and service systems design using computer-aided design techniques. As such it not only classifies available literature on the service systems themselves, but also contains sections on the application of computers and…
Sequential visibility-graph motifs
NASA Astrophysics Data System (ADS)
Iacovacci, Jacopo; Lacasa, Lucas
2016-04-01
Visibility algorithms transform time series into graphs and encode dynamical information in their topology, paving the way for graph-theoretical time series analysis as well as building a bridge between nonlinear dynamics and network science. In this work we introduce and study the concept of sequential visibility-graph motifs, smaller substructures of n consecutive nodes that appear with characteristic frequencies. We develop a theory to compute in an exact way the motif profiles associated with general classes of deterministic and stochastic dynamics. We find that this simple property is indeed a highly informative and computationally efficient feature capable of distinguishing among different dynamics and robust against noise contamination. We finally confirm that it can be used in practice to perform unsupervised learning, by extracting motif profiles from experimental heart-rate series and being able, accordingly, to disentangle meditative from other relaxation states. Applications of this general theory include the automatic classification and description of physical, biological, and financial time series.
Federated and Cloud Enabled Resources for Data Management and Utilization
NASA Astrophysics Data System (ADS)
Rankin, R.; Gordon, M.; Potter, R. G.; Satchwill, B.
2011-12-01
The emergence of cloud computing over the past three years has led to a paradigm shift in how data can be managed, processed and made accessible. Building on the federated data management system offered through the Canadian Space Science Data Portal (www.cssdp.ca), we demonstrate how heterogeneous and geographically distributed data sets and modeling tools have been integrated to form a virtual data center and computational modeling platform that has services for data processing and visualization embedded within it. We also discuss positive and negative experiences in utilizing Eucalyptus and OpenStack cloud applications, and job scheduling facilitated by Condor and Star Cluster. We summarize our findings by demonstrating use of these technologies in the Cloud Enabled Space Weather Data Assimilation and Modeling Platform CESWP (www.ceswp.ca), which is funded through Canarie's (canarie.ca) Network Enabled Platforms program in Canada.
Furenlid, Lars R.; Barrett, Harrison H.; Barber, H. Bradford; Clarkson, Eric W.; Kupinski, Matthew A.; Liu, Zhonglin; Stevenson, Gail D.; Woolfenden, James M.
2015-01-01
During the past two decades, researchers at the University of Arizona’s Center for Gamma-Ray Imaging (CGRI) have explored a variety of approaches to gamma-ray detection, including scintillation cameras, solid-state detectors, and hybrids such as the intensified Quantum Imaging Device (iQID) configuration where a scintillator is followed by optical gain and a fast CCD or CMOS camera. We have combined these detectors with a variety of collimation schemes, including single and multiple pinholes, parallel-hole collimators, synthetic apertures, and anamorphic crossed slits, to build a large number of preclinical molecular-imaging systems that perform Single-Photon Emission Computed Tomography (SPECT), Positron Emission Tomography (PET), and X-Ray Computed Tomography (CT). In this paper, we discuss the themes and methods we have developed over the years to record and fully use the information content carried by every detected gamma-ray photon. PMID:26236069
Building Science Identity in Disadvantaged Teenage Girls using an Apprenticeship Model
NASA Astrophysics Data System (ADS)
Pettit, E. C.; Conner, L.; Tzou, C.
2015-12-01
Expeditionary science differs from laboratory science in that expeditionary science teams conduct investigations in conditions that are often physically and socially, as well as intellectually, challenging. Team members live in close quarters for extended periods of time, team building and leadership affect the scientific process, and research tools are limited to what is available on site. Girls on Ice is an expeditionary science experience primarily for disadvantaged girls; it fully immerses girls in a mini scientific expedition to study alpine, glacierized environments. In addition to mentoring the girls through conducting their own scientific research, we encourage awareness and discussion of different sociocultural perspectives on the relation between the natural world, science, and society. The experience aligns closely with the apprenticeship model of learning, which can be effective in enhancing identification with science. Using a mixed-methods approach, we show that the Girls on Ice model helps girls (1) increase their interest and engagement in science and build a stronger science identity, (2) develop confidence, importantly they develop a combined physical and intellectual confidence; (3) engage in authentic scientific thinking, including critical thinking and problem solving; and (4) enhance leadership self-confidence. We discuss these results in a learning sciences framework, which posits that learning is inseparable from the social and physical contexts in which it takes place.
Programmers, professors, and parasites: credit and co-authorship in computer science.
Solomon, Justin
2009-12-01
This article presents an in-depth analysis of past and present publishing practices in academic computer science to suggest the establishment of a more consistent publishing standard. Historical precedent for academic publishing in computer science is established through the study of anecdotes as well as statistics collected from databases of published computer science papers. After examining these facts alongside information about analogous publishing situations and standards in other scientific fields, the article concludes with a list of basic principles that should be adopted in any computer science publishing standard. These principles would contribute to the reliability and scientific nature of academic publications in computer science and would allow for more straightforward discourse in future publications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cai, Jie; Kim, Donghun; Braun, James E.
It is important to have practical methods for constructing a good mathematical model for a building's thermal system for energy audits, retrofit analysis and advanced building controls, e.g. model predictive control. Identification approaches based on semi-physical model structures are popular in building science for those purposes. However conventional gray box identification approaches applied to thermal networks would fail when significant unmeasured heat gains present in estimation data. Although this situation is very common and practical, there has been little research to tackle this issue in building science. This paper presents an overall identification approach to alleviate influences of unmeasured disturbances,more » and hence to obtain improved gray-box building models. The approach was applied to an existing open space building and the performance is demonstrated.« less
Increasing Diversity in Computer Science: Acknowledging, yet Moving Beyond, Gender
NASA Astrophysics Data System (ADS)
Larsen, Elizabeth A.; Stubbs, Margaret L.
Lack of diversity within the computer science field has, thus far, been examined most fully through the lens of gender. This article is based on a follow-on to Margolis and Fisher's (2002) study and includes interviews with 33 Carnegie Mellon University students from the undergraduate senior class of 2002 in the School of Computer Science. We found evidence of similarities among the perceptions of these women and men on definitions of computer science, explanations for the notoriously low proportion of women in the field, characterizations of a typical computer science student, impressions of recent curricular changes, a sense of the atmosphere/culture in the program, views of the Women@SCS campus organization, and suggestions for attracting and retaining well-rounded students in computer science. We conclude that efforts to increase diversity in the computer science field will benefit from a more broad-based approach that considers, but is not limited to, notions of gender difference.
Evolution of the Virtualized HPC Infrastructure of Novosibirsk Scientific Center
NASA Astrophysics Data System (ADS)
Adakin, A.; Anisenkov, A.; Belov, S.; Chubarov, D.; Kalyuzhny, V.; Kaplin, V.; Korol, A.; Kuchin, N.; Lomakin, S.; Nikultsev, V.; Skovpen, K.; Sukharev, A.; Zaytsev, A.
2012-12-01
Novosibirsk Scientific Center (NSC), also known worldwide as Akademgorodok, is one of the largest Russian scientific centers hosting Novosibirsk State University (NSU) and more than 35 research organizations of the Siberian Branch of Russian Academy of Sciences including Budker Institute of Nuclear Physics (BINP), Institute of Computational Technologies, and Institute of Computational Mathematics and Mathematical Geophysics (ICM&MG). Since each institute has specific requirements on the architecture of computing farms involved in its research field, currently we've got several computing facilities hosted by NSC institutes, each optimized for a particular set of tasks, of which the largest are the NSU Supercomputer Center, Siberian Supercomputer Center (ICM&MG), and a Grid Computing Facility of BINP. A dedicated optical network with the initial bandwidth of 10 Gb/s connecting these three facilities was built in order to make it possible to share the computing resources among the research communities, thus increasing the efficiency of operating the existing computing facilities and offering a common platform for building the computing infrastructure for future scientific projects. Unification of the computing infrastructure is achieved by extensive use of virtualization technology based on XEN and KVM platforms. This contribution gives a thorough review of the present status and future development prospects for the NSC virtualized computing infrastructure and the experience gained while using it for running production data analysis jobs related to HEP experiments being carried out at BINP, especially the KEDR detector experiment at the VEPP-4M electron-positron collider.
ERIC Educational Resources Information Center
Abuzaghleh, Omar; Goldschmidt, Kathleen; Elleithy, Yasser; Lee, Jeongkyu
2013-01-01
With the advances in computing power, high-performance computing (HPC) platforms have had an impact on not only scientific research in advanced organizations but also computer science curriculum in the educational community. For example, multicore programming and parallel systems are highly desired courses in the computer science major. However,…
Computer Science and the Liberal Arts
ERIC Educational Resources Information Center
Shannon, Christine
2010-01-01
Computer science and the liberal arts have much to offer each other. Yet liberal arts colleges, in particular, have been slow to recognize the opportunity that the study of computer science provides for achieving the goals of a liberal education. After the precipitous drop in computer science enrollments during the first decade of this century,…
Marrying Content and Process in Computer Science Education
ERIC Educational Resources Information Center
Zendler, A.; Spannagel, C.; Klaudt, D.
2011-01-01
Constructivist approaches to computer science education emphasize that as well as knowledge, thinking skills and processes are involved in active knowledge construction. K-12 computer science curricula must not be based on fashions and trends, but on contents and processes that are observable in various domains of computer science, that can be…
ERIC Educational Resources Information Center
Master, Allison; Cheryan, Sapna; Meltzoff, Andrew N.
2016-01-01
Computer science has one of the largest gender disparities in science, technology, engineering, and mathematics. An important reason for this disparity is that girls are less likely than boys to enroll in necessary "pipeline courses," such as introductory computer science. Two experiments investigated whether high-school girls' lower…
Approaching Gender Parity: Women in Computer Science at Afghanistan's Kabul University
ERIC Educational Resources Information Center
Plane, Jandelyn
2010-01-01
This study explores the representation of women in computer science at the tertiary level through data collected about undergraduate computer science education at Kabul University in Afghanistan. Previous studies have theorized reasons for underrepresentation of women in computer science, and while many of these reasons are indeed present in…
Some Hail 'Computational Science' as Biggest Advance Since Newton, Galileo.
ERIC Educational Resources Information Center
Turner, Judith Axler
1987-01-01
Computational science is defined as science done on a computer. A computer can serve as a laboratory for researchers who cannot experiment with their subjects, and as a calculator for those who otherwise might need centuries to solve some problems mathematically. The National Science Foundation's support of supercomputers is discussed. (MLW)
African-American males in computer science---Examining the pipeline for clogs
NASA Astrophysics Data System (ADS)
Stone, Daryl Bryant
The literature on African-American males (AAM) begins with a statement to the effect that "Today young Black men are more likely to be killed or sent to prison than to graduate from college." Why are the numbers of African-American male college graduates decreasing? Why are those enrolled in college not majoring in the science, technology, engineering, and mathematics (STEM) disciplines? This research explored why African-American males are not filling the well-recognized industry need for Computer Scientist/Technologists by choosing college tracks to these careers. The literature on STEM disciplines focuses largely on women in STEM, as opposed to minorities, and within minorities, there is a noticeable research gap in addressing the needs and opportunities available to African-American males. The primary goal of this study was therefore to examine the computer science "pipeline" from the African-American male perspective. The method included a "Computer Science Degree Self-Efficacy Scale" be distributed to five groups of African-American male students, to include: (1) fourth graders, (2) eighth graders, (3) eleventh graders, (4) underclass undergraduate computer science majors, and (5) upperclass undergraduate computer science majors. In addition to a 30-question self-efficacy test, subjects from each group were asked to participate in a group discussion about "African-American males in computer science." The audio record of each group meeting provides qualitative data for the study. The hypotheses include the following: (1) There is no significant difference in "Computer Science Degree" self-efficacy between fourth and eighth graders. (2) There is no significant difference in "Computer Science Degree" self-efficacy between eighth and eleventh graders. (3) There is no significant difference in "Computer Science Degree" self-efficacy between eleventh graders and lower-level computer science majors. (4) There is no significant difference in "Computer Science Degree" self-efficacy between lower-level computer science majors and upper-level computer science majors. (5) There is no significant difference in "Computer Science Degree" self-efficacy between each of the five groups of students. Finally, the researcher selected African-American male students attending six primary schools, including the predominately African-American elementary, middle and high school that the researcher attended during his own academic career. Additionally, a racially mixed elementary, middle and high school was selected from the same county in Maryland. Bowie State University provided both the underclass and upperclass computer science majors surveyed in this study. Of the five hypotheses, the sample provided enough evidence to support the claim that there are significant differences in the "Computer Science Degree" self-efficacy between each of the five groups of students. ANOVA analysis by question and total self-efficacy scores provided more results of statistical significance. Additionally, factor analysis and review of the qualitative data provide more insightful results. Overall, the data suggest 'a clog' may exist in the middle school level and students attending racially mixed schools were more confident in their computer, math and science skills. African-American males admit to spending lots of time on social networking websites and emailing, but are 'dis-aware' of the skills and knowledge needed to study in the computing disciplines. The majority of the subjects knew little, if any, AAMs in the 'computing discipline pipeline'. The collegian African-American males, in this study, agree that computer programming is a difficult area and serves as a 'major clog in the pipeline'.
Development of EarthCube Governance: An Agile Approach
NASA Astrophysics Data System (ADS)
Pearthree, G.; Allison, M. L.; Patten, K.
2013-12-01
Governance of geosciences cyberinfrastructure is a complex and essential undertaking, critical in enabling distributed knowledge communities to collaborate and communicate across disciplines, distances, and cultures. Advancing science with respect to 'grand challenges," such as global climate change, weather prediction, and core fundamental science, depends not just on technical cyber systems, but also on social systems for strategic planning, decision-making, project management, learning, teaching, and building a community of practice. Simply put, a robust, agile technical system depends on an equally robust and agile social system. Cyberinfrastructure development is wrapped in social, organizational and governance challenges, which may significantly impede progress. An agile development process is underway for governance of transformative investments in geosciences cyberinfrastructure through the NSF EarthCube initiative. Agile development is iterative and incremental, and promotes adaptive planning and rapid and flexible response. Such iterative deployment across a variety of EarthCube stakeholders encourages transparency, consensus, accountability, and inclusiveness. A project Secretariat acts as the coordinating body, carrying out duties for planning, organizing, communicating, and reporting. A broad coalition of stakeholder groups comprises an Assembly (Mainstream Scientists, Cyberinfrastructure Institutions, Information Technology/Computer Sciences, NSF EarthCube Investigators, Science Communities, EarthCube End-User Workshop Organizers, Professional Societies) to serve as a preliminary venue for identifying, evaluating, and testing potential governance models. To offer opportunity for broader end-user input, a crowd-source approach will engage stakeholders not involved otherwise. An Advisory Committee from the Earth, ocean, atmosphere, social, computer and library sciences is guiding the process from a high-level policy point of view. Developmental evaluators from the social sciences embedded in the project provide real-time review and adjustments. While a large number of agencies and organizations have agreed to participate, in order to ensure an open and inclusive process, community selected leaders yet to be identified will play key roles through an Assembly Advisory Council. Once consensus is reached on a governing framework, a community-selected demonstration governance pilot will help facilitate community convergence on system design.
Dynamical Approach Study of Spurious Numerics in Nonlinear Computations
NASA Technical Reports Server (NTRS)
Yee, H. C.; Mansour, Nagi (Technical Monitor)
2002-01-01
The last two decades have been an era when computation is ahead of analysis and when very large scale practical computations are increasingly used in poorly understood multiscale complex nonlinear physical problems and non-traditional fields. Ensuring a higher level of confidence in the predictability and reliability (PAR) of these numerical simulations could play a major role in furthering the design, understanding, affordability and safety of our next generation air and space transportation systems, and systems for planetary and atmospheric sciences, and in understanding the evolution and origin of life. The need to guarantee PAR becomes acute when computations offer the ONLY way of solving these types of data limited problems. Employing theory from nonlinear dynamical systems, some building blocks to ensure a higher level of confidence in PAR of numerical simulations have been revealed by the author and world expert collaborators in relevant fields. Five building blocks with supporting numerical examples were discussed. The next step is to utilize knowledge gained by including nonlinear dynamics, bifurcation and chaos theories as an integral part of the numerical process. The third step is to design integrated criteria for reliable and accurate algorithms that cater to the different multiscale nonlinear physics. This includes but is not limited to the construction of appropriate adaptive spatial and temporal discretizations that are suitable for the underlying governing equations. In addition, a multiresolution wavelets approach for adaptive numerical dissipation/filter controls for high speed turbulence, acoustics and combustion simulations will be sought. These steps are corner stones for guarding against spurious numerical solutions that are solutions of the discretized counterparts but are not solutions of the underlying governing equations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Radman, Ali M
The Division of Natural Sciences and Mathematics is housed in the Wilson-Booker Science Building (WBSB) which previously consisted of six classrooms, a lecture room, three biology laboratories, one physics laboratory, one chemistry laboratory, one research laboratory, and three computer laboratories. However, due to rapid expansion in STEM majors, there was a dire need for more classroom and laboratory space to accommodate this expansion. Further, since the College started integrating research into the curriculum in 2004 in order to keep pace with the national trend in science education, it has become apparent that one small research laboratory that accommodates 10 studentsmore » will not keep pace with the growing needs of the new students interested in research. Therefore , it became imperative to add another research laboratory to augment the existing one. Thus, the new instrumentation/Research Laboratory will provide space for the new equipment and research space for an additional 8 - 10 students. In addition, the new WBSB wing also houses a Biochemisty/Molecular Biology Laboratory, an Organic Chemistry laboratory, an Animal Laboratory, a Seminar Room, two spacious classrooms, and 3 Faculty Offices. The impact of the new facility will be far-reaching.« less
The value and use of social media as communication tool in the plant sciences.
Osterrieder, Anne
2013-07-11
Social media now complements many parts of our lives. Facebook, Twitter, YouTube and many other social networking sites allow users to share and interact with online content and to connect with like-minded people. Its strengths - rapid dissemination and amplification of content and the ability to lead informal conversations - make it a powerful tool to use in a professional context. This commentary explains the overall concept of social media and offers suggestions on usage and possible types of scientific content. It advises researchers on the potential benefits and how to take a strategic approach towards building a social media presence. It also presents examples of effective social media use within the plant science community. Common reasons for scientists to not engage with social media include the fear of appearing unprofessional, posting something wrong or being misunderstood, or a lack of confidence in their computer skills. With the rapid changes in academic publishing, dissemination and science communication, as well as the rise of 'altmetrics' to track online engagement with scientific content, digital literacy will become an essential skill in a scientist's tool kit.
Search and rescue in collapsed structures: engineering and social science aspects.
El-Tawil, Sherif; Aguirre, Benigno
2010-10-01
This paper discusses the social science and engineering dimensions of search and rescue (SAR) in collapsed buildings. First, existing information is presented on factors that influence the behaviour of trapped victims, particularly human, physical, socioeconomic and circumstantial factors. Trapped victims are most often discussed in the context of structural collapse and injuries sustained. Most studies in this area focus on earthquakes as the type of disaster that produces the most extensive structural damage. Second, information is set out on the engineering aspects of urban search and rescue (USAR) in the United States, including the role of structural engineers in USAR operations, training and certification of structural specialists, and safety and general procedures. The use of computational simulation to link the engineering and social science aspects of USAR is discussed. This could supplement training of local SAR groups and USAR teams, allowing them to understand better the collapse process and how voids form in a rubble pile. A preliminary simulation tool developed for this purpose is described. © 2010 The Author(s). Journal compilation © Overseas Development Institute, 2010.
NASA Astrophysics Data System (ADS)
Schiller, Q.; Li, X.; Palo, S. E.; Blum, L. W.; Gerhardt, D.
2015-12-01
The Colorado Student Space Weather Experiment is a spacecraft mission developed and operated by students at the University of Colorado, Boulder. The 3U CubeSat was launched from Vandenberg Air Force Base in September 2012. The massively successful mission far outlived its 4 month estimated lifetime and stopped transmitting data after over two years in orbit in December 2014. CSSWE has contributed to 15 scientific or engineering peer-reviewed journal publications. During the course of the project, over 65 undergraduate and graduate students from CU's Computer Science, Aerospace, and Mechanical Engineering Departments, as well as the Astrophysical and Planetary Sciences Department participated. The students were responsible for the design, development, build, integration, testing, and operations from component- to system-level. The variety of backgrounds on this unique project gave the students valuable experience in their own focus area, but also cross-discipline and system-level involvement. However, though the perseverance of the students brought the mission to fruition, it was only possible through the mentoring and support of professionals in the Aerospace Engineering Sciences Department and CU's Laboratory for Atmospheric and Space Physics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Diamond, Rick; Moezzi, Mithra
Within the energy research community, social sciences tends to be viewed fairly narrowly, often as simply a marketing tool to change the behavior of consumers and decision makers, and to ''attack market barriers''. As we see it, social sciences, which draws on sociology, psychology, political science, business administration, and other academic disciplines, is capable of far more. A social science perspective can re-align questions in ways that can lead to the development of technologies and technology policy that are much stronger and potentially more successful than they would be otherwise. In most energy policies governing commercial buildings, the prevailing Rmore » and D directives are firmly rooted in a technology framework, one that is generally more quantitative and evaluative than that fostered by the social sciences. To illustrate how social science thinking would approach the goal of achieving high energy performance in the commercial building sector, they focus on the US Department of Energy's Roadmap for commercial buildings (DOE 2000) as a starting point. By ''deconstructing'' the four strategies provided by the Roadmap, they set the stage for proposing a closer partnership between advocates of technology-based and social science-based approaches.« less
Girls in computer science: A female only introduction class in high school
NASA Astrophysics Data System (ADS)
Drobnis, Ann W.
This study examined the impact of an all girls' classroom environment in a high school introductory computer science class on the student's attitudes towards computer science and their thoughts on future involvement with computer science. It was determined that an all girls' introductory class could impact the declining female enrollment and female students' efficacy towards computer science. This research was conducted in a summer school program through a regional magnet school for science and technology which these students attend during the school year. Three different groupings of students were examined for the research: female students in an all girls' class, female students in mixed-gender classes and male students in mixed-gender classes. A survey, Attitudes about Computers and Computer Science (ACCS), was designed to obtain an understanding of the students' thoughts, preconceptions, attitude, knowledge of computer science, and future intentions around computer science, both in education and career. Students in all three groups were administered the ACCS prior to taking the class and upon completion of the class. In addition, students in the all girls' class wrote in a journal throughout the course, and some of those students were also interviewed upon completion of the course. The data was analyzed using quantitative and qualitative techniques. While there were no major differences found in the quantitative data, it was determined that girls in the all girls' class were truly excited by what they had learned and were more open to the idea of computer science being a part of their future.
Teaching Building Science with Simulations
ERIC Educational Resources Information Center
Hatherly, Amanda
2017-01-01
Teaching building science to community college students can be challenging given both the macro (houses change subject to varying seasons) and the micro (heat transfer, moisture movement) level of the topics taught. Simulations and games can provide a way of learning material that can otherwise be difficult for students to understand. In this…
Building STEAM in Design-Related Technology
ERIC Educational Resources Information Center
Maldonado, Elaine; Pearson, Karen R.
2013-01-01
TECH-FIT is a National Science Foundation initiative at FIT, part of the State University of New York. An institution with over 85% female students, this interdisciplinary, design-related STEM (Science, Technology, Engineering, and Mathematics) project sought to increase inclusion and student performance in STEM. Building on new and existing…
Tour Brookhaven Lab's Future Hub for Energy Research: The Interdisciplinary Science Building
Gerry Stokes; Jim Misewich; Caradonna, Peggy; Sullivan, John; Olsen, Jim
2018-04-16
Construction is under way for the Interdisciplinary Science Building (ISB), a future world-class facility for energy research at Brookhaven Lab. Meet two scientists who will develop solutions at the ISB to tackle some of the nation's energy challenges, and tour the construction site.
ERIC Educational Resources Information Center
Schroth, Stephen T.; Helfer, Jason A.
2017-01-01
Environmental studies provide an ideal opportunity for gifted children of any age to build critical and creative-thinking skills while also building skills in science, technology, engineering, and mathematics (STEM) areas. Exploring issues related to sustainability and environmental concerns permits gifted learners to identify problems, develop…
ERIC Educational Resources Information Center
Peterson, Amelia
2016-01-01
As a systemic approach to improving educational practice through research, "What Works" has come under repeated challenge from alternative approaches, most recently that of improvement science. While "What Works" remains a dominant paradigm for centralized knowledge-building efforts, there is need to understand why this…