Sample records for doe computational science

  1. Alliance for Computational Science Collaboration HBCU Partnership at Fisk University. Final Report 2001

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collins, W. E.

    2004-08-16

    Computational Science plays a big role in research and development in mathematics, science, engineering and biomedical disciplines. The Alliance for Computational Science Collaboration (ACSC) has the goal of training African-American and other minority scientists in the computational science field for eventual employment with the Department of Energy (DOE). The involvements of Historically Black Colleges and Universities (HBCU) in the Alliance provide avenues for producing future DOE African-American scientists. Fisk University has been participating in this program through grants from the DOE. The DOE grant supported computational science activities at Fisk University. The research areas included energy related projects, distributed computing,more » visualization of scientific systems and biomedical computing. Students' involvement in computational science research included undergraduate summer research at Oak Ridge National Lab, on-campus research involving the participation of undergraduates, participation of undergraduate and faculty members in workshops, and mentoring of students. These activities enhanced research and education in computational science, thereby adding to Fisk University's spectrum of research and educational capabilities. Among the successes of the computational science activities are the acceptance of three undergraduate students to graduate schools with full scholarships beginning fall 2002 (one for master degree program and two for Doctoral degree program).« less

  2. DOE Advanced Scientific Advisory Committee (ASCAC): Workforce Subcommittee Letter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chapman, Barbara; Calandra, Henri; Crivelli, Silvia

    2014-07-23

    Simulation and computing are essential to much of the research conducted at the DOE national laboratories. Experts in the ASCR ¬relevant Computing Sciences, which encompass a range of disciplines including Computer Science, Applied Mathematics, Statistics and domain Computational Sciences, are an essential element of the workforce in nearly all of the DOE national laboratories. This report seeks to identify the gaps and challenges facing DOE with respect to this workforce. This letter is ASCAC’s response to the charge of February 19, 2014 to identify disciplines in which significantly greater emphasis in workforce training at the graduate or postdoctoral levels ismore » necessary to address workforce gaps in current and future Office of Science mission needs.« less

  3. Computational Science and Innovation

    NASA Astrophysics Data System (ADS)

    Dean, D. J.

    2011-09-01

    Simulations - utilizing computers to solve complicated science and engineering problems - are a key ingredient of modern science. The U.S. Department of Energy (DOE) is a world leader in the development of high-performance computing (HPC), the development of applied math and algorithms that utilize the full potential of HPC platforms, and the application of computing to science and engineering problems. An interesting general question is whether the DOE can strategically utilize its capability in simulations to advance innovation more broadly. In this article, I will argue that this is certainly possible.

  4. Scientific Computing Strategic Plan for the Idaho National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whiting, Eric Todd

    Scientific computing is a critical foundation of modern science. Without innovations in the field of computational science, the essential missions of the Department of Energy (DOE) would go unrealized. Taking a leadership role in such innovations is Idaho National Laboratory’s (INL’s) challenge and charge, and is central to INL’s ongoing success. Computing is an essential part of INL’s future. DOE science and technology missions rely firmly on computing capabilities in various forms. Modeling and simulation, fueled by innovations in computational science and validated through experiment, are a critical foundation of science and engineering. Big data analytics from an increasing numbermore » of widely varied sources is opening new windows of insight and discovery. Computing is a critical tool in education, science, engineering, and experiments. Advanced computing capabilities in the form of people, tools, computers, and facilities, will position INL competitively to deliver results and solutions on important national science and engineering challenges. A computing strategy must include much more than simply computers. The foundational enabling component of computing at many DOE national laboratories is the combination of a showcase like data center facility coupled with a very capable supercomputer. In addition, network connectivity, disk storage systems, and visualization hardware are critical and generally tightly coupled to the computer system and co located in the same facility. The existence of these resources in a single data center facility opens the doors to many opportunities that would not otherwise be possible.« less

  5. Alliance for Computational Science Collaboration: HBCU Partnership at Alabama A&M University Continuing High Performance Computing Research and Education at AAMU

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qian, Xiaoqing; Deng, Z. T.

    2009-11-10

    This is the final report for the Department of Energy (DOE) project DE-FG02-06ER25746, entitled, "Continuing High Performance Computing Research and Education at AAMU". This three-year project was started in August 15, 2006, and it was ended in August 14, 2009. The objective of this project was to enhance high performance computing research and education capabilities at Alabama A&M University (AAMU), and to train African-American and other minority students and scientists in the computational science field for eventual employment with DOE. AAMU has successfully completed all the proposed research and educational tasks. Through the support of DOE, AAMU was able tomore » provide opportunities to minority students through summer interns and DOE computational science scholarship program. In the past three years, AAMU (1). Supported three graduate research assistants in image processing for hypersonic shockwave control experiment and in computational science related area; (2). Recruited and provided full financial support for six AAMU undergraduate summer research interns to participate Research Alliance in Math and Science (RAMS) program at Oak Ridge National Lab (ORNL); (3). Awarded highly competitive 30 DOE High Performance Computing Scholarships ($1500 each) to qualified top AAMU undergraduate students in science and engineering majors; (4). Improved high performance computing laboratory at AAMU with the addition of three high performance Linux workstations; (5). Conducted image analysis for electromagnetic shockwave control experiment and computation of shockwave interactions to verify the design and operation of AAMU-Supersonic wind tunnel. The high performance computing research and education activities at AAMU created great impact to minority students. As praised by Accreditation Board for Engineering and Technology (ABET) in 2009, ?The work on high performance computing that is funded by the Department of Energy provides scholarships to undergraduate students as computational science scholars. This is a wonderful opportunity to recruit under-represented students.? Three ASEE papers were published in 2007, 2008 and 2009 proceedings of ASEE Annual Conferences, respectively. Presentations of these papers were also made at the ASEE Annual Conferences. It is very critical to continue the research and education activities.« less

  6. ASCR Workshop on Quantum Computing for Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aspuru-Guzik, Alan; Van Dam, Wim; Farhi, Edward

    This report details the findings of the DOE ASCR Workshop on Quantum Computing for Science that was organized to assess the viability of quantum computing technologies to meet the computational requirements of the DOE’s science and energy mission, and to identify the potential impact of quantum technologies. The workshop was held on February 17-18, 2015, in Bethesda, MD, to solicit input from members of the quantum computing community. The workshop considered models of quantum computation and programming environments, physical science applications relevant to DOE's science mission as well as quantum simulation, and applied mathematics topics including potential quantum algorithms formore » linear algebra, graph theory, and machine learning. This report summarizes these perspectives into an outlook on the opportunities for quantum computing to impact problems relevant to the DOE’s mission as well as the additional research required to bring quantum computing to the point where it can have such impact.« less

  7. What Does CALL Have to Offer Computer Science and What Does Computer Science Have to Offer CALL?

    ERIC Educational Resources Information Center

    Cushion, Steve

    2006-01-01

    We will argue that CALL can usefully be viewed as a subset of computer software engineering and can profit from adopting some of the recent progress in software development theory. The unified modelling language has become the industry standard modelling technique and the accompanying unified process is rapidly gaining acceptance. The manner in…

  8. The grand challenge of managing the petascale facility.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aiken, R. J.; Mathematics and Computer Science

    2007-02-28

    This report is the result of a study of networks and how they may need to evolve to support petascale leadership computing and science. As Dr. Ray Orbach, director of the Department of Energy's Office of Science, says in the spring 2006 issue of SciDAC Review, 'One remarkable example of growth in unexpected directions has been in high-end computation'. In the same article Dr. Michael Strayer states, 'Moore's law suggests that before the end of the next cycle of SciDAC, we shall see petaflop computers'. Given the Office of Science's strong leadership and support for petascale computing and facilities, wemore » should expect to see petaflop computers in operation in support of science before the end of the decade, and DOE/SC Advanced Scientific Computing Research programs are focused on making this a reality. This study took its lead from this strong focus on petascale computing and the networks required to support such facilities, but it grew to include almost all aspects of the DOE/SC petascale computational and experimental science facilities, all of which will face daunting challenges in managing and analyzing the voluminous amounts of data expected. In addition, trends indicate the increased coupling of unique experimental facilities with computational facilities, along with the integration of multidisciplinary datasets and high-end computing with data-intensive computing; and we can expect these trends to continue at the petascale level and beyond. Coupled with recent technology trends, they clearly indicate the need for including capability petascale storage, networks, and experiments, as well as collaboration tools and programming environments, as integral components of the Office of Science's petascale capability metafacility. The objective of this report is to recommend a new cross-cutting program to support the management of petascale science and infrastructure. The appendices of the report document current and projected DOE computation facilities, science trends, and technology trends, whose combined impact can affect the manageability and stewardship of DOE's petascale facilities. This report is not meant to be all-inclusive. Rather, the facilities, science projects, and research topics presented are to be considered examples to clarify a point.« less

  9. Crosscut report: Exascale Requirements Reviews, March 9–10, 2017 – Tysons Corner, Virginia. An Office of Science review sponsored by: Advanced Scientific Computing Research, Basic Energy Sciences, Biological and Environmental Research, Fusion Energy Sciences, High Energy Physics, Nuclear Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerber, Richard; Hack, James; Riley, Katherine

    The mission of the U.S. Department of Energy Office of Science (DOE SC) is the delivery of scientific discoveries and major scientific tools to transform our understanding of nature and to advance the energy, economic, and national security missions of the United States. To achieve these goals in today’s world requires investments in not only the traditional scientific endeavors of theory and experiment, but also in computational science and the facilities that support large-scale simulation and data analysis. The Advanced Scientific Computing Research (ASCR) program addresses these challenges in the Office of Science. ASCR’s mission is to discover, develop, andmore » deploy computational and networking capabilities to analyze, model, simulate, and predict complex phenomena important to DOE. ASCR supports research in computational science, three high-performance computing (HPC) facilities — the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory and Leadership Computing Facilities at Argonne (ALCF) and Oak Ridge (OLCF) National Laboratories — and the Energy Sciences Network (ESnet) at Berkeley Lab. ASCR is guided by science needs as it develops research programs, computers, and networks at the leading edge of technologies. As we approach the era of exascale computing, technology changes are creating challenges for science programs in SC for those who need to use high performance computing and data systems effectively. Numerous significant modifications to today’s tools and techniques will be needed to realize the full potential of emerging computing systems and other novel computing architectures. To assess these needs and challenges, ASCR held a series of Exascale Requirements Reviews in 2015–2017, one with each of the six SC program offices,1 and a subsequent Crosscut Review that sought to integrate the findings from each. Participants at the reviews were drawn from the communities of leading domain scientists, experts in computer science and applied mathematics, ASCR facility staff, and DOE program managers in ASCR and the respective program offices. The purpose of these reviews was to identify mission-critical scientific problems within the DOE Office of Science (including experimental facilities) and determine the requirements for the exascale ecosystem that would be needed to address those challenges. The exascale ecosystem includes exascale computing systems, high-end data capabilities, efficient software at scale, libraries, tools, and other capabilities. This effort will contribute to the development of a strategic roadmap for ASCR compute and data facility investments and will help the ASCR Facility Division establish partnerships with Office of Science stakeholders. It will also inform the Office of Science research needs and agenda. The results of the six reviews have been published in reports available on the web at http://exascaleage.org/. This report presents a summary of the individual reports and of common and crosscutting findings, and it identifies opportunities for productive collaborations among the DOE SC program offices.« less

  10. ASCR Cybersecurity for Scientific Computing Integrity - Research Pathways and Ideas Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peisert, Sean; Potok, Thomas E.; Jones, Todd

    At the request of the U.S. Department of Energy's (DOE) Office of Science (SC) Advanced Scientific Computing Research (ASCR) program office, a workshop was held June 2-3, 2015, in Gaithersburg, MD, to identify potential long term (10 to +20 year) cybersecurity fundamental basic research and development challenges, strategies and roadmap facing future high performance computing (HPC), networks, data centers, and extreme-scale scientific user facilities. This workshop was a follow-on to the workshop held January 7-9, 2015, in Rockville, MD, that examined higher level ideas about scientific computing integrity specific to the mission of the DOE Office of Science. Issues includedmore » research computation and simulation that takes place on ASCR computing facilities and networks, as well as network-connected scientific instruments, such as those run by various DOE Office of Science programs. Workshop participants included researchers and operational staff from DOE national laboratories, as well as academic researchers and industry experts. Participants were selected based on the submission of abstracts relating to the topics discussed in the previous workshop report [1] and also from other ASCR reports, including "Abstract Machine Models and Proxy Architectures for Exascale Computing" [27], the DOE "Preliminary Conceptual Design for an Exascale Computing Initiative" [28], and the January 2015 machine learning workshop [29]. The workshop was also attended by several observers from DOE and other government agencies. The workshop was divided into three topic areas: (1) Trustworthy Supercomputing, (2) Extreme-Scale Data, Knowledge, and Analytics for Understanding and Improving Cybersecurity, and (3) Trust within High-end Networking and Data Centers. Participants were divided into three corresponding teams based on the category of their abstracts. The workshop began with a series of talks from the program manager and workshop chair, followed by the leaders for each of the three topics and a representative of each of the four major DOE Office of Science Advanced Scientific Computing Research Facilities: the Argonne Leadership Computing Facility (ALCF), the Energy Sciences Network (ESnet), the National Energy Research Scientific Computing Center (NERSC), and the Oak Ridge Leadership Computing Facility (OLCF). The rest of the workshop consisted of topical breakout discussions and focused writing periods that produced much of this report.« less

  11. Alliance for Computational Science Collaboration, HBCU Partnership at Alabama A&M University Final Performance Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deng, Z.T.

    2001-11-15

    The objective of this project was to conduct high-performance computing research and teaching at AAMU, and to train African-American and other minority students and scientists in the computational science field for eventual employment with DOE. During the project period, eight tasks were accomplished. Student Research Assistant, Work Study, Summer Interns, Scholarship were proved to be one of the best ways for us to attract top-quality minority students. Under the support of DOE, through research, summer interns, collaborations, scholarships programs, AAMU has successfully provided research and educational opportunities to minority students in the field related to computational science.

  12. Classrooms Matter: The Design of Virtual Classrooms Influences Gender Disparities in Computer Science Classes

    ERIC Educational Resources Information Center

    Cheryan, Sapna; Meltzoff, Andrew N.; Kim, Saenam

    2011-01-01

    Three experiments examined whether the design of virtual learning environments influences undergraduates' enrollment intentions and anticipated success in introductory computer science courses. Changing the design of a virtual classroom--from one that conveys current computer science stereotypes to one that does not--significantly increased…

  13. Scalable Analysis Methods and In Situ Infrastructure for Extreme Scale Knowledge Discovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bethel, Wes

    2016-07-24

    The primary challenge motivating this team’s work is the widening gap between the ability to compute information and to store it for subsequent analysis. This gap adversely impacts science code teams, who are able to perform analysis only on a small fraction of the data they compute, resulting in the very real likelihood of lost or missed science, when results are computed but not analyzed. Our approach is to perform as much analysis or visualization processing on data while it is still resident in memory, an approach that is known as in situ processing. The idea in situ processing wasmore » not new at the time of the start of this effort in 2014, but efforts in that space were largely ad hoc, and there was no concerted effort within the research community that aimed to foster production-quality software tools suitable for use by DOE science projects. In large, our objective was produce and enable use of production-quality in situ methods and infrastructure, at scale, on DOE HPC facilities, though we expected to have impact beyond DOE due to the widespread nature of the challenges, which affect virtually all large-scale computational science efforts. To achieve that objective, we assembled a unique team of researchers consisting of representatives from DOE national laboratories, academia, and industry, and engaged in software technology R&D, as well as engaged in close partnerships with DOE science code teams, to produce software technologies that were shown to run effectively at scale on DOE HPC platforms.« less

  14. Final Report Scalable Analysis Methods and In Situ Infrastructure for Extreme Scale Knowledge Discovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Leary, Patrick

    The primary challenge motivating this project is the widening gap between the ability to compute information and to store it for subsequent analysis. This gap adversely impacts science code teams, who can perform analysis only on a small fraction of the data they calculate, resulting in the substantial likelihood of lost or missed science, when results are computed but not analyzed. Our approach is to perform as much analysis or visualization processing on data while it is still resident in memory, which is known as in situ processing. The idea in situ processing was not new at the time ofmore » the start of this effort in 2014, but efforts in that space were largely ad hoc, and there was no concerted effort within the research community that aimed to foster production-quality software tools suitable for use by Department of Energy (DOE) science projects. Our objective was to produce and enable the use of production-quality in situ methods and infrastructure, at scale, on DOE high-performance computing (HPC) facilities, though we expected to have an impact beyond DOE due to the widespread nature of the challenges, which affect virtually all large-scale computational science efforts. To achieve this objective, we engaged in software technology research and development (R&D), in close partnerships with DOE science code teams, to produce software technologies that were shown to run efficiently at scale on DOE HPC platforms.« less

  15. Does it matter what we call it?

    USDA-ARS?s Scientific Manuscript database

    Agronomy, soil science, plant science, crop science, agricultural science, computer science, environmental science, environmental engineering, agricultural and irrigation engineering, hydrology, meteorology – all are names that describe fields of study relevant to agriculture and the environment in ...

  16. First principles calculations of thermal conductivity with out of equilibrium molecular dynamics simulations

    NASA Astrophysics Data System (ADS)

    Puligheddu, Marcello; Gygi, Francois; Galli, Giulia

    The prediction of the thermal properties of solids and liquids is central to numerous problems in condensed matter physics and materials science, including the study of thermal management of opto-electronic and energy conversion devices. We present a method to compute the thermal conductivity of solids by performing ab initio molecular dynamics at non equilibrium conditions. Our formulation is based on a generalization of the approach to equilibrium technique, using sinusoidal temperature gradients, and it only requires calculations of first principles trajectories and atomic forces. We discuss results and computational requirements for a representative, simple oxide, MgO, and compare with experiments and data obtained with classical potentials. This work was supported by MICCoM as part of the Computational Materials Science Program funded by the U.S. Department of Energy (DOE), Office of Science , Basic Energy Sciences (BES), Materials Sciences and Engineering Division under Grant DOE/BES 5J-30.

  17. DOE pushes for useful quantum computing

    NASA Astrophysics Data System (ADS)

    Cho, Adrian

    2018-01-01

    The U.S. Department of Energy (DOE) is joining the quest to develop quantum computers, devices that would exploit quantum mechanics to crack problems that overwhelm conventional computers. The initiative comes as Google and other companies race to build a quantum computer that can demonstrate "quantum supremacy" by beating classical computers on a test problem. But reaching that milestone will not mean practical uses are at hand, and the new $40 million DOE effort is intended to spur the development of useful quantum computing algorithms for its work in chemistry, materials science, nuclear physics, and particle physics. With the resources at its 17 national laboratories, DOE could play a key role in developing the machines, researchers say, although finding problems with which quantum computers can help isn't so easy.

  18. PARTNERING WITH DOE TO APPLY ADVANCED BIOLOGICAL, ENVIRONMENTAL, AND COMPUTATIONAL SCIENCE TO ENVIRONMENTAL ISSUES

    EPA Science Inventory

    On February 18, 2004, the U.S. Environmental Protection Agency and Department of Energy signed a Memorandum of Understanding to expand the research collaboration of both agencies to advance biological, environmental, and computational sciences for protecting human health and the ...

  19. Template Interfaces for Agile Parallel Data-Intensive Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramakrishnan, Lavanya; Gunter, Daniel; Pastorello, Gilerto Z.

    Tigres provides a programming library to compose and execute large-scale data-intensive scientific workflows from desktops to supercomputers. DOE User Facilities and large science collaborations are increasingly generating large enough data sets that it is no longer practical to download them to a desktop to operate on them. They are instead stored at centralized compute and storage resources such as high performance computing (HPC) centers. Analysis of this data requires an ability to run on these facilities, but with current technologies, scaling an analysis to an HPC center and to a large data set is difficult even for experts. Tigres ismore » addressing the challenge of enabling collaborative analysis of DOE Science data through a new concept of reusable "templates" that enable scientists to easily compose, run and manage collaborative computational tasks. These templates define common computation patterns used in analyzing a data set.« less

  20. Argonne Chemical Sciences & Engineering - Awards Home

    Science.gov Websites

    Argonne National Laboratory Chemical Sciences & Engineering DOE Logo CSE Home About CSE Argonne Home > Chemical Sciences & Engineering > Fundamental Interactions Catalysis & Energy Computational Postdoctoral Fellowships Contact Us CSE Intranet Awards Argonne's Chemical Sciences and

  1. Science-Driven Computing: NERSC's Plan for 2006-2010

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simon, Horst D.; Kramer, William T.C.; Bailey, David H.

    NERSC has developed a five-year strategic plan focusing on three components: Science-Driven Systems, Science-Driven Services, and Science-Driven Analytics. (1) Science-Driven Systems: Balanced introduction of the best new technologies for complete computational systems--computing, storage, networking, visualization and analysis--coupled with the activities necessary to engage vendors in addressing the DOE computational science requirements in their future roadmaps. (2) Science-Driven Services: The entire range of support activities, from high-quality operations and user services to direct scientific support, that enable a broad range of scientists to effectively use NERSC systems in their research. NERSC will concentrate on resources needed to realize the promise ofmore » the new highly scalable architectures for scientific discovery in multidisciplinary computational science projects. (3) Science-Driven Analytics: The architectural and systems enhancements and services required to integrate NERSC's powerful computational and storage resources to provide scientists with new tools to effectively manipulate, visualize, and analyze the huge data sets derived from simulations and experiments.« less

  2. Advanced Artificial Science. The development of an artificial science and engineering research infrastructure to facilitate innovative computational modeling, analysis, and application to interdisciplinary areas of scientific investigation.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saffer, Shelley

    2014-12-01

    This is a final report of the DOE award DE-SC0001132, Advanced Artificial Science. The development of an artificial science and engineering research infrastructure to facilitate innovative computational modeling, analysis, and application to interdisciplinary areas of scientific investigation. This document describes the achievements of the goals, and resulting research made possible by this award.

  3. 2009 ALCF annual report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beckman, P.; Martin, D.; Drugan, C.

    2010-11-23

    This year the Argonne Leadership Computing Facility (ALCF) delivered nearly 900 million core hours of science. The research conducted at their leadership class facility touched our lives in both minute and massive ways - whether it was studying the catalytic properties of gold nanoparticles, predicting protein structures, or unearthing the secrets of exploding stars. The authors remained true to their vision to act as the forefront computational center in extending science frontiers by solving pressing problems for our nation. Our success in this endeavor was due mainly to the Department of Energy's (DOE) INCITE (Innovative and Novel Computational Impact onmore » Theory and Experiment) program. The program awards significant amounts of computing time to computationally intensive, unclassified research projects that can make high-impact scientific advances. This year, DOE allocated 400 million hours of time to 28 research projects at the ALCF. Scientists from around the world conducted the research, representing such esteemed institutions as the Princeton Plasma Physics Laboratory, National Institute of Standards and Technology, and European Center for Research and Advanced Training in Scientific Computation. Argonne also provided Director's Discretionary allocations for research challenges, addressing such issues as reducing aerodynamic noise, critical for next-generation 'green' energy systems. Intrepid - the ALCF's 557-teraflops IBM Blue/Gene P supercomputer - enabled astounding scientific solutions and discoveries. Intrepid went into full production five months ahead of schedule. As a result, the ALCF nearly doubled the days of production computing available to the DOE Office of Science, INCITE awardees, and Argonne projects. One of the fastest supercomputers in the world for open science, the energy-efficient system uses about one-third as much electricity as a machine of comparable size built with more conventional parts. In October 2009, President Barack Obama recognized the excellence of the entire Blue Gene series by awarding it to the National Medal of Technology and Innovation. Other noteworthy achievements included the ALCF's collaboration with the National Energy Research Scientific Computing Center (NERSC) to examine cloud computing as a potential new computing paradigm for scientists. Named Magellan, the DOE-funded initiative will explore which science application programming models work well within the cloud, as well as evaluate the challenges that come with this new paradigm. The ALCF obtained approval for its next-generation machine, a 10-petaflops system to be delivered in 2012. This system will allow us to resolve ever more pressing problems, even more expeditiously through breakthrough science in the years to come.« less

  4. ESnet: Large-Scale Science and Data Management ( (LBNL Summer Lecture Series)

    ScienceCinema

    Johnston, Bill

    2017-12-09

    Summer Lecture Series 2004: Bill Johnston of Berkeley Lab's Computing Sciences is a distinguished networking and computing researcher. He managed the Energy Sciences Network (ESnet), a leading-edge, high-bandwidth network funded by DOE's Office of Science. Used for everything from videoconferencing to climate modeling, and flexible enough to accommodate a wide variety of data-intensive applications and services, ESNet's traffic volume is doubling every year and currently surpasses 200 terabytes per month.

  5. High School Students Gear Up for Battle of the Brains

    Science.gov Websites

    tournament, which focuses on physics, math, biology, astronomy, chemistry, computers and the earth sciences competition. DOE began the National Science Bowl 11 years ago to help stimulate interest in science and math

  6. Laboratory Directed Research and Development Program FY 2006

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hansen

    2007-03-08

    The Ernest Orlando Lawrence Berkeley National Laboratory (Berkeley Lab or LBNL) is a multi-program national research facility operated by the University of California for the Department of Energy (DOE). As an integral element of DOE's National Laboratory System, Berkeley Lab supports DOE's missions in fundamental science, energy resources, and environmental quality. Berkeley Lab programs advance four distinct goals for DOE and the nation: (1) To perform leading multidisciplinary research in the computing sciences, physical sciences, energy sciences, biosciences, and general sciences in a manner that ensures employee and public safety and protection of the environment. (2) To develop and operatemore » unique national experimental facilities for qualified investigators. (3) To educate and train future generations of scientists and engineers to promote national science and education goals. (4) To transfer knowledge and technological innovations and to foster productive relationships among Berkeley Lab's research programs, universities, and industry in order to promote national economic competitiveness.« less

  7. Principles versus Artifacts in Computer Science Curriculum Design

    ERIC Educational Resources Information Center

    Machanick, Philip

    2003-01-01

    Computer Science is a subject which has difficulty in marketing itself. Further, pinning down a standard curriculum is difficult--there are many preferences which are hard to accommodate. This paper argues the case that part of the problem is the fact that, unlike more established disciplines, the subject does not clearly distinguish the study of…

  8. Turbulent Flow Simulation at the Exascale: Opportunities and Challenges Workshop: August 4-5, 2015, Washington, D.C.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sprague, Michael A.; Boldyrev, Stanislav; Fischer, Paul

    This report details the impact exascale will bring to turbulent-flow simulations in applied science and technology. The need for accurate simulation of turbulent flows is evident across the DOE applied-science and engineering portfolios, including combustion, plasma physics, nuclear-reactor physics, wind energy, and atmospheric science. The workshop brought together experts in turbulent-flow simulation, computational mathematics, and high-performance computing. Building upon previous ASCR workshops on exascale computing, participants defined a research agenda and path forward that will enable scientists and engineers to continually leverage, engage, and direct advances in computational systems on the path to exascale computing.

  9. A Purposeful MOOC to Alleviate Insufficient CS Education in Finnish Schools

    ERIC Educational Resources Information Center

    Kurhila, Jaakko; Vihavainen, Arto

    2015-01-01

    The Finnish national school curriculum, effective from 2004, does not include any topics related to Computer Science (CS). To alleviate the problem that school students are not able to study CS-related topics, the Department of Computer Science at the University of Helsinki prepared a completely online course that is open to pupils and students in…

  10. National Geographic Society Kids Network: Report on 1994 teacher participants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    In 1994, National Geographic Society Kids Network, a computer/telecommunications-based science curriculum, was presented to elementary and middle school teachers through summer programs sponsored by NGS and US DOE. The network program assists teachers in understanding the process of doing science; understanding the role of computers and telecommunications in the study of science, math, and engineering; and utilizing computers and telecommunications appropriately in the classroom. The program enables teacher to integrate science, math, and technology with other subjects with the ultimate goal of encouraging students of all abilities to pursue careers in science/math/engineering. This report assesses the impact of the networkmore » program on participating teachers.« less

  11. Extraordinary Tools for Extraordinary Science: The Impact ofSciDAC on Accelerator Science&Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ryne, Robert D.

    2006-08-10

    Particle accelerators are among the most complex and versatile instruments of scientific exploration. They have enabled remarkable scientific discoveries and important technological advances that span all programs within the DOE Office of Science (DOE/SC). The importance of accelerators to the DOE/SC mission is evident from an examination of the DOE document, ''Facilities for the Future of Science: A Twenty-Year Outlook''. Of the 28 facilities listed, 13 involve accelerators. Thanks to SciDAC, a powerful suite of parallel simulation tools has been developed that represent a paradigm shift in computational accelerator science. Simulations that used to take weeks or more now takemore » hours, and simulations that were once thought impossible are now performed routinely. These codes have been applied to many important projects of DOE/SC including existing facilities (the Tevatron complex, the Relativistic Heavy Ion Collider), facilities under construction (the Large Hadron Collider, the Spallation Neutron Source, the Linac Coherent Light Source), and to future facilities (the International Linear Collider, the Rare Isotope Accelerator). The new codes have also been used to explore innovative approaches to charged particle acceleration. These approaches, based on the extremely intense fields that can be present in lasers and plasmas, may one day provide a path to the outermost reaches of the energy frontier. Furthermore, they could lead to compact, high-gradient accelerators that would have huge consequences for US science and technology, industry, and medicine. In this talk I will describe the new accelerator modeling capabilities developed under SciDAC, the essential role of multi-disciplinary collaboration with applied mathematicians, computer scientists, and other IT experts in developing these capabilities, and provide examples of how the codes have been used to support DOE/SC accelerator projects.« less

  12. Extraordinary tools for extraordinary science: the impact of SciDAC on accelerator science and technology

    NASA Astrophysics Data System (ADS)

    Ryne, Robert D.

    2006-09-01

    Particle accelerators are among the most complex and versatile instruments of scientific exploration. They have enabled remarkable scientific discoveries and important technological advances that span all programs within the DOE Office of Science (DOE/SC). The importance of accelerators to the DOE/SC mission is evident from an examination of the DOE document, ''Facilities for the Future of Science: A Twenty-Year Outlook.'' Of the 28 facilities listed, 13 involve accelerators. Thanks to SciDAC, a powerful suite of parallel simulation tools has been developed that represent a paradigm shift in computational accelerator science. Simulations that used to take weeks or more now take hours, and simulations that were once thought impossible are now performed routinely. These codes have been applied to many important projects of DOE/SC including existing facilities (the Tevatron complex, the Relativistic Heavy Ion Collider), facilities under construction (the Large Hadron Collider, the Spallation Neutron Source, the Linac Coherent Light Source), and to future facilities (the International Linear Collider, the Rare Isotope Accelerator). The new codes have also been used to explore innovative approaches to charged particle acceleration. These approaches, based on the extremely intense fields that can be present in lasers and plasmas, may one day provide a path to the outermost reaches of the energy frontier. Furthermore, they could lead to compact, high-gradient accelerators that would have huge consequences for US science and technology, industry, and medicine. In this talk I will describe the new accelerator modeling capabilities developed under SciDAC, the essential role of multi-disciplinary collaboration with applied mathematicians, computer scientists, and other IT experts in developing these capabilities, and provide examples of how the codes have been used to support DOE/SC accelerator projects.

  13. Journal of Undergraduate Research, Volume VI, 2006

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faletra, P.; Schuetz, A.; Cherkerzian, D.

    Students who conducted research at DOE National Laboratories during 2005 were invited to include their research abstracts, and for a select few, their completed research papers in this Journal. This Journal is direct evidence of students collaborating with their mentors. Fields in which these students worked include: Biology; Chemistry; Computer Science; Engineering; Environmental Science; General Sciences; Materials Sciences; Medical and Health Sciences; Nuclear Sciences; Physics; and Science Policy.

  14. CILogon-HA. Higher Assurance Federated Identities for DOE Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Basney, James

    The CILogon-HA project extended the existing open source CILogon service (initially developed with funding from the National Science Foundation) to provide credentials at multiple levels of assurance to users of DOE facilities for collaborative science. CILogon translates mechanism and policy across higher education and grid trust federations, bridging from the InCommon identity federation (which federates university and DOE lab identities) to the Interoperable Global Trust Federation (which defines standards across the Worldwide LHC Computing Grid, the Open Science Grid, and other cyberinfrastructure). The CILogon-HA project expanded the CILogon service to support over 160 identity providers (including 6 DOE facilities) andmore » 3 internationally accredited certification authorities. To provide continuity of operations upon the end of the CILogon-HA project period, project staff transitioned the CILogon service to operation by XSEDE.« less

  15. NNSA Administrator Addresses the Next Generation of Nuclear Security Professionals: Part 2

    ScienceCinema

    Thomas D'Agostino

    2017-12-09

    Administrator Thomas DAgostino of the National Nuclear Security Administration addressed the next generation of nuclear security professionals during the opening session of todays 2009 Department of Energy (DOE) Computational Science Graduate Fellowship Annual Conference. Administrator DAgostino discussed NNSAs role in implementing President Obamas nuclear security agenda and encouraged the computing science fellows to consider careers in nuclear security.

  16. NNSA Administrator Addresses the Next Generation of Nuclear Security Professionals: Part 1

    ScienceCinema

    Thomas D'Agostino

    2017-12-09

    Administrator Thomas DAgostino of the National Nuclear Security Administration addressed the next generation of nuclear security professionals during the opening session of todays 2009 Department of Energy (DOE) Computational Science Graduate Fellowship Annual Conference. Administrator DAgostino discussed NNSAs role in implementing President Obamas nuclear security agenda and encouraged the computing science fellows to consider careers in nuclear security.

  17. Showing Up Is Half the Battle: Assessing Different Contextualized Learning Tools to Increase the Performance in Introductory Computer Science Courses

    ERIC Educational Resources Information Center

    Rolka, Christine; Remshagen, Anja

    2015-01-01

    Contextualized learning is considered beneficial for student success. In this article, we assess the impact of context-based learning tools on student grade performance in an introductory computer science course. In particular, we investigate two central questions: (1) does the use context-based learning tools, robots and animations, affect…

  18. Laboratory directed research and development program FY 1999

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hansen, Todd; Levy, Karin

    2000-03-08

    The Ernest Orlando Lawrence Berkeley National Laboratory (Berkeley Lab or LBNL) is a multi-program national research facility operated by the University of California for the Department of Energy (DOE). As an integral element of DOE's National Laboratory System, Berkeley Lab supports DOE's missions in fundamental science, energy resources, and environmental quality. Berkeley Lab programs advance four distinct goals for DOE and the nation: (1) To perform leading multidisciplinary research in the computing sciences, physical sciences, energy sciences, biosciences, and general sciences in a manner that ensures employee and public safety and protection of the environment. (2) To develop and operatemore » unique national experimental facilities for qualified investigators. (3) To educate and train future generations of scientists and engineers to promote national science and education goals. (4) To transfer knowledge and technological innovations and to foster productive relationships among Berkeley Lab's research programs, universities, and industry in order to promote national economic competitiveness. This is the annual report on Laboratory Directed Research and Development (LDRD) program for FY99.« less

  19. Final Report for DOE Award ER25756

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kesselman, Carl

    2014-11-17

    The SciDAC-funded Center for Enabling Distributed Petascale Science (CEDPS) was established to address technical challenges that arise due to the frequent geographic distribution of data producers (in particular, supercomputers and scientific instruments) and data consumers (people and computers) within the DOE laboratory system. Its goal is to produce technical innovations that meet DOE end-user needs for (a) rapid and dependable placement of large quantities of data within a distributed high-performance environment, and (b) the convenient construction of scalable science services that provide for the reliable and high-performance processing of computation and data analysis requests from many remote clients. The Centermore » is also addressing (c) the important problem of troubleshooting these and other related ultra-high-performance distributed activities from the perspective of both performance and functionality« less

  20. Using Computing and Data Grids for Large-Scale Science and Engineering

    NASA Technical Reports Server (NTRS)

    Johnston, William E.

    2001-01-01

    We use the term "Grid" to refer to a software system that provides uniform and location independent access to geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. These emerging data and computing Grids promise to provide a highly capable and scalable environment for addressing large-scale science problems. We describe the requirements for science Grids, the resulting services and architecture of NASA's Information Power Grid (IPG) and DOE's Science Grid, and some of the scaling issues that have come up in their implementation.

  1. Partnership in Computational Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huray, Paul G.

    1999-02-24

    This is the final report for the "Partnership in Computational Science" (PICS) award in an amount of $500,000 for the period January 1, 1993 through December 31, 1993. A copy of the proposal with its budget is attached as Appendix A. This report first describes the consequent significance of the DOE award in building infrastructure of high performance computing in the Southeast and then describes the work accomplished under this grant and a list of publications resulting from it.

  2. Final Technical Progress Report; Closeout Certifications; CSSV Newsletter Volume I; CSSV Newsletter Volume II; CSSV Activity Journal; CSSV Final Financial Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Houston, Johnny L; Geter, Kerry

    This Project?s third year of implementation in 2007-2008, the final year, as designated by Elizabeth City State University (ECSU), in cooperation with the National Association of Mathematicians (NAM) Inc., in an effort to promote research and research training programs in computational science ? scientific visualization (CSSV). A major goal of the Project was to attract the energetic and productive faculty, graduate and upper division undergraduate students of diverse ethnicities to a program that investigates science and computational science issues of long-term interest to the Department of Energy (DoE) and the nation. The breadth and depth of computational science?scientific visualization andmore » the magnitude of resources available are enormous for permitting a variety of research activities. ECSU?s Computational Science-Science Visualization Center will serve as a conduit for directing users to these enormous resources.« less

  3. Advanced Scientific Computing Research Network Requirements: ASCR Network Requirements Review Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bacon, Charles; Bell, Greg; Canon, Shane

    The Energy Sciences Network (ESnet) is the primary provider of network connectivity for the U.S. Department of Energy (DOE) Office of Science (SC), the single largest supporter of basic research in the physical sciences in the United States. In support of SC programs, ESnet regularly updates and refreshes its understanding of the networking requirements of the instruments, facilities, scientists, and science programs that it serves. This focus has helped ESnet to be a highly successful enabler of scientific discovery for over 25 years. In October 2012, ESnet and the Office of Advanced Scientific Computing Research (ASCR) of the DOE SCmore » organized a review to characterize the networking requirements of the programs funded by the ASCR program office. The requirements identified at the review are summarized in the Findings section, and are described in more detail in the body of the report.« less

  4. Computational Science at the Argonne Leadership Computing Facility

    NASA Astrophysics Data System (ADS)

    Romero, Nichols

    2014-03-01

    The goal of the Argonne Leadership Computing Facility (ALCF) is to extend the frontiers of science by solving problems that require innovative approaches and the largest-scale computing systems. ALCF's most powerful computer - Mira, an IBM Blue Gene/Q system - has nearly one million cores. How does one program such systems? What software tools are available? Which scientific and engineering applications are able to utilize such levels of parallelism? This talk will address these questions and describe a sampling of projects that are using ALCF systems in their research, including ones in nanoscience, materials science, and chemistry. Finally, the ways to gain access to ALCF resources will be presented. This research used resources of the Argonne Leadership Computing Facility at Argonne National Laboratory, which is supported by the Office of Science of the U.S. Department of Energy under contract DE-AC02-06CH11357.

  5. Climate Science Performance, Data and Productivity on Titan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mayer, Benjamin W; Worley, Patrick H; Gaddis, Abigail L

    2015-01-01

    Climate Science models are flagship codes for the largest of high performance computing (HPC) resources, both in visibility, with the newly launched Department of Energy (DOE) Accelerated Climate Model for Energy (ACME) effort, and in terms of significant fractions of system usage. The performance of the DOE ACME model is captured with application level timers and examined through a sizeable run archive. Performance and variability of compute, queue time and ancillary services are examined. As Climate Science advances in the use of HPC resources there has been an increase in the required human and data systems to achieve programs goals.more » A description of current workflow processes (hardware, software, human) and planned automation of the workflow, along with historical and projected data in motion and at rest data usage, are detailed. The combination of these two topics motivates a description of future systems requirements for DOE Climate Modeling efforts, focusing on the growth of data storage and network and disk bandwidth required to handle data at an acceptable rate.« less

  6. The future of scientific workflows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deelman, Ewa; Peterka, Tom; Altintas, Ilkay

    Today’s computational, experimental, and observational sciences rely on computations that involve many related tasks. The success of a scientific mission often hinges on the computer automation of these workflows. In April 2015, the US Department of Energy (DOE) invited a diverse group of domain and computer scientists from national laboratories supported by the Office of Science, the National Nuclear Security Administration, from industry, and from academia to review the workflow requirements of DOE’s science and national security missions, to assess the current state of the art in science workflows, to understand the impact of emerging extreme-scale computing systems on thosemore » workflows, and to develop requirements for automated workflow management in future and existing environments. This article is a summary of the opinions of over 50 leading researchers attending this workshop. We highlight use cases, computing systems, workflow needs and conclude by summarizing the remaining challenges this community sees that inhibit large-scale scientific workflows from becoming a mainstream tool for extreme-scale science.« less

  7. 78 FR 41046 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-09

    ... Services Administration, notice is hereby given that the Advanced Scientific Computing Advisory Committee will be renewed for a two-year period beginning on July 1, 2013. The Committee will provide advice to the Director, Office of Science (DOE), on the Advanced Scientific Computing Research Program managed...

  8. Radio Interference Modeling and Prediction for Satellite Operation Applications

    DTIC Science & Technology

    2015-08-25

    et al. Department of Electrical Engineering and Computer Science The Catholic University of America Washington, DC 20064 25 Aug 2015 Final...data included in this document for any purpose other than Government procurement does not in any way obligate the U.S. Government. The fact that...AND ADDRESS(ES) 8. PERFORMING ORGANIZATION REPORT NUMBERDepartment of Electrical Engineering and Computer Science The Catholic University of

  9. Department of Energy - Office of Science Early Career Research Program

    NASA Astrophysics Data System (ADS)

    Horwitz, James

    The Department of Energy (DOE) Office of Science Early Career Program began in FY 2010. The program objectives are to support the development of individual research programs of outstanding scientists early in their careers and to stimulate research careers in the disciplines supported by the DOE Office of Science. Both university and DOE national laboratory early career scientists are eligible. Applicants must be within 10 years of receiving their PhD. For universities, the PI must be an untenured Assistant Professor or Associate Professor on the tenure track. DOE laboratory applicants must be full time, non-postdoctoral employee. University awards are at least 150,000 per year for 5 years for summer salary and expenses. DOE laboratory awards are at least 500,000 per year for 5 years for full annual salary and expenses. The Program is managed by the Office of the Deputy Director for Science Programs and supports research in the following Offices: Advanced Scientific and Computing Research, Biological and Environmental Research, Basic Energy Sciences, Fusion Energy Sciences, High Energy Physics, and Nuclear Physics. A new Funding Opportunity Announcement is issued each year with detailed description on the topical areas encouraged for early career proposals. Preproposals are required. This talk will introduce the DOE Office of Science Early Career Research program and describe opportunities for research relevant to the condensed matter physics community. http://science.energy.gov/early-career/

  10. Army Maneuver Center of Excellence

    DTIC Science & Technology

    2012-10-18

    agreements throughout DoD DARPA, JIEDDO, DHS, FAA, DoE, NSA , NASA, SMDC, etc. Strategic Partnerships Benefit the Army Materiel Enterprise External... Neuroscience Network Sciences Hierarchical Computing Extreme Energy Science Autonomous Systems Technology Emerging Sciences Meso-scale (grain...scales • Improvements in Soldier-system overall performance → operational neuroscience and advanced simulation and training technologies

  11. Science 101: How Does Speech-Recognition Software Work?

    ERIC Educational Resources Information Center

    Robertson, Bill

    2016-01-01

    This column provides background science information for elementary teachers. Many innovations with computer software begin with analysis of how humans do a task. This article takes a look at how humans recognize spoken words and explains the origins of speech-recognition software.

  12. Does Computer Use Matter? The Influence of Computer Usage on Eighth-Grade Students' Mathematics Reasoning

    ERIC Educational Resources Information Center

    Ayieko, Rachel A.; Gokbel, Elif N.; Nelson, Bryan

    2017-01-01

    This study uses the 2011 Trends in International Mathematics and Science Study to investigate the relationships among students' and teachers' computer use, and eighth-grade students' mathematical reasoning in three high-achieving nations: Finland, Chinese Taipei, and Singapore. The study found a significant negative relationship in all three…

  13. PREPARING FOR EXASCALE: ORNL Leadership Computing Application Requirements and Strategy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joubert, Wayne; Kothe, Douglas B; Nam, Hai Ah

    2009-12-01

    In 2009 the Oak Ridge Leadership Computing Facility (OLCF), a U.S. Department of Energy (DOE) facility at the Oak Ridge National Laboratory (ORNL) National Center for Computational Sciences (NCCS), elicited petascale computational science requirements from leading computational scientists in the international science community. This effort targeted science teams whose projects received large computer allocation awards on OLCF systems. A clear finding of this process was that in order to reach their science goals over the next several years, multiple projects will require computational resources in excess of an order of magnitude more powerful than those currently available. Additionally, for themore » longer term, next-generation science will require computing platforms of exascale capability in order to reach DOE science objectives over the next decade. It is generally recognized that achieving exascale in the proposed time frame will require disruptive changes in computer hardware and software. Processor hardware will become necessarily heterogeneous and will include accelerator technologies. Software must undergo the concomitant changes needed to extract the available performance from this heterogeneous hardware. This disruption portends to be substantial, not unlike the change to the message passing paradigm in the computational science community over 20 years ago. Since technological disruptions take time to assimilate, we must aggressively embark on this course of change now, to insure that science applications and their underlying programming models are mature and ready when exascale computing arrives. This includes initiation of application readiness efforts to adapt existing codes to heterogeneous architectures, support of relevant software tools, and procurement of next-generation hardware testbeds for porting and testing codes. The 2009 OLCF requirements process identified numerous actions necessary to meet this challenge: (1) Hardware capabilities must be advanced on multiple fronts, including peak flops, node memory capacity, interconnect latency, interconnect bandwidth, and memory bandwidth. (2) Effective parallel programming interfaces must be developed to exploit the power of emerging hardware. (3) Science application teams must now begin to adapt and reformulate application codes to the new hardware and software, typified by hierarchical and disparate layers of compute, memory and concurrency. (4) Algorithm research must be realigned to exploit this hierarchy. (5) When possible, mathematical libraries must be used to encapsulate the required operations in an efficient and useful way. (6) Software tools must be developed to make the new hardware more usable. (7) Science application software must be improved to cope with the increasing complexity of computing systems. (8) Data management efforts must be readied for the larger quantities of data generated by larger, more accurate science models. Requirements elicitation, analysis, validation, and management comprise a difficult and inexact process, particularly in periods of technological change. Nonetheless, the OLCF requirements modeling process is becoming increasingly quantitative and actionable, as the process becomes more developed and mature, and the process this year has identified clear and concrete steps to be taken. This report discloses (1) the fundamental science case driving the need for the next generation of computer hardware, (2) application usage trends that illustrate the science need, (3) application performance characteristics that drive the need for increased hardware capabilities, (4) resource and process requirements that make the development and deployment of science applications on next-generation hardware successful, and (5) summary recommendations for the required next steps within the computer and computational science communities.« less

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bland, Arthur S Buddy; Hack, James J; Baker, Ann E

    Oak Ridge National Laboratory's (ORNL's) Cray XT5 supercomputer, Jaguar, kicked off the era of petascale scientific computing in 2008 with applications that sustained more than a thousand trillion floating point calculations per second - or 1 petaflop. Jaguar continues to grow even more powerful as it helps researchers broaden the boundaries of knowledge in virtually every domain of computational science, including weather and climate, nuclear energy, geosciences, combustion, bioenergy, fusion, and materials science. Their insights promise to broaden our knowledge in areas that are vitally important to the Department of Energy (DOE) and the nation as a whole, particularly energymore » assurance and climate change. The science of the 21st century, however, will demand further revolutions in computing, supercomputers capable of a million trillion calculations a second - 1 exaflop - and beyond. These systems will allow investigators to continue attacking global challenges through modeling and simulation and to unravel longstanding scientific questions. Creating such systems will also require new approaches to daunting challenges. High-performance systems of the future will need to be codesigned for scientific and engineering applications with best-in-class communications networks and data-management infrastructures and teams of skilled researchers able to take full advantage of these new resources. The Oak Ridge Leadership Computing Facility (OLCF) provides the nation's most powerful open resource for capability computing, with a sustainable path that will maintain and extend national leadership for DOE's Office of Science (SC). The OLCF has engaged a world-class team to support petascale science and to take a dramatic step forward, fielding new capabilities for high-end science. This report highlights the successful delivery and operation of a petascale system and shows how the OLCF fosters application development teams, developing cutting-edge tools and resources for next-generation systems.« less

  15. Final Report on the Proposal to Provide Asian Science and Technology Information

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kahaner, David K.

    2003-07-23

    The Asian Technology Information Program (ATIP) conducted a seven-month Asian science and technology information program for the Office:of Energy Research (ER), U.S: Department of Energy (DOE.) The seven-month program consists of 1) monitoring, analyzing, and dissemiuating science and technology trends and developments associated with Asian high performance computing and communications (HPC), networking, and associated topics, 2) access to ATIP's annual series of Asian S&T reports for ER and HPC related personnel and, 3) supporting DOE and ER designated visits to Asia to study and assess Asian HPC.

  16. Magneto Caloric Effect in Ni-Mn-Ga alloys: First Principles and Experimental studies

    NASA Astrophysics Data System (ADS)

    Odbadrakh, Khorgolkhuu; Nicholson, Don; Brown, Gregory; Rusanu, Aurelian; Rios, Orlando; Hodges, Jason; Safa-Sefat, Athena; Ludtka, Gerard; Eisenbach, Markus; Evans, Boyd

    2012-02-01

    Understanding the Magneto-Caloric Effect (MCE) in alloys with real technological potential is important to the development of viable MCE based products. We report results of computational and experimental investigation of a candidate MCE materials Ni-Mn-Ga alloys. The Wang-Landau statistical method is used in tandem with Locally Self-consistent Multiple Scattering (LSMS) method to explore magnetic states of the system. A classical Heisenberg Hamiltonian is parametrized based on these states and used in obtaining the density of magnetic states. The Currie temperature, isothermal entropy change, and adiabatic temperature change are then calculated from the density of states. Experiments to observe the structural and magnetic phase transformations were performed at the Spallation Neutron Source (SNS) at Oak Ridge National Laboratory (ORNL) on alloys of Ni-Mn-Ga and Fe-Ni-Mn-Ga-Cu. Data from the observations are discussed in comparison with the computational studies. This work was sponsored by the Laboratory Directed Research and Development Program (ORNL), by the Mathematical, Information, and Computational Sciences Division; Office of Advanced Scientific Computing Research (US DOE), and by the Materials Sciences and Engineering Division; Office of Basic Energy Sciences (US DOE).

  17. DOE High Performance Computing Operational Review (HPCOR): Enabling Data-Driven Scientific Discovery at HPC Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerber, Richard; Allcock, William; Beggio, Chris

    2014-10-17

    U.S. Department of Energy (DOE) High Performance Computing (HPC) facilities are on the verge of a paradigm shift in the way they deliver systems and services to science and engineering teams. Research projects are producing a wide variety of data at unprecedented scale and level of complexity, with community-specific services that are part of the data collection and analysis workflow. On June 18-19, 2014 representatives from six DOE HPC centers met in Oakland, CA at the DOE High Performance Operational Review (HPCOR) to discuss how they can best provide facilities and services to enable large-scale data-driven scientific discovery at themore » DOE national laboratories. The report contains findings from that review.« less

  18. UC Merced Center for Computational Biology Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Colvin, Michael; Watanabe, Masakatsu

    Final report for the UC Merced Center for Computational Biology. The Center for Computational Biology (CCB) was established to support multidisciplinary scientific research and academic programs in computational biology at the new University of California campus in Merced. In 2003, the growing gap between biology research and education was documented in a report from the National Academy of Sciences, Bio2010 Transforming Undergraduate Education for Future Research Biologists. We believed that a new type of biological sciences undergraduate and graduate programs that emphasized biological concepts and considered biology as an information science would have a dramatic impact in enabling the transformationmore » of biology. UC Merced as newest UC campus and the first new U.S. research university of the 21st century was ideally suited to adopt an alternate strategy - to create a new Biological Sciences majors and graduate group that incorporated the strong computational and mathematical vision articulated in the Bio2010 report. CCB aimed to leverage this strong commitment at UC Merced to develop a new educational program based on the principle of biology as a quantitative, model-driven science. Also we expected that the center would be enable the dissemination of computational biology course materials to other university and feeder institutions, and foster research projects that exemplify a mathematical and computations-based approach to the life sciences. As this report describes, the CCB has been successful in achieving these goals, and multidisciplinary computational biology is now an integral part of UC Merced undergraduate, graduate and research programs in the life sciences. The CCB began in fall 2004 with the aid of an award from U.S. Department of Energy (DOE), under its Genomes to Life program of support for the development of research and educational infrastructure in the modern biological sciences. This report to DOE describes the research and academic programs made possible by the CCB from its inception until August, 2010, at the end of the final extension. Although DOE support for the center ended in August 2010, the CCB will continue to exist and support its original objectives. The research and academic programs fostered by the CCB have led to additional extramural funding from other agencies, and we anticipate that CCB will continue to provide support for quantitative and computational biology program at UC Merced for many years to come. Since its inception in fall 2004, CCB research projects have continuously had a multi-institutional collaboration with Lawrence Livermore National Laboratory (LLNL), and the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign, as well as individual collaborators at other sites. CCB affiliated faculty cover a broad range of computational and mathematical research including molecular modeling, cell biology, applied math, evolutional biology, bioinformatics, etc. The CCB sponsored the first distinguished speaker series at UC Merced, which had an important role is spreading the word about the computational biology emphasis at this new campus. One of CCB's original goals is to help train a new generation of biologists who bridge the gap between the computational and life sciences. To archive this goal, by summer 2006, a new program - summer undergraduate internship program, have been established under CCB to train the highly mathematical and computationally intensive Biological Science researchers. By the end of summer 2010, 44 undergraduate students had gone through this program. Out of those participants, 11 students have been admitted to graduate schools and 10 more students are interested in pursuing graduate studies in the sciences. The center is also continuing to facilitate the development and dissemination of undergraduate and graduate course materials based on the latest research in computational biology.« less

  19. Workflow Management Systems for Molecular Dynamics on Leadership Computers

    NASA Astrophysics Data System (ADS)

    Wells, Jack; Panitkin, Sergey; Oleynik, Danila; Jha, Shantenu

    Molecular Dynamics (MD) simulations play an important role in a range of disciplines from Material Science to Biophysical systems and account for a large fraction of cycles consumed on computing resources. Increasingly science problems require the successful execution of ''many'' MD simulations as opposed to a single MD simulation. There is a need to provide scalable and flexible approaches to the execution of the workload. We present preliminary results on the Titan computer at the Oak Ridge Leadership Computing Facility that demonstrate a general capability to manage workload execution agnostic of a specific MD simulation kernel or execution pattern, and in a manner that integrates disparate grid-based and supercomputing resources. Our results build upon our extensive experience of distributed workload management in the high-energy physics ATLAS project using PanDA (Production and Distributed Analysis System), coupled with recent conceptual advances in our understanding of workload management on heterogeneous resources. We will discuss how we will generalize these initial capabilities towards a more production level service on DOE leadership resources. This research is sponsored by US DOE/ASCR and used resources of the OLCF computing facility.

  20. Quantum Testbeds Stakeholder Workshop (QTSW) Report meeting purpose and agenda.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hebner, Gregory A.

    Quantum computing (QC) is a promising early-stage technology with the potential to provide scientific computing capabilities far beyond what is possible with even an Exascale computer in specific problems of relevance to the Office of Science. These include (but are not limited to) materials modeling, molecular dynamics, and quantum chromodynamics. However, commercial QC systems are not yet available and the technical maturity of current QC hardware, software, algorithms, and systems integration is woefully incomplete. Thus, there is a significant opportunity for DOE to define the technology building blocks, and solve the system integration issues to enable a revolutionary tool. Oncemore » realized, QC will have world changing impact on economic competitiveness, the scientific enterprise, and citizen well-being. Prior to this workshop, DOE / Office of Advanced Scientific Computing Research (ASCR) hosted a workshop in 2015 to explore QC scientific applications. The goal of that workshop was to assess the viability of QC technologies to meet the computational requirements in support of DOE’s science and energy mission and to identify the potential impact of these technologies.« less

  1. The Magellan Final Report on Cloud Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    ,; Coghlan, Susan; Yelick, Katherine

    The goal of Magellan, a project funded through the U.S. Department of Energy (DOE) Office of Advanced Scientific Computing Research (ASCR), was to investigate the potential role of cloud computing in addressing the computing needs for the DOE Office of Science (SC), particularly related to serving the needs of mid- range computing and future data-intensive computing workloads. A set of research questions was formed to probe various aspects of cloud computing from performance, usability, and cost. To address these questions, a distributed testbed infrastructure was deployed at the Argonne Leadership Computing Facility (ALCF) and the National Energy Research Scientific Computingmore » Center (NERSC). The testbed was designed to be flexible and capable enough to explore a variety of computing models and hardware design points in order to understand the impact for various scientific applications. During the project, the testbed also served as a valuable resource to application scientists. Applications from a diverse set of projects such as MG-RAST (a metagenomics analysis server), the Joint Genome Institute, the STAR experiment at the Relativistic Heavy Ion Collider, and the Laser Interferometer Gravitational Wave Observatory (LIGO), were used by the Magellan project for benchmarking within the cloud, but the project teams were also able to accomplish important production science utilizing the Magellan cloud resources.« less

  2. Sandia QIS Capabilities.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muller, Richard P.

    2017-07-01

    Sandia National Laboratories has developed a broad set of capabilities in quantum information science (QIS), including elements of quantum computing, quantum communications, and quantum sensing. The Sandia QIS program is built atop unique DOE investments at the laboratories, including the MESA microelectronics fabrication facility, the Center for Integrated Nanotechnologies (CINT) facilities (joint with LANL), the Ion Beam Laboratory, and ASC High Performance Computing (HPC) facilities. Sandia has invested $75 M of LDRD funding over 12 years to develop unique, differentiating capabilities that leverage these DOE infrastructure investments.

  3. To naturalize or not to naturalize? An issue for cognitive science as well as anthropology.

    PubMed

    Stenning, Keith

    2012-07-01

    Several of Beller, Bender, and Medin's (2012) issues are as relevant within cognitive science as between it and anthropology. Knowledge-rich human mental processes impose hermeneutic tasks, both on subjects and researchers. Psychology's current philosophy of science is ill suited to analyzing these: Its demand for ''stimulus control'' needs to give way to ''negotiation of mutual interpretation.'' Cognitive science has ways to address these issues, as does anthropology. An example from my own work is about how defeasible logics are mathematical models of some aspects of simple hermeneutic processes. They explain processing relative to databases of knowledge and belief-that is, content. A specific example is syllogistic reasoning, which raises issues of experimenters' interpretations of subjects' reasoning. Science, especially since the advent of understandings of computation, does not have to be reductive. How does this approach transfer onto anthropological topics? Recent cognitive science approaches to anthropological topics have taken a reductive stance in terms of modules. We end with some speculations about a different cognitive approach to, for example, religion. Copyright © 2012 Cognitive Science Society, Inc.

  4. Scarecrow

    NASA Image and Video Library

    2007-10-04

    The team developing NASA Mars Science Laboratory calls this test rover Scarecrow because the vehicle does not include a computer brain. Mobility engineers use this test rover to evaluate mobility and suspension performance.

  5. LBNL Computational ResearchTheory Facility Groundbreaking - Full Press Conference. Feb 1st, 2012

    ScienceCinema

    Yelick, Kathy

    2018-01-24

    Energy Secretary Steven Chu, along with Berkeley Lab and UC leaders, broke ground on the Lab's Computational Research and Theory (CRT) facility yesterday. The CRT will be at the forefront of high-performance supercomputing research and be DOE's most efficient facility of its kind. Joining Secretary Chu as speakers were Lab Director Paul Alivisatos, UC President Mark Yudof, Office of Science Director Bill Brinkman, and UC Berkeley Chancellor Robert Birgeneau. The festivities were emceed by Associate Lab Director for Computing Sciences, Kathy Yelick, and Berkeley Mayor Tom Bates joined in the shovel ceremony.

  6. LBNL Computational Research and Theory Facility Groundbreaking. February 1st, 2012

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yelick, Kathy

    2012-02-02

    Energy Secretary Steven Chu, along with Berkeley Lab and UC leaders, broke ground on the Lab's Computational Research and Theory (CRT) facility yesterday. The CRT will be at the forefront of high-performance supercomputing research and be DOE's most efficient facility of its kind. Joining Secretary Chu as speakers were Lab Director Paul Alivisatos, UC President Mark Yudof, Office of Science Director Bill Brinkman, and UC Berkeley Chancellor Robert Birgeneau. The festivities were emceed by Associate Lab Director for Computing Sciences, Kathy Yelick, and Berkeley Mayor Tom Bates joined in the shovel ceremony.

  7. LBNL Computational Research and Theory Facility Groundbreaking. February 1st, 2012

    ScienceCinema

    Yelick, Kathy

    2017-12-09

    Energy Secretary Steven Chu, along with Berkeley Lab and UC leaders, broke ground on the Lab's Computational Research and Theory (CRT) facility yesterday. The CRT will be at the forefront of high-performance supercomputing research and be DOE's most efficient facility of its kind. Joining Secretary Chu as speakers were Lab Director Paul Alivisatos, UC President Mark Yudof, Office of Science Director Bill Brinkman, and UC Berkeley Chancellor Robert Birgeneau. The festivities were emceed by Associate Lab Director for Computing Sciences, Kathy Yelick, and Berkeley Mayor Tom Bates joined in the shovel ceremony.

  8. Outcomes from the DOE Workshop on Turbulent Flow Simulation at the Exascale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sprague, Michael; Boldyrev, Stanislav; Chang, Choong-Seock

    This paper summarizes the outcomes from the Turbulent Flow Simulation at the Exascale: Opportunities and Challenges Workshop, which was held 4-5 August 2015, and was sponsored by the U.S. Department of Energy Office of Advanced Scientific Computing Research. The workshop objective was to define and describe the challenges and opportunities that computing at the exascale will bring to turbulent-flow simulations in applied science and technology. The need for accurate simulation of turbulent flows is evident across the U.S. Department of Energy applied-science and engineering portfolios, including combustion, plasma physics, nuclear-reactor physics, wind energy, and atmospheric science. The workshop brought togethermore » experts in turbulent-flow simulation, computational mathematics, and high-performance computing. Building upon previous ASCR workshops on exascale computing, participants defined a research agenda and path forward that will enable scientists and engineers to continually leverage, engage, and direct advances in computational systems on the path to exascale computing.« less

  9. Biological and Environmental Research Exascale Requirements Review. An Office of Science review sponsored jointly by Advanced Scientific Computing Research and Biological and Environmental Research, March 28-31, 2016, Rockville, Maryland

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arkin, Adam; Bader, David C.; Coffey, Richard

    Understanding the fundamentals of genomic systems or the processes governing impactful weather patterns are examples of the types of simulation and modeling performed on the most advanced computing resources in America. High-performance computing and computational science together provide a necessary platform for the mission science conducted by the Biological and Environmental Research (BER) office at the U.S. Department of Energy (DOE). This report reviews BER’s computing needs and their importance for solving some of the toughest problems in BER’s portfolio. BER’s impact on science has been transformative. Mapping the human genome, including the U.S.-supported international Human Genome Project that DOEmore » began in 1987, initiated the era of modern biotechnology and genomics-based systems biology. And since the 1950s, BER has been a core contributor to atmospheric, environmental, and climate science research, beginning with atmospheric circulation studies that were the forerunners of modern Earth system models (ESMs) and by pioneering the implementation of climate codes onto high-performance computers. See http://exascaleage.org/ber/ for more information.« less

  10. Computational Understanding: Analysis of Sentences and Context

    DTIC Science & Technology

    1974-05-01

    Computer Science Department Stanford, California 9430b 10- PROGRAM ELEMENT. PROJECT. TASK AREA « WORK UNIT NUMBERS II. CONTROLLING OFFICE NAME...these is the need tor programs that can respond in useful ways to information expressed in a natural language. However a computational understanding...buying structure because "Mary" appears where it does. But the time for analysis was rarely over five seconds of computer time, when the Lisp program

  11. Final Report: A Broad Research Project on the Sciences of Complexity, September 15, 1994 - November 15, 1999

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2000-02-01

    DOE support for a broad research program in the sciences of complexity permitted the Santa Fe Institute to initiate new collaborative research within its integrative core activities as well as to host visitors to participate in research on specific topics that serve as motivation and testing ground for the study of the general principles of complex systems. Results are presented on computational biology, biodiversity and ecosystem research, and advanced computing and simulation.

  12. International Symposium on 21st Century Challenges in Computational Engineering and Science

    DTIC Science & Technology

    2010-02-26

    REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 The public reporting burden for this collection of information is estimated to average 1 ...it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ORGANIZATION. 1 . REPORT DATE (DD-MM-YYYY) 26...CHALLENGES IN COMPUTATIONAL ENGINEERING AND SCIENCE 5a. CONTRACT NUMBER 5b. GRANT NUMBER FA9550-09- 1 -0648 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S

  13. DOE planning workshop advanced biomedical technology initiative

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1994-06-01

    The Department of Energy has mad major contributions in the biomedical sciences with programs in medical applications and instrumentation development, molecular biology, human genome, and computational sciences. In an effort to help determine DOE`s role in applying these capabilities to the nation`s health care needs, a planning workshop was held on January 11--12, 1994. The workshop was co-sponsored by the Department`s Office of Energy Research and Defense Programs organizations. Participants represented industry, medical research institutions, national laboratories, and several government agencies. They attempted to define the needs of the health care industry. identify DOE laboratory capabilities that address these needs,more » and determine how DOE, in cooperation with other team members, could begin an initiative with the goals of reducing health care costs while improving the quality of health care delivery through the proper application of technology and computational systems. This document is a report of that workshop. Seven major technology development thrust areas were considered. Each involves development of various aspects of imaging, optical, sensor and data processing and storage technologies. The thrust areas as prioritized for DOE are: (1) Minimally Invasive Procedures; (2) Technologies for Individual Self Care; (3) Outcomes Research; (4) Telemedicine; (5) Decision Support Systems; (6) Assistive Technology; (7) Prevention and Education.« less

  14. Fusion Energy Sciences Exascale Requirements Review. An Office of Science review sponsored jointly by Advanced Scientific Computing Research and Fusion Energy Sciences, January 27-29, 2016, Gaithersburg, Maryland

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, Choong-Seock; Greenwald, Martin; Riley, Katherine

    The additional computing power offered by the planned exascale facilities could be transformational across the spectrum of plasma and fusion research — provided that the new architectures can be efficiently applied to our problem space. The collaboration that will be required to succeed should be viewed as an opportunity to identify and exploit cross-disciplinary synergies. To assess the opportunities and requirements as part of the development of an overall strategy for computing in the exascale era, the Exascale Requirements Review meeting of the Fusion Energy Sciences (FES) community was convened January 27–29, 2016, with participation from a broad range ofmore » fusion and plasma scientists, specialists in applied mathematics and computer science, and representatives from the U.S. Department of Energy (DOE) and its major computing facilities. This report is a summary of that meeting and the preparatory activities for it and includes a wealth of detail to support the findings. Technical opportunities, requirements, and challenges are detailed in this report (and in the recent report on the Workshop on Integrated Simulation). Science applications are described, along with mathematical and computational enabling technologies. Also see http://exascaleage.org/fes/ for more information.« less

  15. The computationalist reformulation of the mind-body problem.

    PubMed

    Marchal, Bruno

    2013-09-01

    Computationalism, or digital mechanism, or simply mechanism, is a hypothesis in the cognitive science according to which we can be emulated by a computer without changing our private subjective feeling. We provide a weaker form of that hypothesis, weaker than the one commonly referred to in the (vast) literature and show how to recast the mind-body problem in that setting. We show that such a mechanist hypothesis does not solve the mind-body problem per se, but does help to reduce partially the mind-body problem into another problem which admits a formulation in pure arithmetic. We will explain that once we adopt the computationalist hypothesis, which is a form of mechanist assumption, we have to derive from it how our belief in the physical laws can emerge from *only* arithmetic and classical computer science. In that sense we reduce the mind-body problem to a body problem appearance in computer science, or in arithmetic. The general shape of the possible solution of that subproblem, if it exists, is shown to be closer to "Platonist or neoplatonist theology" than to the "Aristotelian theology". In Plato's theology, the physical or observable reality is only the shadow of a vaster hidden nonphysical and nonobservable, perhaps mathematical, reality. The main point is that the derivation is constructive, and it provides the technical means to derive physics from arithmetic, and this will make the computationalist hypothesis empirically testable, and thus scientific in the Popperian analysis of science. In case computationalism is wrong, the derivation leads to a procedure for measuring "our local degree of noncomputationalism". Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. XPRESS: eXascale PRogramming Environment and System Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brightwell, Ron; Sterling, Thomas; Koniges, Alice

    The XPRESS Project is one of four major projects of the DOE Office of Science Advanced Scientific Computing Research X-stack Program initiated in September, 2012. The purpose of XPRESS is to devise an innovative system software stack to enable practical and useful exascale computing around the end of the decade with near-term contributions to efficient and scalable operation of trans-Petaflops performance systems in the next two to three years; both for DOE mission-critical applications. To this end, XPRESS directly addresses critical challenges in computing of efficiency, scalability, and programmability through introspective methods of dynamic adaptive resource management and task scheduling.

  17. How Data Becomes Physics: Inside the RACF

    ScienceCinema

    Ernst, Michael; Rind, Ofer; Rajagopalan, Srini; Lauret, Jerome; Pinkenburg, Chris

    2018-06-22

    The RHIC & ATLAS Computing Facility (RACF) at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory sits at the center of a global computing network. It connects more than 2,500 researchers around the world with the data generated by millions of particle collisions taking place each second at Brookhaven Lab's Relativistic Heavy Ion Collider (RHIC, a DOE Office of Science User Facility for nuclear physics research), and the ATLAS experiment at the Large Hadron Collider in Europe. Watch this video to learn how the people and computing resources of the RACF serve these scientists to turn petabytes of raw data into physics discoveries.

  18. The Mind Research Network - Mental Illness Neuroscience Discovery Grant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberts, J.; Calhoun, V.

    The scientific and technological programs of the Mind Research Network (MRN), reflect DOE missions in basic science and associated instrumentation, computational modeling, and experimental techniques. MRN's technical goals over the course of this project have been to develop and apply integrated, multi-modality functional imaging techniques derived from a decade of DOE-support research and technology development.

  19. 26 CFR 1.448-1T - Limitation on the use of the cash receipts and disbursements method of accounting (temporary).

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ..., (E) Accounting, (F) Actuarial science, (G) Performing arts, or (H) Consulting. Substantially all of... the client. The taxpayer does not, however, provide the client with additional computer programming... processing systems. The client will then order computers and other data processing equipment through the...

  20. 26 CFR 1.448-1T - Limitation on the use of the cash receipts and disbursements method of accounting (temporary).

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ..., (E) Accounting, (F) Actuarial science, (G) Performing arts, or (H) Consulting. Substantially all of... the client. The taxpayer does not, however, provide the client with additional computer programming... processing systems. The client will then order computers and other data processing equipment through the...

  1. 26 CFR 1.448-1T - Limitation on the use of the cash receipts and disbursements method of accounting (temporary).

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ..., (E) Accounting, (F) Actuarial science, (G) Performing arts, or (H) Consulting. Substantially all of... the client. The taxpayer does not, however, provide the client with additional computer programming... processing systems. The client will then order computers and other data processing equipment through the...

  2. 26 CFR 1.448-1T - Limitation on the use of the cash receipts and disbursements method of accounting (temporary).

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ..., (E) Accounting, (F) Actuarial science, (G) Performing arts, or (H) Consulting. Substantially all of... the client. The taxpayer does not, however, provide the client with additional computer programming... processing systems. The client will then order computers and other data processing equipment through the...

  3. Teach or No Teach: Is Large System Education Resurging?

    ERIC Educational Resources Information Center

    Sharma, Aditya; Murphy, Marianne C.

    2011-01-01

    Legacy or not, mainframe education is being taught at many U.S. universities. Some computer science programs have always had some large system content but there does appear to be resurgence of mainframe related content in business programs such as Management Information Systems (MIS) and Computer Information Systems (CIS). Many companies such as…

  4. Inner-shell photoionization of atomic chlorine near the 2p-1 edge: a Breit-Pauli R-matrix calculation

    NASA Astrophysics Data System (ADS)

    Felfli, Z.; Deb, N. C.; Manson, S. T.; Hibbert, A.; Msezane, A. Z.

    2009-05-01

    An R-matrix calculation which takes into account relativistic effects via the Breit-Pauli (BP) operator is performed for photoionization cross sections of atomic Cl near the 2p threshold. The wavefunctions are constructed with orbitals generated from a careful large scale configuration interaction (CI) calculation with relativistic corrections using the CIV3 code of Hibbert [1] and Glass and Hibbert [2]. The results are contrasted with the calculation of Martins [3], which uses a CI with relativistic corrections, and compared with the most recent measurements [4]. [1] A. Hibbert, Comput. Phys. Commun. 9, 141 (1975) [2] R. Glass and A. Hibbert, Comput. Phys. Commun. 16, 19 (1978) [3] M. Martins, J. Phys. B 34, 1321 (2001) [4] D. Lindle et al (private communication) Research supported by U.S. DOE, Division of Chemical Sciences, NSF and CAU CFNM, NSF-CREST Program. Computing facilities at Queen's University of Belfast, UK and of DOE Office of Science, NERSC are appreciated.

  5. A Spacelab Expert System for Remote Engineering and Science

    NASA Technical Reports Server (NTRS)

    Groleau, Nick; Colombano, Silvano; Friedland, Peter (Technical Monitor)

    1994-01-01

    NASA's space science program is based on strictly pre-planned activities. This approach does not always result in the best science. We describe an existing computer system that enables space science to be conducted in a more reactive manner through advanced automation techniques that have recently been used in SLS-2 October 1993 space shuttle flight. Advanced computing techniques, usually developed in the field of Artificial Intelligence, allow large portions of the scientific investigator's knowledge to be "packaged" in a portable computer to present advice to the astronaut operator. We strongly believe that this technology has wide applicability to other forms of remote science/engineering. In this brief article, we present the technology of remote science/engineering assistance as implemented for the SLS-2 space shuttle flight. We begin with a logical overview of the system (paying particular attention to the implementation details relevant to the use of the embedded knowledge for system reasoning), then describe its use and success in space, and conclude with ideas about possible earth uses of the technology in the life and medical sciences.

  6. 78 FR 64931 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-30

    ..., we request that members of the public notify the DFO, Christine Chalk, that you intend to call-into the meeting via email at: christine.chalk@science.doe.gov . FOR FURTHER INFORMATION CONTACT: Melea...

  7. The Argonne Leadership Computing Facility 2010 annual report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Drugan, C.

    Researchers found more ways than ever to conduct transformative science at the Argonne Leadership Computing Facility (ALCF) in 2010. Both familiar initiatives and innovative new programs at the ALCF are now serving a growing, global user community with a wide range of computing needs. The Department of Energy's (DOE) INCITE Program remained vital in providing scientists with major allocations of leadership-class computing resources at the ALCF. For calendar year 2011, 35 projects were awarded 732 million supercomputer processor-hours for computationally intensive, large-scale research projects with the potential to significantly advance key areas in science and engineering. Argonne also continued tomore » provide Director's Discretionary allocations - 'start up' awards - for potential future INCITE projects. And DOE's new ASCR Leadership Computing (ALCC) Program allocated resources to 10 ALCF projects, with an emphasis on high-risk, high-payoff simulations directly related to the Department's energy mission, national emergencies, or for broadening the research community capable of using leadership computing resources. While delivering more science today, we've also been laying a solid foundation for high performance computing in the future. After a successful DOE Lehman review, a contract was signed to deliver Mira, the next-generation Blue Gene/Q system, to the ALCF in 2012. The ALCF is working with the 16 projects that were selected for the Early Science Program (ESP) to enable them to be productive as soon as Mira is operational. Preproduction access to Mira will enable ESP projects to adapt their codes to its architecture and collaborate with ALCF staff in shaking down the new system. We expect the 10-petaflops system to stoke economic growth and improve U.S. competitiveness in key areas such as advancing clean energy and addressing global climate change. Ultimately, we envision Mira as a stepping-stone to exascale-class computers that will be faster than petascale-class computers by a factor of a thousand. Pete Beckman, who served as the ALCF's Director for the past few years, has been named director of the newly created Exascale Technology and Computing Institute (ETCi). The institute will focus on developing exascale computing to extend scientific discovery and solve critical science and engineering problems. Just as Pete's leadership propelled the ALCF to great success, we know that that ETCi will benefit immensely from his expertise and experience. Without question, the future of supercomputing is certainly in good hands. I would like to thank Pete for all his effort over the past two years, during which he oversaw the establishing of ALCF2, the deployment of the Magellan project, increases in utilization, availability, and number of projects using ALCF1. He managed the rapid growth of ALCF staff and made the facility what it is today. All the staff and users are better for Pete's efforts.« less

  8. Parameters Free Computational Characterization of Defects in Transition Metal Oxides with Diffusion Quantum Monte Carlo

    NASA Astrophysics Data System (ADS)

    Santana, Juan A.; Krogel, Jaron T.; Kent, Paul R.; Reboredo, Fernando

    Materials based on transition metal oxides (TMO's) are among the most challenging systems for computational characterization. Reliable and practical computations are possible by directly solving the many-body problem for TMO's with quantum Monte Carlo (QMC) methods. These methods are very computationally intensive, but recent developments in algorithms and computational infrastructures have enabled their application to real materials. We will show our efforts on the application of the diffusion quantum Monte Carlo (DMC) method to study the formation of defects in binary and ternary TMO and heterostructures of TMO. We will also outline current limitations in hardware and algorithms. This work is supported by the Materials Sciences & Engineering Division of the Office of Basic Energy Sciences, U.S. Department of Energy (DOE).

  9. High Performance Computing and Storage Requirements for Nuclear Physics: Target 2017

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerber, Richard; Wasserman, Harvey

    2014-04-30

    In April 2014, NERSC, ASCR, and the DOE Office of Nuclear Physics (NP) held a review to characterize high performance computing (HPC) and storage requirements for NP research through 2017. This review is the 12th in a series of reviews held by NERSC and Office of Science program offices that began in 2009. It is the second for NP, and the final in the second round of reviews that covered the six Office of Science program offices. This report is the result of that review

  10. Large Scale Computing and Storage Requirements for High Energy Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerber, Richard A.; Wasserman, Harvey

    2010-11-24

    The National Energy Research Scientific Computing Center (NERSC) is the leading scientific computing facility for the Department of Energy's Office of Science, providing high-performance computing (HPC) resources to more than 3,000 researchers working on about 400 projects. NERSC provides large-scale computing resources and, crucially, the support and expertise needed for scientists to make effective use of them. In November 2009, NERSC, DOE's Office of Advanced Scientific Computing Research (ASCR), and DOE's Office of High Energy Physics (HEP) held a workshop to characterize the HPC resources needed at NERSC to support HEP research through the next three to five years. Themore » effort is part of NERSC's legacy of anticipating users needs and deploying resources to meet those demands. The workshop revealed several key points, in addition to achieving its goal of collecting and characterizing computing requirements. The chief findings: (1) Science teams need access to a significant increase in computational resources to meet their research goals; (2) Research teams need to be able to read, write, transfer, store online, archive, analyze, and share huge volumes of data; (3) Science teams need guidance and support to implement their codes on future architectures; and (4) Projects need predictable, rapid turnaround of their computational jobs to meet mission-critical time constraints. This report expands upon these key points and includes others. It also presents a number of case studies as representative of the research conducted within HEP. Workshop participants were asked to codify their requirements in this case study format, summarizing their science goals, methods of solution, current and three-to-five year computing requirements, and software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, multi-core environment that is expected to dominate HPC architectures over the next few years. The report includes a section that describes efforts already underway or planned at NERSC that address requirements collected at the workshop. NERSC has many initiatives in progress that address key workshop findings and are aligned with NERSC's strategic plans.« less

  11. Introduction to Library Public Services. Sixth Edition. Library and Information Science Text Series.

    ERIC Educational Resources Information Center

    Evans, G. Edward; Amodeo, Anthony J.; Carter, Thomas L.

    This book covers the role, purpose, and philosophy related to each of the major functional areas of library public service. This sixth edition, on the presumption that most people know the basic facts about computer hardware, does not include the chapter (in the previous edition) on computer basics, and instead integrated specific technological…

  12. Evaluating the Comparability of Paper- and Computer-Based Science Tests across Sex and SES Subgroups

    ERIC Educational Resources Information Center

    Randall, Jennifer; Sireci, Stephen; Li, Xueming; Kaira, Leah

    2012-01-01

    As access and reliance on technology continue to increase, so does the use of computerized testing for admissions, licensure/certification, and accountability exams. Nonetheless, full computer-based test (CBT) implementation can be difficult due to limited resources. As a result, some testing programs offer both CBT and paper-based test (PBT)…

  13. Acid/base equilibria in clusters and their role in proton exchange membranes: Computational insight

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Glezakou, Vanda A; Dupuis, Michel; Mundy, Christopher J

    2007-10-24

    We describe molecular orbital theory and ab initio molecular dynamics studies of acid/base equilibria of clusters AH:(H 2O) n↔A -:H +(H 2O) n in low hydration regime (n = 1-4), where AH is a model of perfluorinated sulfonic acids, RSO 3H (R = CF 3CF 2), encountered in polymeric electrolyte membranes of fuel cells. Free energy calculations on the neutral and ion pair structures for n = 3 indicate that the two configurations are close in energy and are accessible in the fluctuation dynamics of proton transport. For n = 1,2 the only relevant configuration is the neutral form. Thismore » was verified through ab initio metadynamics simulations. These findings suggest that bases are directly involved in the proton transport at low hydration levels. In addition, the gas phase proton affinity of the model sulfonic acid RSO 3H was found to be comparable to the proton affinity of water. Thus, protonated acids can also play a role in proton transport under low hydration conditions and under high concentration of protons. This work was supported by the Division of Chemical Science, Office of Basic Energy Sciences, US Department of Energy (DOE under Contract DE-AC05-76RL)1830. Computations were performed on computers of the Molecular Interactions and Transformations (MI&T) group and MSCF facility of EMSL, sponsored by US DOE and OBER located at PNNL. This work was benefited from resource of the National Energy Research Scientific Computing Centre, supported by the Office of Science of the US DOE, under Contract No. DE-AC03-76SF00098.« less

  14. Preface: SciDAC 2008

    NASA Astrophysics Data System (ADS)

    Stevens, Rick

    2008-07-01

    The fourth annual Scientific Discovery through Advanced Computing (SciDAC) Conference was held June 13-18, 2008, in Seattle, Washington. The SciDAC conference series is the premier communitywide venue for presentation of results from the DOE Office of Science's interdisciplinary computational science program. Started in 2001 and renewed in 2006, the DOE SciDAC program is the country's - and arguably the world's - most significant interdisciplinary research program supporting the development of advanced scientific computing methods and their application to fundamental and applied areas of science. SciDAC supports computational science across many disciplines, including astrophysics, biology, chemistry, fusion sciences, and nuclear physics. Moreover, the program actively encourages the creation of long-term partnerships among scientists focused on challenging problems and computer scientists and applied mathematicians developing the technology and tools needed to address those problems. The SciDAC program has played an increasingly important role in scientific research by allowing scientists to create more accurate models of complex processes, simulate problems once thought to be impossible, and analyze the growing amount of data generated by experiments. To help further the research community's ability to tap into the capabilities of current and future supercomputers, Under Secretary for Science, Raymond Orbach, launched the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program in 2003. The INCITE program was conceived specifically to seek out computationally intensive, large-scale research projects with the potential to significantly advance key areas in science and engineering. The program encourages proposals from universities, other research institutions, and industry. During the first two years of the INCITE program, 10 percent of the resources at NERSC were allocated to INCITE awardees. However, demand for supercomputing resources far exceeded available systems; and in 2003, the Office of Science identified increasing computing capability by a factor of 100 as the second priority on its Facilities of the Future list. The goal was to establish leadership-class computing resources to support open science. As a result of a peer reviewed competition, the first leadership computing facility was established at Oak Ridge National Laboratory in 2004. A second leadership computing facility was established at Argonne National Laboratory in 2006. This expansion of computational resources led to a corresponding expansion of the INCITE program. In 2008, Argonne, Lawrence Berkeley, Oak Ridge, and Pacific Northwest national laboratories all provided resources for INCITE. By awarding large blocks of computer time on the DOE leadership computing facilities, the INCITE program enables the largest-scale computations to be pursued. In 2009, INCITE will award over half a billion node-hours of time. The SciDAC conference celebrates progress in advancing science through large-scale modeling and simulation. Over 350 participants attended this year's talks, poster sessions, and tutorials, spanning the disciplines supported by DOE. While the principal focus was on SciDAC accomplishments, this year's conference also included invited presentations and posters from DOE INCITE awardees. Another new feature in the SciDAC conference series was an electronic theater and video poster session, which provided an opportunity for the community to see over 50 scientific visualizations in a venue equipped with many high-resolution large-format displays. To highlight the growing international interest in petascale computing, this year's SciDAC conference included a keynote presentation by Herman Lederer from the Max Planck Institut, one of the leaders of DEISA (Distributed European Infrastructure for Supercomputing Applications) project and a member of the PRACE consortium, Europe's main petascale project. We also heard excellent talks from several European groups, including Laurent Gicquel of CERFACS, who spoke on `Large-Eddy Simulations of Turbulent Reacting Flows of Real Burners: Status and Challenges', and Jean-Francois Hamelin from EDF, who presented a talk on `Getting Ready for Petaflop Capacities and Beyond: A Utility Perspective'. Two other compelling addresses gave attendees a glimpse into the future. Tomas Diaz de la Rubia of Lawrence Livermore National Laboratory spoke on a vision for a fusion/fission hybrid reactor known as the `LIFE Engine' and discussed some of the materials and modeling challenges that need to be overcome to realize the vision for a 1000-year greenhouse-gas-free power source. Dan Reed from Microsoft gave a capstone talk on the convergence of technology, architecture, and infrastructure for cloud computing, data-intensive computing, and exascale computing (1018 flops/sec). High-performance computing is making rapid strides. The SciDAC community's computational resources are expanding dramatically. In the summer of 2008 the first general purpose petascale system (IBM Cell-based RoadRunner at Los Alamos National Laboratory) was recognized in the top 500 list of fastest machines heralding in the dawning of the petascale era. The DOE's leadership computing facility at Argonne reached number three on the Top 500 and is at the moment the most capable open science machine based on an IBM BG/P system with a peak performance of over 550 teraflops/sec. Later this year Oak Ridge is expected to deploy a 1 petaflops/sec Cray XT system. And even before the scientific community has had an opportunity to make significant use of petascale systems, the computer science research community is forging ahead with ideas and strategies for development of systems that may by the end of the next decade sustain exascale performance. Several talks addressed barriers to, and strategies for, achieving exascale capabilities. The last day of the conference was devoted to tutorials hosted by Microsoft Research at a new conference facility in Redmond, Washington. Over 90 people attended the tutorials, which covered topics ranging from an introduction to BG/P programming to advanced numerical libraries. The SciDAC and INCITE programs and the DOE Office of Advanced Scientific Computing Research core program investments in applied mathematics, computer science, and computational and networking facilities provide a nearly optimum framework for advancing computational science for DOE's Office of Science. At a broader level this framework also is benefiting the entire American scientific enterprise. As we look forward, it is clear that computational approaches will play an increasingly significant role in addressing challenging problems in basic science, energy, and environmental research. It takes many people to organize and support the SciDAC conference, and I would like to thank as many of them as possible. The backbone of the conference is the technical program; and the task of selecting, vetting, and recruiting speakers is the job of the organizing committee. I thank the members of this committee for all the hard work and the many tens of conference calls that enabled a wonderful program to be assembled. This year the following people served on the organizing committee: Jim Ahrens, LANL; David Bader, LLNL; Bryan Barnett, Microsoft; Peter Beckman, ANL; Vincent Chan, GA; Jackie Chen, SNL; Lori Diachin, LLNL; Dan Fay, Microsoft; Ian Foster, ANL; Mark Gordon, Ames; Mohammad Khaleel, PNNL; David Keyes, Columbia University; Bob Lucas, University of Southern California; Tony Mezzacappa, ORNL; Jeff Nichols, ORNL; David Nowak, ANL; Michael Papka, ANL; Thomas Schultess, ORNL; Horst Simon, LBNL; David Skinner, LBNL; Panagiotis Spentzouris, Fermilab; Bob Sugar, UCSB; and Kathy Yelick, LBNL. I owe a special thanks to Mike Papka and Jim Ahrens for handling the electronic theater. I also thank all those who submitted videos. It was a highly successful experiment. Behind the scenes an enormous amount of work is required to make a large conference go smoothly. First I thank Cheryl Zidel for her tireless efforts as organizing committee liaison and posters chair and, in general, handling all of my end of the program and keeping me calm. I also thank Gail Pieper for her work in editing the proceedings, Beth Cerny Patino for her work on the Organizing Committee website and electronic theater, and Ken Raffenetti for his work in keeping that website working. Jon Bashor and John Hules did an excellent job in handling conference communications. I thank Caitlin Youngquist for the striking graphic design; Dan Fay for tutorials arrangements; and Lynn Dory, Suzanne Stevenson, Sarah Pebelske and Sarah Zidel for on-site registration and conference support. We all owe Yeen Mankin an extra-special thanks for choosing the hotel, handling contracts, arranging menus, securing venues, and reassuring the chair that everything was under control. We are pleased to have obtained corporate sponsorship from Cray, IBM, Intel, HP, and SiCortex. I thank all the speakers and panel presenters. I also thank the former conference chairs Tony Metzzacappa, Bill Tang, and David Keyes, who were never far away for advice and encouragement. Finally, I offer my thanks to Michael Strayer, without whose leadership, vision, and persistence the SciDAC program would not have come into being and flourished. I am honored to be part of his program and his friend. Rick Stevens Seattle, Washington July 18, 2008

  15. Graphene Oxide Catalyzed C-H Bond Activation: The Importance Oxygen Functional Groups for Biaryl Construction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gao, Yongjun; Tang, Pei; Zhou, Hu

    A heterogeneous, inexpensive and environment-friendly carbon catalytic system was developed for the C-H bond arylation of benzene resulting in the subsequent formation of biaryl compounds. The oxygen-containing groups on these graphene oxide sheets play an essential role in the observed catalytic activity. The catalytic results of model compounds and DFT calculations show that these functional groups promote this reaction by stabilization and activation of K ions at the same time of facilitating the leaving of I. And further mechanisms studies show that it is the charge induced capabilities of oxygen groups connected to specific carbon skeleton together with the giantmore » π-reaction platform provided by the π-domain of graphene that played the vital roles in the observed excellent catalytic activity. D. Mei acknowledges the support from the US Department of Energy (DOE), Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences & Biosciences. Pacific Northwest National Laboratory (PNNL) is a multiprogram national laboratory operated for DOE by Battelle. Computing time was granted by the grand challenge of computational catalysis of the William R. Wiley Environmental Molecular Sciences Laboratory.« less

  16. Site Environmental Report for 2010, Volumes 1 & 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baskin, David; Bauters, Tim; Borglin, Ned

    2011-09-01

    LBNL is a multiprogram scientific facility operated by the UC for the DOE. LBNL’s research is directed toward the physical, biological, environmental, and computational sciences, in order to deliver scientific knowledge and discoveries pertinent to DOE’s missions. This annual Site Environmental Report covers activities conducted in CY 2010. The format and content of this report satisfy the requirements of DOE Order 231.1A, Environment, Safety, and Health Reporting,1 and the operating contract between UC and DOE

  17. The Quantum Engineering Conundrum

    NASA Astrophysics Data System (ADS)

    Monroe, Christopher

    2017-04-01

    There is newfound rush and excitement in Quantum Information Science, as this field seems to be moving toward an industrial/engineering phase. However, this evolution will require that quantum science, long the domain of academics and other researchers, make the leap to sustained engineering efforts in order to fabricate practical devices. I will address the conundrum, that full-blooded engineering does not generally happen on campuses, while many in the professional engineering and computer science community do not believe in quantum physics!

  18. Exploring the role of pendant amines in transition metal complexes for the reduction of N2 to hydrazine and ammonia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhattacharya, Papri; Prokopchuk, Demyan E.; Mock, Michael T.

    2017-03-01

    This review examines the synthesis and acid reactivity of transition metal dinitrogen complexes bearing diphosphine ligands containing pendant amine groups in the second coordination sphere. This manuscript is a review of the work performed in the Center for Molecular Electrocatalysis. This work was supported as part of the Center for Molecular Electrocatalysis, an Energy Frontier Research Center funded by the U.S. Department of Energy (U.S. DOE), Office of Science, Office of Basic Energy Sciences. EPR studies on Fe were performed using EMSL, a national scientific user facility sponsored by the DOE’s Office of Biological and Environmental Research and located atmore » PNNL. Computational resources were provided by the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory. Pacific Northwest National Laboratory is operated by Battelle for the U.S. DOE.« less

  19. ASCR Cybersecurity for Scientific Computing Integrity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piesert, Sean

    The Department of Energy (DOE) has the responsibility to address the energy, environmental, and nuclear security challenges that face our nation. Much of DOE’s enterprise involves distributed, collaborative teams; a signi¬cant fraction involves “open science,” which depends on multi-institutional, often international collaborations that must access or share signi¬cant amounts of information between institutions and over networks around the world. The mission of the Office of Science is the delivery of scienti¬c discoveries and major scienti¬c tools to transform our understanding of nature and to advance the energy, economic, and national security of the United States. The ability of DOE tomore » execute its responsibilities depends critically on its ability to assure the integrity and availability of scienti¬c facilities and computer systems, and of the scienti¬c, engineering, and operational software and data that support its mission.« less

  20. Gregarious Convection and Radiative Feedbacks in Idealized Worlds

    DTIC Science & Technology

    2016-08-29

    exist neither on the globe nor within the cloud model. Since mesoscales impose great computational costs on atmosphere models, as well as inconven...Atmospheric Science, University of Miami, Miami, Florida, USA Abstract What role does convection play in cloud feedbacks? What role does convective... cloud fields depends systematically on global temperature, then convective organization could be a climate system feedback. How reconcilable and how

  1. ISCR Annual Report: Fical Year 2004

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McGraw, J R

    2005-03-03

    Large-scale scientific computation and all of the disciplines that support and help to validate it have been placed at the focus of Lawrence Livermore National Laboratory (LLNL) by the Advanced Simulation and Computing (ASC) program of the National Nuclear Security Administration (NNSA) and the Scientific Discovery through Advanced Computing (SciDAC) initiative of the Office of Science of the Department of Energy (DOE). The maturation of computational simulation as a tool of scientific and engineering research is underscored in the November 2004 statement of the Secretary of Energy that, ''high performance computing is the backbone of the nation's science and technologymore » enterprise''. LLNL operates several of the world's most powerful computers--including today's single most powerful--and has undertaken some of the largest and most compute-intensive simulations ever performed. Ultrascale simulation has been identified as one of the highest priorities in DOE's facilities planning for the next two decades. However, computers at architectural extremes are notoriously difficult to use efficiently. Furthermore, each successful terascale simulation only points out the need for much better ways of interacting with the resulting avalanche of data. Advances in scientific computing research have, therefore, never been more vital to LLNL's core missions than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, LLNL must engage researchers at many academic centers of excellence. In Fiscal Year 2004, the Institute for Scientific Computing Research (ISCR) served as one of LLNL's main bridges to the academic community with a program of collaborative subcontracts, visiting faculty, student internships, workshops, and an active seminar series. The ISCR identifies researchers from the academic community for computer science and computational science collaborations with LLNL and hosts them for short- and long-term visits with the aim of encouraging long-term academic research agendas that address LLNL's research priorities. Through such collaborations, ideas and software flow in both directions, and LLNL cultivates its future workforce. The Institute strives to be LLNL's ''eyes and ears'' in the computer and information sciences, keeping the Laboratory aware of and connected to important external advances. It also attempts to be the ''feet and hands'' that carry those advances into the Laboratory and incorporates them into practice. ISCR research participants are integrated into LLNL's Computing and Applied Research (CAR) Department, especially into its Center for Applied Scientific Computing (CASC). In turn, these organizations address computational challenges arising throughout the rest of the Laboratory. Administratively, the ISCR flourishes under LLNL's University Relations Program (URP). Together with the other five institutes of the URP, it navigates a course that allows LLNL to benefit from academic exchanges while preserving national security. While it is difficult to operate an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and worth the continued effort.« less

  2. Fist Principles Approach to the Magneto Caloric Effect: Application to Ni2MnGa

    NASA Astrophysics Data System (ADS)

    Odbadrakh, Khorgolkhuu; Nicholson, Don; Rusanu, Aurelian; Eisenbach, Markus; Brown, Gregory; Evans, Boyd, III

    2011-03-01

    The magneto-caloric effect (MCE) has potential application in heating and cooling technologies. In this work, we present calculated magnetic structure of a candidate MCE material, Ni 2 MnGa. The magnetic configurations of a 144 atom supercell is first explored using first-principle, the results are then used to fit exchange parameters of a Heisenberg Hamiltonian. The Wang-Landau method is used to calculate the magnetic density of states of the Heisenberg Hamiltonian. Based on this classical estimate, the magnetic density of states is calculated using the Wang Landau method with energies obtained from the first principles method. The Currie temperature and other thermodynamic properties are calculated using the density of states. The relationships between the density of magnetic states and the field induced adiabatic temperature change and isothermal entropy change are discussed. This work was sponsored by the Laboratory Directed Research and Development Program (ORNL), by the Mathematical, Information, and Computational Sciences Division; Office of Advanced Scientific Computing Research (US DOE), and by the Materials Sciences and Engineering Division; Office of Basic Energy Sciences (US DOE).

  3. Density functional theory in materials science.

    PubMed

    Neugebauer, Jörg; Hickel, Tilmann

    2013-09-01

    Materials science is a highly interdisciplinary field. It is devoted to the understanding of the relationship between (a) fundamental physical and chemical properties governing processes at the atomistic scale with (b) typically macroscopic properties required of materials in engineering applications. For many materials, this relationship is not only determined by chemical composition, but strongly governed by microstructure. The latter is a consequence of carefully selected process conditions (e.g., mechanical forming and annealing in metallurgy or epitaxial growth in semiconductor technology). A key task of computational materials science is to unravel the often hidden composition-structure-property relationships using computational techniques. The present paper does not aim to give a complete review of all aspects of materials science. Rather, we will present the key concepts underlying the computation of selected material properties and discuss the major classes of materials to which they are applied. Specifically, our focus will be on methods used to describe single or polycrystalline bulk materials of semiconductor, metal or ceramic form.

  4. US Department of Energy High School Student Supercomputing Honors Program: A follow-up assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1987-01-01

    The US DOE High School Student Supercomputing Honors Program was designed to recognize high school students with superior skills in mathematics and computer science and to provide them with formal training and experience with advanced computer equipment. This document reports on the participants who attended the first such program, which was held at the National Magnetic Fusion Energy Computer Center at the Lawrence Livermore National Laboratory (LLNL) during August 1985.

  5. Applied Mathematics at the U.S. Department of Energy: Past, Present and a View to the Future

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, D L; Bell, J; Estep, D

    2008-02-15

    Over the past half-century, the Applied Mathematics program in the U.S. Department of Energy's Office of Advanced Scientific Computing Research has made significant, enduring advances in applied mathematics that have been essential enablers of modern computational science. Motivated by the scientific needs of the Department of Energy and its predecessors, advances have been made in mathematical modeling, numerical analysis of differential equations, optimization theory, mesh generation for complex geometries, adaptive algorithms and other important mathematical areas. High-performance mathematical software libraries developed through this program have contributed as much or more to the performance of modern scientific computer codes as themore » high-performance computers on which these codes run. The combination of these mathematical advances and the resulting software has enabled high-performance computers to be used for scientific discovery in ways that could only be imagined at the program's inception. Our nation, and indeed our world, face great challenges that must be addressed in coming years, and many of these will be addressed through the development of scientific understanding and engineering advances yet to be discovered. The U.S. Department of Energy (DOE) will play an essential role in providing science-based solutions to many of these problems, particularly those that involve the energy, environmental and national security needs of the country. As the capability of high-performance computers continues to increase, the types of questions that can be answered by applying this huge computational power become more varied and more complex. It will be essential that we find new ways to develop and apply the mathematics necessary to enable the new scientific and engineering discoveries that are needed. In August 2007, a panel of experts in applied, computational and statistical mathematics met for a day and a half in Berkeley, California to understand the mathematical developments required to meet the future science and engineering needs of the DOE. It is important to emphasize that the panelists were not asked to speculate only on advances that might be made in their own research specialties. Instead, the guidance this panel was given was to consider the broad science and engineering challenges that the DOE faces and identify the corresponding advances that must occur across the field of mathematics for these challenges to be successfully addressed. As preparation for the meeting, each panelist was asked to review strategic planning and other informational documents available for one or more of the DOE Program Offices, including the Offices of Science, Nuclear Energy, Fossil Energy, Environmental Management, Legacy Management, Energy Efficiency & Renewable Energy, Electricity Delivery & Energy Reliability and Civilian Radioactive Waste Management as well as the National Nuclear Security Administration. The panelists reported on science and engineering needs for each of these offices, and then discussed and identified mathematical advances that will be required if these challenges are to be met. A review of DOE challenges in energy, the environment and national security brings to light a broad and varied array of questions that the DOE must answer in the coming years. A representative subset of such questions includes: (1) Can we predict the operating characteristics of a clean coal power plant? (2) How stable is the plasma containment in a tokamak? (3) How quickly is climate change occurring and what are the uncertainties in the predicted time scales? (4) How quickly can an introduced bio-weapon contaminate the agricultural environment in the US? (5) How do we modify models of the atmosphere and clouds to incorporate newly collected data of possibly of new types? (6) How quickly can the United States recover if part of the power grid became inoperable? (7) What are optimal locations and communication protocols for sensing devices in a remote-sensing network? (8) How can new materials be designed with a specified desirable set of properties? In comparing and contrasting these and other questions of importance to DOE, the panel found that while the scientific breadth of the requirements is enormous, a central theme emerges: Scientists are being asked to identify or provide technology, or to give expert analysis to inform policy-makers that requires the scientific understanding of increasingly complex physical and engineered systems. In addition, as the complexity of the systems of interest increases, neither experimental observation nor mathematical and computational modeling alone can access all components of the system over the entire range of scales or conditions needed to provide the required scientific understanding.« less

  6. National resource for computation in chemistry, phase I: evaluation and recommendations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1980-05-01

    The National Resource for Computation in Chemistry (NRCC) was inaugurated at the Lawrence Berkeley Laboratory (LBL) in October 1977, with joint funding by the Department of Energy (DOE) and the National Science Foundation (NSF). The chief activities of the NRCC include: assembling a staff of eight postdoctoral computational chemists, establishing an office complex at LBL, purchasing a midi-computer and graphics display system, administering grants of computer time, conducting nine workshops in selected areas of computational chemistry, compiling a library of computer programs with adaptations and improvements, initiating a software distribution system, providing user assistance and consultation on request. This reportmore » presents assessments and recommendations of an Ad Hoc Review Committee appointed by the DOE and NSF in January 1980. The recommendations are that NRCC should: (1) not fund grants for computing time or research but leave that to the relevant agencies, (2) continue the Workshop Program in a mode similar to Phase I, (3) abandon in-house program development and establish instead a competitive external postdoctoral program in chemistry software development administered by the Policy Board and Director, and (4) not attempt a software distribution system (leaving that function to the QCPE). Furthermore, (5) DOE should continue to make its computational facilities available to outside users (at normal cost rates) and should find some way to allow the chemical community to gain occasional access to a CRAY-level computer.« less

  7. The Importance of Simulation Workflow and Data Management in the Accelerated Climate Modeling for Energy Project

    NASA Astrophysics Data System (ADS)

    Bader, D. C.

    2015-12-01

    The Accelerated Climate Modeling for Energy (ACME) Project is concluding its first year. Supported by the Office of Science in the U.S. Department of Energy (DOE), its vision is to be "an ongoing, state-of-the-science Earth system modeling, modeling simulation and prediction project that optimizes the use of DOE laboratory resources to meet the science needs of the nation and the mission needs of DOE." Included in the "laboratory resources," is a large investment in computational, network and information technologies that will be utilized to both build better and more accurate climate models and broadly disseminate the data they generate. Current model diagnostic analysis and data dissemination technologies will not scale to the size of the simulations and the complexity of the models envisioned by ACME and other top tier international modeling centers. In this talk, the ACME Workflow component plans to meet these future needs will be described and early implementation examples will be highlighted.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    van der Eide, Edwin F.; Yang, Ping; Walter, Eric D.

    Unlike the very labile, unobservable radical cations [{l_brace}CpM(CO){sub 3}{r_brace}{sub 2}]{sup {sm_bullet}+} (M = W, Mo), derivatives [{l_brace}CpM(CO){sub 2}(PMe{sub 3}){r_brace}{sub 2}]{sup {sm_bullet}+} are stable enough to be isolated and characterized. Experimental and theoretical studies show that the shortened M-M bonds are of order 1 1/2, and that they are not supported by bridging ligands. The unpaired electron is fully delocalized, with a spin density of ca. 45% on each metal atom. We thank the U.S. Department of Energy (DOE), Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Biosciences and Geosciences for support of this work. Pacific Northwestmore » National Laboratory (PNNL) is a multiprogram national laboratory operated for DOE by Battelle. The EPR and computational studies were performed using EMSL, a national scientific user facility sponsored by the DOE's Office of Biological and Environmental Research and located at PNNL. We thank Dr. Charles Windisch for access to his UV-Vis-NIR spectrometer.« less

  9. 2008 ALCF annual report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Drugan, C.

    2009-12-07

    The word 'breakthrough' aptly describes the transformational science and milestones achieved at the Argonne Leadership Computing Facility (ALCF) throughout 2008. The number of research endeavors undertaken at the ALCF through the U.S. Department of Energy's (DOE) Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program grew from 9 in 2007 to 20 in 2008. The allocation of computer time awarded to researchers on the Blue Gene/P also spiked significantly - from nearly 10 million processor hours in 2007 to 111 million in 2008. To support this research, we expanded the capabilities of Intrepid, an IBM Blue Gene/P systemmore » at the ALCF, to 557 teraflops (TF) for production use. Furthermore, we enabled breakthrough levels of productivity and capability in visualization and data analysis with Eureka, a powerful installation of NVIDIA Quadro Plex S4 external graphics processing units. Eureka delivered a quantum leap in visual compute density, providing more than 111 TF and more than 3.2 terabytes of RAM. On April 21, 2008, the dedication of the ALCF realized DOE's vision to bring the power of the Department's high performance computing to open scientific research. In June, the IBM Blue Gene/P supercomputer at the ALCF debuted as the world's fastest for open science and third fastest overall. No question that the science benefited from this growth and system improvement. Four research projects spearheaded by Argonne National Laboratory computer scientists and ALCF users were named to the list of top ten scientific accomplishments supported by DOE's Advanced Scientific Computing Research (ASCR) program. Three of the top ten projects used extensive grants of computing time on the ALCF's Blue Gene/P to model the molecular basis of Parkinson's disease, design proteins at atomic scale, and create enzymes. As the year came to a close, the ALCF was recognized with several prestigious awards at SC08 in November. We provided resources for Linear Scaling Divide-and-Conquer Electronic Structure Calculations for Thousand Atom Nanostructures, a collaborative effort between Argonne, Lawrence Berkeley National Laboratory, and Oak Ridge National Laboratory that received the ACM Gordon Bell Prize Special Award for Algorithmic Innovation. The ALCF also was named a winner in two of the four categories in the HPC Challenge best performance benchmark competition.« less

  10. Reverse engineering the world: a commentary on Hoffman, Singh, and Prakash, "The interface theory of perception".

    PubMed

    Fields, Chris

    2015-12-01

    Does perception hide the truth? Information theory, computer science, and quantum theory all suggest that the answer is "yes." They suggest, indeed, that useful perception is only feasible because the truth can be hidden.

  11. Does Cloud Computing in the Atmospheric Sciences Make Sense? A case study of hybrid cloud computing at NASA Langley Research Center

    NASA Astrophysics Data System (ADS)

    Nguyen, L.; Chee, T.; Minnis, P.; Spangenberg, D.; Ayers, J. K.; Palikonda, R.; Vakhnin, A.; Dubois, R.; Murphy, P. R.

    2014-12-01

    The processing, storage and dissemination of satellite cloud and radiation products produced at NASA Langley Research Center are key activities for the Climate Science Branch. A constellation of systems operates in sync to accomplish these goals. Because of the complexity involved with operating such intricate systems, there are both high failure rates and high costs for hardware and system maintenance. Cloud computing has the potential to ameliorate cost and complexity issues. Over time, the cloud computing model has evolved and hybrid systems comprising off-site as well as on-site resources are now common. Towards our mission of providing the highest quality research products to the widest audience, we have explored the use of the Amazon Web Services (AWS) Cloud and Storage and present a case study of our results and efforts. This project builds upon NASA Langley Cloud and Radiation Group's experience with operating large and complex computing infrastructures in a reliable and cost effective manner to explore novel ways to leverage cloud computing resources in the atmospheric science environment. Our case study presents the project requirements and then examines the fit of AWS with the LaRC computing model. We also discuss the evaluation metrics, feasibility, and outcomes and close the case study with the lessons we learned that would apply to others interested in exploring the implementation of the AWS system in their own atmospheric science computing environments.

  12. ALCF Data Science Program: Productive Data-centric Supercomputing

    NASA Astrophysics Data System (ADS)

    Romero, Nichols; Vishwanath, Venkatram

    The ALCF Data Science Program (ADSP) is targeted at big data science problems that require leadership computing resources. The goal of the program is to explore and improve a variety of computational methods that will enable data-driven discoveries across all scientific disciplines. The projects will focus on data science techniques covering a wide area of discovery including but not limited to uncertainty quantification, statistics, machine learning, deep learning, databases, pattern recognition, image processing, graph analytics, data mining, real-time data analysis, and complex and interactive workflows. Project teams will be among the first to access Theta, ALCFs forthcoming 8.5 petaflops Intel/Cray system. The program will transition to the 200 petaflop/s Aurora supercomputing system when it becomes available. In 2016, four projects have been selected to kick off the ADSP. The selected projects span experimental and computational sciences and range from modeling the brain to discovering new materials for solar-powered windows to simulating collision events at the Large Hadron Collider (LHC). The program will have a regular call for proposals with the next call expected in Spring 2017.http://www.alcf.anl.gov/alcf-data-science-program This research used resources of the ALCF, which is a DOE Office of Science User Facility supported under Contract DE-AC02-06CH11357.

  13. ONRASIA Scientific Information Bulletin. Volume 8, Number 3, July- September 1993

    DTIC Science & Technology

    1993-09-01

    the Ninth Symposium on Preconditioned Conjugate Dr. Steven F. Ashby Gradient Methods , which he organized. Computing Sciences Department Computing...ditioned Conjugate Gradient Methods , held at Keio chines and is currently a topic of considerable University (Yokohama). During this meeting, I interest...in the United States. In Japan, on the other discussed iterative methods for linear systems with hand, this technique does not appear to be too well

  14. Emerging Science And Technologies: Securing The Nation Through Dicovery and Innovation

    DTIC Science & Technology

    2013-04-01

    potential material for use in quantum computing and spintronics. R&D in the area of advanced carbon-based materials has the potential to revolutionize...seem to involve a dual-approach strategy. First, the vast majority of our sensory input information does not reach the level of consciousness ...WHITE PAPER | 17 Relevant technology areas that support Protection of the Intelligence Enterprise include: Quantum Computing and Associated

  15. Contention Bounds for Combinations of Computation Graphs and Network Topologies

    DTIC Science & Technology

    2014-08-08

    member of STARnet, a Semiconductor Research Corporation program sponsored by MARCO and DARPA, and ASPIRE Lab industrial sponsors and affiliates Intel...Google, Nokia, NVIDIA , Oracle, MathWorks and Samsung. Also funded by U.S. DOE Office of Science, Office of Advanced Scientific Computing Research...DARPA Award Number HR0011-12-2- 0016, the Center for Future Architecture Research, a mem- ber of STARnet, a Semiconductor Research Corporation

  16. A Combined Experimental and Computational Study on the Stability of Nanofluids Containing Metal Organic Frameworks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Annapureddy, Harsha Vardhan Reddy; Nune, Satish K.; Motkuri, Radha K.

    2015-01-08

    Computational studies on nanofluids composed of metal organic frameworks (MOFs) were performed using molecular modeling techniques. Grand Canonical Monte Carlo (GCMC) simulations were used to study adsorption behavior of 1,1,1,3,3-pentafluoropropane (R-245fa) in a MIL-101 MOF at various temperatures. To understand the stability of the nanofluid composed of MIL-101 particles, we performed molecular dynamics simulations to compute potentials of mean force between hypothetical MIL-101 fragments terminated with two different kinds of modulators in R-245fa and water. Our computed potentials of mean force results indicate that the MOF particles tend to disperse better in water than in R-245fa. The reasons for thismore » observation were analyzed and discussed. Our results agree with experimental results indicating that the employed potential models and modeling approaches provide good description of molecular interactions and the reliabilities. Work performed by LXD was supported by the U.S. Department of Energy (DOE), Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences, and Biosciences. Work performed by HVRA, SKN, RKM, and PBM was supported by the Office of Energy Efficiency and Renewable Energy, Geothermal Technologies Program. Pacific Northwest National Laboratory is a multiprogram national laboratory operated for DOE by Battelle.« less

  17. An information technology emphasis in biomedical informatics education.

    PubMed

    Kane, Michael D; Brewer, Jeffrey L

    2007-02-01

    Unprecedented growth in the interdisciplinary domain of biomedical informatics reflects the recent advancements in genomic sequence availability, high-content biotechnology screening systems, as well as the expectations of computational biology to command a leading role in drug discovery and disease characterization. These forces have moved much of life sciences research almost completely into the computational domain. Importantly, educational training in biomedical informatics has been limited to students enrolled in the life sciences curricula, yet much of the skills needed to succeed in biomedical informatics involve or augment training in information technology curricula. This manuscript describes the methods and rationale for training students enrolled in information technology curricula in the field of biomedical informatics, which augments the existing information technology curriculum and provides training on specific subjects in Biomedical Informatics not emphasized in bioinformatics courses offered in life science programs, and does not require prerequisite courses in the life sciences.

  18. Bethune-Cookman University STEM Research Lab. DOE Renovation Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thompson, Herbert W.

    DOE funding was used to renovate 4,500 square feet of aging laboratories and classrooms that support science, engineering, and mathematics disciplines (specifically environmental science, and computer engineering). The expansion of the labs was needed to support robotics and environmental science research, and to better accommodate a wide variety of teaching situations. The renovated space includes a robotics laboratory, two multi-use labs, safe spaces for the storage of instrumentation, modern ventilation equipment, and other “smart” learning venues. The renovated areas feature technologies that are environmentally friendly with reduced energy costs. A campus showcase, the laboratories are a reflection of the University’smore » commitment to the environment and research as a tool for teaching. As anticipated, the labs facilitate the exploration of emerging technologies that are compatible with local and regional economic plans.« less

  19. Laboratory Directed Research and Development Annual Report for 2009

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hughes, Pamela J.

    This report documents progress made on all LDRD-funded projects during fiscal year 2009. As a US Department of Energy (DOE) Office of Science (SC) national laboratory, Pacific Northwest National Laboratory (PNNL) has an enduring mission to bring molecular and environmental sciences and engineering strengths to bear on DOE missions and national needs. Their vision is to be recognized worldwide and valued nationally for leadership in accelerating the discovery and deployment of solutions to challenges in energy, national security, and the environment. To achieve this mission and vision, they provide distinctive, world-leading science and technology in: (1) the design and scalablemore » synthesis of materials and chemicals; (2) climate change science and emissions management; (3) efficient and secure electricity management from generation to end use; and (4) signature discovery and exploitation for threat detection and reduction. PNNL leadership also extends to operating EMSL: the Environmental Molecular Sciences Laboratory, a national scientific user facility dedicated to providing itnegrated experimental and computational resources for discovery and technological innovation in the environmental molecular sciences.« less

  20. Stretching the Traditional Notion of Experiment in Computing: Explorative Experiments.

    PubMed

    Schiaffonati, Viola

    2016-06-01

    Experimentation represents today a 'hot' topic in computing. If experiments made with the support of computers, such as computer simulations, have received increasing attention from philosophers of science and technology, questions such as "what does it mean to do experiments in computer science and engineering and what are their benefits?" emerged only recently as central in the debate over the disciplinary status of the discipline. In this work we aim at showing, also by means of paradigmatic examples, how the traditional notion of controlled experiment should be revised to take into account a part of the experimental practice in computing along the lines of experimentation as exploration. Taking inspiration from the discussion on exploratory experimentation in the philosophy of science-experimentation that is not theory-driven-we advance the idea of explorative experiments that, although not new, can contribute to enlarge the debate about the nature and role of experimental methods in computing. In order to further refine this concept we recast explorative experiments as socio-technical experiments, that test new technologies in their socio-technical contexts. We suggest that, when experiments are explorative, control should be intended in a posteriori form, in opposition to the a priori form that usually takes place in traditional experimental contexts.

  1. Opening Comments: SciDAC 2009

    NASA Astrophysics Data System (ADS)

    Strayer, Michael

    2009-07-01

    Welcome to San Diego and the 2009 SciDAC conference. Over the next four days, I would like to present an assessment of the SciDAC program. We will look at where we've been, how we got to where we are and where we are going in the future. Our vision is to be first in computational science, to be best in class in modeling and simulation. When Ray Orbach asked me what I would do, in my job interview for the SciDAC Director position, I said we would achieve that vision. And with our collective dedicated efforts, we have managed to achieve this vision. In the last year, we have now the most powerful supercomputer for open science, Jaguar, the Cray XT system at the Oak Ridge Leadership Computing Facility (OLCF). We also have NERSC, probably the best-in-the-world program for productivity in science that the Office of Science so depends on. And the Argonne Leadership Computing Facility offers architectural diversity with its IBM Blue Gene/P system as a counterbalance to Oak Ridge. There is also ESnet, which is often understated—the 40 gigabit per second dual backbone ring that connects all the labs and many DOE sites. In the President's Recovery Act funding, there is exciting news that ESnet is going to build out to a 100 gigabit per second network using new optical technologies. This is very exciting news for simulations and large-scale scientific facilities. But as one noted SciDAC luminary said, it's not all about the computers—it's also about the science—and we are also achieving our vision in this area. Together with having the fastest supercomputer for science, at the SC08 conference, SciDAC researchers won two ACM Gordon Bell Prizes for the outstanding performance of their applications. The DCA++ code, which solves some very interesting problems in materials, achieved a sustained performance of 1.3 petaflops, an astounding result and a mark I suspect will last for some time. The LS3DF application for studying nanomaterials also required the development of a new and novel algorithm to produce results up to 400 times faster than a similar application, and was recognized with a prize for algorithm innovation—a remarkable achievement. Day one of our conference will include examples of petascale science enabled at the OLCF. Although Jaguar has not been officially commissioned, it has gone through its acceptance tests, and during its shakedown phase there have been pioneer applications used for the acceptance tests, and they are running at scale. These include applications in the areas of astrophysics, biology, chemistry, combustion, fusion, geosciences, materials science, nuclear energy and nuclear physics. We also have a whole compendium of science we do at our facilities; these have been documented and reviewed at our last SciDAC conference. Many of these were highlighted in our Breakthroughs Report. One session at this week's conference will feature a cross-section of these breakthroughs. In the area of scalable electromagnetic simulations, the Auxiliary-space Maxwell Solver (AMS) uses specialized finite element discretizations and multigrid-based techniques, which decompose the original problem into easier-to-solve subproblems. Congratulations to the mathematicians on this. Another application on the list of breakthroughs was the authentication of PETSc, which provides scalable solvers used in many DOE applications and has solved problems with over 3 billion unknowns and scaled to over 16,000 processors on DOE leadership-class computers. This is becoming a very versatile and useful toolkit to achieve performance at scale. With the announcement of SIAM's first class of Fellows, we are remarkably well represented. Of the group of 191, more than 40 of these Fellows are in the 'DOE space.' We are so delighted that SIAM has recognized them for their many achievements. In the coming months, we will illustrate our leadership in applied math and computer science by looking at our contributions in the areas of programming models, development and performance tools, math libraries, system software, collaboration, and visualization and data analytics. This is a large and diverse list of libraries. We have asked for two panels, one chaired by David Keyes and composed of many of the nation's leading mathematicians, to produce a report on the most significant accomplishments in applied mathematics over the last eight years, taking us back to the start of the SciDAC program. In addition, we have a similar panel in computer science to be chaired by Kathy Yelick. They are going to identify the computer science accomplishments of the past eight years. These accomplishments are difficult to get a handle on, and I'm looking forward to this report. We will also have a follow-on to our report on breakthroughs in computational science and this will also go back eight years, looking at the many accomplishments under the SciDAC and INCITE programs. This will be chaired by Tony Mezzacappa. So, where are we going in the SciDAC program? It might help to take a look at computational science and how it got started. I go back to Ken Wilson, who made the model and has written on computational science and computational science education. His model was thus: The computational scientist plays the role of the experimentalist, and the math and CS researchers play the role of theorists, and the computers themselves are the experimental apparatus. And that in simulation science, we are carrying out numerical experiments as to the nature of physical and biological sciences. Peter Lax, in the same time frame, developed a report on large-scale computing in science and engineering. Peter remarked, 'Perhaps the most important applications of scientific computing come not in the solution of old problems, but in the discovery of new phenomena through numerical experimentation.' And in the early years, I think the person who provided the most guidance, the most innovation and the most vision for where the future might lie was Ed Oliver. Ed Oliver died last year. Ed did a number of things in science. He had this personality where he knew exactly what to do, but he preferred to stay out of the limelight so that others could enjoy the fruits of his vision. We in the SciDAC program and ASCR Facilities are still enjoying the benefits of his vision. We will miss him. Twenty years after Ken Wilson, Ray Orbach laid out the fundamental premise for SciDAC in an interview that appeared in SciDAC Review: 'SciDAC is unique in the world. There isn't any other program like it anywhere else, and it has the remarkable ability to do science by bringing together physical scientists, mathematicians, applied mathematicians, and computer scientists who recognize that computation is not something you do at the end, but rather it needs to be built into the solution of the very problem that one is addressing. ' As you look at the Lax report from 1982, it talks about how 'Future significant improvements may have to come from architectures embodying parallel processing elements—perhaps several thousands of processors.' And it continues, 'esearch in languages, algorithms and numerical analysis will be crucial in learning to exploit these new architectures fully.' In the early '90s, Sterling, Messina and Smith developed a workshop report on petascale computing and concluded, 'A petaflops computer system will be feasible in two decades, or less, and rely in part on the continual advancement of the semiconductor industry both in speed enhancement and cost reduction through improved fabrication processes.' So they were not wrong, and today we are embarking on a forward look that is at a different scale, the exascale, going to 1018 flops. In 2007, Stevens, Simon and Zacharia chaired a series of town hall meetings looking at exascale computing, and in their report wrote, 'Exascale computer systems are expected to be technologically feasible within the next 15 years, or perhaps sooner. These systems will push the envelope in a number of important technologies: processor architecture, scale of multicore integration, power management and packaging.' The concept of computing on the Jaguar computer involves hundreds of thousands of cores, as do the IBM systems that are currently out there. So the scale of computing with systems with billions of processors is staggering to me, and I don't know how the software and math folks feel about it. We have now embarked on a road toward extreme scale computing. We have created a series of town hall meetings and we are now in the process of holding workshops that address what I call within the DOE speak 'the mission need,' or what is the scientific justification for computing at that scale. We are going to have a total of 13 workshops. The workshops on climate, high energy physics, nuclear physics, fusion, and nuclear energy have been held. The report from the workshop on climate is actually out and available, and the other reports are being completed. The upcoming workshops are on biology, materials, and chemistry; and workshops that engage science for nuclear security are a partnership between NNSA and ASCR. There are additional workshops on applied math, computer science, and architecture that are needed for computing at the exascale. These extreme scale workshops will provide the foundation in our office, the Office of Science, the NNSA and DOE, and we will engage the National Science Foundation and the Department of Defense as partners. We envision a 10-year program for an exascale initiative. It will be an integrated R&D program initially—you can think about five years for research and development—that would be in hardware, operating systems, file systems, networking and so on, as well as software for applications. Application software and the operating system and the hardware all need to be bundled in this period so that at the end the system will execute the science applications at scale. We also believe that this process will have to have considerable investment from the manufacturers and vendors to be successful. We have formed laboratory, university and industry working groups to start this process and formed a panel to look at where SciDAC needs to go to compute at the extreme scale, and we have formed an executive committee within the Office of Science and the NNSA to focus on these activities. We will have outreach to DoD in the next few months. We are anticipating a solicitation within the next two years in which we will compete this bundled R&D process. We don't know how we will incorporate SciDAC into extreme scale computing, but we do know there will be many challenges. And as we have shown over the years, we have the expertise and determination to surmount these challenges.

  2. Large Scale Many-Body Perturbation Theory calculations: methodological developments, data collections, validation

    NASA Astrophysics Data System (ADS)

    Govoni, Marco; Galli, Giulia

    Green's function based many-body perturbation theory (MBPT) methods are well established approaches to compute quasiparticle energies and electronic lifetimes. However, their application to large systems - for instance to heterogeneous systems, nanostructured, disordered, and defective materials - has been hindered by high computational costs. We will discuss recent MBPT methodological developments leading to an efficient formulation of electron-electron and electron-phonon interactions, and that can be applied to systems with thousands of electrons. Results using a formulation that does not require the explicit calculation of virtual states, nor the storage and inversion of large dielectric matrices will be presented. We will discuss data collections obtained using the WEST code, the advantages of the algorithms used in WEST over standard techniques, and the parallel performance. Work done in collaboration with I. Hamada, R. McAvoy, P. Scherpelz, and H. Zheng. This work was supported by MICCoM, as part of the Computational Materials Sciences Program funded by the U.S. Department of Energy, Office of Science, Basic Energy Sciences, Materials Sciences and Engineering Division and by ANL.

  3. Commnity Petascale Project for Accelerator Science And Simulation: Advancing Computational Science for Future Accelerators And Accelerator Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spentzouris, Panagiotis; /Fermilab; Cary, John

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessarymore » accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors.« less

  4. Journal of Undergraduate Research, Volume IX, 2009

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stiner, K. S.; Graham, S.; Khan, M.

    Each year more than 600 undergraduate students are awarded paid internships at the Department of Energy’s (DOE) National Laboratories. Th ese interns are paired with research scientists who serve as mentors in authentic research projects. All participants write a research abstract and present at a poster session and/or complete a fulllength research paper. Abstracts and selected papers from our 2007–2008 interns that represent the breadth and depth of undergraduate research performed each year at our National Laboratories are published here in the Journal of Undergraduate Research. The fields in which these students worked included: Biology; Chemistry; Computer Science; Engineering; Environmentalmore » Science; General Science; Materials Science; Medical and Health Sciences; Nuclear Science; Physics; Science Policy; and Waste Management.« less

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mattsson, Ann E.

    Density Functional Theory (DFT) based Equation of State (EOS) construction is a prominent part of Sandia’s capabilities to support engineering sciences. This capability is based on augmenting experimental data with information gained from computational investigations, especially in those parts of the phase space where experimental data is hard, dangerous, or expensive to obtain. A key part of the success of the Sandia approach is the fundamental science work supporting the computational capability. Not only does this work enhance the capability to perform highly accurate calculations but it also provides crucial insight into the limitations of the computational tools, providing highmore » confidence in the results even where results cannot be, or have not yet been, validated by experimental data. This report concerns the key ingredient of projector augmented-wave (PAW) potentials for use in pseudo-potential computational codes. Using the tools discussed in SAND2012-7389 we assess the standard Vienna Ab-initio Simulation Package (VASP) PAWs for Molybdenum.« less

  6. Enabling Extreme Scale Earth Science Applications at the Oak Ridge Leadership Computing Facility

    NASA Astrophysics Data System (ADS)

    Anantharaj, V. G.; Mozdzynski, G.; Hamrud, M.; Deconinck, W.; Smith, L.; Hack, J.

    2014-12-01

    The Oak Ridge Leadership Facility (OLCF), established at the Oak Ridge National Laboratory (ORNL) under the auspices of the U.S. Department of Energy (DOE), welcomes investigators from universities, government agencies, national laboratories and industry who are prepared to perform breakthrough research across a broad domain of scientific disciplines, including earth and space sciences. Titan, the OLCF flagship system, is currently listed as #2 in the Top500 list of supercomputers in the world, and the largest available for open science. The computational resources are allocated primarily via the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, sponsored by the U.S. DOE Office of Science. In 2014, over 2.25 billion core hours on Titan were awarded via INCITE projects., including 14% of the allocation toward earth sciences. The INCITE competition is also open to research scientists based outside the USA. In fact, international research projects account for 12% of the INCITE awards in 2014. The INCITE scientific review panel also includes 20% participation from international experts. Recent accomplishments in earth sciences at OLCF include the world's first continuous simulation of 21,000 years of earth's climate history (2009); and an unprecedented simulation of a magnitude 8 earthquake over 125 sq. miles. One of the ongoing international projects involves scaling the ECMWF Integrated Forecasting System (IFS) model to over 200K cores of Titan. ECMWF is a partner in the EU funded Collaborative Research into Exascale Systemware, Tools and Applications (CRESTA) project. The significance of the research carried out within this project is the demonstration of techniques required to scale current generation Petascale capable simulation codes towards the performance levels required for running on future Exascale systems. One of the techniques pursued by ECMWF is to use Fortran2008 coarrays to overlap computations and communications and to reduce the total volume of data communicated. Use of Titan has enabled ECMWF to plan future scalability developments and resource requirements. We will also discuss the best practices developed over the years in navigating logistical, legal and regulatory hurdles involved in supporting the facility's diverse user community.

  7. Peer Collaboration: The Relation of Regulatory Behaviors to Learning with Hypermedia

    ERIC Educational Resources Information Center

    Winters, Fielding I.; Alexander, Patricia A.

    2011-01-01

    Peer collaboration is a pedagogical method currently used to facilitate learning in classrooms. Similarly, computer-learning environments (CLEs) are often used to promote student learning in science classrooms, in particular. However, students often have difficulty utilizing these environments effectively. Does peer collaboration help students…

  8. OPENING REMARKS: SciDAC: Scientific Discovery through Advanced Computing

    NASA Astrophysics Data System (ADS)

    Strayer, Michael

    2005-01-01

    Good morning. Welcome to SciDAC 2005 and San Francisco. SciDAC is all about computational science and scientific discovery. In a large sense, computational science characterizes SciDAC and its intent is change. It transforms both our approach and our understanding of science. It opens new doors and crosses traditional boundaries while seeking discovery. In terms of twentieth century methodologies, computational science may be said to be transformational. There are a number of examples to this point. First are the sciences that encompass climate modeling. The application of computational science has in essence created the field of climate modeling. This community is now international in scope and has provided precision results that are challenging our understanding of our environment. A second example is that of lattice quantum chromodynamics. Lattice QCD, while adding precision and insight to our fundamental understanding of strong interaction dynamics, has transformed our approach to particle and nuclear science. The individual investigator approach has evolved to teams of scientists from different disciplines working side-by-side towards a common goal. SciDAC is also undergoing a transformation. This meeting is a prime example. Last year it was a small programmatic meeting tracking progress in SciDAC. This year, we have a major computational science meeting with a variety of disciplines and enabling technologies represented. SciDAC 2005 should position itself as a new corner stone for Computational Science and its impact on science. As we look to the immediate future, FY2006 will bring a new cycle to SciDAC. Most of the program elements of SciDAC will be re-competed in FY2006. The re-competition will involve new instruments for computational science, new approaches for collaboration, as well as new disciplines. There will be new opportunities for virtual experiments in carbon sequestration, fusion, and nuclear power and nuclear waste, as well as collaborations with industry and virtual prototyping. New instruments of collaboration will include institutes and centers while summer schools, workshops and outreach will invite new talent and expertise. Computational science adds new dimensions to science and its practice. Disciplines of fusion, accelerator science, and combustion are poised to blur the boundaries between pure and applied science. As we open the door into FY2006 we shall see a landscape of new scientific challenges: in biology, chemistry, materials, and astrophysics to name a few. The enabling technologies of SciDAC have been transformational as drivers of change. Planning for major new software systems assumes a base line employing Common Component Architectures and this has become a household word for new software projects. While grid algorithms and mesh refinement software have transformed applications software, data management and visualization have transformed our understanding of science from data. The Gordon Bell prize now seems to be dominated by computational science and solvers developed by TOPS ISIC. The priorities of the Office of Science in the Department of Energy are clear. The 20 year facilities plan is driven by new science. High performance computing is placed amongst the two highest priorities. Moore's law says that by the end of the next cycle of SciDAC we shall have peta-flop computers. The challenges of petascale computing are enormous. These and the associated computational science are the highest priorities for computing within the Office of Science. Our effort in Leadership Class computing is just a first step towards this goal. Clearly, computational science at this scale will face enormous challenges and possibilities. Performance evaluation and prediction will be critical to unraveling the needed software technologies. We must not lose sight of our overarching goal—that of scientific discovery. Science does not stand still and the landscape of science discovery and computing holds immense promise. In this environment, I believe it is necessary to institute a system of science based performance metrics to help quantify our progress towards science goals and scientific computing. As a final comment I would like to reaffirm that the shifting landscapes of science will force changes to our computational sciences, and leave you with the quote from Richard Hamming, 'The purpose of computing is insight, not numbers'.

  9. USRA/RIACS

    NASA Technical Reports Server (NTRS)

    Oliger, Joseph

    1992-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on 6 June 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under a cooperative agreement with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. A flexible scientific staff is provided through a university faculty visitor program, a post doctoral program, and a student visitor program. Not only does this provide appropriate expertise but it also introduces scientists outside of NASA to NASA problems. A small group of core RIACS staff provides continuity and interacts with an ARC technical monitor and scientific advisory group to determine the RIACS mission. RIACS activities are reviewed and monitored by a USRA advisory council and ARC technical monitor. Research at RIACS is currently being done in the following areas: Parallel Computing; Advanced Methods for Scientific Computing; Learning Systems; High Performance Networks and Technology; Graphics, Visualization, and Virtual Environments.

  10. Lattice QCD Application Development within the US DOE Exascale Computing Project

    NASA Astrophysics Data System (ADS)

    Brower, Richard; Christ, Norman; DeTar, Carleton; Edwards, Robert; Mackenzie, Paul

    2018-03-01

    In October, 2016, the US Department of Energy launched the Exascale Computing Project, which aims to deploy exascale computing resources for science and engineering in the early 2020's. The project brings together application teams, software developers, and hardware vendors in order to realize this goal. Lattice QCD is one of the applications. Members of the US lattice gauge theory community with significant collaborators abroad are developing algorithms and software for exascale lattice QCD calculations. We give a short description of the project, our activities, and our plans.

  11. Lattice QCD Application Development within the US DOE Exascale Computing Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brower, Richard; Christ, Norman; DeTar, Carleton

    In October, 2016, the US Department of Energy launched the Exascale Computing Project, which aims to deploy exascale computing resources for science and engineering in the early 2020's. The project brings together application teams, software developers, and hardware vendors in order to realize this goal. Lattice QCD is one of the applications. Members of the US lattice gauge theory community with significant collaborators abroad are developing algorithms and software for exascale lattice QCD calculations. We give a short description of the project, our activities, and our plans.

  12. Heterolysis of H2 Across a Classical Lewis Pair, 2,6-Lutidine-BCl3: Synthesis, Characterization, and Mechanism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ginovska-Pangovska, Bojana; Autrey, Thomas; Parab, Kshitij K.

    We report on a combined computational and experimental study of the activation of hydrogen using for 2,6-lutidine (Lut)/BCl3 Lewis pairs. Herein we describe the synthetic approach used to obtain a new FLP, Lut-BCl3 that activates molecular H2 at ~10 bar, 100 °C in toluene or lutidine as the solvent. The resulting compound is an unexpected neutral hydride, LutBHCl2, rather than the ion pair, which we attribute to ligand redistribution. The mechanism for activation was modeled with density functional theory and accurate G3(MP2)B3 theory. The dative bond in Lut-BCl3 is calculated to have a bond enthalpy of 15 kcal/mol. The separatedmore » pair is calculated to react with H2 and form the [LutH+][HBCl3–] ion pair with a barrier of 13 kcal/mol. Metathesis with LutBCl3 produces LutBHCl2 and [LutH][BCl4]. The overall reaction is exothermic by 8.5 kcal/mol. An alternative pathway was explored involving lutidine–borenium cation pair activating H2. This work was supported by the U.S. Department of Energy's (DOE) Office of Basic Energy Sciences, Division of Chemical Sciences, Biosciences, and Geosciences, and was performed in part using the Molecular Science Computing Facility (MSCF) in the William R. Wiley Environmental Molecular Sciences Laboratory, a DOE national scientific user facility sponsored by the Department of Energy's Office of Biological and Environmental Research and located at the Pacific Northwest National Laboratory (PNNL). PNNL is operated by Battelle for DOE.« less

  13. Computational studies of adsorption in metal organic frameworks and interaction of nanoparticles in condensed phases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Annapureddy, Harsha V.; Motkuri, Radha K.; Nguyen, Phuong T.

    In this review, we describe recent efforts in which computer simulations were used to systematically study nano-structured metal organic frameworks, with particular emphasis on their application in heating and cooling processes. These materials also are known as metal organic heat carriers. We used both molecular dynamics and Grand Canonical Monte Carlo simulation techniques to gain a molecular-level understanding of the adsorption mechanism of gases in these porous materials. We investigated the uptake of various gases such as refrigerants R12 and R143a and also the elemental gases Xe and Rn by the metal organic framework (i.e., Ni2(dhtp)). We also evaluated themore » effects of temperature and pressure on the uptake mechanism. Our computed results compared reasonably well with available experimental measurements, thus validating our potential models and approaches. In addition, we also investigated the structural, diffusive, and adsorption properties of different hydrocarbons in Ni2(dhtp). To elucidate the mechanism of nanoparticle dispersion in condensed phases, we also studied the interactions among nanoparticles in various liquids, such as n-hexane, water and methanol. This work was performed at Pacific Northwest National Laboratory (PNNL) and was supported by the Division of Chemical Sciences, Geosciences and Biosciences, Office of Basic Energy Sciences, U.S. Department of Energy (DOE). PNNL is operated by Battelle for the DOE. The authors also gratefully acknowledge support received from the National Energy Technology Laboratory of DOE's Office of Fossil Energy.« less

  14. ISCR FY2005 Annual Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keyes, D E; McGraw, J R

    2006-02-02

    Large-scale scientific computation and all of the disciplines that support and help validate it have been placed at the focus of Lawrence Livermore National Laboratory (LLNL) by the Advanced Simulation and Computing (ASC) program of the National Nuclear Security Administration (NNSA) and the Scientific Discovery through Advanced Computing (SciDAC) initiative of the Office of Science of the Department of Energy (DOE). The maturation of simulation as a fundamental tool of scientific and engineering research is underscored in the President's Information Technology Advisory Committee (PITAC) June 2005 finding that ''computational science has become critical to scientific leadership, economic competitiveness, and nationalmore » security''. LLNL operates several of the world's most powerful computers--including today's single most powerful--and has undertaken some of the largest and most compute-intensive simulations ever performed, most notably the molecular dynamics simulation that sustained more than 100 Teraflop/s and won the 2005 Gordon Bell Prize. Ultrascale simulation has been identified as one of the highest priorities in DOE's facilities planning for the next two decades. However, computers at architectural extremes are notoriously difficult to use in an efficient manner. Furthermore, each successful terascale simulation only points out the need for much better ways of interacting with the resulting avalanche of data. Advances in scientific computing research have, therefore, never been more vital to the core missions of LLNL than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, LLNL must engage researchers at many academic centers of excellence. In FY 2005, the Institute for Scientific Computing Research (ISCR) served as one of LLNL's main bridges to the academic community with a program of collaborative subcontracts, visiting faculty, student internships, workshops, and an active seminar series. The ISCR identifies researchers from the academic community for computer science and computational science collaborations with LLNL and hosts them for both brief and extended visits with the aim of encouraging long-term academic research agendas that address LLNL research priorities. Through these collaborations, ideas and software flow in both directions, and LLNL cultivates its future workforce. The Institute strives to be LLNL's ''eyes and ears'' in the computer and information sciences, keeping the Laboratory aware of and connected to important external advances. It also attempts to be the ''hands and feet'' that carry those advances into the Laboratory and incorporate them into practice. ISCR research participants are integrated into LLNL's Computing Applications and Research (CAR) Department, especially into its Center for Applied Scientific Computing (CASC). In turn, these organizations address computational challenges arising throughout the rest of the Laboratory. Administratively, the ISCR flourishes under LLNL's University Relations Program (URP). Together with the other four institutes of the URP, the ISCR navigates a course that allows LLNL to benefit from academic exchanges while preserving national security. While it is difficult to operate an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and worth the continued effort. The pages of this annual report summarize the activities of the faculty members, postdoctoral researchers, students, and guests from industry and other laboratories who participated in LLNL's computational mission under the auspices of the ISCR during FY 2005.« less

  15. Quantum Information Science: An Update

    NASA Astrophysics Data System (ADS)

    Kwek, L. C.; Zen, Freddy P.

    2016-08-01

    It is now roughly thirty years since the incipient ideas on quantum information science was concretely formalized. Over the last three decades, there has been much development in this field, and at least one technology, namely devices for quantum cryptography, is now commercialized. Yet, the holy grail of a workable quantum computing machine still lies faraway at the horizon. In any case, it took nearly several centuries before the vacuum tubes were invented after the first mechanical calculating were constructed, and several decades later, for the transistor to bring the current computer technology to fruition. In this review, we provide a short survey of the current development and progress in quantum information science. It clearly does not do justice to the amount of work in the past thirty years. Nevertheless, despite the modest attempt, this review hopes to induce younger researchers into this exciting field.

  16. Graduate student theses supported by DOE`s Environmental Sciences Division

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cushman, Robert M.; Parra, Bobbi M.

    1995-07-01

    This report provides complete bibliographic citations, abstracts, and keywords for 212 doctoral and master`s theses supported fully or partly by the U.S. Department of Energy`s Environmental Sciences Division (and its predecessors) in the following areas: Atmospheric Sciences; Marine Transport; Terrestrial Transport; Ecosystems Function and Response; Carbon, Climate, and Vegetation; Information; Computer Hardware, Advanced Mathematics, and Model Physics (CHAMMP); Atmospheric Radiation Measurement (ARM); Oceans; National Institute for Global Environmental Change (NIGEC); Unmanned Aerial Vehicles (UAV); Integrated Assessment; Graduate Fellowships for Global Change; and Quantitative Links. Information on the major professor, department, principal investigator, and program area is given for each abstract.more » Indexes are provided for major professor, university, principal investigator, program area, and keywords. This bibliography is also available in various machine-readable formats (ASCII text file, WordPerfect{reg_sign} files, and PAPYRUS{trademark} files).« less

  17. CRD's Daniela Ushizima Receives DOE Early Career Award

    Science.gov Websites

    Science. The award will fund research into developing new methods to help scientists extract more -the-art data analysis methods with emphasis on pattern recognition and machine learning emerging sources, multidisciplinary teams to interpret the data and the computational methods to automate some of

  18. Quality Improvement: Does the Air Force Systems Command Practice What It Preaches

    DTIC Science & Technology

    1990-03-01

    without his assistance in getting supplies, computers, and plotters. Another special thanks goes to my committee chairman. Dr Stephen Blank. who provided...N.J.: Prentice-Hall. 1986). 166. 5. Ibid.. 181. 6. Sidney Siegel. Nonparametric Statistics for the Behavioral Sciences (New York: Mc- Graw -Hill. 1956

  19. Cloudbursting - Solving the 3-body problem

    NASA Astrophysics Data System (ADS)

    Chang, G.; Heistand, S.; Vakhnin, A.; Huang, T.; Zimdars, P.; Hua, H.; Hood, R.; Koenig, J.; Mehrotra, P.; Little, M. M.; Law, E.

    2014-12-01

    Many science projects in the future will be accomplished through collaboration among 2 or more NASA centers along with, potentially, external scientists. Science teams will be composed of more geographically dispersed individuals and groups. However, the current computing environment does not make this easy and seamless. By being able to share computing resources among members of a multi-center team working on a science/ engineering project, limited pre-competition funds could be more efficiently applied and technical work could be conducted more effectively with less time spent moving data or waiting for computing resources to free up. Based on the work from an NASA CIO IT Labs task, this presentation will highlight our prototype work in identifying the feasibility and identify the obstacles, both technical and management, to perform "Cloudbursting" among private clouds located at three different centers. We will demonstrate the use of private cloud computing infrastructure at the Jet Propulsion Laboratory, Langley Research Center, and Ames Research Center to provide elastic computation to each other to perform parallel Earth Science data imaging. We leverage elastic load balancing and auto-scaling features at each data center so that each location can independently define how many resources to allocate to a particular job that was "bursted" from another data center and demonstrate that compute capacity scales up and down with the job. We will also discuss future work in the area, which could include the use of cloud infrastructure from different cloud framework providers as well as other cloud service providers.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chow, Edmond

    Solving sparse problems is at the core of many DOE computational science applications. We focus on the challenge of developing sparse algorithms that can fully exploit the parallelism in extreme-scale computing systems, in particular systems with massive numbers of cores per node. Our approach is to express a sparse matrix factorization as a large number of bilinear constraint equations, and then solving these equations via an asynchronous iterative method. The unknowns in these equations are the matrix entries of the factorization that is desired.

  1. Laboratory Directed Research and Development FY2011 Annual Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Craig, W; Sketchley, J; Kotta, P

    2012-03-22

    A premier applied-science laboratory, Lawrence Livermore National Laboratory (LLNL) has earned the reputation as a leader in providing science and technology solutions to the most pressing national and global security problems. The LDRD Program, established by Congress at all DOE national laboratories in 1991, is LLNL's most important single resource for fostering excellent science and technology for today's needs and tomorrow's challenges. The LDRD internally directed research and development funding at LLNL enables high-risk, potentially high-payoff projects at the forefront of science and technology. The LDRD Program at Livermore serves to: (1) Support the Laboratory's missions, strategic plan, and foundationalmore » science; (2) Maintain the Laboratory's science and technology vitality; (3) Promote recruiting and retention; (4) Pursue collaborations; (5) Generate intellectual property; and (6) Strengthen the U.S. economy. Myriad LDRD projects over the years have made important contributions to every facet of the Laboratory's mission and strategic plan, including its commitment to nuclear, global, and energy and environmental security, as well as cutting-edge science and technology and engineering in high-energy-density matter, high-performance computing and simulation, materials and chemistry at the extremes, information systems, measurements and experimental science, and energy manipulation. A summary of each project was submitted by the principal investigator. Project summaries include the scope, motivation, goals, relevance to DOE/NNSA and LLNL mission areas, the technical progress achieved in FY11, and a list of publications that resulted from the research. The projects are: (1) Nuclear Threat Reduction; (2) Biosecurity; (3) High-Performance Computing and Simulation; (4) Intelligence; (5) Cybersecurity; (6) Energy Security; (7) Carbon Capture; (8) Material Properties, Theory, and Design; (9) Radiochemistry; (10) High-Energy-Density Science; (11) Laser Inertial-Fusion Energy; (12) Advanced Laser Optical Systems and Applications; (12) Space Security; (13) Stockpile Stewardship Science; (14) National Security; (15) Alternative Energy; and (16) Climatic Change.« less

  2. Federal Technology Catalog 1982: Summaries of practical technology

    NASA Astrophysics Data System (ADS)

    The catalog presents summaries of practical technology selected for commercial potential and/or promising applications to the fields of computer technology, electrotechnology, energy, engineering, life sciences, machinery and tools, manufacturing, materials, physical sciences, and testing and instrumentation. Each summary not only describes a technology, but gives a source for further information. This publication describes some 1,100 new processes, inventions, equipment, software, and techniques developed by and for dozens of Federal agencies during 1982. Included is coverage of NASA Tech Briefs, DOE Energygrams, and Army Manufacturing Notes.

  3. Pacific Northwest National Laboratory Annual Site Environmental Report for Calendar Year 2013

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duncan, Joanne P.; Sackschewsky, Michael R.; Tilden, Harold T.

    2014-09-30

    Pacific Northwest National Laboratory (PNNL), one of the U.S. Department of Energy (DOE) Office of Science’s 10 national laboratories, provides innovative science and technology development in the areas of energy and the environment, fundamental and computational science, and national security. DOE’s Pacific Northwest Site Office (PNSO) is responsible for oversight of PNNL at its Campus in Richland, Washington, as well as its facilities in Sequim, Seattle, and North Bonneville, Washington, and Corvallis and Portland, Oregon.

  4. DOE Advanced Scientific Computing Advisory Subcommittee (ASCAC) Report: Top Ten Exascale Research Challenges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lucas, Robert; Ang, James; Bergman, Keren

    2014-02-10

    Exascale computing systems are essential for the scientific fields that will transform the 21st century global economy, including energy, biotechnology, nanotechnology, and materials science. Progress in these fields is predicated on the ability to perform advanced scientific and engineering simulations, and analyze the deluge of data. On July 29, 2013, ASCAC was charged by Patricia Dehmer, the Acting Director of the Office of Science, to assemble a subcommittee to provide advice on exascale computing. This subcommittee was directed to return a list of no more than ten technical approaches (hardware and software) that will enable the development of a systemmore » that achieves the Department's goals for exascale computing. Numerous reports over the past few years have documented the technical challenges and the non¬-viability of simply scaling existing computer designs to reach exascale. The technical challenges revolve around energy consumption, memory performance, resilience, extreme concurrency, and big data. Drawing from these reports and more recent experience, this ASCAC subcommittee has identified the top ten computing technology advancements that are critical to making a capable, economically viable, exascale system.« less

  5. Situating Computer Simulation Professional Development: Does It Promote Inquiry-Based Simulation Use?

    ERIC Educational Resources Information Center

    Gonczi, Amanda L.; Maeng, Jennifer L.; Bell, Randy L.; Whitworth, Brooke A.

    2016-01-01

    This mixed-methods study sought to identify professional development implementation variables that may influence participant (a) adoption of simulations, and (b) use for inquiry-based science instruction. Two groups (Cohort 1, N = 52; Cohort 2, N = 104) received different professional development. Cohort 1 was focused on Web site use mechanics.…

  6. Increasing Advocacy for Information Systems Students with Disabilities through Disability Film Festivals at a Major Metropolitan University

    ERIC Educational Resources Information Center

    Joseph, Anthony; Lawler, James

    2018-01-01

    College does not bestow enough engagement of computer science and information systems students with higher-functioning people with disabilities. Information systems students without disabilities do not have enough experiences in diversity with equivalently skilled students with disabilities. In this paper, the authors expand the knowledge of…

  7. What Does Quality Programming Mean for High Achieving Students?

    ERIC Educational Resources Information Center

    Samudzi, Cleo

    2008-01-01

    The Missouri Academy of Science, Mathematics and Computing (Missouri Academy) is a two-year accelerated, early-entrance-to-college, residential school that matches the level, complexity and pace of the curriculum with the readiness and motivation of high achieving high school students. The school is a part of Northwest Missouri State University…

  8. Life sciences and environmental sciences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1992-02-01

    The DOE laboratories play a unique role in bringing multidisciplinary talents -- in biology, physics, chemistry, computer sciences, and engineering -- to bear on major problems in the life and environmental sciences. Specifically, the laboratories utilize these talents to fulfill OHER's mission of exploring and mitigating the health and environmental effects of energy use, and of developing health and medical applications of nuclear energy-related phenomena. At Lawrence Berkeley Laboratory (LBL) support of this mission is evident across the spectrum of OHER-sponsored research, especially in the broad areas of genomics, structural biology, basic cell and molecular biology, carcinogenesis, energy and environment,more » applications to biotechnology, and molecular, nuclear and radiation medicine. These research areas are briefly described.« less

  9. Laboratory Directed Research and Development Program FY 2008 Annual Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    editor, Todd C Hansen

    2009-02-23

    The Ernest Orlando Lawrence Berkeley National Laboratory (Berkeley Lab or LBNL) is a multi-program national research facility operated by the University of California for the Department of Energy (DOE). As an integral element of DOE's National Laboratory System, Berkeley Lab supports DOE's missions in fundamental science, energy resources, and environmental quality. Berkeley Lab programs advance four distinct goals for DOE and the nation: (1) To perform leading multidisciplinary research in the computing sciences, physical sciences, energy sciences, biosciences, and general sciences in a manner that ensures employee and public safety and protection of the environment. (2) To develop and operatemore » unique national experimental facilities for qualified investigators. (3) To educate and train future generations of scientists and engineers to promote national science and education goals. (4) To transfer knowledge and technological innovations and to foster productive relationships among Berkeley Lab's research programs, universities, and industry in order to promote national economic competitiveness. Berkeley Lab's research and the Laboratory Directed Research and Development (LDRD) program support DOE's Strategic Themes that are codified in DOE's 2006 Strategic Plan (DOE/CF-0010), with a primary focus on Scientific Discovery and Innovation. For that strategic theme, the Fiscal Year (FY) 2008 LDRD projects support each one of the three goals through multiple strategies described in the plan. In addition, LDRD efforts support the four goals of Energy Security, the two goals of Environmental Responsibility, and Nuclear Security (unclassified fundamental research that supports stockpile safety and nonproliferation programs). The LDRD program supports Office of Science strategic plans, including the 20-year Scientific Facilities Plan and the Office of Science Strategic Plan. The research also supports the strategic directions periodically under consideration and review by the Office of Science Program Offices, such as LDRD projects germane to new research facility concepts and new fundamental science directions. Berkeley Lab LDRD program also play an important role in leveraging DOE capabilities for national needs. The fundamental scientific research and development conducted in the program advances the skills and technologies of importance to our Work For Others (WFO) sponsors. Among many directions, these include a broad range of health-related science and technology of interest to the National Institutes of Health, breast cancer and accelerator research supported by the Department of Defense, detector technologies that should be useful to the Department of Homeland Security, and particle detection that will be valuable to the Environmental Protection Agency. The Berkeley Lab Laboratory Directed Research and Development Program FY2008 report is compiled from annual reports submitted by principal investigators following the close of the fiscal year. This report describes the supported projects and summarizes their accomplishments. It constitutes a part of the LDRD program planning and documentation process that includes an annual planning cycle, project selection, implementation, and review.« less

  10. Conformational Dynamics and Proton Relay Positioning in Nickel Catalysts for Hydrogen Production and Oxidation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Franz, James A.; O'Hagan, Molly J.; Ho, Ming-Hsun

    2013-12-09

    The [Ni(PR2NR’2)2]2+ catalysts, (where PR2NR´2 is 1,5-R´-3,7-R-1,5-diaza-3,7-diphosphacyclooctane), are some of the fastest reported for hydrogen production and oxidation, however, chair/boat isomerization and the presence of a fifth solvent ligand have the potential to slow catalysis by incorrectly positioning the pendant amines or blocking the addition of hydrogen. Here, we report the structural dynamics of a series of [Ni(PR2NR’2)2]n+ complexes, characterized by NMR spectroscopy and theoretical modeling. A fast exchange process was observed for the [Ni(CH3CN)(PR2NR’2)2]2+ complexes which depends on the ligand. This exchange process was identified to occur through a three step mechanism including dissociation of the acetonitrile, boat/chair isomerizationmore » of each of the four rings identified by the phosphine ligands (including nitrogen inversion), and reassociation of acetonitrile on the opposite side of the complex. The rate of the chair/boat inversion can be influenced by varying the substituent on the nitrogen atom, but the rate of the overall exchange process is at least an order of magnitude faster than the catalytic rate in acetonitrile demonstrating that the structural dynamics of the [Ni(PR2NR´2)2]2+ complexes does not hinder catalysis. This material is based upon work supported as part of the Center for Molecular Electrocatalysis, an Energy Frontier Research Center funded by the US Department of Energy, Office of Science, Office of Basic Energy Sciences under FWP56073. Research by J.A.F., M.O., M-H. H., M.L.H, D.L.D. A.M.A., S. R. and R.M.B. was carried out in the Center for Molecular Electrocatalysis, an Energy Frontier Research Center funded by the U.S. Department of Energy, Office of Science. W.J.S. and S.L. were funded by the DOE Office of Science Early Career Research Program through the Office of Basic Energy Sciences. T.L. was supported by the US Department of Energy, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences & Biosciences. Pacific Northwest National Laboratory (PNNL) is a multiprogram national laboratory operated for DOE by Battelle. Computational resources were provided at W. R. Wiley Environmental Molecular Science Laboratory (EMSL), a national scientific user facility sponsored by the Department of Energy’s Office of Biological and Environmental Research located at Pacific Northwest National Laboratory; the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory; and the Jaguar supercomputer at Oak Ridge National Laboratory (INCITE 2008-2011 award supported by the Office of Science of the U.S. DOE under Contract No. DE-AC0500OR22725).« less

  11. RIACS/USRA

    NASA Technical Reports Server (NTRS)

    Oliger, Joseph

    1993-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on 6 June 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. A flexible scientific staff is provided through a university faculty visitor program, a post doctoral program, and a student visitor program. Not only does this provide appropriate expertise but it also introduces scientists outside of NASA to NASA problems. A small group of core RIACS staff provides continuity and interacts with an ARC technical monitor and scientific advisory group to determine the RIACS mission. RIACS activities are reviewed and monitored by a USRA advisory council and ARC technical monitor. Research at RIACS is currently being done in the following areas: Parallel Computing, Advanced Methods for Scientific Computing, High Performance Networks and Technology, and Learning Systems. Parallel compiler techniques, adaptive numerical methods for flows in complicated geometries, and optimization were identified as important problems to investigate for ARC's involvement in the Computational Grand Challenges of the next decade.

  12. The Practical Obstacles of Data Transfer: Why researchers still love scp

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nam, Hai Ah; Hill, Jason J; Parete-Koon, Suzanne T

    The importance of computing facilities is heralded every six months with the announcement of the new Top500 list, showcasing the world s fastest supercomputers. Unfortu- nately, with great computing capability does not come great long-term data storage capacity, which often means users must move their data to their local site archive, to remote sites where they may be doing future computation or anal- ysis, or back to their home institution, else face the dreaded data purge that most HPC centers employ to keep utiliza- tion of large parallel filesystems low to manage performance and capacity. At HPC centers, data transfermore » is crucial to the scientific workflow and will increase in importance as computing systems grow in size. The Energy Sciences Net- work (ESnet) recently launched its fifth generation network, a 100 Gbps high-performance, unclassified national network connecting more than 40 DOE research sites to support scientific research and collaboration. Despite the tenfold increase in bandwidth to DOE research sites amenable to multiple data transfer streams and high throughput, in prac- tice, researchers often under-utilize the network and resort to painfully-slow single stream transfer methods such as scp to avoid the complexity of using multiple stream tools such as GridFTP and bbcp, and contend with frustration from the lack of consistency of available tools between sites. In this study we survey and assess the data transfer methods pro- vided at several DOE supported computing facilities, includ- ing both leadership-computing facilities, connected through ESnet. We present observed transfer rates, suggested opti- mizations, and discuss the obstacles the tools must overcome to receive wide-spread adoption over scp.« less

  13. Performance Analysis, Modeling and Scaling of HPC Applications and Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhatele, Abhinav

    2016-01-13

    E cient use of supercomputers at DOE centers is vital for maximizing system throughput, mini- mizing energy costs and enabling science breakthroughs faster. This requires complementary e orts along several directions to optimize the performance of scienti c simulation codes and the under- lying runtimes and software stacks. This in turn requires providing scalable performance analysis tools and modeling techniques that can provide feedback to physicists and computer scientists developing the simulation codes and runtimes respectively. The PAMS project is using time allocations on supercomputers at ALCF, NERSC and OLCF to further the goals described above by performing research alongmore » the following fronts: 1. Scaling Study of HPC applications; 2. Evaluation of Programming Models; 3. Hardening of Performance Tools; 4. Performance Modeling of Irregular Codes; and 5. Statistical Analysis of Historical Performance Data. We are a team of computer and computational scientists funded by both DOE/NNSA and DOE/ ASCR programs such as ECRP, XStack (Traleika Glacier, PIPER), ExaOSR (ARGO), SDMAV II (MONA) and PSAAP II (XPACC). This allocation will enable us to study big data issues when analyzing performance on leadership computing class systems and to assist the HPC community in making the most e ective use of these resources.« less

  14. Motivating Students on ICT-Related Study Programs to Engage with the Subject of Sustainable Development

    ERIC Educational Resources Information Center

    Hilty, Lorenz M.; Huber, Patrizia

    2018-01-01

    Purpose: Sustainable development (SD) does not usually form part of the curriculum of ICT-related study programs such as Computer Science, Information Technology, Information Systems, and Informatics. However, many topics form a bridge between SD and ICT and could potentially be integrated into ICT-related study programs. This paper reports the…

  15. Preface: SciDAC 2009

    NASA Astrophysics Data System (ADS)

    Simon, Horst

    2009-07-01

    By almost any measure, the SciDAC community has come a long way since DOE launched the SciDAC program back in 2001. At the time, we were grappling with how to efficiently run applications on terascale systems (the November 2001 TOP500 list was led by DOE's ASCI White IBM system at Lawrence Livermore achieving 7.2 teraflop/s). And the results stemming from the first round of SciDAC projects were summed up in two-page reports. The scientific results were presented at annual meetings, which were by invitation only and typically were attended by about 75 researchers. Fast forward to 2009 and we now have SciDAC Review, a quarterly magazine showcasing the scientific computing contributions of SciDAC projects and related programs, all focused on presenting a comprehensive look at Scientific Discovery through Advanced Computing. That is also the motivation behind the annual SciDAC conference that in 2009 was held from June 14-18 in San Diego. The annual conference, which can also be described as a celebration of all things SciDAC, grew out those meetings organized in the early days of the program. In 2005, the meeting was held in San Francisco and attendance was opened up to all members of the SciDAC community. The schedule was also expanded to include a keynote address, plenary speakers and other features found in a conference format. This year marks the fifth such SciDAC conference, which now comprises four days of computational science presentations, multiple poster sessions and, since last year, an evening event showcasing simulations and modeling runs resulting from SciDAC projects. The fifth annual SciDAC conference was remarkable on several levels. The primary purpose, of course, is to showcase the research accomplishments resulting from SciDAC programs in particular and computational science in general. It is these accomplishments, represented in 38 papers and 52 posters, that comprise this set of conference proceedings. These proceedings can stand alone as evidence of the success of DOE's innovative SciDAC efforts. But from the outset, a critical driver for the program was to foster increased collaboration among researchers across disciplines and organizations. In particular, SciDAC wanted to engage scientists at universities in the projects, both to expand the community and to develop the next generation of computational scientists. At the meeting in San Diego, the fruits of this emphasis were clearly visible, from the special poster session highlighting the work of the DOE Computational Science Graduate Fellows, to the informal discussions in hotel hallways, to focused side meetings apart from the main presentations. A highlight of the meeting was the keynote address by Dr Ray Orbach, until recently the DOE Under Secretary for Science and head of the Office of Science. It was during his tenure that the first round of projects matured and the second set of SciDAC projects were launched. And complementing these research projects was Dr Orbach's vision for INCITE, DOE's Innovative and Novel Computational Impact on Theory and Experiment program, inaugurated in 2003. This program allocated significant HPC resources to scientists tackling high-impact problems, including some of those addressed by SciDAC teams. Together, SciDAC and INCITE are dramatically accelerating the field of computational science. As has been noted before, the SciDAC conference celebrates progress in advancing science through large-scale modeling and simulation. Over 400 people registered to attend this year's talks, poster sessions and tutorials, all spanning the disciplines supported by DOE. While the principal focus was on SciDAC accomplishments, this year's conference also included invited presentations and posters from colleagues whose research is supported by other agencies. At the 2009 meeting we also formalized a developing synergy with the Department of Defense's HPC Users Group Meeting, which has occasionally met in parallel with the SciDAC meeting. But in San Diego, we took the additional steps of organizing a joint poster session and a joint plenary session, further advancing opportunities for broader networking. Throughout the four-day program, attendees at both meetings had the option of sitting in on sessions at either conference. We also included several of the NSF Petascale applications in the program, and have also extended invitations to our computational colleagues in other federal agencies, including the National Science Foundation, NASA, and the National Oceanographic and Atmospheric Administration, as well as international collaborators to join us in San Diego. In 2009 we also reprised one of the more popular sessions from Seattle in 2008, the Electronic Visualization and Poster Night, during which 29 scientific visualizations were presented on high-resolution large-format displays. The best entries were awarded one of the coveted 'OASCR Awards.' The conference also featured a session about breakthroughs in computational science, based on the 'Breakthrough Report' that was published in 2008, led by Tony Mezzacappa (ORNL). Tony was also the chair of the SciDAC 2005 conference. For the third consecutive year, the conference was followed by a day of tutorials organized by the SciDAC Outreach Center and aimed primarily at students interested in scientific computing. This year, nearly 100 participants attended the tutorials, hosted by the San Diego Supercomputer Center and General Atomics. This outreach to the broader community is really what SciDAC is all about - Scientific Discovery through Advanced Computing. Such discoveries are not confined by organizational lines, but rather are often the result of researchers reaching out and collaborating with others, using their combined expertise to push our boundaries of knowledge. I am happy to see that this vision is shared by so many researchers in computational science, who all decided to join SciDAC 2009. While credit for the excellent presentations and posters goes to the teams of researchers, the success of this year's conference is due to the strong efforts and support from members of the 2009 SciDAC Program Committee and Organizing Committee, and I would like to extend my heartfelt thanks to them for helping to make the 2009 meeting the largest and most successful to date. Program Committee members were: David Bader, LLNL; Pete Beckman, ANL; John Bell, LBNL; John Boisseau, University of Texas; Paul Bonoli, MIT; Hank Childs, LBNL; Bill Collins, LBNL; Jim Davenport, BNL; David Dean, ORNL; Thom Dunning, NCSA; Peg Folta, LLNL; Glenn Hammond, PNNL; Maciej Haranczyk, LBNL; Robert Harrison, ORNL; Paul Hovland, ANL; Paul Kent, ORNL; Aram Kevorkian, SPAWAR; David Keyes, Columbia University; Kwok Ko, SLAC; Felice Lightstone, LLNL; Bob Lucas, ISI/USC; Paul Mackenzie, Fermilab; Tony Mezzacappa, ORNL; John Negele, MIT; Jeff Nichols, ORNL; Mike Norman, UCSD; Joe Oefelein, SNL; Jeanie Osburn, NRL; Peter Ostroumov, ANL; Valerio Pascucci, University of Utah; Ruth Pordes, Fermilab; Rob Ross, ANL; Nagiza Samatova, ORNL; Martin Savage, University of Washington; Tim Scheibe, PNNL; Ed Seidel, NSF; Arie Shoshani, LBNL; Rick Stevens, ANL; Bob Sugar, UCSB; Bill Tang, PPPL; Bob Wilhelmson, NCSA; Kathy Yelick, NERSC/LBNL; Dave Zachmann, Vista Computational Technology LLC. Organizing Committee members were: Communications: Jon Bashor, LBNL. Contracts/Logistics: Mary Spada and Cheryl Zidel, ANL. Posters: David Bailey, LBNL. Proceedings: John Hules, LBNL. Proceedings Database Developer: Beth Cerny Patino, ANL. Program Committee Liaison/Conference Web Site: Yeen Mankin, LBNL. Tutorials: David Skinner, NERSC/LBNL. Visualization Night: Hank Childs, LBNL; Valerio Pascucci, Chems Touati, Nathan Galli, and Erik Jorgensen, University of Utah. Again, my thanks to all. Horst Simon San Diego, California June 18, 2009

  16. U.S, Department of Energy's Bioenergy Research Centers An Overview of the Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2009-07-01

    Alternative fuels from renewable cellulosic biomass--plant stalks, trunks, stems, and leaves--are expected to significantly reduce U.S. dependence on imported oil while enhancing national energy security and decreasing the environmental impacts of energy use. Ethanol and other advanced biofuels from cellulosic biomass are renewable alternatives that could increase domestic production of transportation fuels, revitalize rural economies, and reduce carbon dioxide and pollutant emissions. According to U.S. Secretary of Energy Steven Chu, 'Developing the next generation of biofuels is key to our effort to end our dependence on foreign oil and address the climate crisis while creating millions of new jobs thatmore » can't be outsourced'. In the United States, the Energy Independence and Security Act (EISA) of 2007 is an important driver for the sustainable development of renewable biofuels. As part of EISA, the Renewable Fuel Standard mandates that 36 billion gallons of biofuels are to be produced annually by 2022, of which 16 billion gallons are expected to come from cellulosic feedstocks. Although cellulosic ethanol production has been demonstrated on a pilot level, developing a cost-effective, commercial-scale cellulosic biofuel industry will require transformational science to significantly streamline current production processes. Woodchips, grasses, cornstalks, and other cellulosic biomass are widely abundant but more difficult to break down into sugars than corn grain--the primary source of U.S. ethanol fuel production today. Biological research is key to accelerating the deconstruction of cellulosic biomass into sugars that can be converted to biofuels. The Department of Energy (DOE) Office of Science continues to play a major role in inspiring, supporting, and guiding the biotechnology revolution over the past 25 years. The DOE Genomic Science Program is advancing a new generation of research focused on achieving whole-systems understanding for biology. This program is bringing together scientists in diverse fields to understand the complex biology underlying solutions to DOE missions in energy production, environmental remediation, and climate change science. New interdisciplinary research communities are emerging, as are knowledgebases and scientific and computational resources critical to advancing large-scale, genome-based biology. To focus the most advanced biotechnology-based resources on the biological challenges of biofuel production, DOE established three Bioenergy Research Centers (BRCs) in September 2007. Each center is pursuing the basic research underlying a range of high-risk, high-return biological solutions for bioenergy applications. Advances resulting from the BRCs will provide the knowledge needed to develop new biobased products, methods, and tools that the emerging biofuel industry can use. The scientific rationale for these centers and for other fundamental genomic research critical to the biofuel industry was established at a DOE workshop involving members of the research community (see sidebar, Biofuel Research Plan, below). The DOE BRCs have developed automated, high-throughput analysis pipelines that will accelerate scientific discovery for biology-based biofuel research. The three centers, which were selected through a scientific peer-review process, are based in geographically diverse locations--the Southeast, the Midwest, and the West Coast--with partners across the nation. DOE's Oak Ridge National Laboratory leads the BioEnergy Science Center (BESC) in Tennessee; the University of Wisconsin-Madison leads the Great Lakes Bioenergy Research Center (GLBRC); and DOE's Lawrence Berkeley National Laboratory leads the DOE Joint BioEnergy Institute (JBEI) in California. Each center represents a multidisciplinary partnership with expertise spanning the physical and biological sciences, including genomics, microbial and plant biology, analytical chemistry, computational biology and bioinformatics, and engineering. Institutional partners include DOE national laboratories, universities, private companies, and nonprofit organizations.« less

  17. DOE Network 2025: Network Research Problems and Challenges for DOE Scientists. Workshop Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None, None

    2016-02-01

    The growing investments in large science instruments and supercomputers by the US Department of Energy (DOE) hold enormous promise for accelerating the scientific discovery process. They facilitate unprecedented collaborations of geographically dispersed teams of scientists that use these resources. These collaborations critically depend on the production, sharing, moving, and management of, as well as interactive access to, large, complex data sets at sites dispersed across the country and around the globe. In particular, they call for significant enhancements in network capacities to sustain large data volumes and, equally important, the capabilities to collaboratively access the data across computing, storage, andmore » instrument facilities by science users and automated scripts and systems. Improvements in network backbone capacities of several orders of magnitude are essential to meet these challenges, in particular, to support exascale initiatives. Yet, raw network speed represents only a part of the solution. Indeed, the speed must be matched by network and transport layer protocols and higher layer tools that scale in ways that aggregate, compose, and integrate the disparate subsystems into a complete science ecosystem. Just as important, agile monitoring and management services need to be developed to operate the network at peak performance levels. Finally, these solutions must be made an integral part of the production facilities by using sound approaches to develop, deploy, diagnose, operate, and maintain them over the science infrastructure.« less

  18. Computational Science: A Research Methodology for the 21st Century

    NASA Astrophysics Data System (ADS)

    Orbach, Raymond L.

    2004-03-01

    Computational simulation - a means of scientific discovery that employs computer systems to simulate a physical system according to laws derived from theory and experiment - has attained peer status with theory and experiment. Important advances in basic science are accomplished by a new "sociology" for ultrascale scientific computing capability (USSCC), a fusion of sustained advances in scientific models, mathematical algorithms, computer architecture, and scientific software engineering. Expansion of current capabilities by factors of 100 - 1000 open up new vistas for scientific discovery: long term climatic variability and change, macroscopic material design from correlated behavior at the nanoscale, design and optimization of magnetic confinement fusion reactors, strong interactions on a computational lattice through quantum chromodynamics, and stellar explosions and element production. The "virtual prototype," made possible by this expansion, can markedly reduce time-to-market for industrial applications such as jet engines and safer, more fuel efficient cleaner cars. In order to develop USSCC, the National Energy Research Scientific Computing Center (NERSC) announced the competition "Innovative and Novel Computational Impact on Theory and Experiment" (INCITE), with no requirement for current DOE sponsorship. Fifty nine proposals for grand challenge scientific problems were submitted for a small number of awards. The successful grants, and their preliminary progress, will be described.

  19. Wildlife software: procedures for publication of computer software

    USGS Publications Warehouse

    Samuel, M.D.

    1990-01-01

    Computers and computer software have become an integral part of the practice of wildlife science. Computers now play an important role in teaching, research, and management applications. Because of the specialized nature of wildlife problems, specific computer software is usually required to address a given problem (e.g., home range analysis). This type of software is not usually available from commercial vendors and therefore must be developed by those wildlife professionals with particular skill in computer programming. Current journal publication practices generally prevent a detailed description of computer software associated with new techniques. In addition, peer review of journal articles does not usually include a review of associated computer software. Thus, many wildlife professionals are usually unaware of computer software that would meet their needs or of major improvements in software they commonly use. Indeed most users of wildlife software learn of new programs or important changes only by word of mouth.

  20. Synergistic Effect of Nitrogen in Cobalt Nitride and Nitrogen-Doped Hollow Carbon Spheres for Oxygen Reduction Reaction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhong, Xing; Liu, Lin; Jiang, Yu

    The need for inexpensive and high-activity oxygen reduction reaction (ORR) electrocatalysts has attracted considerable research interest over the past years. Here we report a novel hybrid that contains cobalt nitride/nitrogen-rich hollow carbon spheres (CoxN/NHCS) as a high-performance catalyst for ORR. The CoxN nanoparticles were uniformly dispersed and confined in the hollow NHCS shell. The performance of the resulting CoxN/NHCS hybrid was comparable with that of a commercial Pt/C at the same catalyst loading toward ORR, but the mass activity of the former was 5.7 times better than that of the latter. The nitrogen in both CoxN and NHCS, especially CoxN,more » could weaken the adsorption of reaction intermediates (O and OOH), which follows the favourable reaction pathway on CoxN/NHCS according to the DFT-calculated Gibbs free energy diagrams. Our results demonstrated a new strategy for designing and developing inexpensive, non-precious metal electrocatalysts for next-generation fuels. The authors acknowledge the financial support from the National Basic Research Program (973 program, No. 2013CB733501) and the National Natural Science Foundation of China (No. 21306169, 21101137, 21136001, 21176221 and 91334013). Dr. D. Mei is supported by the US Department of Energy (DOE), Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences & Biosciences. Pacific Northwest National Laboratory (PNNL) is a multiprogram national laboratory operated for DOE by Battelle. Computing time was granted by the grand challenge of computational catalysis of the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL). EMSL is a national scientific user facility located at Pacific Northwest National Laboratory (PNNL) and sponsored by DOE’s Office of Biological and Environmental Research.« less

  1. What do computer scientists tweet? Analyzing the link-sharing practice on Twitter.

    PubMed

    Schmitt, Marco; Jäschke, Robert

    2017-01-01

    Twitter communication has permeated every sphere of society. To highlight and share small pieces of information with possibly vast audiences or small circles of the interested has some value in almost any aspect of social life. But what is the value exactly for a scientific field? We perform a comprehensive study of computer scientists using Twitter and their tweeting behavior concerning the sharing of web links. Discerning the domains, hosts and individual web pages being tweeted and the differences between computer scientists and a Twitter sample enables us to look in depth at the Twitter-based information sharing practices of a scientific community. Additionally, we aim at providing a deeper understanding of the role and impact of altmetrics in computer science and give a glance at the publications mentioned on Twitter that are most relevant for the computer science community. Our results show a link sharing culture that concentrates more heavily on public and professional quality information than the Twitter sample does. The results also show a broad variety in linked sources and especially in linked publications with some publications clearly related to community-specific interests of computer scientists, while others with a strong relation to attention mechanisms in social media. This refers to the observation that Twitter is a hybrid form of social media between an information service and a social network service. Overall the computer scientists' style of usage seems to be more on the information-oriented side and to some degree also on professional usage. Therefore, altmetrics are of considerable use in analyzing computer science.

  2. What do computer scientists tweet? Analyzing the link-sharing practice on Twitter

    PubMed Central

    Schmitt, Marco

    2017-01-01

    Twitter communication has permeated every sphere of society. To highlight and share small pieces of information with possibly vast audiences or small circles of the interested has some value in almost any aspect of social life. But what is the value exactly for a scientific field? We perform a comprehensive study of computer scientists using Twitter and their tweeting behavior concerning the sharing of web links. Discerning the domains, hosts and individual web pages being tweeted and the differences between computer scientists and a Twitter sample enables us to look in depth at the Twitter-based information sharing practices of a scientific community. Additionally, we aim at providing a deeper understanding of the role and impact of altmetrics in computer science and give a glance at the publications mentioned on Twitter that are most relevant for the computer science community. Our results show a link sharing culture that concentrates more heavily on public and professional quality information than the Twitter sample does. The results also show a broad variety in linked sources and especially in linked publications with some publications clearly related to community-specific interests of computer scientists, while others with a strong relation to attention mechanisms in social media. This refers to the observation that Twitter is a hybrid form of social media between an information service and a social network service. Overall the computer scientists’ style of usage seems to be more on the information-oriented side and to some degree also on professional usage. Therefore, altmetrics are of considerable use in analyzing computer science. PMID:28636619

  3. Ammonia Oxidation by Abstraction of Three Hydrogen Atoms from a Mo–NH 3 Complex

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhattacharya, Papri; Heiden, Zachariah M.; Wiedner, Eric S.

    We report ammonia oxidation by homolytic cleavage of all three H atoms from a Mo-15NH3 complex using the 2,4,6-tri-tert-butylphenoxyl radical to afford a Mo-alkylimido (Mo=15NR) complex (R = 2,4,6-tri-t-butylcyclohexa-2,5-dien-1-one). Reductive cleavage of Mo=15NR generates a terminal Mo≡N nitride, and a [Mo-15NH]+ complex is formed by protonation. Computational analysis describes the energetic profile for the stepwise removal of three H atoms from the Mo-15NH3 complex and the formation of Mo=15NR. Acknowledgment. This work was supported as part of the Center for Molecular Electrocatalysis, an Energy Frontier Re-search Center funded by the U.S. Department of Energy (U.S. DOE), Office of Science, Officemore » of Basic Energy Sciences. EPR and mass spectrometry experiments were performed using EMSL, a national scientific user facility sponsored by the DOE’s Office of Biological and Environmental Research and located at PNNL. The authors thank Dr. Eric D. Walter and Dr. Rosalie Chu for assistance in performing EPR and mass spectroscopy analysis, respectively. Computational resources provided by the National Energy Re-search Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory. Pacific North-west National Laboratory is operated by Battelle for the U.S. DOE.« less

  4. A National Study of the Relationship between Home Access to a Computer and Academic Performance Scores of Grade 12 U.S. Science Students: An Analysis of the 2009 NAEP Data

    NASA Astrophysics Data System (ADS)

    Coffman, Mitchell Ward

    The purpose of this dissertation was to examine the relationship between student access to a computer at home and academic achievement. The 2009 National Assessment of Educational Progress (NAEP) dataset was probed using the National Data Explorer (NDE) to investigate correlations in the subsets of SES, Parental Education, Race, and Gender as it relates to access of a home computer and improved performance scores for U.S. public school grade 12 science students. A causal-comparative approach was employed seeking clarity on the relationship between home access and performance scores. The influence of home access cannot overcome the challenges students of lower SES face. The achievement gap, or a second digital divide, for underprivileged classes of students, including minorities does not appear to contract via student access to a home computer. Nonetheless, in tests for significance, statistically significant improvement in science performance scores was reported for those having access to a computer at home compared to those not having access. Additionally, regression models reported evidence of correlations between and among subsets of controls for the demographic factors gender, race, and socioeconomic status. Variability in these correlations was high; suggesting influence from unobserved factors may have more impact upon the dependent variable. Having access to a computer at home increases performance scores for grade 12 general science students of all races, genders and socioeconomic levels. However, the performance gap is roughly equivalent to the existing performance gap of the national average for science scores, suggesting little influence from access to a computer on academic achievement. The variability of scores reported in the regression analysis models reflects a moderate to low effect, suggesting an absence of causation. These statistical results are accurate and confirm the literature review, whereby having access to a computer at home and the predictor variables were found to have a significant impact on performance scores, although the data presented suggest computer access at home is less influential upon performance scores than poverty and its correlates.

  5. Reverse Engineering and Software Products Reuse to Teach Collaborative Web Portals: A Case Study with Final-Year Computer Science Students

    ERIC Educational Resources Information Center

    Medina-Dominguez, Fuensanta; Sanchez-Segura, Maria-Isabel; Mora-Soto, Arturo; Amescua, Antonio

    2010-01-01

    The development of collaborative Web applications does not follow a software engineering methodology. This is because when university students study Web applications in general, and collaborative Web portals in particular, they are not being trained in the use of software engineering techniques to develop collaborative Web portals. This paper…

  6. Does Like Seek Like?: The Formation of Working Groups in a Programming Project

    ERIC Educational Resources Information Center

    Sanou Gozalo, Eduard; Hernández-Fernández, Antoni; Arias, Marta; Ferrer-i-Cancho, Ramon

    2017-01-01

    In a course of the degree of computer science, the programming project has changed from individual to teamed work, tentatively in couples (pair programming). Students have full freedom to team up with minimum intervention from teachers. The analysis of the working groups made indicates that students do not tend to associate with students with a…

  7. Dehydration of 1-octadecanol over H-BEA: A combined experimental and computational study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Song, Wenji; Liu, Yuanshuai; Barath, Eszter

    Liquid phase dehydration of 1-octdecanol, which is intermediately formed during the hydrodeoxygenation of microalgae oil, has been explored in a combined experimental and computational study. The alkyl chain of C18 alcohol interacts with acid sites during diffusion inside the zeolite pores, resulting in an inefficient utilization of the Brønsted acid sites for samples with high acid site concentrations. The parallel intra- and inter- molecular dehydration pathways having different activation energies pass through alternative reaction intermediates. Formation of surface-bound alkoxide species is the rate-limiting step during intramolecular dehydration, whereas intermolecular dehydration proceeds via a bulky dimer intermediate. Octadecene is the primarymore » dehydration product over H-BEA at 533 K. Despite of the main contribution of Brønsted acid sites towards both dehydration pathways, Lewis acid sites are also active in the formation of dioctadecyl ether. The intramolecular dehydration to octadecene and cleavage of the intermediately formed ether, however, require strong BAS. L. Wang, D. Mei and J. A. Lercher, acknowledge the partial support from the US Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences & Biosciences. Pacific Northwest National Laboratory (PNNL) is a multiprogram national laboratory operated for DOE by Battelle. Computing time was granted by the grand challenge of computational catalysis of the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) and by the National Energy Research Scientific Computing Center (NERSC). EMSL is a national scientific user facility located at Pacific Northwest National Laboratory (PNNL) and sponsored by DOE’s Office of Biological and Environmental Research.« less

  8. Mechanistic insights into aqueous phase propanol dehydration in H-ZSM-5 zeolite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mei, Donghai; Lercher, Johannes A.

    Aqueous phase dehydration of 1-propanol over H-ZSM-5 zeolite was investigated using density functional theory (DFT) calculations. The water molecules in the zeolite pores prefer to aggregate via the hydrogen bonding network and be protonated at the Brønsted acidic sites (BAS). Two typical configurations, i.e., dispersed and clustered, of water molecules were identified by ab initio molecular dynamics simulation of the mimicking aqueous phase H-ZSM-5 zeolite unit cell with 20 water molecules per unit cell. DFT calculated Gibbs free energies suggest that the dimeric propanol-propanol, the propanol-water complex, and the trimeric propanol-propanol-water are formed at high propanol concentrations, which provide amore » kinetically feasible dehydration reaction channel of 1-propanol to propene. However, calculation results also indicate that the propanol dehydration via the unimolecular mechanism becomes kinetically discouraged due to the enhanced stability of the protonated dimeric propanol and the protonated water cluster acting as the BAS site for alcohol dehydration reaction. This work was supported by the US Department of Energy (DOE), Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences & Biosciences. Pacific Northwest National Laboratory (PNNL) is a multiprogram national laboratory operated for DOE by Battelle. Computing time was granted by the grand challenge of computational catalysis of the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL). EMSL is a national scientific user facility located at Pacific Northwest National Laboratory (PNNL) and sponsored by DOE’s Office of Biological and Environmental Research.« less

  9. Reply to comment by Melsen et al. on "Most computational hydrology is not reproducible, so is it really science?"

    NASA Astrophysics Data System (ADS)

    Hutton, Christopher; Wagener, Thorsten; Freer, Jim; Han, Dawei; Duffy, Chris; Arheimer, Berit

    2017-03-01

    In this article, we reply to a comment made by Melsen et al. [2017] on our previous commentary regarding reproducibility in computational hydrology. Re-executing someone else's code and workflow to derive a set of published results does not by itself constitute reproducibility. However, it forms a key part of the process: it demonstrates that all the degrees of freedom and choices made by the scientist in running the experiment are contained within that code and workflow. This does not only allow us to build and extend directly from the original work, but with full knowledge of decisions made in the original experimental setup, we can then focus our attention to the degrees of freedom of interest: those that occur in hydrological systems that are ultimately our subject of study.

  10. U.S. Department of Energy's Bioenergy Research Centers An Overview of the Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2010-07-01

    Alternative fuels from renewable cellulosic biomass - plant stalks, trunks, stems, and leaves - are expected to significantly reduce U.S. dependence on imported oil while enhancing national energy security and decreasing the environmental impacts of energy use. Ethanol and other advanced biofuels from cellulosic biomass are renewable alternatives that could increase domestic production of transportation fuels, revitalize rural economies, and reduce carbon dioxide and pollutant emissions. According to U.S. Secretary of Energy Steven Chu, 'Developing the next generation of biofuels is key to our effort to end our dependence on foreign oil and address the climate crisis while creating millionsmore » of new jobs that can't be outsourced.' Although cellulosic ethanol production has been demonstrated on a pilot level, developing a cost-effective, commercial-scale cellulosic biofuel industry will require transformational science to significantly streamline current production processes. Woodchips, grasses, cornstalks, and other cellulosic biomass are widely abundant but more difficult to break down into sugars than corn grain - the primary source of U.S. ethanol fuel production today. Biological research is key to accelerating the deconstruction of cellulosic biomass into sugars that can be converted to biofuels. The Department of Energy (DOE) Office of Science continues to play a major role in inspiring, supporting, and guiding the biotechnology revolution over the past 30 years. The DOE Genomic Science program is advancing a new generation of research focused on achieving whole-systems understanding of biology. This program is bringing together scientists in diverse fields to understand the complex biology underlying solutions to DOE missions in energy production, environmental remediation, and climate change science. For more information on the Genomic Science program, see p. 26. To focus the most advanced biotechnology-based resources on the biological challenges of biofuel production, DOE established three Bioenergy Research Centers (BRCs) in September 2007. Each center is pursuing the basic research underlying a range of high-risk, high-return biological solutions for bioenergy applications. Advances resulting from the BRCs are providing the knowledge needed to develop new biobased products, methods, and tools that the emerging biofuel industry can use (see sidebar, Bridging the Gap from Fundamental Biology to Industrial Innovation for Bioenergy, p. 6). The DOE BRCs have developed automated, high-throughput analysis pipelines that will accelerate scientific discovery for biology-based biofuel research. The three centers, which were selected through a scientific peer-review process, are based in geographically diverse locations - the Southeast, the Midwest, and the West Coast - with partners across the nation (see U.S. map, DOE Bioenergy Research Centers and Partners, on back cover). DOE's Lawrence Berkeley National Laboratory leads the DOE Joint BioEnergy Institute (JBEI) in California; DOE's Oak Ridge National Laboratory leads the BioEnergy Science Center (BESC) in Tennessee; and the University of Wisconsin-Madison leads the Great Lakes Bioenergy Research Center (GLBRC). Each center represents a multidisciplinary partnership with expertise spanning the physical and biological sciences, including genomics, microbial and plant biology, analytical chemistry, computational biology and bioinformatics, and engineering. Institutional partners include DOE national laboratories, universities, private companies, and nonprofit organizations.« less

  11. 2013 Progress Report -- DOE Joint Genome Institute

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2013-11-01

    In October 2012, we introduced a 10-Year Strategic Vision [http://bit.ly/JGI-Vision] for the Institute. A central focus of this Strategic Vision is to bridge the gap between sequenced genomes and an understanding of biological functions at the organism and ecosystem level. This involves the continued massive-scale generation of sequence data, complemented by orthogonal new capabilities to functionally annotate these large sequence data sets. Our Strategic Vision lays out a path to guide our decisions and ensure that the evolving set of experimental and computational capabilities available to DOE JGI users will continue to enable groundbreaking science.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heroux, Michael; Lethin, Richard

    Programming models and environments play the essential roles in high performance computing of enabling the conception, design, implementation and execution of science and engineering application codes. Programmer productivity is strongly influenced by the effectiveness of our programming models and environments, as is software sustainability since our codes have lifespans measured in decades, so the advent of new computing architectures, increased concurrency, concerns for resilience, and the increasing demands for high-fidelity, multi-physics, multi-scale and data-intensive computations mean that we have new challenges to address as part of our fundamental R&D requirements. Fortunately, we also have new tools and environments that makemore » design, prototyping and delivery of new programming models easier than ever. The combination of new and challenging requirements and new, powerful toolsets enables significant synergies for the next generation of programming models and environments R&D. This report presents the topics discussed and results from the 2014 DOE Office of Science Advanced Scientific Computing Research (ASCR) Programming Models & Environments Summit, and subsequent discussions among the summit participants and contributors to topics in this report.« less

  13. Extending Landauer's bound from bit erasure to arbitrary computation

    NASA Astrophysics Data System (ADS)

    Wolpert, David

    The minimal thermodynamic work required to erase a bit, known as Landauer's bound, has been extensively investigated both theoretically and experimentally. However, when viewed as a computation that maps inputs to outputs, bit erasure has a very special property: the output does not depend on the input. Existing analyses of thermodynamics of bit erasure implicitly exploit this property, and thus cannot be directly extended to analyze the computation of arbitrary input-output maps. Here we show how to extend these earlier analyses of bit erasure to analyze the thermodynamics of arbitrary computations. Doing this establishes a formal connection between the thermodynamics of computers and much of theoretical computer science. We use this extension to analyze the thermodynamics of the canonical ``general purpose computer'' considered in computer science theory: a universal Turing machine (UTM). We consider a UTM which maps input programs to output strings, where inputs are drawn from an ensemble of random binary sequences, and prove: i) The minimal work needed by a UTM to run some particular input program X and produce output Y is the Kolmogorov complexity of Y minus the log of the ``algorithmic probability'' of Y. This minimal amount of thermodynamic work has a finite upper bound, which is independent of the output Y, depending only on the details of the UTM. ii) The expected work needed by a UTM to compute some given output Y is infinite. As a corollary, the overall expected work to run a UTM is infinite. iii) The expected work needed by an arbitrary Turing machine T (not necessarily universal) to compute some given output Y can either be infinite or finite, depending on Y and the details of T. To derive these results we must combine ideas from nonequilibrium statistical physics with fundamental results from computer science, such as Levin's coding theorem and other theorems about universal computation. I would like to ackowledge the Santa Fe Institute, Grant No. TWCF0079/AB47 from the Templeton World Charity Foundation, Grant No. FQXi-RHl3-1349 from the FQXi foundation, and Grant No. CHE-1648973 from the U.S. National Science Foundation.

  14. 78 FR 716 - DOE/NSF Nuclear Science Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-04

    ... DEPARTMENT OF ENERGY DOE/NSF Nuclear Science Advisory Committee AGENCY: Office of Science, DOE. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the DOE/NSF Nuclear Science... Energy and the National Science Foundation on scientific priorities within the field of basic nuclear...

  15. Calendar Year 2001 Annual Site Environmental Report, Sandia National Laboratories, Albuquerque, New Mexico

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    VIGIL, FRANCINE S.; SANCHEZ, REBECCA D.; WAGNER, KATRINA

    2002-09-01

    Sandia National Laboratories, New Mexico (SNL/NM) is a government-owned, contractor-operated facility overseen by the U.S. Department of Energy (DOE), National Nuclear Security Administration (NNSA) through the Albuquerque Operations Office (AL), Office of Kirtland Site Operations (OKSO). Sandia Corporation, a wholly-owned subsidiary of Lockheed Martin Corporation, operates SNL/NM. Work performed at SNL/NM is in support of the DOE and Sandia Corporation's mission to provide weapon component technology and hardware for the needs of the nation's security. Sandia Corporation also conducts fundamental research and development (R&D) to advance technology in energy research, computer science, waste management, microelectronics, materials science, and transportation safetymore » for hazardous and nuclear components. In support of Sandia Corporation's mission, the Integrated Safety and Security (ISS) Center and the Environmental Restoration (ER) Project at SNL/NM have established extensive environmental programs to assist Sandia Corporation's line organizations in meeting all applicable local, state, and federal environmental regulations and DOE requirements. This annual report summarizes data and the compliance status of Sandia Corporation's environmental protection and monitoring programs through December 31, 2001. Major environmental programs include air quality, water quality, groundwater protection, terrestrial surveillance, waste management, pollution prevention (P2), environmental remediation, oil and chemical spill prevention, and the National Environmental Policy Act (NEPA). Environmental monitoring and surveillance programs are required by DOE Order 5400.1, General Environmental Protection Program (DOE 1990) and DOE Order 231.1, Environment, Safety, and Health Reporting (DOE 1996).« less

  16. Computational immunology--from bench to virtual reality.

    PubMed

    Chan, Cliburn; Kepler, Thomas B

    2007-02-01

    Drinking from a fire-hose is an old cliché for the experience of learning basic and clinical sciences in medical school, and the pipe has been growing fatter at an alarming rate. Of course, it does not stop when one graduates; if anything, both the researcher and clinician are flooded with even more information. Slightly embarrassingly, while modern science is very good at generating new information, our ability to weave multiple strands of data into a useful and coherent story lags quite far behind. Bioinformatics, systems biology and computational medicine have arisen in recent years to address just this challenge. This essay is an introduction to the problem of data synthesis and integration in biology and medicine, and how the relatively new art of biological simulation can provide a new kind of map for understanding physiology and pathology. The nascent field of computational immunology will be used for illustration, but similar trends are occurring broadly across all of biology and medicine.

  17. Mars Science Laboratory Heatshield Aerothermodynamics: Design and Reconstruction

    NASA Technical Reports Server (NTRS)

    Edquist, Karl T.; Hollis, Brian R.; Johnston, Christopher O.; Bose, Deepak; White, Todd R.; Mahzari, Milad

    2013-01-01

    The Mars Science Laboratory heatshield was designed to withstand a fully turbulent heat pulse based on test results and computational analysis on a pre-flight design trajectory. Instrumentation on the flight heatshield measured in-depth temperatures in the thermal protection system. The data indicate that boundary layer transition occurred at 5 of 7 thermocouple locations prior to peak heating. Data oscillations at 3 pressure measurement locations may also indicate transition. This paper presents the heatshield temperature and pressure data, possible explanations for the timing of boundary layer transition, and a qualitative comparison of reconstructed and computational heating on the as-flown trajectory. Boundary layer Reynolds numbers that are typically used to predict transition are compared to observed transition at various heatshield locations. A uniform smooth-wall transition Reynolds number does not explain the timing of boundary layer transition observed during flight. A roughness-based Reynolds number supports the possibility of transition due to discrete or distributed roughness elements on the heatshield. However, the distributed roughness height would have needed to be larger than the pre-flight assumption. The instrumentation confirmed the predicted location of maximum turbulent heat flux near the leeside shoulder. The reconstructed heat flux at that location is bounded by smooth-wall turbulent calculations on the reconstructed trajectory, indicating that augmentation due to surface roughness probably did not occur. Turbulent heating on the downstream side of the heatshield nose exceeded smooth-wall computations, indicating that roughness may have augmented heating. The stagnation region also experienced heating that exceeded computational levels, but shock layer radiation does not fully explain the differences.

  18. Basic Energy Sciences Exascale Requirements Review. An Office of Science review sponsored jointly by Advanced Scientific Computing Research and Basic Energy Sciences, November 3-5, 2015, Rockville, Maryland

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Windus, Theresa; Banda, Michael; Devereaux, Thomas

    Computers have revolutionized every aspect of our lives. Yet in science, the most tantalizing applications of computing lie just beyond our reach. The current quest to build an exascale computer with one thousand times the capability of today’s fastest machines (and more than a million times that of a laptop) will take researchers over the next horizon. The field of materials, chemical reactions, and compounds is inherently complex. Imagine millions of new materials with new functionalities waiting to be discovered — while researchers also seek to extend those materials that are known to a dizzying number of new forms. Wemore » could translate massive amounts of data from high precision experiments into new understanding through data mining and analysis. We could have at our disposal the ability to predict the properties of these materials, to follow their transformations during reactions on an atom-by-atom basis, and to discover completely new chemical pathways or physical states of matter. Extending these predictions from the nanoscale to the mesoscale, from the ultrafast world of reactions to long-time simulations to predict the lifetime performance of materials, and to the discovery of new materials and processes will have a profound impact on energy technology. In addition, discovery of new materials is vital to move computing beyond Moore’s law. To realize this vision, more than hardware is needed. New algorithms to take advantage of the increase in computing power, new programming paradigms, and new ways of mining massive data sets are needed as well. This report summarizes the opportunities and the requisite computing ecosystem needed to realize the potential before us. In addition to pursuing new and more complete physical models and theoretical frameworks, this review found that the following broadly grouped areas relevant to the U.S. Department of Energy (DOE) Office of Advanced Scientific Computing Research (ASCR) would directly affect the Basic Energy Sciences (BES) mission need. Simulation, visualization, and data analysis are crucial for advances in energy science and technology. Revolutionary mathematical, software, and algorithm developments are required in all areas of BES science to take advantage of exascale computing architectures and to meet data analysis, management, and workflow needs. In partnership with ASCR, BES has an emerging and pressing need to develop new and disruptive capabilities in data science. More capable and larger high-performance computing (HPC) and data ecosystems are required to support priority research in BES. Continued success in BES research requires developing the next-generation workforce through education and training and by providing sustained career opportunities.« less

  19. ORNL Sustainable Campus Initiative

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Halford, Christopher K

    2012-01-01

    The research conducted at Oak Ridge National Laboratory (ORNL) spans many disciplines and has the potential for far-reaching impact in many areas of everyday life. ORNL researchers and operations staff work on projects in areas as diverse as nuclear power generation, transportation, materials science, computing, and building technologies. As the U.S. Department of Energy s (DOE) largest science and energy research facility, ORNL seeks to establish partnerships with industry in the development of innovative new technologies. The primary focus of this current research deals with developing technologies which improve or maintain the quality of life for humans while reducing themore » overall impact on the environment. In its interactions with industry, ORNL serves as both a facility for sustainable research, as well as a representative of DOE to the private sector. For these reasons it is important that the everyday operations of the Laboratory reflect a dedication to the concepts of stewardship and sustainability.« less

  20. FLASH Interface; a GUI for managing runtime parameters in FLASH simulations

    NASA Astrophysics Data System (ADS)

    Walker, Christopher; Tzeferacos, Petros; Weide, Klaus; Lamb, Donald; Flocke, Norbert; Feister, Scott

    2017-10-01

    We present FLASH Interface, a novel graphical user interface (GUI) for managing runtime parameters in simulations performed with the FLASH code. FLASH Interface supports full text search of available parameters; provides descriptions of each parameter's role and function; allows for the filtering of parameters based on categories; performs input validation; and maintains all comments and non-parameter information already present in existing parameter files. The GUI can be used to edit existing parameter files or generate new ones. FLASH Interface is open source and was implemented with the Electron framework, making it available on Mac OSX, Windows, and Linux operating systems. The new interface lowers the entry barrier for new FLASH users and provides an easy-to-use tool for experienced FLASH simulators. U.S. Department of Energy (DOE), NNSA ASC/Alliances Center for Astrophysical Thermonuclear Flashes, U.S. DOE NNSA ASC through the Argonne Institute for Computing in Science, U.S. National Science Foundation.

  1. Final Report on DOE Project entitled Dynamic Optimized Advanced Scheduling of Bandwidth Demands for Large-Scale Science Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramamurthy, Byravamurthy

    2014-05-05

    In this project, developed scheduling frameworks for dynamic bandwidth demands for large-scale science applications. In particular, we developed scheduling algorithms for dynamic bandwidth demands in this project. Apart from theoretical approaches such as Integer Linear Programming, Tabu Search and Genetic Algorithm heuristics, we have utilized practical data from ESnet OSCARS project (from our DOE lab partners) to conduct realistic simulations of our approaches. We have disseminated our work through conference paper presentations and journal papers and a book chapter. In this project we addressed the problem of scheduling of lightpaths over optical wavelength division multiplexed (WDM) networks. We published severalmore » conference papers and journal papers on this topic. We also addressed the problems of joint allocation of computing, storage and networking resources in Grid/Cloud networks and proposed energy-efficient mechanisms for operatin optical WDM networks.« less

  2. Management, Analysis, and Visualization of Experimental and Observational Data – The Convergence of Data and Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bethel, E. Wes; Greenwald, Martin; Kleese van Dam, Kerstin

    Scientific user facilities—particle accelerators, telescopes, colliders, supercomputers, light sources, sequencing facilities, and more—operated by the U.S. Department of Energy (DOE) Office of Science (SC) generate ever increasing volumes of data at unprecedented rates from experiments, observations, and simulations. At the same time there is a growing community of experimentalists that require real-time data analysis feedback, to enable them to steer their complex experimental instruments to optimized scientific outcomes and new discoveries. Recent efforts in DOE-SC have focused on articulating the data-centric challenges and opportunities facing these science communities. Key challenges include difficulties coping with data size, rate, and complexity inmore » the context of both real-time and post-experiment data analysis and interpretation. Solutions will require algorithmic and mathematical advances, as well as hardware and software infrastructures that adequately support data-intensive scientific workloads. This paper presents the summary findings of a workshop held by DOE-SC in September 2015, convened to identify the major challenges and the research that is needed to meet those challenges.« less

  3. Management, Analysis, and Visualization of Experimental and Observational Data -- The Convergence of Data and Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bethel, E. Wes; Greenwald, Martin; Kleese van Dam, Kersten

    Scientific user facilities---particle accelerators, telescopes, colliders, supercomputers, light sources, sequencing facilities, and more---operated by the U.S. Department of Energy (DOE) Office of Science (SC) generate ever increasing volumes of data at unprecedented rates from experiments, observations, and simulations. At the same time there is a growing community of experimentalists that require real-time data analysis feedback, to enable them to steer their complex experimental instruments to optimized scientific outcomes and new discoveries. Recent efforts in DOE-SC have focused on articulating the data-centric challenges and opportunities facing these science communities. Key challenges include difficulties coping with data size, rate, and complexity inmore » the context of both real-time and post-experiment data analysis and interpretation. Solutions will require algorithmic and mathematical advances, as well as hardware and software infrastructures that adequately support data-intensive scientific workloads. This paper presents the summary findings of a workshop held by DOE-SC in September 2015, convened to identify the major challenges and the research that is needed to meet those challenges.« less

  4. Final report, DOE/industry matching grant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumar, Arvind S.

    2003-02-25

    The Department of Energy/Industry Matching Grant was used to help improve nuclear engineering and science education at the University of Missouri-Rolla. The funds helped in the areas of recruitment and retention. Funds allowed the department to give scholarships to over 100 students (names included). Funds were also used for equipment upgrade and research, including two computers with peripherals, two NaI detectors, and a thermoluminescent dosimeter.

  5. High-Productivity Computing in Computational Physics Education

    NASA Astrophysics Data System (ADS)

    Tel-Zur, Guy

    2011-03-01

    We describe the development of a new course in Computational Physics at the Ben-Gurion University. This elective course for 3rd year undergraduates and MSc. students is being taught during one semester. Computational Physics is by now well accepted as the Third Pillar of Science. This paper's claim is that modern Computational Physics education should deal also with High-Productivity Computing. The traditional approach of teaching Computational Physics emphasizes ``Correctness'' and then ``Accuracy'' and we add also ``Performance.'' Along with topics in Mathematical Methods and case studies in Physics the course deals a significant amount of time with ``Mini-Courses'' in topics such as: High-Throughput Computing - Condor, Parallel Programming - MPI and OpenMP, How to build a Beowulf, Visualization and Grid and Cloud Computing. The course does not intend to teach neither new physics nor new mathematics but it is focused on an integrated approach for solving problems starting from the physics problem, the corresponding mathematical solution, the numerical scheme, writing an efficient computer code and finally analysis and visualization.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sjoreen, Terrence P

    The Oak Ridge National Laboratory (ORNL) Laboratory Directed Research and Development (LDRD) Program reports its status to the U.S. Department of Energy (DOE) in March of each year. The program operates under the authority of DOE Order 413.2A, 'Laboratory Directed Research and Development' (January 8, 2001), which establishes DOE's requirements for the program while providing the Laboratory Director broad flexibility for program implementation. LDRD funds are obtained through a charge to all Laboratory programs. This report describes all ORNL LDRD research activities supported during FY 2005 and includes final reports for completed projects and shorter progress reports for projects thatmore » were active, but not completed, during this period. The FY 2005 ORNL LDRD Self-Assessment (ORNL/PPA-2006/2) provides financial data about the FY 2005 projects and an internal evaluation of the program's management process. ORNL is a DOE multiprogram science, technology, and energy laboratory with distinctive capabilities in materials science and engineering, neutron science and technology, energy production and end-use technologies, biological and environmental science, and scientific computing. With these capabilities ORNL conducts basic and applied research and development (R&D) to support DOE's overarching national security mission, which encompasses science, energy resources, environmental quality, and national nuclear security. As a national resource, the Laboratory also applies its capabilities and skills to the specific needs of other federal agencies and customers through the DOE Work For Others (WFO) program. Information about the Laboratory and its programs is available on the Internet at . LDRD is a relatively small but vital DOE program that allows ORNL, as well as other multiprogram DOE laboratories, to select a limited number of R&D projects for the purpose of: (1) maintaining the scientific and technical vitality of the Laboratory; (2) enhancing the Laboratory's ability to address future DOE missions; (3) fostering creativity and stimulating exploration of forefront science and technology; (4) serving as a proving ground for new research; and (5) supporting high-risk, potentially high-value R&D. Through LDRD the Laboratory is able to improve its distinctive capabilities and enhance its ability to conduct cutting-edge R&D for its DOE and WFO sponsors. To meet the LDRD objectives and fulfill the particular needs of the Laboratory, ORNL has established a program with two components: the Director's R&D Fund and the Seed Money Fund. As outlined in Table 1, these two funds are complementary. The Director's R&D Fund develops new capabilities in support of the Laboratory initiatives, while the Seed Money Fund is open to all innovative ideas that have the potential for enhancing the Laboratory's core scientific and technical competencies. Provision for multiple routes of access to ORNL LDRD funds maximizes the likelihood that novel and seminal ideas with scientific and technological merit will be recognized and supported.« less

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sjoreen, Terrence P

    The Oak Ridge National Laboratory (ORNL) Laboratory Directed Research and Development (LDRD) Program reports its status to the U.S. Department of Energy (DOE) in March of each year. The program operates under the authority of DOE Order 413.2A, 'Laboratory Directed Research and Development' (January 8, 2001), which establishes DOE's requirements for the program while providing the Laboratory Director broad flexibility for program implementation. LDRD funds are obtained through a charge to all Laboratory programs. This report describes all ORNL LDRD research activities supported during FY 2004 and includes final reports for completed projects and shorter progress reports for projects thatmore » were active, but not completed, during this period. The FY 2004 ORNL LDRD Self-Assessment (ORNL/PPA-2005/2) provides financial data about the FY 2004 projects and an internal evaluation of the program's management process. ORNL is a DOE multiprogram science, technology, and energy laboratory with distinctive capabilities in materials science and engineering, neutron science and technology, energy production and end-use technologies, biological and environmental science, and scientific computing. With these capabilities ORNL conducts basic and applied research and development (R&D) to support DOE's overarching national security mission, which encompasses science, energy resources, environmental quality, and national nuclear security. As a national resource, the Laboratory also applies its capabilities and skills to the specific needs of other federal agencies and customers through the DOE Work For Others (WFO) program. Information about the Laboratory and its programs is available on the Internet at . LDRD is a relatively small but vital DOE program that allows ORNL, as well as other multiprogram DOE laboratories, to select a limited number of R&D projects for the purpose of: (1) maintaining the scientific and technical vitality of the Laboratory; (2) enhancing the Laboratory's ability to address future DOE missions; (3) fostering creativity and stimulating exploration of forefront science and technology; (4) serving as a proving ground for new research; and (5) supporting high-risk, potentially high-value R&D. Through LDRD the Laboratory is able to improve its distinctive capabilities and enhance its ability to conduct cutting-edge R&D for its DOE and WFO sponsors. To meet the LDRD objectives and fulfill the particular needs of the Laboratory, ORNL has established a program with two components: the Director's R&D Fund and the Seed Money Fund. As outlined in Table 1, these two funds are complementary. The Director's R&D Fund develops new capabilities in support of the Laboratory initiatives, while the Seed Money Fund is open to all innovative ideas that have the potential for enhancing the Laboratory's core scientific and technical competencies. Provision for multiple routes of access to ORNL LDRD funds maximizes the likelihood that novel and seminal ideas with scientific and technological merit will be recognized and supported.« less

  8. Transportable, university-level educational programs in interactive information storage and retrieval systems

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D.; Roquemore, Leroy

    1984-01-01

    Pursuant to the specifications of a research contract entered into in December, 1983 with NASA, the Computer Science Departments of the University of Southwestern Louisiana and Southern University will be working jointly to address a variety of research and educational issues relating to the use, by non-computer professionals, of some of the largest and most sophiticated interactive information storage and retrieval systems available. Over the projected 6 to 8 year life of the project, in addition to NASA/RECON, the following systems will be examined: Lockheed DIALOG, DOE/RECON, DOD/DTIC, EPA/CSIN, and LLNL/TIS.

  9. The Center for Nanophase Materials Sciences

    NASA Astrophysics Data System (ADS)

    Lowndes, Douglas

    2005-03-01

    The Center for Nanophase Materials Sciences (CNMS) located at Oak Ridge National Laboratory (ORNL) will be the first DOE Nanoscale Science Research Center to begin operation, with construction to be completed in April 2005 and initial operations in October 2005. The CNMS' scientific program has been developed through workshops with the national community, with the goal of creating a highly collaborative research environment to accelerate discovery and drive technological advances. Research at the CNMS is organized under seven Scientific Themes selected to address challenges to understanding and to exploit particular ORNL strengths (see http://cnms.ornl.govhttp://cnms.ornl.gov). These include extensive synthesis and characterization capabilities for soft, hard, nanostructured, magnetic and catalytic materials and their composites; neutron scattering at the Spallation Neutron Source and High Flux Isotope Reactor; computational nanoscience in the CNMS' Nanomaterials Theory Institute and utilizing facilities and expertise of the Center for Computational Sciences and the new Leadership Scientific Computing Facility at ORNL; a new CNMS Nanofabrication Research Laboratory; and a suite of unique and state-of-the-art instruments to be made reliably available to the national community for imaging, manipulation, and properties measurements on nanoscale materials in controlled environments. The new research facilities will be described together with the planned operation of the user research program, the latter illustrated by the current ``jump start'' user program that utilizes existing ORNL/CNMS facilities.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sjoreen, Terrence P

    The Oak Ridge National Laboratory (ORNL) Laboratory Directed Research and Development (LDRD) program reports its status to the U.S. Department of Energy (DOE) in March of each year. The program operates under the authority of DOE Order 413.2B, 'Laboratory Directed Research and Development' (April 19, 2006), which establishes DOE's requirements for the program while providing the Laboratory Director broad flexibility for program implementation. LDRD funds are obtained through a charge to all Laboratory programs. This report includes summaries for all ORNL LDRD research activities supported during FY 2007. The associated FY 2007 ORNL LDRD Self-Assessment (ORNL/PPA-2008/2) provides financial data andmore » an internal evaluation of the program's management process. ORNL is a DOE multiprogram science, technology, and energy laboratory with distinctive capabilities in materials science and engineering, neutron science and technology, energy production and end-use technologies, biological and environmental science, and scientific computing. With these capabilities ORNL conducts basic and applied research and development (R&D) to support DOE's overarching mission to advance the national, economic, and energy security of the United States and promote scientific and technological innovation in support of that mission. As a national resource, the Laboratory also applies its capabilities and skills to specific needs of other federal agencies and customers through the DOE Work for Others (WFO) program. Information about the Laboratory and its programs is available on the Internet at http://www.ornl.gov/. LDRD is a relatively small but vital DOE program that allows ORNL, as well as other DOE laboratories, to select a limited number of R&D projects for the purpose of: (1) maintaining the scientific and technical vitality of the Laboratory; (2) enhancing the Laboratory's ability to address future DOE missions; (3) fostering creativity and stimulating exploration of forefront science and technology; (4) serving as a proving ground for new research; and (5) supporting high-risk, potentially high-value R&D. Through LDRD the Laboratory is able to improve its distinctive capabilities and enhance its ability to conduct cutting-edge R&D for its DOE and WFO sponsors. To meet the LDRD objectives and fulfill the particular needs of the Laboratory, ORNL has established a program with two components: the Director's R&D Fund and the Seed Money Fund. As outlined in Table 1, these two funds are complementary. The Director's R&D Fund develops new capabilities in support of the Laboratory initiatives, while the Seed Money Fund is open to all innovative ideas that have the potential for enhancing the Laboratory's core scientific and technical competencies. Provision for multiple routes of access to ORNL LDRD funds maximizes the likelihood that novel ideas with scientific and technological merit will be recognized and supported.« less

  11. Biological and Environmental Research Network Requirements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balaji, V.; Boden, Tom; Cowley, Dave

    2013-09-01

    The Energy Sciences Network (ESnet) is the primary provider of network connectivity for the U.S. Department of Energy (DOE) Office of Science (SC), the single largest supporter of basic research in the physical sciences in the United States. In support of SC programs, ESnet regularly updates and refreshes its understanding of the networking requirements of the instruments, facilities, scientists, and science programs that it serves. This focus has helped ESnet be a highly successful enabler of scientific discovery for over 25 years. In November 2012, ESnet and the Office of Biological and Environmental Research (BER) of the DOE SC organizedmore » a review to characterize the networking requirements of the programs funded by the BER program office. Several key findings resulted from the review. Among them: 1) The scale of data sets available to science collaborations continues to increase exponentially. This has broad impact, both on the network and on the computational and storage systems connected to the network. 2) Many science collaborations require assistance to cope with the systems and network engineering challenges inherent in managing the rapid growth in data scale. 3) Several science domains operate distributed facilities that rely on high-performance networking for success. Key examples illustrated in this report include the Earth System Grid Federation (ESGF) and the Systems Biology Knowledgebase (KBase). This report expands on these points, and addresses others as well. The report contains a findings section as well as the text of the case studies discussed at the review.« less

  12. Computational ecology as an emerging science

    PubMed Central

    Petrovskii, Sergei; Petrovskaya, Natalia

    2012-01-01

    It has long been recognized that numerical modelling and computer simulations can be used as a powerful research tool to understand, and sometimes to predict, the tendencies and peculiarities in the dynamics of populations and ecosystems. It has been, however, much less appreciated that the context of modelling and simulations in ecology is essentially different from those that normally exist in other natural sciences. In our paper, we review the computational challenges arising in modern ecology in the spirit of computational mathematics, i.e. with our main focus on the choice and use of adequate numerical methods. Somewhat paradoxically, the complexity of ecological problems does not always require the use of complex computational methods. This paradox, however, can be easily resolved if we recall that application of sophisticated computational methods usually requires clear and unambiguous mathematical problem statement as well as clearly defined benchmark information for model validation. At the same time, many ecological problems still do not have mathematically accurate and unambiguous description, and available field data are often very noisy, and hence it can be hard to understand how the results of computations should be interpreted from the ecological viewpoint. In this scientific context, computational ecology has to deal with a new paradigm: conventional issues of numerical modelling such as convergence and stability become less important than the qualitative analysis that can be provided with the help of computational techniques. We discuss this paradigm by considering computational challenges arising in several specific ecological applications. PMID:23565336

  13. Using Left Overs to Make Energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steuterman, Sally; Czarnecki, Alicia; Hurley, Paul

    Representing the Material Science Antinides (MSA), this document is one of the entries in the Ten Hundred and One Word Challenge. As part of the challenge, the 46 Energy Frontier Research Centers were invited to represent their science in images, cartoons, photos, words and original paintings, but any descriptions or words could only use the 1000 most commonly used words in the English language, with the addition of one word important to each of the EFRCs and the mission of DOE energy. The mission of MSA is to conduct transformative research in the actinide sciences with full integration of experimentalmore » and computational approaches, and an emphasis on research questions that are important to the energy future of the nation.« less

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Raugei, Simone; DuBois, Daniel L.; Rousseau, Roger J.

    Rational design of molecular catalysts requires a systematic approach to designing ligands with specific functionality and precisely tailored electronic and steric properties. It then becomes possible to devise computer protocols to predict accurately the required properties and ultimately to design catalysts by computer. In this account we first review how thermodynamic properties such as oxidation-reduction potentials (E0), acidities (pKa), and hydride donor abilities (ΔGH-) form the basis for a systematic design of molecular catalysts for reactions that are critical for a secure energy future (hydrogen evolution and oxidation, oxygen and nitrogen reduction, and carbon dioxide reduction). We highlight how densitymore » functional theory allows us to determine and predict these properties within “chemical” accuracy (~ 0.06 eV for redox potentials, ~ 1 pKa unit for pKa values, and ~ 1.5 kcal/mol for hydricities). These quantities determine free energy maps and profiles associated with catalytic cycles, i.e. the relative energies of intermediates, and help us distinguish between desirable and high-energy pathways and mechanisms. Good catalysts have flat profiles that avoid high activation barriers due to low and high energy intermediates. We illustrate how the criterion of a flat energy profile lends itself to the prediction of design points by computer for optimum catalysts. This research was carried out in the Center for Molecular Electro-catalysis, an Energy Frontier Research Center funded by the U.S. Department of Energy (DOE), Office of Science, Office of Basic Energy Sciences. Pacific Northwest National Laboratory (PNNL) is operated for the DOE by Battelle.« less

  15. Mechanistic Insights into the Structure-Dependent Selectivity of Catalytic Furfural Conversion on Platinum Catalysts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cai, Qiuxia; Wang, Jianguo; Wang, Yang-Gang

    The effects of structure and size on the selectivity of catalytic furfural conversion over supported Pt catalysts in the presence of hydrogen have been studied using first principles density functional theory (DFT) calculations and microkinetic modeling. Four Pt model systems, i.e., periodic Pt(111), Pt(211) surfaces, as well as small nanoclusters (Pt13 and Pt55) are chosen to represent the terrace, step, and corner sites of Pt nanoparticles. Our DFT results show that the reaction routes for furfural hydrogenation and decarbonylation are strongly dependent on the type of reactive sites, which lead to the different selectivity. On the basis of the size-dependentmore » site distribution rule, we correlate the site distributions as a function of the Pt particle size. Our microkinetic results indicate the critical particle size that controls the furfural selectivity is about 1.0 nm, which is in good agreement with the reported experimental value under reaction conditions. This work was supported by National Basic Research Program of China (973 Program) (2013CB733501) and the National Natural Science Foundation of China (NSFC-21306169, 21176221, 21136001, 21101137 and 91334103). This work was also partially supported by the US Department of Energy (DOE), the Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences & Biosciences. Pacific Northwest National Laboratory (PNNL) is a multiprogram national laboratory operated for DOE by Battelle. Computing time was granted by the grand challenge of computational catalysis of the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL). EMSL is a national scientific user facility located at Pacific Northwest National Laboratory (PNNL) and sponsored by DOE’s Office of Biological and Environmental Research.« less

  16. Unsupervised Learning Through Randomized Algorithms for High-Volume High-Velocity Data (ULTRA-HV).

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pinar, Ali; Kolda, Tamara G.; Carlberg, Kevin Thomas

    Through long-term investments in computing, algorithms, facilities, and instrumentation, DOE is an established leader in massive-scale, high-fidelity simulations, as well as science-leading experimentation. In both cases, DOE is generating more data than it can analyze and the problem is intensifying quickly. The need for advanced algorithms that can automatically convert the abundance of data into a wealth of useful information by discovering hidden structures is well recognized. Such efforts however, are hindered by the massive volume of the data and its high velocity. Here, the challenge is developing unsupervised learning methods to discover hidden structure in high-volume, high-velocity data.

  17. A hybrid Gerchberg-Saxton-like algorithm for DOE and CGH calculation

    NASA Astrophysics Data System (ADS)

    Wang, Haichao; Yue, Weirui; Song, Qiang; Liu, Jingdan; Situ, Guohai

    2017-02-01

    The Gerchberg-Saxton (GS) algorithm is widely used in various disciplines of modern sciences and technologies where phase retrieval is required. However, this legendary algorithm most likely stagnates after a few iterations. Many efforts have been taken to improve this situation. Here we propose to introduce the strategy of gradient descent and weighting technique to the GS algorithm, and demonstrate it using two examples: design of a diffractive optical element (DOE) to achieve off-axis illumination in lithographic tools, and design of a computer generated hologram (CGH) for holographic display. Both numerical simulation and optical experiments are carried out for demonstration.

  18. Learning physical biology via modeling and simulation: A new course and textbook for science and engineering undergraduates

    NASA Astrophysics Data System (ADS)

    Nelson, Philip

    To a large extent, undergraduate physical-science curricula remain firmly rooted in pencil-and-paper calculation, despite the fact that most research is done with computers. To a large extent, undergraduate life-science curricula remain firmly rooted in descriptive approaches, despite the fact that much current research involves quantitative modeling. Not only does our pedagogy not reflect current reality; it also creates a spurious barrier between the fields, reinforcing the narrow silos that prevent students from connecting them. I'll describe an intermediate-level course on ``Physical Models of Living Systems.'' The prerequisite is first-year university physics and calculus. The course is a response to rapidly growing interest among undergraduates in a broad range of science and engineering majors. Students acquire several research skills that are often not addressed in traditional undergraduate courses: •Basic modeling skills; •Probabilistic modeling skills; •Data analysis methods; •Computer programming using a general-purpose platform like MATLAB or Python; •Pulling datasets from the Web for analysis; •Data visualization; •Dynamical systems, particularly feedback control. Partially supported by the NSF under Grants EF-0928048 and DMR-0832802.

  19. Development of a GPU-Accelerated 3-D Full-Wave Code for Electromagnetic Wave Propagation in a Cold Plasma

    NASA Astrophysics Data System (ADS)

    Woodbury, D.; Kubota, S.; Johnson, I.

    2014-10-01

    Computer simulations of electromagnetic wave propagation in magnetized plasmas are an important tool for both plasma heating and diagnostics. For active millimeter-wave and microwave diagnostics, accurately modeling the evolution of the beam parameters for launched, reflected or scattered waves in a toroidal plasma requires that calculations be done using the full 3-D geometry. Previously, we reported on the application of GPGPU (General-Purpose computing on Graphics Processing Units) to a 3-D vacuum Maxwell code using the FDTD (Finite-Difference Time-Domain) method. Tests were done for Gaussian beam propagation with a hard source antenna, utilizing the parallel processing capabilities of the NVIDIA K20M. In the current study, we have modified the 3-D code to include a soft source antenna and an induced current density based on the cold plasma approximation. Results from Gaussian beam propagation in an inhomogeneous anisotropic plasma, along with comparisons to ray- and beam-tracing calculations will be presented. Additional enhancements, such as advanced coding techniques for improved speedup, will also be investigated. Supported by U.S. DoE Grant DE-FG02-99-ER54527 and in part by the U.S. DoE, Office of Science, WDTS under the Science Undergraduate Laboratory Internship program.

  20. Standardized input for Hanford environmental impact statements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Napier, B.A.

    1981-05-01

    Models and computer programs for simulating the environmental behavior of radionuclides in the environment and the resulting radiation dose to humans have been developed over the years by the Environmental Analysis Section staff, Ecological Sciences Department at the Pacific Northwest Laboratory (PNL). Methodologies have evolved for calculating raidation doses from many exposure pathways for any type of release mechanism. Depending on the situation or process being simulated, different sets of computer programs, assumptions, and modeling techniques must be used. This report is a compilation of recommended computer programs and necessary input information for use in calculating doses to members ofmore » the general public for environmental impact statements prepared for DOE activities to be conducted on or near the Hanford Reservation.« less

  1. Remediation of Groundwater Contaminated by Nuclear Waste

    NASA Astrophysics Data System (ADS)

    Parker, Jack; Palumbo, Anthony

    2008-07-01

    A Workshop on Accelerating Development of Practical Field-Scale Bioremediation Models; An Online Meeting, 23 January to 20 February 2008; A Web-based workshop sponsored by the U.S. Department of Energy Environmental Remediation Sciences Program (DOE/ERSP) was organized in early 2008 to assess the state of the science and knowledge gaps associated with the use of computer models to facilitate remediation of groundwater contaminated by wastes from Cold War era nuclear weapons development and production. Microbially mediated biological reactions offer a potentially efficient means to treat these sites, but considerable uncertainty exists in the coupled biological, chemical, and physical processes and their mathematical representation.

  2. Bricklayer Static Analysis

    NASA Astrophysics Data System (ADS)

    Harris, Christopher

    In the U.S., science and math are taking spotlight in education, and rightfully so as they directly impact economic progression. Curiously absent is computer science, which despite its numerous job opportunities and growth does not have as much focus. This thesis develops a source code analysis framework using language translation, and machine learning classifiers to analyze programs written in Bricklayer for the purposes of programmatically identifying relative success or failure of a students Bricklayer program, helping teachers scale in the amount of students they can support, and providing better messaging. The thesis uses as a case study a set of student programs to demonstrate the possibilities of the framework.

  3. Advances in Cross-Cutting Ideas for Computational Climate Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ng, Esmond; Evans, Katherine J.; Caldwell, Peter

    This report presents results from the DOE-sponsored workshop titled, ``Advancing X-Cutting Ideas for Computational Climate Science Workshop,'' known as AXICCS, held on September 12--13, 2016 in Rockville, MD. The workshop brought together experts in climate science, computational climate science, computer science, and mathematics to discuss interesting but unsolved science questions regarding climate modeling and simulation, promoted collaboration among the diverse scientists in attendance, and brainstormed about possible tools and capabilities that could be developed to help address them. Emerged from discussions at the workshop were several research opportunities that the group felt could advance climate science significantly. These include (1)more » process-resolving models to provide insight into important processes and features of interest and inform the development of advanced physical parameterizations, (2) a community effort to develop and provide integrated model credibility, (3) including, organizing, and managing increasingly connected model components that increase model fidelity yet complexity, and (4) treating Earth system models as one interconnected organism without numerical or data based boundaries that limit interactions. The group also identified several cross-cutting advances in mathematics, computer science, and computational science that would be needed to enable one or more of these big ideas. It is critical to address the need for organized, verified, and optimized software, which enables the models to grow and continue to provide solutions in which the community can have confidence. Effectively utilizing the newest computer hardware enables simulation efficiency and the ability to handle output from increasingly complex and detailed models. This will be accomplished through hierarchical multiscale algorithms in tandem with new strategies for data handling, analysis, and storage. These big ideas and cross-cutting technologies for enabling breakthrough climate simulation advancements also need the "glue" of outreach and learning across the scientific domains to be successful. The workshop identified several strategies to allow productive, continuous engagement across those who have a broad knowledge of the various angles of the problem. Specific ideas to foster education and tools to make material progress were discussed. Examples include follow-on cross-cutting meetings that enable unstructured discussions of the types this workshop fostered. A concerted effort to recruit undergraduate and graduate students from all relevant domains and provide them experience, training, and networking across their immediate expertise is needed. This will broaden and expand their exposure to the future needs and solutions, and provide a pipeline of scientists with a diversity of knowledge and know-how. Providing real-world experience with subject matter experts from multiple angles may also motivate the students to attack these problems and even come up with the missing solutions.« less

  4. Advances in Cross-Cutting Ideas for Computational Climate Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ng, E.; Evans, K.; Caldwell, P.

    This report presents results from the DOE-sponsored workshop titled, Advancing X-Cutting Ideas for Computational Climate Science Workshop,'' known as AXICCS, held on September 12--13, 2016 in Rockville, MD. The workshop brought together experts in climate science, computational climate science, computer science, and mathematics to discuss interesting but unsolved science questions regarding climate modeling and simulation, promoted collaboration among the diverse scientists in attendance, and brainstormed about possible tools and capabilities that could be developed to help address them. Emerged from discussions at the workshop were several research opportunities that the group felt could advance climate science significantly. These include (1)more » process-resolving models to provide insight into important processes and features of interest and inform the development of advanced physical parameterizations, (2) a community effort to develop and provide integrated model credibility, (3) including, organizing, and managing increasingly connected model components that increase model fidelity yet complexity, and (4) treating Earth system models as one interconnected organism without numerical or data based boundaries that limit interactions. The group also identified several cross-cutting advances in mathematics, computer science, and computational science that would be needed to enable one or more of these big ideas. It is critical to address the need for organized, verified, and optimized software, which enables the models to grow and continue to provide solutions in which the community can have confidence. Effectively utilizing the newest computer hardware enables simulation efficiency and the ability to handle output from increasingly complex and detailed models. This will be accomplished through hierarchical multiscale algorithms in tandem with new strategies for data handling, analysis, and storage. These big ideas and cross-cutting technologies for enabling breakthrough climate simulation advancements also need the "glue" of outreach and learning across the scientific domains to be successful. The workshop identified several strategies to allow productive, continuous engagement across those who have a broad knowledge of the various angles of the problem. Specific ideas to foster education and tools to make material progress were discussed. Examples include follow-on cross-cutting meetings that enable unstructured discussions of the types this workshop fostered. A concerted effort to recruit undergraduate and graduate students from all relevant domains and provide them experience, training, and networking across their immediate expertise is needed. This will broaden and expand their exposure to the future needs and solutions, and provide a pipeline of scientists with a diversity of knowledge and know-how. Providing real-world experience with subject matter experts from multiple angles may also motivate the students to attack these problems and even come up with the missing solutions.« less

  5. Neuromorphic Computing – From Materials Research to Systems Architecture Roundtable

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schuller, Ivan K.; Stevens, Rick; Pino, Robinson

    2015-10-29

    Computation in its many forms is the engine that fuels our modern civilization. Modern computation—based on the von Neumann architecture—has allowed, until now, the development of continuous improvements, as predicted by Moore’s law. However, computation using current architectures and materials will inevitably—within the next 10 years—reach a limit because of fundamental scientific reasons. DOE convened a roundtable of experts in neuromorphic computing systems, materials science, and computer science in Washington on October 29-30, 2015 to address the following basic questions: Can brain-like (“neuromorphic”) computing devices based on new material concepts and systems be developed to dramatically outperform conventional CMOS basedmore » technology? If so, what are the basic research challenges for materials sicence and computing? The overarching answer that emerged was: The development of novel functional materials and devices incorporated into unique architectures will allow a revolutionary technological leap toward the implementation of a fully “neuromorphic” computer. To address this challenge, the following issues were considered: The main differences between neuromorphic and conventional computing as related to: signaling models, timing/clock, non-volatile memory, architecture, fault tolerance, integrated memory and compute, noise tolerance, analog vs. digital, and in situ learning New neuromorphic architectures needed to: produce lower energy consumption, potential novel nanostructured materials, and enhanced computation Device and materials properties needed to implement functions such as: hysteresis, stability, and fault tolerance Comparisons of different implementations: spin torque, memristors, resistive switching, phase change, and optical schemes for enhanced breakthroughs in performance, cost, fault tolerance, and/or manufacturability.« less

  6. Combinatorial Algorithms to Enable Computational Science and Engineering: Work from the CSCAPES Institute

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boman, Erik G.; Catalyurek, Umit V.; Chevalier, Cedric

    2015-01-16

    This final progress report summarizes the work accomplished at the Combinatorial Scientific Computing and Petascale Simulations Institute. We developed Zoltan, a parallel mesh partitioning library that made use of accurate hypergraph models to provide load balancing in mesh-based computations. We developed several graph coloring algorithms for computing Jacobian and Hessian matrices and organized them into a software package called ColPack. We developed parallel algorithms for graph coloring and graph matching problems, and also designed multi-scale graph algorithms. Three PhD students graduated, six more are continuing their PhD studies, and four postdoctoral scholars were advised. Six of these students and Fellowsmore » have joined DOE Labs (Sandia, Berkeley), as staff scientists or as postdoctoral scientists. We also organized the SIAM Workshop on Combinatorial Scientific Computing (CSC) in 2007, 2009, and 2011 to continue to foster the CSC community.« less

  7. Preface: SciDAC 2006

    NASA Astrophysics Data System (ADS)

    Tang, William M., Dr.

    2006-01-01

    The second annual Scientific Discovery through Advanced Computing (SciDAC) Conference was held from June 25-29, 2006 at the new Hyatt Regency Hotel in Denver, Colorado. This conference showcased outstanding SciDAC-sponsored computational science results achieved during the past year across many scientific domains, with an emphasis on science at scale. Exciting computational science that has been accomplished outside of the SciDAC program both nationally and internationally was also featured to help foster communication between SciDAC computational scientists and those funded by other agencies. This was illustrated by many compelling examples of how domain scientists collaborated productively with applied mathematicians and computer scientists to effectively take advantage of terascale computers (capable of performing trillions of calculations per second) not only to accelerate progress in scientific discovery in a variety of fields but also to show great promise for being able to utilize the exciting petascale capabilities in the near future. The SciDAC program was originally conceived as an interdisciplinary computational science program based on the guiding principle that strong collaborative alliances between domain scientists, applied mathematicians, and computer scientists are vital to accelerated progress and associated discovery on the world's most challenging scientific problems. Associated verification and validation are essential in this successful program, which was funded by the US Department of Energy Office of Science (DOE OS) five years ago. As is made clear in many of the papers in these proceedings, SciDAC has fundamentally changed the way that computational science is now carried out in response to the exciting challenge of making the best use of the rapid progress in the emergence of more and more powerful computational platforms. In this regard, Dr. Raymond Orbach, Energy Undersecretary for Science at the DOE and Director of the OS has stated: `SciDAC has strengthened the role of high-end computing in furthering science. It is defining whole new fields for discovery.' (SciDAC Review, Spring 2006, p8). Application domains within the SciDAC 2006 conference agenda encompassed a broad range of science including: (i) the DOE core mission of energy research involving combustion studies relevant to fuel efficiency and pollution issues faced today and magnetic fusion investigations impacting prospects for future energy sources; (ii) fundamental explorations into the building blocks of matter, ranging from quantum chromodynamics - the basic theory that describes how quarks make up the protons and neutrons of all matter - to the design of modern high-energy accelerators; (iii) the formidable challenges of predicting and controlling the behavior of molecules in quantum chemistry and the complex biomolecules determining the evolution of biological systems; (iv) studies of exploding stars for insights into the nature of the universe; and (v) integrated climate modeling to enable realistic analysis of earth's changing climate. Associated research has made it quite clear that advanced computation is often the only means by which timely progress is feasible when dealing with these complex, multi-component physical, chemical, and biological systems operating over huge ranges of temporal and spatial scales. Working with the domain scientists, applied mathematicians and computer scientists have continued to develop the discretizations of the underlying equations and the complementary algorithms to enable improvements in solutions on modern parallel computing platforms as they evolve from the terascale toward the petascale regime. Moreover, the associated tremendous growth of data generated from the terabyte to the petabyte range demands not only the advanced data analysis and visualization methods to harvest the scientific information but also the development of efficient workflow strategies which can deal with the data input/output, management, movement, and storage challenges. If scientific discovery is expected to keep apace with the continuing progression from tera- to petascale platforms, the vital alliance between domain scientists, applied mathematicians, and computer scientists will be even more crucial. During the SciDAC 2006 Conference, some of the future challenges and opportunities in interdisciplinary computational science were emphasized in the Advanced Architectures Panel and by Dr. Victor Reis, Senior Advisor to the Secretary of Energy, who gave a featured presentation on `Simulation, Computation, and the Global Nuclear Energy Partnership.' Overall, the conference provided an excellent opportunity to highlight the rising importance of computational science in the scientific enterprise and to motivate future investment in this area. As Michael Strayer, SciDAC Program Director, has noted: `While SciDAC may have started out as a specific program, Scientific Discovery through Advanced Computing has become a powerful concept for addressing some of the biggest challenges facing our nation and our world.' Looking forward to next year, the SciDAC 2007 Conference will be held from June 24-28 at the Westin Copley Plaza in Boston, Massachusetts. Chairman: David Keyes, Columbia University. The Organizing Committee for the SciDAC 2006 Conference would like to acknowledge the individuals whose talents and efforts were essential to the success of the meeting. Special thanks go to Betsy Riley for her leadership in building the infrastructure support for the conference, for identifying and then obtaining contributions from our corporate sponsors, for coordinating all media communications, and for her efforts in organizing and preparing the conference proceedings for publication; to Tim Jones for handling the hotel scouting, subcontracts, and exhibits and stage production; to Angela Harris for handling supplies, shipping, and tracking, poster sessions set-up, and for her efforts in coordinating and scheduling the promotional activities that took place during the conference; to John Bui and John Smith for their superb wireless networking and A/V set-up and support; to Cindy Latham for Web site design, graphic design, and quality control of proceedings submissions; and to Pamelia Nixon-Hartje of Ambassador for budget and quality control of catering. We are grateful for the highly professional dedicated efforts of all of these individuals, who were the cornerstones of the SciDAC 2006 Conference. Thanks also go to Angela Beach of the ORNL Conference Center for her efforts in executing the contracts with the hotel, Carolyn James of Colorado State for on-site registration supervision, Lora Wolfe and Brittany Hagen for administrative support at ORNL, and Dami Rich and Andrew Sproles for graphic design and production. We are also most grateful to the Oak Ridge National Laboratory, especially Jeff Nichols, and to our corporate sponsors, Data Direct Networks, Cray, IBM, SGI, and Institute of Physics Publishing for their support. We especially express our gratitude to the featured speakers, invited oral speakers, invited poster presenters, session chairs, and advanced architecture panelists and chair for their excellent contributions on behalf of SciDAC 2006. We would like to express our deep appreciation to Lali Chatterjee, Graham Douglas, Margaret Smith, and the production team of Institute of Physics Publishing, who worked tirelessly to publish the final conference proceedings in a timely manner. Finally, heartfelt thanks are extended to Michael Strayer, Associate Director for OASCR and SciDAC Director, and to the DOE program managers associated with SciDAC for their continuing enthusiasm and strong support for the annual SciDAC Conferences as a special venue to showcase the exciting scientific discovery achievements enabled by the interdisciplinary collaborations championed by the SciDAC program.

  8. On the Reaction Mechanism of Acetaldehyde Decomposition on Mo(110)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mei, Donghai; Karim, Ayman M.; Wang, Yong

    2012-02-16

    The strong Mo-O bond strength provides promising reactivity of Mo-based catalysts for the deoxygenation of biomass-derived oxygenates. Combining the novel dimer saddle point searching method with periodic spin-polarized density functional theory calculations, we investigated the reaction pathways of a acetaldehyde decomposition on the clean Mo(110) surface. Two reaction pathways were identified, a selective deoxygenation and a nonselective fragmentation pathways. We found that acetaldehyde preferentially adsorbs at the pseudo 3-fold hollow site in the η2(C,O) configuration on Mo(110). Among four possible bond (β-C-H, γ-C-H, C-O and C-C) cleavages, the initial decomposition of the adsorbed acetaldehyde produces either ethylidene via the C-Omore » bond scission or acetyl via the β-C-H bond scission while the C-C and the γ-C-H bond cleavages of acetaldehyde leading to the formation of methyl (and formyl) and formylmethyl are unlikely. Further dehydrogenations of ethylidene into either ethylidyne or vinyl are competing and very facile with low activation barriers of 0.24 and 0.31 eV, respectively. Concurrently, the formed acetyl would deoxygenate into ethylidyne via the C-O cleavage rather than breaking the C-C or the C-H bonds. The selective deoxygenation of acetaldehyde forming ethylene is inhibited by relatively weaker hydrogenation capability of the Mo(110) surface. Instead, the nonselective pathway via vinyl and vinylidene dehydrogenations to ethynyl as the final hydrocarbon fragment is kinetically favorable. On the other hand, the strong interaction between ethylene and the Mo(110) surface also leads to ethylene decomposition instead of desorption into the gas phase. This work was financially supported by the National Advanced Biofuels Consortium (NABC). Computing time was granted by a user project (emsl42292) at the Molecular Science Computing Facility in the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL). This work was financially supported by the National Advanced Biofuels Consortium (NABC). Computing time was granted by a user project (emsl42292) at the Molecular Science Computing Facility in the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL). The EMSL is a U.S. Department of Energy (DOE) national scientific user facility located at Pacific Northwest National Laboratory (PNNL) and supported by the DOE Office of Biological and Environmental Research. Pacific Northwest National Laboratory is operated by Battelle for the U.S. Department of Energy.« less

  9. A Community Publication and Dissemination System for Hydrology Education Materials

    NASA Astrophysics Data System (ADS)

    Ruddell, B. L.

    2015-12-01

    Hosted by CUAHSI and the Science Education Resource Center (SERC), federated by the National Science Digital Library (NSDL), and allied with the Water Data Center (WDC), Hydrologic Information System (HIS), and HydroShare projects, a simple cyberinfrastructure has been launched for the publication and dissemination of data and model driven university hydrology education materials. This lightweight system's metadata describes learning content as a data-driven module with defined data inputs and outputs. This structure allows a user to mix and match modules to create sequences of content that teach both hydrology and computer learning outcomes. Importantly, this modular infrastructure allows an instructor to substitute a module based on updated computer methods for one based on outdated computer methods, hopefully solving the problem of rapid obsolescence that has hampered previous community efforts. The prototype system is now available from CUAHSI and SERC, with some example content. The system is designed to catalog, link to, make visible, and make accessible the existing and future contributions of the community; this system does not create content. Submissions from hydrology educators are eagerly solicited, especially for existing content.

  10. When does a physical system compute?

    PubMed

    Horsman, Clare; Stepney, Susan; Wagner, Rob C; Kendon, Viv

    2014-09-08

    Computing is a high-level process of a physical system. Recent interest in non-standard computing systems, including quantum and biological computers, has brought this physical basis of computing to the forefront. There has been, however, no consensus on how to tell if a given physical system is acting as a computer or not; leading to confusion over novel computational devices, and even claims that every physical event is a computation. In this paper, we introduce a formal framework that can be used to determine whether a physical system is performing a computation. We demonstrate how the abstract computational level interacts with the physical device level, in comparison with the use of mathematical models in experimental science. This powerful formulation allows a precise description of experiments, technology, computation and simulation, giving our central conclusion: physical computing is the use of a physical system to predict the outcome of an abstract evolution . We give conditions for computing, illustrated using a range of non-standard computing scenarios. The framework also covers broader computing contexts, where there is no obvious human computer user. We introduce the notion of a 'computational entity', and its critical role in defining when computing is taking place in physical systems.

  11. When does a physical system compute?

    PubMed Central

    Horsman, Clare; Stepney, Susan; Wagner, Rob C.; Kendon, Viv

    2014-01-01

    Computing is a high-level process of a physical system. Recent interest in non-standard computing systems, including quantum and biological computers, has brought this physical basis of computing to the forefront. There has been, however, no consensus on how to tell if a given physical system is acting as a computer or not; leading to confusion over novel computational devices, and even claims that every physical event is a computation. In this paper, we introduce a formal framework that can be used to determine whether a physical system is performing a computation. We demonstrate how the abstract computational level interacts with the physical device level, in comparison with the use of mathematical models in experimental science. This powerful formulation allows a precise description of experiments, technology, computation and simulation, giving our central conclusion: physical computing is the use of a physical system to predict the outcome of an abstract evolution. We give conditions for computing, illustrated using a range of non-standard computing scenarios. The framework also covers broader computing contexts, where there is no obvious human computer user. We introduce the notion of a ‘computational entity’, and its critical role in defining when computing is taking place in physical systems. PMID:25197245

  12. Accelerating scientific discovery : 2007 annual report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beckman, P.; Dave, P.; Drugan, C.

    2008-11-14

    As a gateway for scientific discovery, the Argonne Leadership Computing Facility (ALCF) works hand in hand with the world's best computational scientists to advance research in a diverse span of scientific domains, ranging from chemistry, applied mathematics, and materials science to engineering physics and life sciences. Sponsored by the U.S. Department of Energy's (DOE) Office of Science, researchers are using the IBM Blue Gene/L supercomputer at the ALCF to study and explore key scientific problems that underlie important challenges facing our society. For instance, a research team at the University of California-San Diego/ SDSC is studying the molecular basis ofmore » Parkinson's disease. The researchers plan to use the knowledge they gain to discover new drugs to treat the disease and to identify risk factors for other diseases that are equally prevalent. Likewise, scientists from Pratt & Whitney are using the Blue Gene to understand the complex processes within aircraft engines. Expanding our understanding of jet engine combustors is the secret to improved fuel efficiency and reduced emissions. Lessons learned from the scientific simulations of jet engine combustors have already led Pratt & Whitney to newer designs with unprecedented reductions in emissions, noise, and cost of ownership. ALCF staff members provide in-depth expertise and assistance to those using the Blue Gene/L and optimizing user applications. Both the Catalyst and Applications Performance Engineering and Data Analytics (APEDA) teams support the users projects. In addition to working with scientists running experiments on the Blue Gene/L, we have become a nexus for the broader global community. In partnership with the Mathematics and Computer Science Division at Argonne National Laboratory, we have created an environment where the world's most challenging computational science problems can be addressed. Our expertise in high-end scientific computing enables us to provide guidance for applications that are transitioning to petascale as well as to produce software that facilitates their development, such as the MPICH library, which provides a portable and efficient implementation of the MPI standard--the prevalent programming model for large-scale scientific applications--and the PETSc toolkit that provides a programming paradigm that eases the development of many scientific applications on high-end computers.« less

  13. New crystal structures in hexagonal CuInS2 nanocrystals

    NASA Astrophysics Data System (ADS)

    Shen, Xiao; Hernández-Pagan, Emil A.; Zhou, Wu; Puzyrev, Yevgeniy S.; Idrobo, Juan C.; MacDonald, Janet E.; Pennycook, Stephen J.; Pantelides, Sokrates T.

    2013-03-01

    CuInS2 is one of the best candidate materials for solar energy harvesting. Its nanocrystals with a hexagonal lattice structure that is different from the bulk chalcopyrite phase have been synthesized by many groups. The structure of these CuInS2 nanocrystals has been previously identified as the wurtzite structure in which the copper and indium atoms randomly occupy the cation sites. Using first-principles total energy and electronic structure calculations based on density functional theory, UV-vis absorption spectroscopy, X-ray diffraction, and atomic resolution Z-contrast images obtained in an aberration-corrected scanning transmission electron microscope, we show that CuInS2 nanocrystals do not form random wurtzite structure. Instead, the CuInS2 nanocrystals consist of several wurtzite- related crystal structures with ordered cation sublattices, some of which are reported for the first time here. This work is supported by the NSF TN-SCORE (JEM), by NSF (WZ), by ORNL's Shared Research Equipment User Program (JCI) sponsored by DOE BES, by DOE BES Materials Sciences and Engineering Division (SJP, STP), and used resources of the National Energy Research Scientific Computing Center, supported by the DOE Office of Science under Contract No. DE-AC02-05CH11231.

  14. JPRS Report, Science & Technology, USSR: Computers

    DTIC Science & Technology

    1987-08-11

    Based on Game Models (V.O. Groppen, AVTOMATIKA I TELEMEKHANIKA, No 8, Aug 86) •••• 25 Problems of Documenting Activity of Data Bank Administration...DOKUMENTY, No 12, Dec 86) 55 Standardization of Management Documents - One of Methods of Qualitative Increase of Their Effectiveness (V.l. Kokorev...minimal effect or does not produce anything at all. Machines must be used for the entire manufactur- ing process cycle, at its most "critical

  15. Efficient Memory Access with NumPy Global Arrays using Local Memory Access

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daily, Jeffrey A.; Berghofer, Dan C.

    This paper discusses the work completed working with Global Arrays of data on distributed multi-computer systems and improving their performance. The tasks completed were done at Pacific Northwest National Laboratory in the Science Undergrad Laboratory Internship program in the summer of 2013 for the Data Intensive Computing Group in the Fundamental and Computational Sciences DIrectorate. This work was done on the Global Arrays Toolkit developed by this group. This toolkit is an interface for programmers to more easily create arrays of data on networks of computers. This is useful because scientific computation is often done on large amounts of datamore » sometimes so large that individual computers cannot hold all of it. This data is held in array form and can best be processed on supercomputers which often consist of a network of individual computers doing their computation in parallel. One major challenge for this sort of programming is that operations on arrays on multiple computers is very complex and an interface is needed so that these arrays seem like they are on a single computer. This is what global arrays does. The work done here is to use more efficient operations on that data that requires less copying of data to be completed. This saves a lot of time because copying data on many different computers is time intensive. The way this challenge was solved is when data to be operated on with binary operations are on the same computer, they are not copied when they are accessed. When they are on separate computers, only one set is copied when accessed. This saves time because of less copying done although more data access operations were done.« less

  16. Toward a Big Data Science: A challenge of "Science Cloud"

    NASA Astrophysics Data System (ADS)

    Murata, Ken T.; Watanabe, Hidenobu

    2013-04-01

    During these 50 years, along with appearance and development of high-performance computers (and super-computers), numerical simulation is considered to be a third methodology for science, following theoretical (first) and experimental and/or observational (second) approaches. The variety of data yielded by the second approaches has been getting more and more. It is due to the progress of technologies of experiments and observations. The amount of the data generated by the third methodologies has been getting larger and larger. It is because of tremendous development and programming techniques of super computers. Most of the data files created by both experiments/observations and numerical simulations are saved in digital formats and analyzed on computers. The researchers (domain experts) are interested in not only how to make experiments and/or observations or perform numerical simulations, but what information (new findings) to extract from the data. However, data does not usually tell anything about the science; sciences are implicitly hidden in the data. Researchers have to extract information to find new sciences from the data files. This is a basic concept of data intensive (data oriented) science for Big Data. As the scales of experiments and/or observations and numerical simulations get larger, new techniques and facilities are required to extract information from a large amount of data files. The technique is called as informatics as a fourth methodology for new sciences. Any methodologies must work on their facilities: for example, space environment are observed via spacecraft and numerical simulations are performed on super-computers, respectively in space science. The facility of the informatics, which deals with large-scale data, is a computational cloud system for science. This paper is to propose a cloud system for informatics, which has been developed at NICT (National Institute of Information and Communications Technology), Japan. The NICT science cloud, we named as OneSpaceNet (OSN), is the first open cloud system for scientists who are going to carry out their informatics for their own science. The science cloud is not for simple uses. Many functions are expected to the science cloud; such as data standardization, data collection and crawling, large and distributed data storage system, security and reliability, database and meta-database, data stewardship, long-term data preservation, data rescue and preservation, data mining, parallel processing, data publication and provision, semantic web, 3D and 4D visualization, out-reach and in-reach, and capacity buildings. Figure (not shown here) is a schematic picture of the NICT science cloud. Both types of data from observation and simulation are stored in the storage system in the science cloud. It should be noted that there are two types of data in observation. One is from archive site out of the cloud: this is a data to be downloaded through the Internet to the cloud. The other one is data from the equipment directly connected to the science cloud. They are often called as sensor clouds. In the present talk, we first introduce the NICT science cloud. We next demonstrate the efficiency of the science cloud, showing several scientific results which we achieved with this cloud system. Through the discussions and demonstrations, the potential performance of sciences cloud will be revealed for any research fields.

  17. Maestro and Castro: Simulation Codes for Astrophysical Flows

    NASA Astrophysics Data System (ADS)

    Zingale, Michael; Almgren, Ann; Beckner, Vince; Bell, John; Friesen, Brian; Jacobs, Adam; Katz, Maximilian P.; Malone, Christopher; Nonaka, Andrew; Zhang, Weiqun

    2017-01-01

    Stellar explosions are multiphysics problems—modeling them requires the coordinated input of gravity solvers, reaction networks, radiation transport, and hydrodynamics together with microphysics recipes to describe the physics of matter under extreme conditions. Furthermore, these models involve following a wide range of spatial and temporal scales, which puts tough demands on simulation codes. We developed the codes Maestro and Castro to meet the computational challenges of these problems. Maestro uses a low Mach number formulation of the hydrodynamics to efficiently model convection. Castro solves the fully compressible radiation hydrodynamics equations to capture the explosive phases of stellar phenomena. Both codes are built upon the BoxLib adaptive mesh refinement library, which prepares them for next-generation exascale computers. Common microphysics shared between the codes allows us to transfer a problem from the low Mach number regime in Maestro to the explosive regime in Castro. Importantly, both codes are freely available (https://github.com/BoxLib-Codes). We will describe the design of the codes and some of their science applications, as well as future development directions.Support for development was provided by NSF award AST-1211563 and DOE/Office of Nuclear Physics grant DE-FG02-87ER40317 to Stony Brook and by the Applied Mathematics Program of the DOE Office of Advance Scientific Computing Research under US DOE contract DE-AC02-05CH11231 to LBNL.

  18. Data Crosscutting Requirements Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kleese van Dam, Kerstin; Shoshani, Arie; Plata, Charity

    2013-04-01

    In April 2013, a diverse group of researchers from the U.S. Department of Energy (DOE) scientific community assembled to assess data requirements associated with DOE-sponsored scientific facilities and large-scale experiments. Participants in the review included facilities staff, program managers, and scientific experts from the offices of Basic Energy Sciences, Biological and Environmental Research, High Energy Physics, and Advanced Scientific Computing Research. As part of the meeting, review participants discussed key issues associated with three distinct aspects of the data challenge: 1) processing, 2) management, and 3) analysis. These discussions identified commonalities and differences among the needs of varied scientific communities.more » They also helped to articulate gaps between current approaches and future needs, as well as the research advances that will be required to close these gaps. Moreover, the review provided a rare opportunity for experts from across the Office of Science to learn about their collective expertise, challenges, and opportunities. The "Data Crosscutting Requirements Review" generated specific findings and recommendations for addressing large-scale data crosscutting requirements.« less

  19. Geological and geochemical aspects of uranium deposits. A selected, annotated bibliography. Vol. 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, M.B.; Garland, P.A.

    1977-10-01

    This bibliography was compiled by selecting 580 references from the Bibliographic Information Data Base of the Department of Energy's (DOE) National Uranium Resource Evaluation (NURE) Program. This data base and five others have been created by the Ecological Sciences Information Center to provide technical computer-retrievable data on various aspects of the nation's uranium resources. All fields of uranium geology are within the defined scope of the project, as are aerial surveying procedures, uranium reserves and resources, and universally applied uranium research. References used by DOE-NURE contractors in completing their aerial reconnaissance survey reports have been included at the request ofmore » the Grand Junction Office, DOE. The following indexes are provided to aid the user in locating reference of interest: author, keyword, geographic location, quadrangle name, geoformational index, and taxonomic name.« less

  20. 76 FR 62050 - DOE/NSF Nuclear Science Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-06

    ... DEPARTMENT OF ENERGY DOE/NSF Nuclear Science Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION: Notice of renewal. SUMMARY: Pursuant to Section 14(a)(2)(A) of the Federal... Services Administration, notice is hereby given that the DOE/NSF Nuclear Science Advisory Committee (NSAC...

  1. GPU-Accelerated Large-Scale Electronic Structure Theory on Titan with a First-Principles All-Electron Code

    NASA Astrophysics Data System (ADS)

    Huhn, William Paul; Lange, Björn; Yu, Victor; Blum, Volker; Lee, Seyong; Yoon, Mina

    Density-functional theory has been well established as the dominant quantum-mechanical computational method in the materials community. Large accurate simulations become very challenging on small to mid-scale computers and require high-performance compute platforms to succeed. GPU acceleration is one promising approach. In this talk, we present a first implementation of all-electron density-functional theory in the FHI-aims code for massively parallel GPU-based platforms. Special attention is paid to the update of the density and to the integration of the Hamiltonian and overlap matrices, realized in a domain decomposition scheme on non-uniform grids. The initial implementation scales well across nodes on ORNL's Titan Cray XK7 supercomputer (8 to 64 nodes, 16 MPI ranks/node) and shows an overall speed up in runtime due to utilization of the K20X Tesla GPUs on each Titan node of 1.4x, with the charge density update showing a speed up of 2x. Further acceleration opportunities will be discussed. Work supported by the LDRD Program of ORNL managed by UT-Battle, LLC, for the U.S. DOE and by the Oak Ridge Leadership Computing Facility, which is a DOE Office of Science User Facility supported under Contract DE-AC05-00OR22725.

  2. Final report for Conference Support Grant "From Computational Biophysics to Systems Biology - CBSB12"

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hansmann, Ulrich H.E.

    2012-07-02

    This report summarizes the outcome of the international workshop From Computational Biophysics to Systems Biology (CBSB12) which was held June 3-5, 2012, at the University of Tennessee Conference Center in Knoxville, TN, and supported by DOE through the Conference Support Grant 120174. The purpose of CBSB12 was to provide a forum for the interaction between a data-mining interested systems biology community and a simulation and first-principle oriented computational biophysics/biochemistry community. CBSB12 was the sixth in a series of workshops of the same name organized in recent years, and the second that has been held in the USA. As in previousmore » years, it gave researchers from physics, biology, and computer science an opportunity to acquaint each other with current trends in computational biophysics and systems biology, to explore venues of cooperation, and to establish together a detailed understanding of cells at a molecular level. The conference grant of $10,000 was used to cover registration fees and provide travel fellowships to selected students and postdoctoral scientists. By educating graduate students and providing a forum for young scientists to perform research into the working of cells at a molecular level, the workshop adds to DOE's mission of paving the way to exploit the abilities of living systems to capture, store and utilize energy.« less

  3. Know Your Discipline: Teaching the Philosophy of Computer Science

    ERIC Educational Resources Information Center

    Tedre, Matti

    2007-01-01

    The diversity and interdisciplinarity of computer science and the multiplicity of its uses in other sciences make it hard to define computer science and to prescribe how computer science should be carried out. The diversity of computer science also causes friction between computer scientists from different branches. Computer science curricula, as…

  4. Strategies for Effective Implementation of Science Models into 6-9 Grade Classrooms on Climate, Weather, and Energy Topics

    NASA Astrophysics Data System (ADS)

    Yarker, M. B.; Stanier, C. O.; Forbes, C.; Park, S.

    2011-12-01

    As atmospheric scientists, we depend on Numerical Weather Prediction (NWP) models. We use them to predict weather patterns, to understand external forcing on the atmosphere, and as evidence to make claims about atmospheric phenomenon. Therefore, it is important that we adequately prepare atmospheric science students to use computer models. However, the public should also be aware of what models are in order to understand scientific claims about atmospheric issues, such as climate change. Although familiar with weather forecasts on television and the Internet, the general public does not understand the process of using computer models to generate a weather and climate forecasts. As a result, the public often misunderstands claims scientists make about their daily weather as well as the state of climate change. Since computer models are the best method we have to forecast the future of our climate, scientific models and modeling should be a topic covered in K-12 classrooms as part of a comprehensive science curriculum. According to the National Science Education Standards, teachers are encouraged to science models into the classroom as a way to aid in the understanding of the nature of science. However, there is very little description of what constitutes a science model, so the term is often associated with scale models. Therefore, teachers often use drawings or scale representations of physical entities, such as DNA, the solar system, or bacteria. In other words, models used in classrooms are often used as visual representations, but the purpose of science models is often overlooked. The implementation of a model-based curriculum in the science classroom can be an effective way to prepare students to think critically, problem solve, and make informed decisions as a contributing member of society. However, there are few resources available to help teachers implement science models into the science curriculum effectively. Therefore, this research project looks at strategies middle school science teachers use to implement science models into their classrooms. These teachers in this study took part in a week-long professional development designed to orient them towards appropriate use of science models for a unit on weather, climate, and energy concepts. The goal of this project is to describe the professional development and describe how teachers intend to incorporate science models into each of their individual classrooms.

  5. Developing Models for Predictive Climate Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Drake, John B; Jones, Philip W

    2007-01-01

    The Community Climate System Model results from a multi-agency collaboration designed to construct cutting-edge climate science simulation models for a broad research community. Predictive climate simulations are currently being prepared for the petascale computers of the near future. Modeling capabilities are continuously being improved in order to provide better answers to critical questions about Earth's climate. Climate change and its implications are front page news in today's world. Could global warming be responsible for the July 2006 heat waves in Europe and the United States? Should more resources be devoted to preparing for an increase in the frequency of strongmore » tropical storms and hurricanes like Katrina? Will coastal cities be flooded due to a rise in sea level? The National Climatic Data Center (NCDC), which archives all weather data for the nation, reports that global surface temperatures have increased over the last century, and that the rate of increase is three times greater since 1976. Will temperatures continue to climb at this rate, will they decline again, or will the rate of increase become even steeper? To address such a flurry of questions, scientists must adopt a systematic approach and develop a predictive framework. With responsibility for advising on energy and technology strategies, the DOE is dedicated to advancing climate research in order to elucidate the causes of climate change, including the role of carbon loading from fossil fuel use. Thus, climate science--which by nature involves advanced computing technology and methods--has been the focus of a number of DOE's SciDAC research projects. Dr. John Drake (ORNL) and Dr. Philip Jones (LANL) served as principal investigators on the SciDAC project, 'Collaborative Design and Development of the Community Climate System Model for Terascale Computers.' The Community Climate System Model (CCSM) is a fully-coupled global system that provides state-of-the-art computer simulations of the Earth's past, present, and future climate states. The collaborative SciDAC team--including over a dozen researchers at institutions around the country--developed, validated, documented, and optimized the performance of CCSM using the latest software engineering approaches, computational technology, and scientific knowledge. Many of the factors that must be accounted for in a comprehensive model of the climate system are illustrated in figure 1.« less

  6. Phase transitions and melting on the Hugoniot of Mg2SiO4 forsterite: new diffraction and temperature results

    NASA Astrophysics Data System (ADS)

    Asimow, P. D.; Akin, M. C.; Homel, M.; Crum, R. S.; Pagan, D.; Lind, J.; Bernier, J.; Mosenfelder, J. L.; Dillman, A. M.; Lavina, B.; Lee, S.; Fat'yanov, O. V.; Newman, M. G.

    2017-06-01

    The phase transitions of forsterite under shock were studied by x-ray diffraction and pyrometry. Samples of 2 mm thick, near-full density (>98% TMD) polycrystalline forsterite were characterized by EBSD and computed tomography and shock compressed to 50 and 75 GPa by two-stage gas gun at the Dynamic Compression Sector, Advanced Photon Source, with diffraction imaged during compression and release. Changes in diffraction confirm a phase transition by 75 GPa. In parallel, single-crystal forsterite shock temperatures were taken from 120 to 210 GPa with improved absolute calibration procedures on the Caltech 6-channel pyrometer and two-stage gun and used to examine the interpretation of superheating and P-T slope of the liquid Hugoniot. This work performed under the auspices of the U.S. Department of Energy (DOE) by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344, supported in part by LLNL's LDRD program under Grants 15-ERD-012 and 16-ERD-010. The Dynamic Compression Sector (35) is supported by DOE / National Nuclear Security Administration under Award Number DE-NA0002442. This research used resources of the Advanced Photon Source, a U.S. DOE Office of Science User Facility operated for the DOE Office of Science by Argonne National Laboratory under Contract No. DE-AC02-06CH11357. Caltech lab supported by NSF EAR-1426526.

  7. Assessing the role of mini-applications in predicting key performance characteristics of scientific and engineering applications

    DOE PAGES

    Barrett, R. F.; Crozier, P. S.; Doerfler, D. W.; ...

    2014-09-28

    Computational science and engineering application programs are typically large, complex, and dynamic, and are often constrained by distribution limitations. As a means of making tractable rapid explorations of scientific and engineering application programs in the context of new, emerging, and future computing architectures, a suite of miniapps has been created to serve as proxies for full scale applications. Each miniapp is designed to represent a key performance characteristic that does or is expected to significantly impact the runtime performance of an application program. In this paper we introduce a methodology for assessing the ability of these miniapps to effectively representmore » these performance issues. We applied this methodology to four miniapps, examining the linkage between them and an application they are intended to represent. Herein we evaluate the fidelity of that linkage. This work represents the initial steps required to begin to answer the question, ''Under what conditions does a miniapp represent a key performance characteristic in a full app?''« less

  8. MaMR: High-performance MapReduce programming model for material cloud applications

    NASA Astrophysics Data System (ADS)

    Jing, Weipeng; Tong, Danyu; Wang, Yangang; Wang, Jingyuan; Liu, Yaqiu; Zhao, Peng

    2017-02-01

    With the increasing data size in materials science, existing programming models no longer satisfy the application requirements. MapReduce is a programming model that enables the easy development of scalable parallel applications to process big data on cloud computing systems. However, this model does not directly support the processing of multiple related data, and the processing performance does not reflect the advantages of cloud computing. To enhance the capability of workflow applications in material data processing, we defined a programming model for material cloud applications that supports multiple different Map and Reduce functions running concurrently based on hybrid share-memory BSP called MaMR. An optimized data sharing strategy to supply the shared data to the different Map and Reduce stages was also designed. We added a new merge phase to MapReduce that can efficiently merge data from the map and reduce modules. Experiments showed that the model and framework present effective performance improvements compared to previous work.

  9. Middle school children's game playing preferences: Case studies of children's experiences playing and critiquing science-related educational games

    NASA Astrophysics Data System (ADS)

    Joseph, Dolly Rebecca Doran

    The playing of computer games is one of the most popular non-school activities of children, particularly boys, and is often the entry point to greater facility with and use of other computer applications. Children are learning skills as they play, but what they learn often does not generalize beyond application to that and other similar games. Nevertheless, games have the potential to develop in students the knowledge and skills described by national and state educational standards. This study focuses upon middle-school aged children, and how they react to and respond to computer games designed for entertainment and educational purposes, within the context of science learning. Through qualitative, case study methodology, the game play, evaluation, and modification experiences of four diverse middle-school-aged students in summer camps are analyzed. The inquiry focused on determining the attributes of computer games that appeal to middle school students, the aspects of science that appeal to middle school children, and ultimately, how science games might be designed to appeal to middle school children. Qualitative data analysis led to the development of a method for describing players' activity modes during game play, rather than the conventional methods that describe game characteristics. These activity modes are used to describe the game design preferences of the participants. Recommendations are also made in the areas of functional, aesthetic, and character design and for the design of educational games. Middle school students may find the topical areas of forensics, medicine, and the environment to be of most interest; designing games in and across these topic areas has the potential for encouraging voluntary science-related play. Finally, when including children in game evaluation and game design activities, results suggest the value of providing multiple types of activities in order to encourage the full participation of all children.

  10. In Situ Fabrication of PtCo Alloy Embedded in Nitrogen-Doped Graphene Nanopores as Synergistic Catalyst for Oxygen Reduction Reaction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhong, Xing; Wang, Lei; Zhou, Hu

    A novel PtCo alloy in situ etched and embedded in graphene nanopores (PtCo/NPG) as a high-performance catalyst for ORR was reported. Graphene nanopores were fabricated in situ while forming PtCo nanoparticles that were uniformly embedded in the graphene nanopores. Given the synergistic effect between PtCo alloy and nanopores, PtCo/NPG exhibited 11.5 times higher mass activity than that of the commercial Pt/C cathode electrocatalyst. DFT calculations indicated that the nanopores in NPG cannot only stabilize PtCo nanoparticles but can also definitely change the electronic structures, thereby change its adsorption abilities. This enhancement can lead to a favorable reaction pathway on PtCo/NPGmore » for ORR. This study showed that PtCo/NPG is a potential candidate for the next generation of Pt-based catalysts in fuel cells. This study also offered a promising alternative strategy and enabled the fabrication of various kinds of metal/graphene nanopore nanohybrids with potential applications in catalysts and potential use for other technological devices. The authors acknowledge the financial support from the National Basic Research Program (973 program, No. 2013CB733501), Zhejiang Provincial Education Department Research Program (Y201326554) and the National Natural Science Foundation of China (No. 21306169, 21101137, 21136001, 21176221 and 91334013). D. Mei acknowledges the support from the US Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Division of Chemical Sciences, Geosciences & Biosciences. Pacific Northwest National Laboratory (PNNL) is a multiprogram national laboratory operated for DOE by Battelle. Computing time was granted by the grand challenge of computational catalysis of the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) and by the National Energy Research Scientific Computing Center (NERSC).« less

  11. Remote control system for high-perfomance computer simulation of crystal growth by the PFC method

    NASA Astrophysics Data System (ADS)

    Pavlyuk, Evgeny; Starodumov, Ilya; Osipov, Sergei

    2017-04-01

    Modeling of crystallization process by the phase field crystal method (PFC) - one of the important directions of modern computational materials science. In this paper, the practical side of the computer simulation of the crystallization process by the PFC method is investigated. To solve problems using this method, it is necessary to use high-performance computing clusters, data storage systems and other often expensive complex computer systems. Access to such resources is often limited, unstable and accompanied by various administrative problems. In addition, the variety of software and settings of different computing clusters sometimes does not allow researchers to use unified program code. There is a need to adapt the program code for each configuration of the computer complex. The practical experience of the authors has shown that the creation of a special control system for computing with the possibility of remote use can greatly simplify the implementation of simulations and increase the performance of scientific research. In current paper we show the principal idea of such a system and justify its efficiency.

  12. Data management and its role in delivering science at DOE BES user facilities - Past, Present, and Future

    NASA Astrophysics Data System (ADS)

    Miller, Stephen D.; Herwig, Kenneth W.; Ren, Shelly; Vazhkudai, Sudharshan S.; Jemian, Pete R.; Luitz, Steffen; Salnikov, Andrei A.; Gaponenko, Igor; Proffen, Thomas; Lewis, Paul; Green, Mark L.

    2009-07-01

    The primary mission of user facilities operated by Basic Energy Sciences under the Department of Energy is to produce data for users in support of open science and basic research [1]. We trace back almost 30 years of history across selected user facilities illustrating the evolution of facility data management practices and how these practices have related to performing scientific research. The facilities cover multiple techniques such as X-ray and neutron scattering, imaging and tomography sciences. Over time, detector and data acquisition technologies have dramatically increased the ability to produce prolific volumes of data challenging the traditional paradigm of users taking data home upon completion of their experiments to process and publish their results. During this time, computing capacity has also increased dramatically, though the size of the data has grown significantly faster than the capacity of one's laptop to manage and process this new facility produced data. Trends indicate that this will continue to be the case for yet some time. Thus users face a quandary for how to manage today's data complexity and size as these may exceed the computing resources users have available to themselves. This same quandary can also stifle collaboration and sharing. Realizing this, some facilities are already providing web portal access to data and computing thereby providing users access to resources they need [2]. Portal based computing is now driving researchers to think about how to use the data collected at multiple facilities in an integrated way to perform their research, and also how to collaborate and share data. In the future, inter-facility data management systems will enable next tier cross-instrument-cross facility scientific research fuelled by smart applications residing upon user computer resources. We can learn from the medical imaging community that has been working since the early 1990's to integrate data from across multiple modalities to achieve better diagnoses [3] - similarly, data fusion across BES facilities will lead to new scientific discoveries.

  13. 76 FR 10342 - Availability of Fiscal Years 2011-2016 Draft Strategic Plan and Request for Public Comment

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-24

    ... ">[email protected] hq.doe.gov . Michael Holland, Office of the Under Secretary for Science at (202) 586-0505, or e-mail [email protected]science.doe.gov . SUPPLEMENTARY INFORMATION: The DOE was established in... clear goals for DOE's four main business lines: nuclear security, environmental clean-up, science and...

  14. Biomedical informatics training at the University of Wisconsin-Madison.

    PubMed

    Severtson, D J; Pape, L; Page, C D; Shavlik, J W; Phillips, G N; Flatley Brennan, P

    2007-01-01

    The purpose of this paper is to describe biomedical informatics training at the University of Wisconsin-Madison (UW-Madison). We reviewed biomedical informatics training, research, and faculty/trainee participation at UW-Madison. There are three primary approaches to training 1) The Computation & Informatics in Biology & Medicine Training Program, 2) formal biomedical informatics offered by various campus departments, and 3) individualized programs. Training at UW-Madison embodies the features of effective biomedical informatics training recommended by the American College of Medical Informatics that were delineated as: 1) curricula that integrate experiences among computational sciences and application domains, 2) individualized and interdisciplinary cross-training among a diverse cadre of trainees to develop key competencies that he or she does not initially possess, 3) participation in research and development activities, and 4) exposure to a range of basic informational and computational sciences. The three biomedical informatics training approaches immerse students in multidisciplinary training and education that is supported by faculty trainers who participate in collaborative research across departments. Training is provided across a range of disciplines and available at different training stages. Biomedical informatics training at UW-Madison illustrates how a large research University, with multiple departments across biological, computational and health fields, can provide effective and productive biomedical informatics training via multiple bioinformatics training approaches.

  15. Automated metadata--final project report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schissel, David

    This report summarizes the work of the Automated Metadata, Provenance Cataloging, and Navigable Interfaces: Ensuring the Usefulness of Extreme-Scale Data Project (MPO Project) funded by the United States Department of Energy (DOE), Offices of Advanced Scientific Computing Research and Fusion Energy Sciences. Initially funded for three years starting in 2012, it was extended for 6 months with additional funding. The project was a collaboration between scientists at General Atomics, Lawrence Berkley National Laboratory (LBNL), and Massachusetts Institute of Technology (MIT). The group leveraged existing computer science technology where possible, and extended or created new capabilities where required. The MPO projectmore » was able to successfully create a suite of software tools that can be used by a scientific community to automatically document their scientific workflows. These tools were integrated into workflows for fusion energy and climate research illustrating the general applicability of the project’s toolkit. Feedback was very positive on the project’s toolkit and the value of such automatic workflow documentation to the scientific endeavor.« less

  16. A review of existing and emerging digital technologies to combat the global trade in fake medicines.

    PubMed

    Mackey, Tim K; Nayyar, Gaurvika

    2017-05-01

    The globalization of the pharmaceutical supply chain has introduced new challenges, chief among them, fighting the international criminal trade in fake medicines. As the manufacture, supply, and distribution of drugs becomes more complex, so does the need for innovative technology-based solutions to protect patients globally. Areas covered: We conducted a multidisciplinary review of the science/health, information technology, computer science, and general academic literature with the aim of identifying cutting-edge existing and emerging 'digital' solutions to combat fake medicines. Our review identified five distinct categories of technology including mobile, radio frequency identification, advanced computational methods, online verification, and blockchain technology. Expert opinion: Digital fake medicine solutions are unifying platforms that integrate different types of anti-counterfeiting technologies as complementary solutions, improve information sharing and data collection, and are designed to overcome existing barriers of adoption and implementation. Investment in this next generation technology is essential to ensure the future security and integrity of the global drug supply chain.

  17. Opening Comments: SciDAC 2008

    NASA Astrophysics Data System (ADS)

    Strayer, Michael

    2008-07-01

    Welcome to Seattle and the 2008 SciDAC Conference. This conference, the fourth in the series, is a continuation of the PI meetings we first began under SciDAC-1. I would like to start by thanking the organizing committee, and Rick Stevens in particular, for organizing this year's meeting. This morning I would like to look briefly at SciDAC, to give you a brief history of SciDAC and also look ahead to see where we plan to go over the next few years. I think the best description of SciDAC, at least the simulation part, comes from a quote from Dr Ray Orbach, DOE's Under Secretary for Science and Director of the Office of Science. In an interview that appeared in the SciDAC Review magazine, Dr Orbach said, `SciDAC is unique in the world. There isn't any other program like it anywhere else, and it has the remarkable ability to do science by bringing together physical scientists, mathematicians, applied mathematicians, and computer scientists who recognize that computation is not something you do at the end, but rather it needs to be built into the solution of the very problem that one is addressing'. Of course, that is extended not just to physical scientists, but also to biological scientists. This is a theme of computational science, this partnership among disciplines, which goes all the way back to the early 1980s and Ken Wilson. It's a unique thread within the Department of Energy. SciDAC-1, launched around the turn of the millennium, created a new generation of scientific simulation codes. It advocated building out mathematical and computing system software in support of science and a new collaboratory software environment for data. The original concept for SciDAC-1 had topical centers for the execution of the various science codes, but several corrections and adjustments were needed. The ASCR scientific computing infrastructure was also upgraded, providing the hardware facilities for the program. The computing facility that we had at that time was the big 3 teraflop/s center at NERSC and that had to be shared with the programmatic side supporting research across DOE. At the time, ESnet was just slightly over half a gig per sec of bandwidth; and the science being addressed was accelerator science, climate, chemistry, fusion, astrophysics, materials science, and QCD. We built out the national collaboratories from the ASCR office, and in addition we built Integrated Software Infrastructure Centers (ISICs). Of these, three were in applied mathematics, four in computer science (including a performance evaluation research center), and four were collaboratories or Grid projects having to do with data management. For science, there were remarkable breakthroughs in simulation, such as full 3D laboratory scale flame simulation. There were also significant improvements in application codes - from factors of almost 3 to more than 100 - and code improvement as people began to realize they had to integrate mathematics tools and computer science tools into their codes to take advantage of the parallelism of the day. The SciDAC data-mining tool, Sapphire, received a 2006 R&D 100 award. And the community as a whole worked well together and began building a publication record that was substantial. In 2006, we recompeted the program with similar goals - SciDAC-1 was very successful, and we wanted to continue that success and extend what was happening under SciDAC to the broader science community. We opened up the partnership to all of the Offices of Science and the NSF and the NNSA. The goal was to create comprehensive scientific computing software and the infrastructure for the software to enable scientific discovery in the physical, biological, and environmental sciences and take the simulations to an extreme scale, in this case petascale. We would also build out a new generation of data management tools. What we observed during SciDAC-1 was that the data and the data communities - both experimental data from large experimental facilities and observational data, along with simulation data - were expanding at a rate significantly faster than Moore's law. In the past few weeks, the FastBit indexing technology software tool for data analyses and data mining developed under SciDAC's Scientific Data Management project was recognized with an R&D 100 Award, selected by an independent judging panel and editors of R&D Magazine as one of the 100 most technologically significant products introduced into the marketplace over the past year. For SciDAC-2 we had nearly 250 proposals requesting a total of slightly over 1 billion in funding. Of course, we had nowhere near 1 billion. The facilities and the science we ended up with were not significantly different from what we had in SciDAC-1. But we had put in place substantially increased facilities for science. When SciDAC-1 was originally executed with the facilities at NERSC, there was significant impact on the resources at NERSC, because not only did we have an expanding portfolio of programmatic science, but we had the SciDAC projects that also needed to run at NERSC. Suddenly, NERSC was incredibly oversubscribed. With SciDAC-2, we had in place leadership-class computing facilities at Argonne with slightly more than half a petaflop and at Oak Ridge with slightly more than a quarter petaflop with an upgrade planned at the end of this year for a petaflop. And we increased the production computing capacity at NERSC to 104 teraflop/s just so that we would not impact the programmatic research and so that we would have a startup facility for SciDAC. At the end of the summer, NERSC will be at 360 teraflop/s. Both the Oak Ridge system and the principal resource at NERSC are Cray systems; Argonne has a different architecture, an IBM Blue Gene/P. At the same time, ESnet has been built out, and we are on a path where we will have dual rings around the country, from 10 to 40 gigabits per second - a factor of 20 to 80 over what was available during SciDAC-1. The science areas include accelerator science and simulation, astrophysics, climate modeling and simulation, computational biology, fusion science, high-energy physics, petabyte high-energy/ nuclear physics, materials science and chemistry, nuclear physics, QCD, radiation transport, turbulence, and groundwater reactive transport modeling and simulation. They were supported by new enabling technology centers and university-based institutes to develop an educational thread for the SciDAC program. There were four mathematics projects and four computer science projects; and under data management, we see a significant difference in that we are bringing up new visualization projects to support and sustain data-intensive science. When we look at the budgets, we see growth in the budget from just under 60 million for SciDAC-1 to just over 80 for SciDAC-2. Part of the growth is due to bringing in NSF and NNSA as new partners, and some of the growth is due to some program offices increasing their investment in SciDAC, while other program offices are constant or have decreased their investment. This is not a reflection of their priorities per se but, rather, a reflection of the budget process and the difficult times in Washington during the past two years. New activities are under way in SciDAC - the annual PI meeting has turned into what I would describe as the premier interdisciplinary computational science meeting, one of the best in the world. Doing interdisciplinary meetings is difficult because people tend to develop a focus for their particular subject area. But this is the fourth in the series; and since the first meeting in San Francisco, these conferences have been remarkably successful. For SciDAC-2 we also created an outreach magazine, SciDAC Review, which highlights scientific discovery as well as high-performance computing. It's been very successful in telling the non-practitioners what SciDAC and computational science are all about. The other new instrument in SciDAC-2 is an outreach center. As we go from computing at the terascale to computing at the petascale, we face the problem of narrowing our research community. The number of people who are `literate' enough to compute at the terascale is more than the number of those who can compute at the petascale. To address this problem, we established the SciDAC Outreach Center to bring people into the fold and educate them as to how we do SciDAC, how the teams are composed, and what it really means to compute at scale. The resources I have mentioned don't come for free. As part of the HECRTF law of 2005, Congress mandated that the Secretary would ensure that leadership-class facilities would be open to everyone across all agencies. So we took Congress at its word, and INCITE is our instrument for making allocations at the leadership-class facilities at Argonne and Oak Ridge, as well as smaller allocations at NERSC. Therefore, the selected proposals are very large projects that are computationally intensive, that compute at scale, and that have a high science impact. An important feature is that INCITE is completely open to anyone - there is no requirement of DOE Office of Science funding, and proposals are rigorously reviewed for both the science and the computational readiness. In 2008, more than 100 proposals were received, requesting about 600 million processor-hours. We allocated just over a quarter of a billion processor-hours. Astrophysics, materials science, lattice gauge theory, and high energy and nuclear physics were the major areas. These were the teams that were computationally ready for the big machines and that had significant science they could identify. In 2009, there will be a significant increase amount of time to be allocated, over half a billion processor-hours. The deadline is August 11 for new proposals and September 12 for renewals. We anticipate a significant increase in the number of requests this year. We expect you - as successful SciDAC centers, institutes, or partnerships - to compete for and win INCITE program allocation awards. If you have a successful SciDAC proposal, we believe it will make you successful in the INCITE review. We have the expectation that you will among those most prepared and most ready to use the machines and to compute at scale. Over the past 18 months, we have assembled a team to look across our computational science portfolio and to judge what are the 10 most significant science accomplishments. The ASCR office, as it goes forward with OMB, the new administration, and Congress, will be judged by the science we have accomplished. All of our proposals - such as for increasing SciDAC, increasing applied mathematics, and so on - are tied to what have we accomplished in science. And so these 10 big accomplishments are key to establishing credibility for new budget requests. Tony Mezzacappa, who chaired the committee, will also give a presentation on the ranking of these top 10, how they got there, and what the science is all about. Here is the list - numbers 2, 5, 6, 7, 9, and 10 are all SciDAC projects. RankTitle 1Modeling the Molecular Basis of Parkinson's Disease (Tsigelny) 2Discovery of the Standing Accretion Shock Instability and Pulsar Birth Mechanism in a Core-Collapse Supernova Evolution and Explosion (Blondin) 3Prediction and Design of Macromolecular Structures and Functions (Baker) 4Understanding How Lifted Flame Stabilized in a Hot Coflow (Yoo) 5New Insights from LCF-enabled Advanced Kinetic Simulations of Global Turbulence in Fusion Systems (Tang) 6High Transition Temperature Superconductivity: A High-Temperature Superconductive State and a Pairing Mechanism in 2-D Hubbard Model (Scalapino) 7 PETsc: Providing the Solvers for DOE High-Performance Simulations (Smith) 8 Via Lactea II, A Billion Particle Simulation of the Dark Matter Halo of the Milky Way (Madau) 9Probing the Properties of Water through Advanced Computing (Galli) 10First Provably Scalable Maxwell Solver Enables Scalable Electromagnetic Simulations (Kovel) So, what's the future going to look like for us? The office is putting together an initiative with the community, which we call the E3 Initiative. We're looking for a 10-year horizon for what's going to happen. Through the series of town hall meetings, which many of you participated in, we have produced a document on `Transforming Energy, the Environment and Science through simulations at the eXtreme Scale'; it can be found at http://www.science.doe.gov/ascr/ProgramDocuments/TownHall.pdf . We sometimes call it the Exascale initiative. Exascale computing is the gold-ring level of computing that seems just out of reach; but if we work hard and stretch, we just might be able to reach it. We envision that there will be a SciDAC-X, working at the extreme scale, with SciDAC teams that will perform and carry out science in the areas that will have a great societal impact, such as alternative fuels and transportation, combustion, climate, fusion science, high-energy physics, advanced fuel cycles, carbon management, and groundwater. We envision institutes for applied mathematics and computer science that probably will segue into algorithms because, at the extreme scale, we see the distinction between the applied math and the algorithm per se and its implementation in computer science as being inseparable. We envision an INCITE-X with multi-petaflop platforms, perhaps even exaflop computing resources. ESnet will be best in class - our 10-year plan calls for having 400 terabits per second capacity available in dual rings around the country, an enormously fast data communications network for moving large amounts of data. In looking at where we've been and where we are going, we can see that the gigaflops and teraflops era was a regime where we were following Moore's law through advances in clock speed. In the current regime, we're introducing massive parallelism, which I think is exemplified by Intel's announcement of their teraflop chip, where they envision more than a thousand cores on a chip. But in order to reach exascale, extrapolations talk about machines that require 100 megawatts of power in terms of current architectures. It's clearly going to require novel architectures, things we have perhaps not yet envisioned. It is of course an era of challenge. There will be an unpredictable evolution of hardware if we are to reach the exascale; and there will clearly be multilevel heterogeneous parallelism, including multilevel memory hierarchies. We have no idea right now as to the programming models needed to execute at such an extreme scale. We have been incredibly successful at the petascale - we know that already. Managing data and just getting communications to scale is an enormous challenge. And it's not just the extreme scaling. It's the rapid increase in complexity that represents the challenge. Let me end with a metaphor. In previous meetings we have talked about the road to petascale. Indeed, we have seen in hindsight that it was a road well traveled. But perhaps the road to exascale is not a road at all. Perhaps the metaphor will be akin to scaling the south face of K2. That's clearly not something all of us will be able to do, and probably computing at the exascale is not something all of us will do. But if we achieve that goal, perhaps the words of Emily Dickinson will best summarize where we will be. Perhaps in her words, looking backward and down, you will say: I climb the `Hill of Science' I view the landscape o'er; Such transcendental prospect I ne'er beheld before!

  18. Final Report National Laboratory Professional Development Workshop for Underrepresented Participants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, Valerie

    The 2013 CMD-IT National Laboratories Professional Development Workshop for Underrepresented Participants (CMD-IT NLPDev 2013) was held at the Oak Ridge National Laboratory campus in Oak Ridge, TN. from June 13 - 14, 2013. Sponsored by the Department of Energy (DOE) Advanced Scientific Computing Research Program, the primary goal of these workshops is to provide information about career opportunities in computational science at the various national laboratories and to mentor the underrepresented participants through community building and expert presentations focused on career success. This second annual workshop offered sessions to facilitate career advancement and, in particular, the strategies and resources neededmore » to be successful at the national laboratories.« less

  19. Double photoionization of Be-like (Be-F5+) ions

    NASA Astrophysics Data System (ADS)

    Abdel Naby, Shahin; Pindzola, Michael; Colgan, James

    2015-04-01

    The time-dependent close-coupling method is used to study the single photon double ionization of Be-like (Be - F5+) ions. Energy and angle differential cross sections are calculated to fully investigate the correlated motion of the two photoelectrons. Symmetric and antisymmetric amplitudes are presented along the isoelectronic sequence for different energy sharing of the emitted electrons. Our total double photoionization cross sections are in good agreement with available theoretical results and experimental measurements along the Be-like ions. This work was supported in part by grants from NSF and US DoE. Computational work was carried out at NERSC in Oakland, California and the National Institute for Computational Sciences in Knoxville, Tennessee.

  20. Neurobiological roots of language in primate audition: common computational properties.

    PubMed

    Bornkessel-Schlesewsky, Ina; Schlesewsky, Matthias; Small, Steven L; Rauschecker, Josef P

    2015-03-01

    Here, we present a new perspective on an old question: how does the neurobiology of human language relate to brain systems in nonhuman primates? We argue that higher-order language combinatorics, including sentence and discourse processing, can be situated in a unified, cross-species dorsal-ventral streams architecture for higher auditory processing, and that the functions of the dorsal and ventral streams in higher-order language processing can be grounded in their respective computational properties in primate audition. This view challenges an assumption, common in the cognitive sciences, that a nonhuman primate model forms an inherently inadequate basis for modeling higher-level language functions. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. A Bimetallic Nickel–Gallium Complex Catalyzes CO 2 Hydrogenation via the Intermediacy of an Anionic d 10 Nickel Hydride

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cammarota, Ryan C.; Vollmer, Matthew V.; Xie, Jing

    Large-scale CO2 hydrogenation could offer a renewable stream of industrially important C1 chemicals while reducing CO2 emissions. Critical to this opportunity is the requirement for inexpensive catalysts based on earth-abundant metals instead of precious metals. We report a nickel-gallium complex featuring a Ni(0)→Ga(III) bond that shows remarkable catalytic activity for hydrogenating CO2 to formate at ambient temperature (3150 turnovers, turnover frequency = 9700 h-1), compared with prior homogeneous Ni-centred catalysts. The Lewis acidic Ga(III) ion plays a pivotal role by stabilizing reactive catalytic intermediates, including a rare anionic d10 Ni hydride. The structure of this reactive intermediate shows a terminalmore » Ni-H, for which the hydride donor strength rivals those of precious metal-hydrides. Collectively, our experimental and computational results demonstrate that modulating a transition metal center via a direct interaction with a Lewis acidic support can be a powerful strategy for promoting new reactivity paradigms in base-metal catalysis. The work was supported as part of the Inorganometallic Catalysis Design Center, an Energy Frontier Research Center funded by the U.S. Department of Energy (DOE), Office of Science, Basic Energy Sciences under Award DE-SC0012702. R.C.C. and M.V.V. were supported by DOE Office of Science Graduate Student Research and National Science Foundation Graduate Research Fellowship programs, respectively. J.C.L., S.A.B., and A.M.A. were supported by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences & Biosciences. Pacific Northwest National Laboratory is operated by Battelle for the U.S. Department of Energy.« less

  2. 10 CFR 727.2 - What are the definitions of the terms used in this part?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... information. Computer means desktop computers, portable computers, computer networks (including the DOE network and local area networks at or controlled by DOE organizations), network devices, automated.... DOE means the Department of Energy, including the National Nuclear Security Administration. DOE...

  3. 10 CFR 727.2 - What are the definitions of the terms used in this part?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... information. Computer means desktop computers, portable computers, computer networks (including the DOE network and local area networks at or controlled by DOE organizations), network devices, automated.... DOE means the Department of Energy, including the National Nuclear Security Administration. DOE...

  4. 10 CFR 727.2 - What are the definitions of the terms used in this part?

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... information. Computer means desktop computers, portable computers, computer networks (including the DOE network and local area networks at or controlled by DOE organizations), network devices, automated.... DOE means the Department of Energy, including the National Nuclear Security Administration. DOE...

  5. 10 CFR 727.2 - What are the definitions of the terms used in this part?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... information. Computer means desktop computers, portable computers, computer networks (including the DOE network and local area networks at or controlled by DOE organizations), network devices, automated.... DOE means the Department of Energy, including the National Nuclear Security Administration. DOE...

  6. 10 CFR 727.2 - What are the definitions of the terms used in this part?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... information. Computer means desktop computers, portable computers, computer networks (including the DOE network and local area networks at or controlled by DOE organizations), network devices, automated.... DOE means the Department of Energy, including the National Nuclear Security Administration. DOE...

  7. Institutional Computing Executive Group Review of Multi-programmatic & Institutional Computing, Fiscal Year 2005 and 2006

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Langer, S; Rotman, D; Schwegler, E

    The Institutional Computing Executive Group (ICEG) review of FY05-06 Multiprogrammatic and Institutional Computing (M and IC) activities is presented in the attached report. In summary, we find that the M and IC staff does an outstanding job of acquiring and supporting a wide range of institutional computing resources to meet the programmatic and scientific goals of LLNL. The responsiveness and high quality of support given to users and the programs investing in M and IC reflects the dedication and skill of the M and IC staff. M and IC has successfully managed serial capacity, parallel capacity, and capability computing resources.more » Serial capacity computing supports a wide range of scientific projects which require access to a few high performance processors within a shared memory computer. Parallel capacity computing supports scientific projects that require a moderate number of processors (up to roughly 1000) on a parallel computer. Capability computing supports parallel jobs that push the limits of simulation science. M and IC has worked closely with Stockpile Stewardship, and together they have made LLNL a premier institution for computational and simulation science. Such a standing is vital to the continued success of laboratory science programs and to the recruitment and retention of top scientists. This report provides recommendations to build on M and IC's accomplishments and improve simulation capabilities at LLNL. We recommend that institution fully fund (1) operation of the atlas cluster purchased in FY06 to support a few large projects; (2) operation of the thunder and zeus clusters to enable 'mid-range' parallel capacity simulations during normal operation and a limited number of large simulations during dedicated application time; (3) operation of the new yana cluster to support a wide range of serial capacity simulations; (4) improvements to the reliability and performance of the Lustre parallel file system; (5) support for the new GDO petabyte-class storage facility on the green network for use in data intensive external collaborations; and (6) continued support for visualization and other methods for analyzing large simulations. We also recommend that M and IC begin planning in FY07 for the next upgrade of its parallel clusters. LLNL investments in M and IC have resulted in a world-class simulation capability leading to innovative science. We thank the LLNL management for its continued support and thank the M and IC staff for its vision and dedicated efforts to make it all happen.« less

  8. BROOKHAVEN NATIONAL LABORATORY INSTITUTIONAL PLAN FY2003-2007.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    This document presents the vision for Brookhaven National Laboratory (BNL) for the next five years, and a roadmap for implementing that vision. Brookhaven is a multidisciplinary science-based laboratory operated for the U.S. Department of Energy (DOE), supported primarily by programs sponsored by the DOE's Office of Science. As the third-largest funding agency for science in the U.S., one of the DOE's goals is ''to advance basic research and the instruments of science that are the foundations for DOE's applied missions, a base for U.S. technology innovation, and a source of remarkable insights into our physical and biological world, and themore » nature of matter and energy'' (DOE Office of Science Strategic Plan, 2000 http://www.osti.gov/portfolio/science.htm). BNL shapes its vision according to this plan.« less

  9. DOE Radiation Research Program is floundering - NAS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lobsenz, G.

    1994-04-20

    The Energy Department's radiation health effects research program is floundering in a morass of administrative confusion due to an ill-considered 1990 joint management agreement between DOE and the Health and Human Services Department, a National Academy of Sciences panel says. The NAS panel said the [open quotes]administrative difficulties[close quotes] created by the DOE-HHS agreement appear to be [open quotes]stifling creativity and efficiency within DOE's Epidemiology Research Program, delaying the completion and publication of research.[close quotes] The panel also expressed concern that DOE has failed to adequately fund or staff its health research office, and that the department had no mastermore » research plan to identify research needs or set forth uniform, scientifically rigorous data collection procedures. The panel said DOE's lack of commitment was particularly evident in its failure to set up an effective health surveillance program for its nuclear work force. In addition, the panel said DOE had fallen short on promises to create a comprehensive computer bank of health research data that would be continually updated with new information gleaned from an ongoing worker surveillance program. While recommending enhancements, the NAS panel emphasized that DOE's health research program would not be able to function effectively until the department revamped its joint management agreement with HHS.« less

  10. High Performance Computing of Meshless Time Domain Method on Multi-GPU Cluster

    NASA Astrophysics Data System (ADS)

    Ikuno, Soichiro; Nakata, Susumu; Hirokawa, Yuta; Itoh, Taku

    2015-01-01

    High performance computing of Meshless Time Domain Method (MTDM) on multi-GPU using the supercomputer HA-PACS (Highly Accelerated Parallel Advanced system for Computational Sciences) at University of Tsukuba is investigated. Generally, the finite difference time domain (FDTD) method is adopted for the numerical simulation of the electromagnetic wave propagation phenomena. However, the numerical domain must be divided into rectangle meshes, and it is difficult to adopt the problem in a complexed domain to the method. On the other hand, MTDM can be easily adept to the problem because MTDM does not requires meshes. In the present study, we implement MTDM on multi-GPU cluster to speedup the method, and numerically investigate the performance of the method on multi-GPU cluster. To reduce the computation time, the communication time between the decomposed domain is hided below the perfect matched layer (PML) calculation procedure. The results of computation show that speedup of MTDM on 128 GPUs is 173 times faster than that of single CPU calculation.

  11. Final Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kirkpatrick, R. James

    This document serves as the final report for United States Department of Energy Basic Energy Sciences Grant DE-FG02-08ER15929, “Computational and Spectroscopic Investigations of the Molecular Scale Structure and Dynamics of Geologically Important Fluids and Mineral-Fluid Interfaces” (R. James Kirkpatrick, P.I., A. O. Yazaydin, co-P.I.). The research under this grant was intimately tied to that supported by the parallel the grant of the same title at Alfred (DOE DE-FG02-10ER16128; Geoffrey M. Bowers, P.I.).

  12. 78 FR 62609 - DOE/NSF Nuclear Science Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-22

    ... Secretariat, General Services Administration, notice is hereby given that the DOE/NSF Nuclear Science Advisory Committee (NSAC) will be renewed for a two-year period. The Committee will provide advice and... research. Additionally, the renewal of the DOE/NSF Nuclear Science Advisory Committee has been determined...

  13. Factors influencing exemplary science teachers' levels of computer use

    NASA Astrophysics Data System (ADS)

    Hakverdi, Meral

    This study examines exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their students' use of computer applications/tools in or for their science class. After a relevant review of the literature certain variables were selected for analysis. These variables included personal self-efficacy in teaching with computers, outcome expectancy, pupil-control ideology, level of computer use, age, gender, teaching experience, personal computer use, professional computer use and science teachers' level of knowledge/skills in using specific computer applications for science instruction. The sample for this study includes middle and high school science teachers who received the Presidential Award for Excellence in Science Teaching Award (sponsored by the White House and the National Science Foundation) between the years 1997 and 2003 from all 50 states and U.S. territories. Award-winning science teachers were contacted about the survey via e-mail or letter with an enclosed return envelope. Of the 334 award-winning science teachers, usable responses were received from 92 science teachers, which made a response rate of 27.5%. Analysis of the survey responses indicated that exemplary science teachers have a variety of knowledge/skills in using computer related applications/tools. The most commonly used computer applications/tools are information retrieval via the Internet, presentation tools, online communication, digital cameras, and data collection probes. Results of the study revealed that students' use of technology in their science classroom is highly correlated with the frequency of their science teachers' use of computer applications/tools. The results of the multiple regression analysis revealed that personal self-efficacy related to the exemplary science teachers' level of computer use suggesting that computer use is dependent on perceived abilities at using computers. The teachers' use of computer-related applications/tools during class, and their personal self-efficacy, age, and gender are highly related with their level of knowledge/skills in using specific computer applications for science instruction. The teachers' level of knowledge/skills in using specific computer applications for science instruction and gender related to their use of computer-related applications/tools during class and the students' use of computer-related applications/tools in or for their science class. In conclusion, exemplary science teachers need assistance in learning and using computer-related applications/tool in their science class.

  14. Sandia National Laboratories: About Sandia: Community Involvement:

    Science.gov Websites

    DOE Regional Science Bowls - New Mexico DOE Regional Science Bowls - California Family Math Night Family Science Night Science, Technology, Engineering, and Math Programs About Education Programs a national concern. Encouraging students to pursue science, technology, engineering, and math (STEM

  15. 78 FR 12044 - DOE/NSF Nuclear Science Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-21

    ... DEPARTMENT OF ENERGY DOE/NSF Nuclear Science Advisory Committee AGENCY: Office of Science... Nuclear Science Advisory Committee (NSAC). The Federal Advisory Committee Act (Pub. L. 92-463, 86 Stat... Energy and the National Science Foundation on scientific priorities within the field of basic nuclear...

  16. Preface: SciDAC 2005

    NASA Astrophysics Data System (ADS)

    Mezzacappa, Anthony

    2005-01-01

    On 26-30 June 2005 at the Grand Hyatt on Union Square in San Francisco several hundred computational scientists from around the world came together for what can certainly be described as a celebration of computational science. Scientists from the SciDAC Program and scientists from other agencies and nations were joined by applied mathematicians and computer scientists to highlight the many successes in the past year where computation has led to scientific discovery in a variety of fields: lattice quantum chromodynamics, accelerator modeling, chemistry, biology, materials science, Earth and climate science, astrophysics, and combustion and fusion energy science. Also highlighted were the advances in numerical methods and computer science, and the multidisciplinary collaboration cutting across science, mathematics, and computer science that enabled these discoveries. The SciDAC Program was conceived and funded by the US Department of Energy Office of Science. It is the Office of Science's premier computational science program founded on what is arguably the perfect formula: the priority and focus is science and scientific discovery, with the understanding that the full arsenal of `enabling technologies' in applied mathematics and computer science must be brought to bear if we are to have any hope of attacking and ultimately solving today's computational Grand Challenge problems. The SciDAC Program has been in existence for four years, and many of the computational scientists funded by this program will tell you that the program has given them the hope of addressing their scientific problems in full realism for the very first time. Many of these scientists will also tell you that SciDAC has also fundamentally changed the way they do computational science. We begin this volume with one of DOE's great traditions, and core missions: energy research. As we will see, computation has been seminal to the critical advances that have been made in this arena. Of course, to understand our world, whether it is to understand its very nature or to understand it so as to control it for practical application, will require explorations on all of its scales. Computational science has been no less an important tool in this arena than it has been in the arena of energy research. From explorations of quantum chromodynamics, the fundamental theory that describes how quarks make up the protons and neutrons of which we are composed, to explorations of the complex biomolecules that are the building blocks of life, to explorations of some of the most violent phenomena in our universe and of the Universe itself, computation has provided not only significant insight, but often the only means by which we have been able to explore these complex, multicomponent systems and by which we have been able to achieve scientific discovery and understanding. While our ultimate target remains scientific discovery, it certainly can be said that at a fundamental level the world is mathematical. Equations ultimately govern the evolution of the systems of interest to us, be they physical, chemical, or biological systems. The development and choice of discretizations of these underlying equations is often a critical deciding factor in whether or not one is able to model such systems stably, faithfully, and practically, and in turn, the algorithms to solve the resultant discrete equations are the complementary, critical ingredient in the recipe to model the natural world. The use of parallel computing platforms, especially at the TeraScale, and the trend toward even larger numbers of processors, continue to present significant challenges in the development and implementation of these algorithms. Computational scientists often speak of their `workflows'. A workflow, as the name suggests, is the sum total of all complex and interlocking tasks, from simulation set up, execution, and I/O, to visualization and scientific discovery, through which the advancement in our understanding of the natural world is realized. For the computational scientist, enabling such workflows presents myriad, signiflcant challenges, and it is computer scientists that are called upon at such times to address these challenges. Simulations are currently generating data at the staggering rate of tens of TeraBytes per simulation, over the course of days. In the next few years, these data generation rates are expected to climb exponentially to hundreds of TeraBytes per simulation, performed over the course of months. The output, management, movement, analysis, and visualization of these data will be our key to unlocking the scientific discoveries buried within the data. And there is no hope of generating such data to begin with, or of scientific discovery, without stable computing platforms and a sufficiently high and sustained performance of scientific applications codes on them. Thus, scientific discovery in the realm of computational science at the TeraScale and beyond will occur at the intersection of science, applied mathematics, and computer science. The SciDAC Program was constructed to mirror this reality, and the pages that follow are a testament to the efficacy of such an approach. We would like to acknowledge the individuals on whose talents and efforts the success of SciDAC 2005 was based. Special thanks go to Betsy Riley for her work on the SciDAC 2005 Web site and meeting agenda, for lining up our corporate sponsors, for coordinating all media communications, and for her efforts in processing the proceedings contributions, to Sherry Hempfling for coordinating the overall SciDAC 2005 meeting planning, for handling a significant share of its associated communications, and for coordinating with the ORNL Conference Center and Grand Hyatt, to Angela Harris for producing many of the documents and records on which our meeting planning was based and for her efforts in coordinating with ORNL Graphics Services, to Angie Beach of the ORNL Conference Center for her efforts in procurement and setting up and executing the contracts with the hotel, and to John Bui and John Smith for their superb wireless networking and A/V set up and support. We are grateful for the relentless efforts of all of these individuals, their remarkable talents, and for the joy of working with them during this past year. They were the cornerstones of SciDAC 2005. Thanks also go to Kymba A'Hearn and Patty Boyd for on-site registration, Brittany Hagen for administrative support, Bruce Johnston for netcast support, Tim Jones for help with the proceedings and Web site, Sherry Lamb for housing and registration, Cindy Lathum for Web site design, Carolyn Peters for on-site registration, and Dami Rich for graphic design. And we would like to express our appreciation to the Oak Ridge National Laboratory, especially Jeff Nichols, the Argonne National Laboratory, the Lawrence Berkeley National Laboratory, and to our corporate sponsors, Cray, IBM, Intel, and SGI, for their support. We would like to extend special thanks also to our plenary speakers, technical speakers, poster presenters, and panelists for all of their efforts on behalf of SciDAC 2005 and for their remarkable achievements and contributions. We would like to express our deep appreciation to Lali Chatterjee, Graham Douglas and Margaret Smith of Institute of Physics Publishing, who worked tirelessly in order to provide us with this finished volume within two months, which is nothing short of miraculous. Finally, we wish to express our heartfelt thanks to Michael Strayer, SciDAC Director, whose vision it was to focus SciDAC 2005 on scientific discovery, around which all of the excitement we experienced revolved, and to our DOE SciDAC program managers, especially Fred Johnson, for their support, input, and help throughout.

  17. Auditing for Veracity ``DUE-DILIGENCE" RIGOROUS-HONESTY!!!: Ethics??? Digits? For a Very-Long Time Giving us All the FINGER!!! does ``MEAN" Mean MEAN!!!???

    NASA Astrophysics Data System (ADS)

    Martin, Brian; Siegel, E.

    2010-03-01

    BAD-scienceS UNethics FLOOD: Pilkey[UseLESS-Arithmetic(2007)]- Park[Vodoo Science(2006)]-Dewedny[Yes We Have NO Neutrons(1997)] -LeVay[When Science Goes WRONG(2008)-LBNL/DOE IMAGINARY-element -``118")]Bell-Labs/Alcatel-Lucent/Thales-Group/France: L'AffairS: Jan Hendrik Schoen; Giant-Magnetoresistance: ``Fert"-``Gruenberg" [PRL(1988;1989)]Kern(KFZ)/Reed-Elsevier/Wallenbergs/Enskilda- Bank/InvestorAB/Sweden:LONG-AFTER Siegel[flickr.com, search on ``Giant-Magnetoresistance": find: ICMAO,Haifa(1977); J.Mag.Mag. Mtls.(JMMM)7,312(1978): 1978<<<1988: (1988-1978)=10 years= one full decade!!!-SANS CRUCIAL last-2-R(H)-Figures MISSING-ONLY SCANNED online(7/2008)conveniently 1/2-year AFTER 2007-Physics: Wolf/Japan/Nobel-prizes(12/2007)]-Revkin[dot.earth,NYT(8/2009)] ``Sea-level-Rise Predictions HALVED"(=50%-error:by coin-toss Bern oulli ``super-computer"(SC)!!!)-McNeil[NY(8/2009]``H1N1-Flu (Langer-Carlson-Bak forest-fire SOC:Vespigniani-Germann)epidem- iology-models predicted cases: ˜1,000-3,000 max. VS. same-week CDC-data 100,000-300,000"(=100%-error: by drunk dart-throws New- ton F=ma``SC"(!!!)-Financials(2008)!!!: AD INFIN-ITUM AD NAUSEUM !!! Statistical-lawS[Biostatistics(1998)]:``TRUST, BUT VERIFY !!!": ABSOLUTELY MANDATORY!!!

  18. Haptic augmentation of science instruction: Does touch matter?

    NASA Astrophysics Data System (ADS)

    Jones, M. Gail; Minogue, James; Tretter, Thomas R.; Negishi, Atsuko; Taylor, Russell

    2006-01-01

    This study investigated the impact of haptic augmentation of a science inquiry program on students' learning about viruses and nanoscale science. The study assessed how the addition of different types of haptic feedback (active touch and kinesthetic feedback) combined with computer visualizations influenced middle and high school students' experiences. The influences of a PHANToM (a sophisticated haptic desktop device), a Sidewinder (a haptic gaming joystick), and a mouse (no haptic feedback) interface were compared. The levels of engagement in the instruction and students' attitudes about the instructional program were assessed using a combination of constructed response and Likert scale items. Potential cognitive differences were examined through an analysis of spontaneously generated analogies that appeared during student discourse. Results showed that the addition of haptic feedback from the haptic-gaming joystick and the PHANToM provided a more immersive learning environment that not only made the instruction more engaging but may also influence the way in which the students construct their understandings about abstract science concepts.

  19. The Complexity of Primary Care Psychology: Theoretical Foundations.

    PubMed

    Smit, E H; Derksen, J J L

    2015-07-01

    How does primary care psychology deal with organized complexity? Has it escaped Newtonian science? Has it, as Weaver (1991) suggests, found a way to 'manage problems with many interrelated factors that cannot be dealt by statistical techniques'? Computer simulations and mathematical models in psychology are ongoing positive developments in the study of complex systems. However, the theoretical development of complex systems in psychology lags behind these advances. In this article we use complexity science to develop a theory on experienced complexity in the daily practice of primary care psychologists. We briefly answer the ontological question of what we see (from the perspective of primary care psychology) as reality, the epistemological question of what we can know, the methodological question of how to act, and the ethical question of what is good care. Following our empirical study, we conclude that complexity science can describe the experienced complexity of the psychologist and offer room for personalized client-centered care. Complexity science is slowly filling the gap between the dominant reductionist theory and complex daily practice.

  20. Minitrack on data and knowledge base issues in genomics at the 27th Hawaii International Conference on system sciences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-05-01

    This report is a summary of the proceedings from the Minitrack on Data and Knowledge Base Issues in Genomics at the 27th Hawaii International Conference on System Science, January 4 - 7, 1994. The minitrack was organized by Dong-Guk Shin (University of Connecticut) and Francois Rechenmann (INRIA, France). Support was jointly provided by the NSF, NIH and DOE. The minitrack included, after rigorous review, ten full papers and four extended abstracts in the following five different research subareas of genome informatics: data modeling and management, sequence analysis, graphical user interface, interoperation in a heterogenous computing environment, and system integration inmore » a knowledge-based approach.« less

  1. Reasons for 2011 Release of the Evaluated Nuclear Data Library (ENDL2011.0)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, D.; Escher, J.; Hoffman, R.

    LLNL's Computational Nuclear Physics Group and Nuclear Theory and Modeling Group have collaborated to create the 2011 release of the Evaluated Nuclear Data Library (ENDL2011). ENDL2011 is designed to sup- port LLNL's current and future nuclear data needs. This database is currently the most complete nuclear database for Monte Carlo and deterministic transport of neutrons and charged particles, surpassing ENDL2009.0 [1]. The ENDL2011 release [2] contains 918 transport-ready eval- uations in the neutron sub-library alone. ENDL2011 was assembled with strong support from the ASC program, leveraged with support from NNSA science campaigns and the DOE/Offce of Science US Nuclear Datamore » Pro- gram.« less

  2. A DOE Perspective

    NASA Astrophysics Data System (ADS)

    Bennett, Kristin

    2004-03-01

    As one of the lead agencies for nanotechnology research and development, the Department of Energy (DOE) is revolutionizing the way we understand and manipulate materials at the nanoscale. As the Federal government's single largest supporter of basic research in the physical sciences in the United States, and overseeing the Nation's cross-cutting research programs in high-energy physics, nuclear physics, and fusion energy sciences, the DOE guides the grand challenges in nanomaterials research that will have an impact on everything from medicine, to energy production, to manufacturing. Within the DOE's Office of Science, the Office of Basic Energy Sciences (BES) leads research and development at the nanoscale, which supports the Department's missions of national security, energy, science, and the environment. The cornerstone of the program in nanoscience is the establishment and operation of five new Nanoscale Science Research Centers (NSRCs), which are under development at six DOE Laboratories. Throughout its history, DOE's Office of Science has designed, constructed and operated many of the nation's most advanced, large-scale research and development user facilities, of importance to all areas of science. These state-of-the art facilities are shared with the science community worldwide and contain technologies and instruments that are available nowhere else. Like all DOE national user facilities, the new NSRCs are designed to make novel state-of-the-art research tools available to the world, and to accelerate a broad scale national effort in basic nanoscience and nanotechnology. The NSRCs will be sited adjacent to or near existing DOE/BES major user facilities, and are designed to enable national user access to world-class capabilities for the synthesis, processing, fabrication, and analysis of materials at the nanoscale, and to transform the nation's approach to nanomaterials.

  3. Networking for large-scale science: infrastructure, provisioning, transport and application mapping

    NASA Astrophysics Data System (ADS)

    Rao, Nageswara S.; Carter, Steven M.; Wu, Qishi; Wing, William R.; Zhu, Mengxia; Mezzacappa, Anthony; Veeraraghavan, Malathi; Blondin, John M.

    2005-01-01

    Large-scale science computations and experiments require unprecedented network capabilities in the form of large bandwidth and dynamically stable connections to support data transfers, interactive visualizations, and monitoring and steering operations. A number of component technologies dealing with the infrastructure, provisioning, transport and application mappings must be developed and/or optimized to achieve these capabilities. We present a brief account of the following technologies that contribute toward achieving these network capabilities: (a) DOE UltraScienceNet and NSF CHEETAH network testbeds that provide on-demand and scheduled dedicated network connections; (b) experimental results on transport protocols that achieve close to 100% utilization on dedicated 1Gbps wide-area channels; (c) a scheme for optimally mapping a visualization pipeline onto a network to minimize the end-to-end delays; and (d) interconnect configuration and protocols that provides multiple Gbps flows from Cray X1 to external hosts.

  4. Role of Polymer-grafted Nanoparticle Interactions in Supercrystal Self-Assembly

    NASA Astrophysics Data System (ADS)

    Horst, Nathan; Waltmann, Curt; Travesset, Alex

    Many successful strategies are available for the programmable self-assembly of nanoparticle superlattices. In this talk, we discuss the the case of nanoparticles with grafted polymer ligands. For very short polymers, the phase diagram is rationalized by borrowing results from hard-sphere packing models. Although a clear correlation exists between the maximum of the packing fraction of hard spheres and supercrystal equilibrium phases found experimentally, these systems are flexible, which leads to clear deviations from the sphere packing model. Using theoretical and computational models, we present an investigation of the interactions of polymer-grafted nanoparticles, focusing on the role of the rigidity of the chain, and how it affects the resulting two and three-dimensional superlattice structures. Comparison with an experimental system of gold nanoparticles grafted with polyethylene glycol is also presented. Supported by the U.S. Department of Energy (U.S. DOE), Office of Basic Energy Sciences, Division of Materials Sciences and Engineering. Ames Laboratory is operated for the U.S. DOE by Iowa State University under Contract No. DE-AC02-07CH11358.

  5. 1996 Site environmental report Sandia National Laboratories Albuquerque, New Mexico

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fink, C.H.; Duncan, D.; Sanchez, R.

    1997-08-01

    Sandia National Laboratories/New Mexico (SNL/NM) is operated in support of the U.S. Department of Energy (DOE) mission to provide weapon component technology and hardware for national security needs, and to conduct fundamental research and development (R&D) to advance technology in energy research, computer science, waste management, electronics, materials science, and transportation safety for hazardous and nuclear components. In support of this mission, the Environmental Safety and Health (ES&H) Center at SNL/NM conducts extensive environmental monitoring, surveillance, and compliance activities to assist SNL`s line organizations in meeting all applicable environmental regulations applicable to the site including those regulating radiological and nonradiologicalmore » effluents and emissions. Also herein are included, the status of environmental programs that direct and manage activities such as terrestrial surveillance; ambient air and meteorological monitoring; hazardous, radioactive, and solid waste management; pollution prevention and waste minimization; environmental restoration (ER); oil and chemical spill prevention; and National Environmental Policy Act (NEPA) documentation. This report has been prepared in compliance with DOE order 5400.1, General Environmental Protection.« less

  6. Additions and improvements to the high energy density physics capabilities in the FLASH code

    NASA Astrophysics Data System (ADS)

    Lamb, D.; Bogale, A.; Feister, S.; Flocke, N.; Graziani, C.; Khiar, B.; Laune, J.; Tzeferacos, P.; Walker, C.; Weide, K.

    2017-10-01

    FLASH is an open-source, finite-volume Eulerian, spatially-adaptive radiation magnetohydrodynamics code that has the capabilities to treat a broad range of physical processes. FLASH performs well on a wide range of computer architectures, and has a broad user base. Extensive high energy density physics (HEDP) capabilities exist in FLASH, which make it a powerful open toolset for the academic HEDP community. We summarize these capabilities, emphasizing recent additions and improvements. We describe several non-ideal MHD capabilities that are being added to FLASH, including the Hall and Nernst effects, implicit resistivity, and a circuit model, which will allow modeling of Z-pinch experiments. We showcase the ability of FLASH to simulate Thomson scattering polarimetry, which measures Faraday due to the presence of magnetic fields, as well as proton radiography, proton self-emission, and Thomson scattering diagnostics. Finally, we describe several collaborations with the academic HEDP community in which FLASH simulations were used to design and interpret HEDP experiments. This work was supported in part at U. Chicago by DOE NNSA ASC through the Argonne Institute for Computing in Science under FWP 57789; DOE NNSA under NLUF Grant DE-NA0002724; DOE SC OFES Grant DE-SC0016566; and NSF Grant PHY-1619573.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pruski, Marek; Sadow, Aaron D.; Slowing, Igor I.

    Catalysis research at the U.S. Department of Energy's (DOE's) National Laboratories covers a wide range of research topics in heterogeneous catalysis, homogeneous/molecular catalysis, biocatalysis, electrocatalysis, and surface science. Since much of the work at National Laboratories is funded by DOE, the research is largely focused on addressing DOE's mission to ensure America's security and prosperity by addressing its energy, environmental, and nuclear challenges through transformative science and technology solutions. The catalysis research carried out at the DOE National Laboratories ranges from very fundamental catalysis science, funded by DOE's Office of Basic Energy Sciences (BES), to applied research and development (R&D)more » in areas such as biomass conversion to fuels and chemicals, fuel cells, and vehicle emission control with primary funding from DOE's Office of Energy Efficiency and Renewable Energy.« less

  8. Key Lessons in Building "Data Commons": The Open Science Data Cloud Ecosystem

    NASA Astrophysics Data System (ADS)

    Patterson, M.; Grossman, R.; Heath, A.; Murphy, M.; Wells, W.

    2015-12-01

    Cloud computing technology has created a shift around data and data analysis by allowing researchers to push computation to data as opposed to having to pull data to an individual researcher's computer. Subsequently, cloud-based resources can provide unique opportunities to capture computing environments used both to access raw data in its original form and also to create analysis products which may be the source of data for tables and figures presented in research publications. Since 2008, the Open Cloud Consortium (OCC) has operated the Open Science Data Cloud (OSDC), which provides scientific researchers with computational resources for storing, sharing, and analyzing large (terabyte and petabyte-scale) scientific datasets. OSDC has provided compute and storage services to over 750 researchers in a wide variety of data intensive disciplines. Recently, internal users have logged about 2 million core hours each month. The OSDC also serves the research community by colocating these resources with access to nearly a petabyte of public scientific datasets in a variety of fields also accessible for download externally by the public. In our experience operating these resources, researchers are well served by "data commons," meaning cyberinfrastructure that colocates data archives, computing, and storage infrastructure and supports essential tools and services for working with scientific data. In addition to the OSDC public data commons, the OCC operates a data commons in collaboration with NASA and is developing a data commons for NOAA datasets. As cloud-based infrastructures for distributing and computing over data become more pervasive, we ask, "What does it mean to publish data in a data commons?" Here we present the OSDC perspective and discuss several services that are key in architecting data commons, including digital identifier services.

  9. 10 CFR 727.5 - What acknowledgment and consent is required for access to information on DOE computers?

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... information on DOE computers? 727.5 Section 727.5 Energy DEPARTMENT OF ENERGY CONSENT FOR ACCESS TO INFORMATION ON DEPARTMENT OF ENERGY COMPUTERS § 727.5 What acknowledgment and consent is required for access to information on DOE computers? An individual may not be granted access to information on a DOE...

  10. 10 CFR 727.5 - What acknowledgment and consent is required for access to information on DOE computers?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... information on DOE computers? 727.5 Section 727.5 Energy DEPARTMENT OF ENERGY CONSENT FOR ACCESS TO INFORMATION ON DEPARTMENT OF ENERGY COMPUTERS § 727.5 What acknowledgment and consent is required for access to information on DOE computers? An individual may not be granted access to information on a DOE...

  11. 10 CFR 727.5 - What acknowledgment and consent is required for access to information on DOE computers?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... information on DOE computers? 727.5 Section 727.5 Energy DEPARTMENT OF ENERGY CONSENT FOR ACCESS TO INFORMATION ON DEPARTMENT OF ENERGY COMPUTERS § 727.5 What acknowledgment and consent is required for access to information on DOE computers? An individual may not be granted access to information on a DOE...

  12. 10 CFR 727.5 - What acknowledgment and consent is required for access to information on DOE computers?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... information on DOE computers? 727.5 Section 727.5 Energy DEPARTMENT OF ENERGY CONSENT FOR ACCESS TO INFORMATION ON DEPARTMENT OF ENERGY COMPUTERS § 727.5 What acknowledgment and consent is required for access to information on DOE computers? An individual may not be granted access to information on a DOE...

  13. 10 CFR 727.5 - What acknowledgment and consent is required for access to information on DOE computers?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... information on DOE computers? 727.5 Section 727.5 Energy DEPARTMENT OF ENERGY CONSENT FOR ACCESS TO INFORMATION ON DEPARTMENT OF ENERGY COMPUTERS § 727.5 What acknowledgment and consent is required for access to information on DOE computers? An individual may not be granted access to information on a DOE...

  14. 12 CFR 516.10 - How does OTS compute time periods under this part?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... APPLICATION PROCESSING PROCEDURES § 516.10 How does OTS compute time periods under this part? In computing time periods under this part, OTS does not include the day of the act or event that commences the time... 12 Banks and Banking 5 2010-01-01 2010-01-01 false How does OTS compute time periods under this...

  15. Promoting Pre-college Science Education

    NASA Astrophysics Data System (ADS)

    Taylor, P. L.; Lee, R. L.

    2000-10-01

    The Fusion Education Program, with continued support from DOE, has strengthened its interactions with educators in promoting pre-college science education for students. Projects aggressively pursued this year include an on-site, college credited, laboratory-based 10-day educator workshop on plasma and fusion science; completion of `Starpower', a fusion power plant simulation on interactive CD; expansion of scientist visits to classrooms; broadened participation in an internet-based science olympiad; and enhancements to the tours of the DIII-D Facility. In the workshop, twelve teachers used bench top devices to explore basic plasma physics. Also included were radiation experiments, computer aided drafting, techniques to integrate fusion science and technology in the classroom, and visits to a University Physics lab and the San Diego Supercomputer Center. Our ``Scientist in a Classroom'' program reached more than 2200 students at 20 schools. Our `Starpower' CD allows a range of interactive learning from the effects of electric and magnetic fields on charged particles to operation of a Tokamak-based power plant. Continuing tours of the DIII-D facility were attended by more than 800 students this past year.

  16. The melting temperature of liquid water with the effective fragment potential

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brorsen, Kurt R.; Willow, Soohaeng Y.; Xantheas, Sotiris S.

    2015-09-17

    Direct simulation of the solid-liquid water interface with the effective fragment potential (EFP) via the constant enthalpy and pressure (NPH) ensemble was used to estimate the melting temperature (Tm) of ice-Ih. Initial configurations and velocities, taken from equilibrated constant pressure and temperature (NPT) simulations at T = 300 K, 350 K and 400 K, respectively, yielded corresponding Tm values of 378±16 K, 382±14 K and 384±15 K. These estimates are consistently higher than experiment, albeit to the same degree with previously reported estimates using density functional theory (DFT)-based Born-Oppenheimer simulations with the Becke-Lee-Yang-Parr functional plus dispersion corrections (BLYP-D). KRB wasmore » supported by a Computational Science Graduate Fellowship from the Department of Energy. MSG was supported by a U.S. National Science Foundation Software Infrastructure (SI2) grant (ACI – 1047772). SSX acknowledges support from the US Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences and Biosciences. Pacific Northwest National Laboratory (PNNL) is a multiprogram national laboratory operated for DOE by Battelle.« less

  17. Argonne Leadership Computing Facility 2011 annual report : Shaping future supercomputing.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Papka, M.; Messina, P.; Coffey, R.

    The ALCF's Early Science Program aims to prepare key applications for the architecture and scale of Mira and to solidify libraries and infrastructure that will pave the way for other future production applications. Two billion core-hours have been allocated to 16 Early Science projects on Mira. The projects, in addition to promising delivery of exciting new science, are all based on state-of-the-art, petascale, parallel applications. The project teams, in collaboration with ALCF staff and IBM, have undertaken intensive efforts to adapt their software to take advantage of Mira's Blue Gene/Q architecture, which, in a number of ways, is a precursormore » to future high-performance-computing architecture. The Argonne Leadership Computing Facility (ALCF) enables transformative science that solves some of the most difficult challenges in biology, chemistry, energy, climate, materials, physics, and other scientific realms. Users partnering with ALCF staff have reached research milestones previously unattainable, due to the ALCF's world-class supercomputing resources and expertise in computation science. In 2011, the ALCF's commitment to providing outstanding science and leadership-class resources was honored with several prestigious awards. Research on multiscale brain blood flow simulations was named a Gordon Bell Prize finalist. Intrepid, the ALCF's BG/P system, ranked No. 1 on the Graph 500 list for the second consecutive year. The next-generation BG/Q prototype again topped the Green500 list. Skilled experts at the ALCF enable researchers to conduct breakthrough science on the Blue Gene system in key ways. The Catalyst Team matches project PIs with experienced computational scientists to maximize and accelerate research in their specific scientific domains. The Performance Engineering Team facilitates the effective use of applications on the Blue Gene system by assessing and improving the algorithms used by applications and the techniques used to implement those algorithms. The Data Analytics and Visualization Team lends expertise in tools and methods for high-performance, post-processing of large datasets, interactive data exploration, batch visualization, and production visualization. The Operations Team ensures that system hardware and software work reliably and optimally; system tools are matched to the unique system architectures and scale of ALCF resources; the entire system software stack works smoothly together; and I/O performance issues, bug fixes, and requests for system software are addressed. The User Services and Outreach Team offers frontline services and support to existing and potential ALCF users. The team also provides marketing and outreach to users, DOE, and the broader community.« less

  18. USRA/RIACS

    NASA Technical Reports Server (NTRS)

    Oliger, Joseph

    1992-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under a cooperative agreement with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. A flexible scientific staff is provided through a university faculty visitor program, a post doctoral program, and a student visitor program. Not only does this provide appropriate expertise but it also introduces scientists outside of NASA to NASA problems. A small group of core RIACS staff provides continuity and interacts with an ARC technical monitor and scientific advisory group to determine the RIACS mission. RIACS activities are reviewed and monitored by a USRA advisory council and ARC technical monitor. Research at RIACS is currently being done in the following areas: (1) parallel computing; (2) advanced methods for scientific computing; (3) learning systems; (4) high performance networks and technology; and (5) graphics, visualization, and virtual environments. In the past year, parallel compiler techniques and adaptive numerical methods for flows in complicated geometries were identified as important problems to investigate for ARC's involvement in the Computational Grand Challenges of the next decade. We concluded a summer student visitors program during this six months. We had six visiting graduate students that worked on projects over the summer and presented seminars on their work at the conclusion of their visits. RIACS technical reports are usually preprints of manuscripts that have been submitted to research journals or conference proceedings. A list of these reports for the period July 1, 1992 through December 31, 1992 is provided.

  19. 10 CFR 727.6 - What are the obligations of a DOE contractor?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... COMPUTERS § 727.6 What are the obligations of a DOE contractor? (a) A DOE contractor must ensure that... computer unless the DOE contractor has obtained a written acknowledgment and consent by each contractor or... violates the requirements of this section with regard to a DOE computer with Restricted Data or other...

  20. 10 CFR 727.3 - To whom does this part apply?

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... DEPARTMENT OF ENERGY CONSENT FOR ACCESS TO INFORMATION ON DEPARTMENT OF ENERGY COMPUTERS § 727.3 To whom does... subcontractor employees, and any other individual who has been granted access to a DOE computer or to information on a DOE computer. (b) Section 727.4 of this part also applies to any person who uses a DOE...

  1. 10 CFR 727.3 - To whom does this part apply?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... DEPARTMENT OF ENERGY CONSENT FOR ACCESS TO INFORMATION ON DEPARTMENT OF ENERGY COMPUTERS § 727.3 To whom does... subcontractor employees, and any other individual who has been granted access to a DOE computer or to information on a DOE computer. (b) Section 727.4 of this part also applies to any person who uses a DOE...

  2. 10 CFR 727.6 - What are the obligations of a DOE contractor?

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... COMPUTERS § 727.6 What are the obligations of a DOE contractor? (a) A DOE contractor must ensure that... computer unless the DOE contractor has obtained a written acknowledgment and consent by each contractor or... violates the requirements of this section with regard to a DOE computer with Restricted Data or other...

  3. 10 CFR 727.3 - To whom does this part apply?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... DEPARTMENT OF ENERGY CONSENT FOR ACCESS TO INFORMATION ON DEPARTMENT OF ENERGY COMPUTERS § 727.3 To whom does... subcontractor employees, and any other individual who has been granted access to a DOE computer or to information on a DOE computer. (b) Section 727.4 of this part also applies to any person who uses a DOE...

  4. 10 CFR 727.6 - What are the obligations of a DOE contractor?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... COMPUTERS § 727.6 What are the obligations of a DOE contractor? (a) A DOE contractor must ensure that... computer unless the DOE contractor has obtained a written acknowledgment and consent by each contractor or... violates the requirements of this section with regard to a DOE computer with Restricted Data or other...

  5. 10 CFR 727.3 - To whom does this part apply?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... DEPARTMENT OF ENERGY CONSENT FOR ACCESS TO INFORMATION ON DEPARTMENT OF ENERGY COMPUTERS § 727.3 To whom does... subcontractor employees, and any other individual who has been granted access to a DOE computer or to information on a DOE computer. (b) Section 727.4 of this part also applies to any person who uses a DOE...

  6. 10 CFR 727.6 - What are the obligations of a DOE contractor?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... COMPUTERS § 727.6 What are the obligations of a DOE contractor? (a) A DOE contractor must ensure that... computer unless the DOE contractor has obtained a written acknowledgment and consent by each contractor or... violates the requirements of this section with regard to a DOE computer with Restricted Data or other...

  7. 10 CFR 727.6 - What are the obligations of a DOE contractor?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... COMPUTERS § 727.6 What are the obligations of a DOE contractor? (a) A DOE contractor must ensure that... computer unless the DOE contractor has obtained a written acknowledgment and consent by each contractor or... violates the requirements of this section with regard to a DOE computer with Restricted Data or other...

  8. 10 CFR 727.3 - To whom does this part apply?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... DEPARTMENT OF ENERGY CONSENT FOR ACCESS TO INFORMATION ON DEPARTMENT OF ENERGY COMPUTERS § 727.3 To whom does... subcontractor employees, and any other individual who has been granted access to a DOE computer or to information on a DOE computer. (b) Section 727.4 of this part also applies to any person who uses a DOE...

  9. Gender differences in the use of computers, programming, and peer interactions in computer science classrooms

    NASA Astrophysics Data System (ADS)

    Stoilescu, Dorian; Egodawatte, Gunawardena

    2010-12-01

    Research shows that female and male students in undergraduate computer science programs view computer culture differently. Female students are interested more in the use of computers than in doing programming, whereas male students see computer science mainly as a programming activity. The overall purpose of our research was not to find new definitions for computer science culture but to see how male and female students see themselves involved in computer science practices, how they see computer science as a successful career, and what they like and dislike about current computer science practices. The study took place in a mid-sized university in Ontario. Sixteen students and two instructors were interviewed to get their views. We found that male and female views are different on computer use, programming, and the pattern of student interactions. Female and male students did not have any major issues in using computers. In computing programming, female students were not so involved in computing activities whereas male students were heavily involved. As for the opinions about successful computer science professionals, both female and male students emphasized hard working, detailed oriented approaches, and enjoying playing with computers. The myth of the geek as a typical profile of successful computer science students was not found to be true.

  10. Software on diffractive optics and computer-generated holograms

    NASA Astrophysics Data System (ADS)

    Doskolovich, Leonid L.; Golub, Michael A.; Kazanskiy, Nikolay L.; Khramov, Alexander G.; Pavelyev, Vladimir S.; Seraphimovich, P. G.; Soifer, Victor A.; Volotovskiy, S. G.

    1995-01-01

    The `Quick-DOE' software for an IBM PC-compatible computer is aimed at calculating the masks of diffractive optical elements (DOEs) and computer generated holograms, computer simulation of DOEs, and for executing a number of auxiliary functions. In particular, among the auxiliary functions are the file format conversions, mask visualization on display from a file, implementation of fast Fourier transforms, and arranging and preparation of composite images for the output on a photoplotter. The software is aimed for use by opticians, DOE designers, and the programmers dealing with the development of the program for DOE computation.

  11. How to get the Nobel Prize in physics

    NASA Astrophysics Data System (ADS)

    Nordling, Carl

    1995-01-01

    Every year, on the 10th of December, one piece is added to the history of science. This is the day when the Nobel Prizes are awarded to those scientists who "during the preceding year have conferred the greatest benefit on mankind". The Nobel Prize carries the highest prestige and fame of all distinctions in the world of science. There have been many speculations regarding the prize: What effect does it have on its recipients? Does it boost their research activities or does it kill them? Is it merely an after-the-fact recognition of important steps in the history of science, or does it also create history by changing the directions along which science develops? What role does it play in the sociology of science? Is it a prize for leaders of big research teams or is there a preference for the genius working completely on his own? Are there equal opportunities for men and women, for Swedes and Russians, for black and white? Where does one find the track that leads to Stockholm?

  12. 75 FR 6651 - DOE/NSF Nuclear Science Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-10

    ... DEPARTMENT OF ENERGY DOE/NSF Nuclear Science Advisory Committee AGENCY: Department of Energy.../NSF Nuclear Science Advisory Committee (NSAC). Federal Advisory Committee Act (Pub. L. 92- 463, 86... on scientific priorities within the field of basic nuclear science research. Tentative Agenda: Agenda...

  13. Geocomputation over Hybrid Computer Architecture and Systems: Prior Works and On-going Initiatives at UARK

    NASA Astrophysics Data System (ADS)

    Shi, X.

    2015-12-01

    As NSF indicated - "Theory and experimentation have for centuries been regarded as two fundamental pillars of science. It is now widely recognized that computational and data-enabled science forms a critical third pillar." Geocomputation is the third pillar of GIScience and geosciences. With the exponential growth of geodata, the challenge of scalable and high performance computing for big data analytics become urgent because many research activities are constrained by the inability of software or tool that even could not complete the computation process. Heterogeneous geodata integration and analytics obviously magnify the complexity and operational time frame. Many large-scale geospatial problems may be not processable at all if the computer system does not have sufficient memory or computational power. Emerging computer architectures, such as Intel's Many Integrated Core (MIC) Architecture and Graphics Processing Unit (GPU), and advanced computing technologies provide promising solutions to employ massive parallelism and hardware resources to achieve scalability and high performance for data intensive computing over large spatiotemporal and social media data. Exploring novel algorithms and deploying the solutions in massively parallel computing environment to achieve the capability for scalable data processing and analytics over large-scale, complex, and heterogeneous geodata with consistent quality and high-performance has been the central theme of our research team in the Department of Geosciences at the University of Arkansas (UARK). New multi-core architectures combined with application accelerators hold the promise to achieve scalability and high performance by exploiting task and data levels of parallelism that are not supported by the conventional computing systems. Such a parallel or distributed computing environment is particularly suitable for large-scale geocomputation over big data as proved by our prior works, while the potential of such advanced infrastructure remains unexplored in this domain. Within this presentation, our prior and on-going initiatives will be summarized to exemplify how we exploit multicore CPUs, GPUs, and MICs, and clusters of CPUs, GPUs and MICs, to accelerate geocomputation in different applications.

  14. The role of broken symmetry in solvation of a spherical cavity in classical and quantum water models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Remsing, Richard C.; Baer, Marcel D.; Schenter, Gregory K.

    2014-08-21

    Insertion of a hard sphere cavity in liquid water breaks translational symmetry and generates an electrostatic potential difference between the region near the cavity and the bulk. Here, we clarify the physical interpretation of this potential and its calculation. We also show that the electrostatic potential in the center of small, medium, and large cavities depends very sensitively on the form of the assumed molecular interactions for dfferent classical simple point-charge models and quantum mechanical DFT-based interaction potentials, as reected in their description of donor and acceptor hydrogen bonds near the cavity. These dfferences can signifcantly affect the magnitude ofmore » the scalar electrostatic potential. We argue that the result of these studies will have direct consequences toward our understanding of the thermodynamics of ion solvation through the cavity charging process. JDW and RCR are supported by the National Science Foundation (Grants CHE0848574 and CHE1300993). CJM and GKS are supported by the U.S. Department of Energy`s (DOE) Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences and Biosciences. Pacific Northwest National Laboratory (PNNL) is operated for the Department of Energy by Battelle. MDB is grateful for the support of the Linus Pauling Distinguished Postdoctoral Fellowship Program at PNNL. We acknowledge illuminating discussions and sharing of ideas and preprints with Dr. Shawn M. Kathmann and Prof. Tom Beck. The DFT simulations used resources of the National Energy Research Scientific Computing Center, which is supported by the Office of Science of the U.S. Department of Energy under Contract No. DE-AC02-05CH11231. Additional computing resources were generously allocated by PNNL's Institutional Computing program.« less

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, Ann E; Barker, Ashley D; Bland, Arthur S Buddy

    Oak Ridge National Laboratory's Leadership Computing Facility (OLCF) continues to deliver the most powerful resources in the U.S. for open science. At 2.33 petaflops peak performance, the Cray XT Jaguar delivered more than 1.4 billion core hours in calendar year (CY) 2011 to researchers around the world for computational simulations relevant to national and energy security; advancing the frontiers of knowledge in physical sciences and areas of biological, medical, environmental, and computer sciences; and providing world-class research facilities for the nation's science enterprise. Users reported more than 670 publications this year arising from their use of OLCF resources. Of thesemore » we report the 300 in this review that are consistent with guidance provided. Scientific achievements by OLCF users cut across all range scales from atomic to molecular to large-scale structures. At the atomic scale, researchers discovered that the anomalously long half-life of Carbon-14 can be explained by calculating, for the first time, the very complex three-body interactions between all the neutrons and protons in the nucleus. At the molecular scale, researchers combined experimental results from LBL's light source and simulations on Jaguar to discover how DNA replication continues past a damaged site so a mutation can be repaired later. Other researchers combined experimental results from ORNL's Spallation Neutron Source and simulations on Jaguar to reveal the molecular structure of ligno-cellulosic material used in bioethanol production. This year, Jaguar has been used to do billion-cell CFD calculations to develop shock wave compression turbo machinery as a means to meet DOE goals for reducing carbon sequestration costs. General Electric used Jaguar to calculate the unsteady flow through turbo machinery to learn what efficiencies the traditional steady flow assumption is hiding from designers. Even a 1% improvement in turbine design can save the nation billions of gallons of fuel.« less

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dahlburg, Jill; Corones, James; Batchelor, Donald

    Fusion is potentially an inexhaustible energy source whose exploitation requires a basic understanding of high-temperature plasmas. The development of a science-based predictive capability for fusion-relevant plasmas is a challenge central to fusion energy science, in which numerical modeling has played a vital role for more than four decades. A combination of the very wide range in temporal and spatial scales, extreme anisotropy, the importance of geometric detail, and the requirement of causality which makes it impossible to parallelize over time, makes this problem one of the most challenging in computational physics. Sophisticated computational models are under development for many individualmore » features of magnetically confined plasmas and increases in the scope and reliability of feasible simulations have been enabled by increased scientific understanding and improvements in computer technology. However, full predictive modeling of fusion plasmas will require qualitative improvements and innovations to enable cross coupling of a wider variety of physical processes and to allow solution over a larger range of space and time scales. The exponential growth of computer speed, coupled with the high cost of large-scale experimental facilities, makes an integrated fusion simulation initiative a timely and cost-effective opportunity. Worldwide progress in laboratory fusion experiments provides the basis for a recent FESAC recommendation to proceed with a burning plasma experiment (see FESAC Review of Burning Plasma Physics Report, September 2001). Such an experiment, at the frontier of the physics of complex systems, would be a huge step in establishing the potential of magnetic fusion energy to contribute to the world’s energy security. An integrated simulation capability would dramatically enhance the utilization of such a facility and lead to optimization of toroidal fusion plasmas in general. This science-based predictive capability, which was cited in the FESAC integrated planning document (IPPA, 2000), represents a significant opportunity for the DOE Office of Science to further the understanding of fusion plasmas to a level unparalleled worldwide.« less

  17. What's statistical about learning? Insights from modelling statistical learning as a set of memory processes

    PubMed Central

    2017-01-01

    Statistical learning has been studied in a variety of different tasks, including word segmentation, object identification, category learning, artificial grammar learning and serial reaction time tasks (e.g. Saffran et al. 1996 Science 274, 1926–1928; Orban et al. 2008 Proceedings of the National Academy of Sciences 105, 2745–2750; Thiessen & Yee 2010 Child Development 81, 1287–1303; Saffran 2002 Journal of Memory and Language 47, 172–196; Misyak & Christiansen 2012 Language Learning 62, 302–331). The difference among these tasks raises questions about whether they all depend on the same kinds of underlying processes and computations, or whether they are tapping into different underlying mechanisms. Prior theoretical approaches to statistical learning have often tried to explain or model learning in a single task. However, in many cases these approaches appear inadequate to explain performance in multiple tasks. For example, explaining word segmentation via the computation of sequential statistics (such as transitional probability) provides little insight into the nature of sensitivity to regularities among simultaneously presented features. In this article, we will present a formal computational approach that we believe is a good candidate to provide a unifying framework to explore and explain learning in a wide variety of statistical learning tasks. This framework suggests that statistical learning arises from a set of processes that are inherent in memory systems, including activation, interference, integration of information and forgetting (e.g. Perruchet & Vinter 1998 Journal of Memory and Language 39, 246–263; Thiessen et al. 2013 Psychological Bulletin 139, 792–814). From this perspective, statistical learning does not involve explicit computation of statistics, but rather the extraction of elements of the input into memory traces, and subsequent integration across those memory traces that emphasize consistent information (Thiessen and Pavlik 2013 Cognitive Science 37, 310–343). This article is part of the themed issue ‘New frontiers for statistical learning in the cognitive sciences'. PMID:27872374

  18. What's statistical about learning? Insights from modelling statistical learning as a set of memory processes.

    PubMed

    Thiessen, Erik D

    2017-01-05

    Statistical learning has been studied in a variety of different tasks, including word segmentation, object identification, category learning, artificial grammar learning and serial reaction time tasks (e.g. Saffran et al. 1996 Science 274: , 1926-1928; Orban et al. 2008 Proceedings of the National Academy of Sciences 105: , 2745-2750; Thiessen & Yee 2010 Child Development 81: , 1287-1303; Saffran 2002 Journal of Memory and Language 47: , 172-196; Misyak & Christiansen 2012 Language Learning 62: , 302-331). The difference among these tasks raises questions about whether they all depend on the same kinds of underlying processes and computations, or whether they are tapping into different underlying mechanisms. Prior theoretical approaches to statistical learning have often tried to explain or model learning in a single task. However, in many cases these approaches appear inadequate to explain performance in multiple tasks. For example, explaining word segmentation via the computation of sequential statistics (such as transitional probability) provides little insight into the nature of sensitivity to regularities among simultaneously presented features. In this article, we will present a formal computational approach that we believe is a good candidate to provide a unifying framework to explore and explain learning in a wide variety of statistical learning tasks. This framework suggests that statistical learning arises from a set of processes that are inherent in memory systems, including activation, interference, integration of information and forgetting (e.g. Perruchet & Vinter 1998 Journal of Memory and Language 39: , 246-263; Thiessen et al. 2013 Psychological Bulletin 139: , 792-814). From this perspective, statistical learning does not involve explicit computation of statistics, but rather the extraction of elements of the input into memory traces, and subsequent integration across those memory traces that emphasize consistent information (Thiessen and Pavlik 2013 Cognitive Science 37: , 310-343).This article is part of the themed issue 'New frontiers for statistical learning in the cognitive sciences'. © 2016 The Author(s).

  19. Exploring the Relationships between Self-Efficacy and Preference for Teacher Authority among Computer Science Majors

    ERIC Educational Resources Information Center

    Lin, Che-Li; Liang, Jyh-Chong; Su, Yi-Ching; Tsai, Chin-Chung

    2013-01-01

    Teacher-centered instruction has been widely adopted in college computer science classrooms and has some benefits in training computer science undergraduates. Meanwhile, student-centered contexts have been advocated to promote computer science education. How computer science learners respond to or prefer the two types of teacher authority,…

  20. 77 FR 51791 - DOE/NSF Nuclear Science Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-27

    ... DEPARTMENT OF ENERGY DOE/NSF Nuclear Science Advisory Committee AGENCY: Department of Energy.../NSF Nuclear Science Advisory Committee (NSAC). The Federal Advisory Committee Act (Pub. L. 92-463, 86... on scientific priorities within the field of basic nuclear science research. Tentative Agenda: Agenda...

  1. 76 FR 31945 - DOE/NSF Nuclear Science Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-02

    ... DEPARTMENT OF ENERGY DOE/NSF Nuclear Science Advisory Committee AGENCY: Department of Energy.../NSF Nuclear Science Advisory Committee (NSAC). The Federal Advisory Committee Act (Pub. L. 92-463, 86... the field of basic nuclear science research. Tentative Agenda: Agenda will include discussions of the...

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    East, D. R.; Sexton, J.

    This was a collaborative effort between Lawrence Livermore National Security, LLC as manager and operator of Lawrence Livermore National Laboratory (LLNL) and IBM TJ Watson Research Center to research, assess feasibility and develop an implementation plan for a High Performance Computing Innovation Center (HPCIC) in the Livermore Valley Open Campus (LVOC). The ultimate goal of this work was to help advance the State of California and U.S. commercial competitiveness in the arena of High Performance Computing (HPC) by accelerating the adoption of computational science solutions, consistent with recent DOE strategy directives. The desired result of this CRADA was a well-researched,more » carefully analyzed market evaluation that would identify those firms in core sectors of the US economy seeking to adopt or expand their use of HPC to become more competitive globally, and to define how those firms could be helped by the HPCIC with IBM as an integral partner.« less

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slattery, Stuart R

    ExaMPM is a mini-application for the Material Point Method (MPM) for studying the application of MPM to future exascale computing systems. MPM is a general method for computational mechanics and fluids and is used in a wide variety of science and engineering disciplines to study problems with large deformations, phase change, fracture, and other phenomena. ExaMPM provides a reference implementation of MPM as described in the 1994 work of Sulsky et.al. (Sulsky, Deborah, Zhen Chen, and Howard L. Schreyer. "A particle method for history-dependent materials." Computer methods in applied mechanics and engineering 118.1-2 (1994): 179-196.). The software can solve basicmore » MPM problems in solid mechanics using the original algorithm of Sulsky with explicit time integration, basic geometries, and free-slip and no-slip boundary conditions as described in the reference. ExaMPM is intended to be used as a starting point to design new parallel algorithms for the next generation of DOE supercomputers.« less

  4. Coherence and Divergence of Megatrends in Science and Engineering

    NASA Astrophysics Data System (ADS)

    Roco, M. C.

    2002-04-01

    Scientific discoveries and technological innovations are at the core of human endeavor, and it is estimated that their role will only increase in time. Such advancements evolve in coherence, with areas of confluence and temporary divergences, which bring synergism and that stimulate further developments following in average an exponential growth. Six increasingly interconnected megatrends are perceived as dominating the scene for the next decades: (a) information and computing, (b) nanoscale science and engineering (S&E), (c) biology and bio-environmental approaches, (d) medical sciences and enhancing human physical capabilities, (e) cognitive sciences and enhancing intellectual abilities, and (f) collective behavior and system approach. This paper presents a perspective on the process of identification, planning and program implementation of S&E megatrends, with illustration for the US research initiative on nanoscale science, engineering, and technology. The interplay between coherence and divergence, leading to unifying science and converging technologies, does not develop only among simultaneous scientific trends but also along time and across geopolitical boundaries. There is no single way of development of S&E, and here is the role of taking visionary measures. Societal implication scientists need to be involved from the conceptual phase of a program responding to a S&E megatrend.

  5. Strengthening LLNL Missions through Laboratory Directed Research and Development in High Performance Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Willis, D. K.

    2016-12-01

    High performance computing (HPC) has been a defining strength of Lawrence Livermore National Laboratory (LLNL) since its founding. Livermore scientists have designed and used some of the world’s most powerful computers to drive breakthroughs in nearly every mission area. Today, the Laboratory is recognized as a world leader in the application of HPC to complex science, technology, and engineering challenges. Most importantly, HPC has been integral to the National Nuclear Security Administration’s (NNSA’s) Stockpile Stewardship Program—designed to ensure the safety, security, and reliability of our nuclear deterrent without nuclear testing. A critical factor behind Lawrence Livermore’s preeminence in HPC ismore » the ongoing investments made by the Laboratory Directed Research and Development (LDRD) Program in cutting-edge concepts to enable efficient utilization of these powerful machines. Congress established the LDRD Program in 1991 to maintain the technical vitality of the Department of Energy (DOE) national laboratories. Since then, LDRD has been, and continues to be, an essential tool for exploring anticipated needs that lie beyond the planning horizon of our programs and for attracting the next generation of talented visionaries. Through LDRD, Livermore researchers can examine future challenges, propose and explore innovative solutions, and deliver creative approaches to support our missions. The present scientific and technical strengths of the Laboratory are, in large part, a product of past LDRD investments in HPC. Here, we provide seven examples of LDRD projects from the past decade that have played a critical role in building LLNL’s HPC, computer science, mathematics, and data science research capabilities, and describe how they have impacted LLNL’s mission.« less

  6. Academic computer science and gender: A naturalistic study investigating the causes of attrition

    NASA Astrophysics Data System (ADS)

    Declue, Timothy Hall

    Far fewer women than men take computer science classes in high school, enroll in computer science programs in college, or complete advanced degrees in computer science. The computer science pipeline begins to shrink for women even before entering college, but it is at the college level that the "brain drain" is the most evident numerically, especially in the first class taken by most computer science majors called "Computer Science 1" or CS-I. The result, for both academia and industry, is a pronounced technological gender disparity in academic and industrial computer science. The study revealed the existence of several factors influencing success in CS-I. First, and most clearly, the effect of attribution processes seemed to be quite strong. These processes tend to work against success for females and in favor of success for males. Likewise, evidence was discovered which strengthens theories related to prior experience and the perception that computer science has a culture which is hostile to females. Two unanticipated themes related to the motivation and persistence of successful computer science majors. The findings did not support the belief that females have greater logistical problems in computer science than males, or that females tend to have a different programming style than males which adversely affects the females' ability to succeed in CS-I.

  7. 10 CFR 727.4 - Is there any expectation of privacy applicable to a DOE computer?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Communications Privacy Act of 1986), no user of a DOE computer shall have any expectation of privacy in the use... computer? 727.4 Section 727.4 Energy DEPARTMENT OF ENERGY CONSENT FOR ACCESS TO INFORMATION ON DEPARTMENT OF ENERGY COMPUTERS § 727.4 Is there any expectation of privacy applicable to a DOE computer...

  8. 10 CFR 727.4 - Is there any expectation of privacy applicable to a DOE computer?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Communications Privacy Act of 1986), no user of a DOE computer shall have any expectation of privacy in the use... computer? 727.4 Section 727.4 Energy DEPARTMENT OF ENERGY CONSENT FOR ACCESS TO INFORMATION ON DEPARTMENT OF ENERGY COMPUTERS § 727.4 Is there any expectation of privacy applicable to a DOE computer...

  9. 10 CFR 727.4 - Is there any expectation of privacy applicable to a DOE computer?

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Communications Privacy Act of 1986), no user of a DOE computer shall have any expectation of privacy in the use... computer? 727.4 Section 727.4 Energy DEPARTMENT OF ENERGY CONSENT FOR ACCESS TO INFORMATION ON DEPARTMENT OF ENERGY COMPUTERS § 727.4 Is there any expectation of privacy applicable to a DOE computer...

  10. 10 CFR 727.4 - Is there any expectation of privacy applicable to a DOE computer?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Communications Privacy Act of 1986), no user of a DOE computer shall have any expectation of privacy in the use... computer? 727.4 Section 727.4 Energy DEPARTMENT OF ENERGY CONSENT FOR ACCESS TO INFORMATION ON DEPARTMENT OF ENERGY COMPUTERS § 727.4 Is there any expectation of privacy applicable to a DOE computer...

  11. 10 CFR 727.4 - Is there any expectation of privacy applicable to a DOE computer?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Communications Privacy Act of 1986), no user of a DOE computer shall have any expectation of privacy in the use... computer? 727.4 Section 727.4 Energy DEPARTMENT OF ENERGY CONSENT FOR ACCESS TO INFORMATION ON DEPARTMENT OF ENERGY COMPUTERS § 727.4 Is there any expectation of privacy applicable to a DOE computer...

  12. Computer-Game Construction: A Gender-Neutral Attractor to Computing Science

    ERIC Educational Resources Information Center

    Carbonaro, Mike; Szafron, Duane; Cutumisu, Maria; Schaeffer, Jonathan

    2010-01-01

    Enrollment in Computing Science university programs is at a dangerously low level. A major reason for this is the general lack of interest in Computing Science by females. In this paper, we discuss our experience with using a computer game construction environment as a vehicle to encourage female participation in Computing Science. Experiments…

  13. High Performance Computing Facility Operational Assessment, FY 2011 Oak Ridge Leadership Computing Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, Ann E; Bland, Arthur S Buddy; Hack, James J

    Oak Ridge National Laboratory's Leadership Computing Facility (OLCF) continues to deliver the most powerful resources in the U.S. for open science. At 2.33 petaflops peak performance, the Cray XT Jaguar delivered more than 1.5 billion core hours in calendar year (CY) 2010 to researchers around the world for computational simulations relevant to national and energy security; advancing the frontiers of knowledge in physical sciences and areas of biological, medical, environmental, and computer sciences; and providing world-class research facilities for the nation's science enterprise. Scientific achievements by OLCF users range from collaboration with university experimentalists to produce a working supercapacitor thatmore » uses atom-thick sheets of carbon materials to finely determining the resolution requirements for simulations of coal gasifiers and their components, thus laying the foundation for development of commercial-scale gasifiers. OLCF users are pushing the boundaries with software applications sustaining more than one petaflop of performance in the quest to illuminate the fundamental nature of electronic devices. Other teams of researchers are working to resolve predictive capabilities of climate models, to refine and validate genome sequencing, and to explore the most fundamental materials in nature - quarks and gluons - and their unique properties. Details of these scientific endeavors - not possible without access to leadership-class computing resources - are detailed in Section 4 of this report and in the INCITE in Review. Effective operations of the OLCF play a key role in the scientific missions and accomplishments of its users. This Operational Assessment Report (OAR) will delineate the policies, procedures, and innovations implemented by the OLCF to continue delivering a petaflop-scale resource for cutting-edge research. The 2010 operational assessment of the OLCF yielded recommendations that have been addressed (Reference Section 1) and where appropriate, changes in Center metrics were introduced. This report covers CY 2010 and CY 2011 Year to Date (YTD) that unless otherwise specified, denotes January 1, 2011 through June 30, 2011. User Support remains an important element of the OLCF operations, with the philosophy 'whatever it takes' to enable successful research. Impact of this center-wide activity is reflected by the user survey results that show users are 'very satisfied.' The OLCF continues to aggressively pursue outreach and training activities to promote awareness - and effective use - of U.S. leadership-class resources (Reference Section 2). The OLCF continues to meet and in many cases exceed DOE metrics for capability usage (35% target in CY 2010, delivered 39%; 40% target in CY 2011, 54% January 1, 2011 through June 30, 2011). The Schedule Availability (SA) and Overall Availability (OA) for Jaguar were exceeded in CY2010. Given the solution to the VRM problem the SA and OA for Jaguar in CY 2011 are expected to exceed the target metrics of 95% and 90%, respectively (Reference Section 3). Numerous and wide-ranging research accomplishments, scientific support, and technological innovations are more fully described in Sections 4 and 6 and reflect OLCF leadership in enabling high-impact science solutions and vision in creating an exascale-ready center. Financial Management (Section 5) and Risk Management (Section 7) are carried out using best practices approved of by DOE. The OLCF has a valid cyber security plan and Authority to Operate (Section 8). The proposed metrics for 2012 are reflected in Section 9.« less

  14. Advancing Water Science through Improved Cyberinfrastructure

    NASA Astrophysics Data System (ADS)

    Koch, B. J.; Miles, B.; Rai, A.; Ahalt, S.; Band, L. E.; Minsker, B.; Palmer, M.; Williams, M. R.; Idaszak, R.; Whitton, M. C.

    2012-12-01

    Major scientific advances are needed to help address impacts of climate change and increasing human-mediated environmental modification on the water cycle at global and local scales. However, such advances within the water sciences are limited in part by inadequate information infrastructures. For example, cyberinfrastructure (CI) includes the integrated computer hardware, software, networks, sensors, data, and human capital that enable scientific workflows to be carried out within and among individual research efforts and across varied disciplines. A coordinated transformation of existing CI and development of new CI could accelerate the productivity of water science by enabling greater discovery, access, and interoperability of data and models, and by freeing scientists to do science rather than create and manage technological tools. To elucidate specific ways in which improved CI could advance water science, three challenges confronting the water science community were evaluated: 1) How does ecohydrologic patch structure affect nitrogen transport and fate in watersheds?, 2) How can human-modified environments emulate natural water and nutrient cycling to enhance both human and ecosystem well-being?, 3) How do changes in climate affect water availability to support biodiversity and human needs? We assessed the approaches used by researchers to address components of these challenges, identified barriers imposed by limitations of current CI, and interviewed leaders in various water science subdisciplines to determine the most recent CI tools employed. Our preliminary findings revealed four areas where CI improvements are likely to stimulate scientific advances: 1) sensor networks, 2) data quality assurance/quality control, 3) data and modeling standards, 4) high performance computing. In addition, the full potential of a re-envisioned water science CI cannot be realized without a substantial training component. In light of these findings, we suggest that CI industry-proven practices such as open-source community architecture, agile development methodologies, and sound software engineering methods offer a promising pathway to a transformed water science CI capable of meeting the demands of both individual scientists and community-wide research initiatives.

  15. A quantitative study of the summer slide in science of elementary school students

    NASA Astrophysics Data System (ADS)

    Donovan, Giovanna Guadagno

    Concerned parents and educators agree children learn best when the rhythm of instruction is continuous with practice and application of skills. Long summer breaks may interrupt the flow of formal school learning leading some students to forget previous instruction. A review of the previous school work is generally required in the fall upon return from the summer vacation. Investigating summer vacation and equity issues, Jamar (1994) noted that more affluent students may "return to school in the fall with a considerable educational advantage over their less advantaged peers as a result of either additional school-related learning, or lower levels of forgetting, over the summer months (p. 1)". The population of 402 fifth grade students from a suburban New England school district participated in this study. The district administered the science subtest of the TerraNova 2 (TN2) assessment in late May 2007 (pre-test data) and in September 2007 (post-test data). These archived data, including gender and student socioeconomic status (SES) levels (as referenced by free or reduced lunch status), were analyzed for an ex-post facto causal comparison study to identify the phenomenon of summer slide in science of fifth graders enrolled in six elementary schools. The ANOVA statistical model was used calculating the repeated measures factor of time (pre/post summer vacation) on the science content area. Subsequent two-way ANOVAS, with one repeated-measures factor (time of testing) explored the existence of similar/different patterns by gender and by SES levels. Two questions guided this study. First, does the summer slide phenomenon exist in science education? Second, if the summer slide in science phenomenon exists in science education, then does SES impact it? Does the summer slide in science phenomenon differ between genders? Findings suggest that the summer slide phenomenon exists in science; SES and gender does not affect the overall science test scores. However, SES impacts the summer slide phenomenon in science but gender does not impact summer slide in science. Furthermore, the school does not statistically impact the summer slide phenomenon in science and the impact of school does not differ across SES and genders.

  16. Phytozome Comparative Plant Genomics Portal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goodstein, David; Batra, Sajeev; Carlson, Joseph

    2014-09-09

    The Dept. of Energy Joint Genome Institute is a genomics user facility supporting DOE mission science in the areas of Bioenergy, Carbon Cycling, and Biogeochemistry. The Plant Program at the JGI applies genomic, analytical, computational and informatics platforms and methods to: 1. Understand and accelerate the improvement (domestication) of bioenergy crops 2. Characterize and moderate plant response to climate change 3. Use comparative genomics to identify constrained elements and infer gene function 4. Build high quality genomic resource platforms of JGI Plant Flagship genomes for functional and experimental work 5. Expand functional genomic resources for Plant Flagship genomes

  17. On the role of the plasmodial cytoskeleton in facilitating intelligent behavior in slime mold Physarum polycephalum

    PubMed Central

    Mayne, Richard; Adamatzky, Andrew; Jones, Jeff

    2015-01-01

    The plasmodium of slime mold Physarum polycephalum behaves as an amorphous reaction-diffusion computing substrate and is capable of apparently ‘intelligent’ behavior. But how does intelligence emerge in an acellular organism? Through a range of laboratory experiments, we visualize the plasmodial cytoskeleton—a ubiquitous cellular protein scaffold whose functions are manifold and essential to life—and discuss its putative role as a network for transducing, transmitting and structuring data streams within the plasmodium. Through a range of computer modeling techniques, we demonstrate how emergent behavior, and hence computational intelligence, may occur in cytoskeletal communications networks. Specifically, we model the topology of both the actin and tubulin cytoskeletal networks and discuss how computation may occur therein. Furthermore, we present bespoke cellular automata and particle swarm models for the computational process within the cytoskeleton and observe the incidence of emergent patterns in both. Our work grants unique insight into the origins of natural intelligence; the results presented here are therefore readily transferable to the fields of natural computation, cell biology and biomedical science. We conclude by discussing how our results may alter our biological, computational and philosophical understanding of intelligence and consciousness. PMID:26478782

  18. On the role of the plasmodial cytoskeleton in facilitating intelligent behavior in slime mold Physarum polycephalum.

    PubMed

    Mayne, Richard; Adamatzky, Andrew; Jones, Jeff

    2015-01-01

    The plasmodium of slime mold Physarum polycephalum behaves as an amorphous reaction-diffusion computing substrate and is capable of apparently 'intelligent' behavior. But how does intelligence emerge in an acellular organism? Through a range of laboratory experiments, we visualize the plasmodial cytoskeleton-a ubiquitous cellular protein scaffold whose functions are manifold and essential to life-and discuss its putative role as a network for transducing, transmitting and structuring data streams within the plasmodium. Through a range of computer modeling techniques, we demonstrate how emergent behavior, and hence computational intelligence, may occur in cytoskeletal communications networks. Specifically, we model the topology of both the actin and tubulin cytoskeletal networks and discuss how computation may occur therein. Furthermore, we present bespoke cellular automata and particle swarm models for the computational process within the cytoskeleton and observe the incidence of emergent patterns in both. Our work grants unique insight into the origins of natural intelligence; the results presented here are therefore readily transferable to the fields of natural computation, cell biology and biomedical science. We conclude by discussing how our results may alter our biological, computational and philosophical understanding of intelligence and consciousness.

  19. Discovery of Sound in the Sea (DOSITS) Website Development

    DTIC Science & Technology

    2013-03-04

    life affect ocean sound levels? • Science of Sound > Sounds in the Sea > How will ocean acidification affect ocean sound levels? • Science of Sound...Science of Sound > Sounds in the Sea > How does shipping affect ocean sound levels? • Science of Sound > Sounds in the Sea > How does marine

  20. The Effectiveness of Computer-Assisted Instruction to Teach Physical Examination to Students and Trainees in the Health Sciences Professions: A Systematic Review and Meta-Analysis.

    PubMed

    Tomesko, Jennifer; Touger-Decker, Riva; Dreker, Margaret; Zelig, Rena; Parrott, James Scott

    2017-01-01

    To explore knowledge and skill acquisition outcomes related to learning physical examination (PE) through computer-assisted instruction (CAI) compared with a face-to-face (F2F) approach. A systematic literature review and meta-analysis published between January 2001 and December 2016 was conducted. Databases searched included Medline, Cochrane, CINAHL, ERIC, Ebsco, Scopus, and Web of Science. Studies were synthesized by study design, intervention, and outcomes. Statistical analyses included DerSimonian-Laird random-effects model. In total, 7 studies were included in the review, and 5 in the meta-analysis. There were no statistically significant differences for knowledge (mean difference [MD] = 5.39, 95% confidence interval [CI]: -2.05 to 12.84) or skill acquisition (MD = 0.35, 95% CI: -5.30 to 6.01). The evidence does not suggest a strong consistent preference for either CAI or F2F instruction to teach students/trainees PE. Further research is needed to identify conditions which examine knowledge and skill acquisition outcomes that favor one mode of instruction over the other.

  1. RF Antenna Design for a Helicon Plasma Source

    NASA Astrophysics Data System (ADS)

    Godden, Katarina; Stassel, Brendan; Warta, Daniel; Yep, Isaac; Hicks, Nathaniel; Munk, Jens

    2017-10-01

    A helicon plasma source is under development for the new Plasma Science and Engineering Laboratory at the University of Alaska Anchorage. The helicon source is of a type comprising Pyrex and stainless steel cylindrical sections, joined to an ultrahigh vacuum chamber. A radio frequency (RF) helical antenna surrounds the Pyrex chamber, as well as DC solenoidal magnetic field coils. This presentation focuses on the design of the RF helical antenna and RF matching network, such that helicon wave power is coupled to argon plasma with minimal reflected power to the RF amplifier. The amplifier output is selectable between 2-30 MHz, with forward c.w. power up to 1.5 kW. Details and computer simulation of the antenna geometry, materials, and power matching will be presented, as well as the matching network of RF transmission line, tuning capacitors, and cooling system. An initial computational study of power coupling to the plasma will also be described. Supported by U.S. NSF/DOE Partnership in Basic Plasma Science and Engineering Grant PHY-1619615, by the Alaska Space Grant Program, and by UAA Innovate 2017.

  2. Central Computer Science Concepts to Research-Based Teacher Training in Computer Science: An Experimental Study

    ERIC Educational Resources Information Center

    Zendler, Andreas; Klaudt, Dieter

    2012-01-01

    The significance of computer science for economics and society is undisputed. In particular, computer science is acknowledged to play a key role in schools (e.g., by opening multiple career paths). The provision of effective computer science education in schools is dependent on teachers who are able to properly represent the discipline and whose…

  3. Final report: Prototyping a combustion corridor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rutland, Christopher J.; Leach, Joshua

    2001-12-15

    The Combustion Corridor is a concept in which researchers in combustion and thermal sciences have unimpeded access to large volumes of remote computational results. This will enable remote, collaborative analysis and visualization of state-of-the-art combustion science results. The Engine Research Center (ERC) at the University of Wisconsin - Madison partnered with Lawrence Berkeley National Laboratory, Argonne National Laboratory, Sandia National Laboratory, and several other universities to build and test the first stages of a combustion corridor. The ERC served two important functions in this partnership. First, we work extensively with combustion simulations so we were able to provide real worldmore » research data sets for testing the Corridor concepts. Second, the ERC was part of an extension of the high bandwidth based DOE National Laboratory connections to universities.« less

  4. Investigation of flow-induced numerical instability in a mixed semi-implicit, implicit leapfrog time discretization

    NASA Astrophysics Data System (ADS)

    King, Jacob; Kruger, Scott

    2017-10-01

    Flow can impact the stability and nonlinear evolution of range of instabilities (e.g. RWMs, NTMs, sawteeth, locked modes, PBMs, and high-k turbulence) and thus robust numerical algorithms for simulations with flow are essential. Recent simulations of DIII-D QH-mode [King et al., Phys. Plasmas and Nucl. Fus. 2017] with flow have been restricted to smaller time-step sizes than corresponding computations without flow. These computations use a mixed semi-implicit, implicit leapfrog time discretization as implemented in the NIMROD code [Sovinec et al., JCP 2004]. While prior analysis has shown that this algorithm is unconditionally stable with respect to the effect of large flows on the MHD waves in slab geometry [Sovinec et al., JCP 2010], our present Von Neumann stability analysis shows that a flow-induced numerical instability may arise when ad-hoc cylindrical curvature is included. Computations with the NIMROD code in cylindrical geometry with rigid rotation and without free-energy drive from current or pressure gradients qualitatively confirm this analysis. We explore potential methods to circumvent this flow-induced numerical instability such as using a semi-Lagrangian formulation instead of time-centered implicit advection and/or modification to the semi-implicit operator. This work is supported by the DOE Office of Science (Office of Fusion Energy Sciences).

  5. 5 CFR 591.218 - How does OPM compute price indexes?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false How does OPM compute price indexes? 591... Allowances § 591.218 How does OPM compute price indexes? Except for shelter and energy utilities, OPM averages by item the prices collected in each survey area. For the Washington, DC, area, OPM computes a...

  6. Climate Science's Globally Distributed Infrastructure

    NASA Astrophysics Data System (ADS)

    Williams, D. N.

    2016-12-01

    The Earth System Grid Federation (ESGF) is primarily funded by the Department of Energy's (DOE's) Office of Science (the Office of Biological and Environmental Research [BER] Climate Data Informatics Program and the Office of Advanced Scientific Computing Research Next Generation Network for Science Program), the National Oceanic and Atmospheric Administration (NOAA), the National Aeronautics and Space Administration (NASA), and the National Science Foundation (NSF), the European Infrastructure for the European Network for Earth System Modeling (IS-ENES), and the Australian National University (ANU). Support also comes from other U.S. federal and international agencies. The federation works across multiple worldwide data centers and spans seven international network organizations to provide users with the ability to access, analyze, and visualize data using a globally federated collection of networks, computers, and software. Its architecture employs a series of geographically distributed peer nodes that are independently administered and united by common federation protocols and application programming interfaces (APIs). The full ESGF infrastructure has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the Coupled Model Intercomparison Project (CMIP; output used by the Intergovernmental Panel on Climate Change assessment reports), multiple model intercomparison projects (MIPs; endorsed by the World Climate Research Programme [WCRP]), and the Accelerated Climate Modeling for Energy (ACME; ESGF is included in the overarching ACME workflow process to store model output). ESGF is a successful example of integration of disparate open-source technologies into a cohesive functional system that serves the needs the global climate science community. Data served by ESGF includes not only model output but also observational data from satellites and instruments, reanalysis, and generated images.

  7. Effect of Graphene with Nanopores on Metal Clusters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Hu; Chen, Xianlang; Wang, Lei

    Porous graphene, which is a novel type of defective graphene, shows excellent potential as a support material for metal clusters. In this work, the stability and electronic structures of metal clusters (Pd, Ir, Rh) supported on pristine graphene and graphene with different sizes of nanopore were investigated by first-principle density functional theory (DFT) calculations. Thereafter, CO adsorption and oxidation reaction on the Pd-graphene system were chosen to evaluate its catalytic performance. Graphene with nanopore can strongly stabilize the metal clusters and cause a substantial downshift of the d-band center of the metal clusters, thus decreasing CO adsorption. All binding energies,more » d-band centers, and adsorption energies show a linear change with the size of the nanopore: a bigger size of nanopore corresponds to a stronger metal clusters bond to the graphene, lower downshift of the d-band center, and weaker CO adsorption. By using a suitable size nanopore, supported Pd clusters on the graphene will have similar CO and O2 adsorption ability, thus leading to superior CO tolerance. The DFT calculated reaction energy barriers show that graphene with nanopore is a superior catalyst for CO oxidation reaction. These properties can play an important role in instructing graphene-supported metal catalyst preparation to prevent the diffusion or agglomeration of metal clusters and enhance catalytic performance. This work was supported by National Basic Research Program of China (973Program) (2013CB733501), the National Natural Science Foundation of China (NSFC-21176221, 21136001, 21101137, 21306169, and 91334013). D. Mei acknowledges the support from the US Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences & Biosciences. Pacific Northwest National Laboratory (PNNL) is a multiprogram national laboratory operated for DOE by Battelle. Computing time was granted by the grand challenge of computational catalysis of the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) and by the National Energy Research Scientific Computing Center (NERSC).« less

  8. Local Aqueous Solvation Structure Around Ca2+ During Ca2+---Cl– Pair Formation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baer, Marcel D.; Mundy, Christopher J.

    2016-03-03

    The molecular details of single ion solvation around Ca2+ and ion-pairing of Ca2--Cl- are investigated using ab initio molecular dynamics. The use of empirical dispersion corrections to the BLYP functional are investigated by comparison to experimentally available extended X-ray absorption fine structure (EXAFS) measurements, which probes the first solvation shell in great detail. Besides finding differences in the free-energy for both ion-pairing and the coordination number of ion solvation between the quantum and classical descriptions of interaction, there were important differences found between dispersion corrected and uncorrected density functional theory (DFT). Specifically, we show significantly different free-energy landscapes for bothmore » coordination number of Ca2+ and its ion-pairing with Cl- depending on the DFT simulation protocol. Our findings produce a self-consistent treatment of short-range solvent response to the ion and the intermediate to long-range collective response of the electrostatics of the ion-ion interaction to produce a detailed picture of ion-pairing that is consistent with experiment. MDB is supported by MS3 (Materials Synthesis and Simulation Across Scales) Initiative at Pacific Northwest National Laboratory. It was conducted under the Laboratory Directed Research and Development Program at PNNL, a multiprogram national laboratory operated by Battelle for the U.S. Department of Energy. CJM acknowledges support from US Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences & Biosciences. This research used resources of the National Energy Research Scientific Computing Center, a DOE Office of Science User Facility supported by the Office of Science of the U.S. Department of Energy under Contract No. DE-AC02-05CH11231. Additional computing resources were generously allocated by PNNL's Institutional Computing program. The authors thank Prof. Tom Beck for discussions regarding QCT, and Drs. Greg Schenter and Shawn Kathmann for insightful comments.« less

  9. Using spatial principles to optimize distributed computing for enabling the physical science discoveries

    PubMed Central

    Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing

    2011-01-01

    Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century. PMID:21444779

  10. Using spatial principles to optimize distributed computing for enabling the physical science discoveries.

    PubMed

    Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing

    2011-04-05

    Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century.

  11. Should Science Teaching Involve the History of Science? An Assessment of Kuhn's View

    ERIC Educational Resources Information Center

    Kindi, Vasso

    2005-01-01

    Thomas Kuhn draws the distinction between textbook history of science and history of science proper. The question addressed in the paper is whether Kuhn recommends the inclusion of distortive textbook history in science education. It is argued, pace Fuller, that Kuhn does not make normative suggestions. He does not urge the teaching of bad history…

  12. A Financial Technology Entrepreneurship Program for Computer Science Students

    ERIC Educational Resources Information Center

    Lawler, James P.; Joseph, Anthony

    2011-01-01

    Education in entrepreneurship is becoming a critical area of curricula for computer science students. Few schools of computer science have a concentration in entrepreneurship in the computing curricula. The paper presents Technology Entrepreneurship in the curricula at a leading school of computer science and information systems, in which students…

  13. Computer Science Teacher Professional Development in the United States: A Review of Studies Published between 2004 and 2014

    ERIC Educational Resources Information Center

    Menekse, Muhsin

    2015-01-01

    While there has been a remarkable interest to make computer science a core K-12 academic subject in the United States, there is a shortage of K-12 computer science teachers to successfully implement computer sciences courses in schools. In order to enhance computer science teacher capacity, training programs have been offered through teacher…

  14. LANL continuity of operations plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Senutovitch, Diane M

    2010-12-22

    The Los Alamos National Laboratory (LANL) is a premier national security research institution, delivering scientific and engineering solutions for the nation's most crucial and complex problems. Our primary responsibility is to ensure the safety, security, and reliability of the nation's nuclear stockpile. LANL emphasizes worker safety, effective operational safeguards and security, and environmental stewardship, outstanding science remains the foundation of work at the Laboratory. In addition to supporting the Laboratory's core national security mission, our work advances bioscience, chemistry, computer science, earth and environmental sciences, materials science, and physics disciplines. To accomplish LANL's mission, we must ensure that the Laboratorymore » EFs continue to be performed during a continuity event, including localized acts of nature, accidents, technological or attack-related emergencies, and pandemic or epidemic events. The LANL Continuity of Operations (COOP) Plan documents the overall LANL COOP Program and provides the operational framework to implement continuity policies, requirements, and responsibilities at LANL, as required by DOE 0 150.1, Continuity Programs, May 2008. LANL must maintain its ability to perform the nation's PMEFs, which are: (1) maintain the safety and security of nuclear materials in the DOE Complex at fixed sites and in transit; (2) respond to a nuclear incident, both domestically and internationally, caused by terrorist activity, natural disaster, or accident, including mobilizing the resources to support these efforts; and (3) support the nation's energy infrastructure. This plan supports Continuity of Operations for Los Alamos National Laboratory (LANL). This plan issues LANL policy as directed by the DOE 0 150.1, Continuity Programs, and provides direction for the orderly continuation of LANL EFs for 30 days of closure or 60 days for a pandemic/epidemic event. Initiation of COOP operations may be required to support an allhazards event, including a national security emergency, major fire, catastrophic natural disaster, man-made disaster, terrorism event, or technological disaster by rendering LANL buildings, infrastructure, or Technical Areas unsafe, temporarily unusable, or inaccessible.« less

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dykstra, Dave; Garzoglio, Gabriele; Kim, Hyunwoo

    As of 2012, a number of US Department of Energy (DOE) National Laboratories have access to a 100 Gb/s wide-area network backbone. The ESnet Advanced Networking Initiative (ANI) project is intended to develop a prototype network, based on emerging 100 Gb/s Ethernet technology. The ANI network will support DOE's science research programs. A 100 Gb/s network test bed is a key component of the ANI project. The test bed offers the opportunity for early evaluation of 100Gb/s network infrastructure for supporting the high impact data movement typical of science collaborations and experiments. In order to make effective use of thismore » advanced infrastructure, the applications and middleware currently used by the distributed computing systems of large-scale science need to be adapted and tested within the new environment, with gaps in functionality identified and corrected. As a user of the ANI test bed, Fermilab aims to study the issues related to end-to-end integration and use of 100 Gb/s networks for the event simulation and analysis applications of physics experiments. In this paper we discuss our findings from evaluating existing HEP Physics middleware and application components, including GridFTP, Globus Online, etc. in the high-speed environment. These will include possible recommendations to the system administrators, application and middleware developers on changes that would make production use of the 100 Gb/s networks, including data storage, caching and wide area access.« less

  16. Minimizing the formation of coke and methane on Co nanoparticles in steam reforming of biomass-derived oxygenates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Junming; Mei, Donghai; Karim, Ayman M.

    2013-06-01

    Fundamental understanding and control of chemical transformations are essential to the development of technically feasible and economically viable catalytic processes for efficient conversion of biomass to fuels and chemicals. Using an integrated experimental and theoretical approach, we report high hydrogen selectivity and catalyst durability of acetone steam reforming (ASR) on inert carbon supported Co nanoparticles. The observed catalytic performance is further elucidated on the basis of comprehensive first-principles calculations. Instead of being considered as an undesired intermediate prone for catalyst deactivation during bioethanol steam reforming (ESR), acetone is suggested as a key and desired intermediate in proposed two-stage ESR processmore » that leads to high hydrogen selectivity and low methane formation on Co-based catalysts. The significance of the present work also sheds a light on controlling the chemical transformations of key intermediates in biomass conversion such as ketones. We gratefully acknowledge the financial support from U. S. Department of Energy (DOE), Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences, and Biosciences, and the Laboratory directed research and development (LDRD) project of Pacific Northwest National Laboratory (PNNL). Computing time was granted by the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL). The EMSL is a U.S. DOE national scientific user facility located at PNNL, and sponsored by the U.S. DOE’s Office of Biological and Environmental Research.« less

  17. Dehydration pathways of 1-propanol on HZSM-5 in the presence and absence of water

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhi, Yuchun; Shi, Hui; Mu, Linyu

    The Brønsted acid-catalyzed gas-phase dehydration of 1-propanol (0.075-4 kPa) was studied on zeolite H-MFI (Si/Al = 26, containing minimal amounts of extraframework Al moieties) in the absence and presence of co-fed water (0-2.5 kPa) at 413-443 K. It is shown that propene can be formed from monomeric and dimeric adsorbed 1-propanol. The stronger adsorption of 1-propanol relative to water indicates that the reduced dehydration rates in the presence of water are not a consequence of the competitive adsorption between 1-propanol and water. Instead, the deleterious effect is related to the different extents of stabilization of adsorbed intermediates and the relevantmore » elimination/substitution transition states by water. Water stabilizes the adsorbed 1-propanol monomer significantly more than the elimination transition state, leading to a higher activation barrier and a greater entropy gain for the rate-limiting step, which eventually leads to propene. In a similar manner, an excess of 1-propanol stabilizes the adsorbed state of 1-propanol more than the elimination transition state. In comparison with the monomer-mediated pathway, adsorbed dimer and the relevant transition states for propene and ether formation are similarly, while less effectively, stabilized by intrazeolite water molecules. This work was supported by the US Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences, and Biosciences, and was performed in part using the Molecular Sciences Computing Facility (MSCF) in the William R. Wiley Environmental Molecular Sciences Laboratory, a DOE national scientific user facility sponsored by the DOE’s Office of Biological and Environmental Research and located and the Pacific Northwest National Laboratory (PNNL). PNNL is operated by Battelle for DOE.« less

  18. Computer Science | Classification | College of Engineering & Applied

    Science.gov Websites

    EMS 1011 profile photo Adrian Dumitrescu, Ph.D.ProfessorComputer Science(414) 229-4265Eng & Math @uwm.eduEng & Math Sciences 919 profile photo Hossein Hosseini, Ph.D.ProfessorComputer Science(414) 229 -5184hosseini@uwm.eduEng & Math Sciences 1091 profile photo Amol Mali, Ph.D.Associate ProfessorComputer

  19. Computers in Science Education: Can They Go Far Enough? Have We Gone Too Far?

    ERIC Educational Resources Information Center

    Schrock, John Richard

    1984-01-01

    Indicates that although computers may churn out creative research, science is still dependent on science education, and that science education consists of increasing human experience. Also considers uses and misuses of computers in the science classroom, examining Edgar Dale's "cone of experience" related to laboratory computer and "extended…

  20. 2K09 and thereafter : the coming era of integrative bioinformatics, systems biology and intelligent computing for functional genomics and personalized medicine research.

    PubMed

    Yang, Jack Y; Niemierko, Andrzej; Bajcsy, Ruzena; Xu, Dong; Athey, Brian D; Zhang, Aidong; Ersoy, Okan K; Li, Guo-Zheng; Borodovsky, Mark; Zhang, Joe C; Arabnia, Hamid R; Deng, Youping; Dunker, A Keith; Liu, Yunlong; Ghafoor, Arif

    2010-12-01

    Significant interest exists in establishing synergistic research in bioinformatics, systems biology and intelligent computing. Supported by the United States National Science Foundation (NSF), International Society of Intelligent Biological Medicine (http://www.ISIBM.org), International Journal of Computational Biology and Drug Design (IJCBDD) and International Journal of Functional Informatics and Personalized Medicine, the ISIBM International Joint Conferences on Bioinformatics, Systems Biology and Intelligent Computing (ISIBM IJCBS 2009) attracted more than 300 papers and 400 researchers and medical doctors world-wide. It was the only inter/multidisciplinary conference aimed to promote synergistic research and education in bioinformatics, systems biology and intelligent computing. The conference committee was very grateful for the valuable advice and suggestions from honorary chairs, steering committee members and scientific leaders including Dr. Michael S. Waterman (USC, Member of United States National Academy of Sciences), Dr. Chih-Ming Ho (UCLA, Member of United States National Academy of Engineering and Academician of Academia Sinica), Dr. Wing H. Wong (Stanford, Member of United States National Academy of Sciences), Dr. Ruzena Bajcsy (UC Berkeley, Member of United States National Academy of Engineering and Member of United States Institute of Medicine of the National Academies), Dr. Mary Qu Yang (United States National Institutes of Health and Oak Ridge, DOE), Dr. Andrzej Niemierko (Harvard), Dr. A. Keith Dunker (Indiana), Dr. Brian D. Athey (Michigan), Dr. Weida Tong (FDA, United States Department of Health and Human Services), Dr. Cathy H. Wu (Georgetown), Dr. Dong Xu (Missouri), Drs. Arif Ghafoor and Okan K Ersoy (Purdue), Dr. Mark Borodovsky (Georgia Tech, President of ISIBM), Dr. Hamid R. Arabnia (UGA, Vice-President of ISIBM), and other scientific leaders. The committee presented the 2009 ISIBM Outstanding Achievement Awards to Dr. Joydeep Ghosh (UT Austin), Dr. Aidong Zhang (Buffalo) and Dr. Zhi-Hua Zhou (Nanjing) for their significant contributions to the field of intelligent biological medicine.

  1. 2K09 and thereafter : the coming era of integrative bioinformatics, systems biology and intelligent computing for functional genomics and personalized medicine research

    PubMed Central

    2010-01-01

    Significant interest exists in establishing synergistic research in bioinformatics, systems biology and intelligent computing. Supported by the United States National Science Foundation (NSF), International Society of Intelligent Biological Medicine (http://www.ISIBM.org), International Journal of Computational Biology and Drug Design (IJCBDD) and International Journal of Functional Informatics and Personalized Medicine, the ISIBM International Joint Conferences on Bioinformatics, Systems Biology and Intelligent Computing (ISIBM IJCBS 2009) attracted more than 300 papers and 400 researchers and medical doctors world-wide. It was the only inter/multidisciplinary conference aimed to promote synergistic research and education in bioinformatics, systems biology and intelligent computing. The conference committee was very grateful for the valuable advice and suggestions from honorary chairs, steering committee members and scientific leaders including Dr. Michael S. Waterman (USC, Member of United States National Academy of Sciences), Dr. Chih-Ming Ho (UCLA, Member of United States National Academy of Engineering and Academician of Academia Sinica), Dr. Wing H. Wong (Stanford, Member of United States National Academy of Sciences), Dr. Ruzena Bajcsy (UC Berkeley, Member of United States National Academy of Engineering and Member of United States Institute of Medicine of the National Academies), Dr. Mary Qu Yang (United States National Institutes of Health and Oak Ridge, DOE), Dr. Andrzej Niemierko (Harvard), Dr. A. Keith Dunker (Indiana), Dr. Brian D. Athey (Michigan), Dr. Weida Tong (FDA, United States Department of Health and Human Services), Dr. Cathy H. Wu (Georgetown), Dr. Dong Xu (Missouri), Drs. Arif Ghafoor and Okan K Ersoy (Purdue), Dr. Mark Borodovsky (Georgia Tech, President of ISIBM), Dr. Hamid R. Arabnia (UGA, Vice-President of ISIBM), and other scientific leaders. The committee presented the 2009 ISIBM Outstanding Achievement Awards to Dr. Joydeep Ghosh (UT Austin), Dr. Aidong Zhang (Buffalo) and Dr. Zhi-Hua Zhou (Nanjing) for their significant contributions to the field of intelligent biological medicine. PMID:21143775

  2. Science in the Eyes of Preschool Children: Findings from an Innovative Research Tool

    NASA Astrophysics Data System (ADS)

    Dubosarsky, Mia D.

    How do young children view science? Do these views reflect cultural stereotypes? When do these views develop? These fundamental questions in the field of science education have rarely been studied with the population of preschool children. One main reason is the lack of an appropriate research instrument that addresses preschool children's developmental competencies. Extensive body of research has pointed at the significance of early childhood experiences in developing positive attitudes and interests toward learning in general and the learning of science in particular. Theoretical and empirical research suggests that stereotypical views of science may be replaced by authentic views following inquiry science experience. However, no preschool science intervention program could be designed without a reliable instrument that provides baseline information about preschool children's current views of science. The current study presents preschool children's views of science as gathered from a pioneering research tool. This tool, in the form of a computer "game," does not require reading, writing, or expressive language skills and is operated by the children. The program engages children in several simple tasks involving picture recognition and yes/no answers in order to reveal their views about science. The study was conducted with 120 preschool children in two phases and found that by the age of 4 years, participants possess an emergent concept of science. Gender and school differences were detected. Findings from this interdisciplinary study will contribute to the fields of early childhood, science education, learning technologies, program evaluation, and early childhood curriculum development.

  3. Nuclear-Recoil Differential Cross Sections for the Two Photon Double Ionization of Helium

    NASA Astrophysics Data System (ADS)

    Abdel Naby, Shahin; Ciappina, M. F.; Lee, T. G.; Pindzola, M. S.; Colgan, J.

    2013-05-01

    In support of the reaction microscope measurements at the free-electron laser facility at Hamburg (FLASH), we use the time-dependent close-coupling method (TDCC) to calculate fully differential nuclear-recoil cross sections for the two-photon double ionization of He at photon energy of 44 eV. The total cross section for the double ionization is in good agreement with previous calculations. The nuclear-recoil distribution is in good agreement with the experimental measurements. In contrast to the single-photon double ionization, maximum nuclear recoil triple differential cross section is obtained at small nuclear momenta. This work was supported in part by grants from NSF and US DoE. Computational work was carried out at NERSC in Oakland, California and the National Institute for Computational Sciences in Knoxville, Tennessee.

  4. Hybrid Ultra-Microporous Materials for Selective Xenon Adsorption and Separation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mohamed, Mona H.; Elsaidi, Sameh K.; Pham, Tony

    The demand for Xe/Kr separation continues to grow due to the industrial significance of high-purity Xe gas. Current separation processes rely on energy intensive cryogenic distillation. Therefore, there is a need to develop less energy intensive alternatives such as physisorptive separation using porous materials. Here we show that an underexplored class of porous materials called hybrid ultramicroporous materials (HUMs) based upon inorganic and organic building blocks affords new benchmark selectivity for Xe separation from Xe/Kr mixtures. The isostructural materials, CROFOUR-1-Ni and CROFOUR-2-Ni, are coordination networks that exhibit coordinatively saturated metal centres and two distinct types of micropores, one of whichmore » is lined by CrO 4 2- (CROFOUR) anions and the other is decorated by the functionalized organic linker. These nets offer unprecedented selectivity towards Xe, and also address processing and stability limitations of existing porous materials. Modelling experiments indicate that the extraordinary selectivity of these nets is tailored by synergy between the pore size, which is just above the kinetic diameter of Xe, and the strong electrostatics afforded by the CrO 4 2- anions. Column breakthrough experiments demonstrate the potential of the practical use of these materials in Xe/Kr separation at low concentrations at the levels relevant to Xe capture from air and in nuclear fuel reprocessing. B.S. acknowledges the National Science Foundation (Award No. CHE-1152362), including support from the Major Research Instrumentation Program (Award No CHE-1531590), the computational resources that were made available by a XSEDE Grant (No. TG-DMR090028), and the use of the services provided by Research Computing at the University of South Florida. We (P.K.T) thank the US Department of Energy (DOE), Office of Nuclear Energy for adsorption and breakthrough measurements. We (P.K.T) particularly thank J. Bresee, Kimberly Gray, T. Todd (Idaho National Laboratory), John Vienna (PNNL), B. Jubin (Oak Ridge National Laboratory) and D.M. Strachan (Strachan LLC) for providing programmatic support and guidance. Pacific Northwest National Laboratory is a multi-program national laboratory operated for the US Department of Energy by Battelle Memorial Institute under Contract DE-AC05-76RL01830. M.J.Z. gratefully acknowledges Science Foundation Ireland (Award 13/RP/B2549) for support. This research used Beamline 17-BM of the Advanced Photon Source, a U.S. Department of Energy (DOE) Office of Science User Facility operated for the DOE Office of Science by Argonne National Laboratory under Contract No. DE-AC02-06CH11357.« less

  5. Computational Materials Science and Chemistry: Accelerating Discovery and Innovation through Simulation-Based Engineering and Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crabtree, George; Glotzer, Sharon; McCurdy, Bill

    This report is based on a SC Workshop on Computational Materials Science and Chemistry for Innovation on July 26-27, 2010, to assess the potential of state-of-the-art computer simulations to accelerate understanding and discovery in materials science and chemistry, with a focus on potential impacts in energy technologies and innovation. The urgent demand for new energy technologies has greatly exceeded the capabilities of today's materials and chemical processes. To convert sunlight to fuel, efficiently store energy, or enable a new generation of energy production and utilization technologies requires the development of new materials and processes of unprecedented functionality and performance. Newmore » materials and processes are critical pacing elements for progress in advanced energy systems and virtually all industrial technologies. Over the past two decades, the United States has developed and deployed the world's most powerful collection of tools for the synthesis, processing, characterization, and simulation and modeling of materials and chemical systems at the nanoscale, dimensions of a few atoms to a few hundred atoms across. These tools, which include world-leading x-ray and neutron sources, nanoscale science facilities, and high-performance computers, provide an unprecedented view of the atomic-scale structure and dynamics of materials and the molecular-scale basis of chemical processes. For the first time in history, we are able to synthesize, characterize, and model materials and chemical behavior at the length scale where this behavior is controlled. This ability is transformational for the discovery process and, as a result, confers a significant competitive advantage. Perhaps the most spectacular increase in capability has been demonstrated in high performance computing. Over the past decade, computational power has increased by a factor of a million due to advances in hardware and software. This rate of improvement, which shows no sign of abating, has enabled the development of computer simulations and models of unprecedented fidelity. We are at the threshold of a new era where the integrated synthesis, characterization, and modeling of complex materials and chemical processes will transform our ability to understand and design new materials and chemistries with predictive power. In turn, this predictive capability will transform technological innovation by accelerating the development and deployment of new materials and processes in products and manufacturing. Harnessing the potential of computational science and engineering for the discovery and development of materials and chemical processes is essential to maintaining leadership in these foundational fields that underpin energy technologies and industrial competitiveness. Capitalizing on the opportunities presented by simulation-based engineering and science in materials and chemistry will require an integration of experimental capabilities with theoretical and computational modeling; the development of a robust and sustainable infrastructure to support the development and deployment of advanced computational models; and the assembly of a community of scientists and engineers to implement this integration and infrastructure. This community must extend to industry, where incorporating predictive materials science and chemistry into design tools can accelerate the product development cycle and drive economic competitiveness. The confluence of new theories, new materials synthesis capabilities, and new computer platforms has created an unprecedented opportunity to implement a "materials-by-design" paradigm with wide-ranging benefits in technological innovation and scientific discovery. The Workshop on Computational Materials Science and Chemistry for Innovation was convened in Bethesda, Maryland, on July 26-27, 2010. Sponsored by the Department of Energy (DOE) Offices of Advanced Scientific Computing Research and Basic Energy Sciences, the workshop brought together 160 experts in materials science, chemistry, and computational science representing more than 65 universities, laboratories, and industries, and four agencies. The workshop examined seven foundational challenge areas in materials science and chemistry: materials for extreme conditions, self-assembly, light harvesting, chemical reactions, designer fluids, thin films and interfaces, and electronic structure. Each of these challenge areas is critical to the development of advanced energy systems, and each can be accelerated by the integrated application of predictive capability with theory and experiment. The workshop concluded that emerging capabilities in predictive modeling and simulation have the potential to revolutionize the development of new materials and chemical processes. Coupled with world-leading materials characterization and nanoscale science facilities, this predictive capability provides the foundation for an innovation ecosystem that can accelerate the discovery, development, and deployment of new technologies, including advanced energy systems. Delivering on the promise of this innovation ecosystem requires the following: Integration of synthesis, processing, characterization, theory, and simulation and modeling. Many of the newly established Energy Frontier Research Centers and Energy Hubs are exploiting this integration. Achieving/strengthening predictive capability in foundational challenge areas. Predictive capability in the seven foundational challenge areas described in this report is critical to the development of advanced energy technologies. Developing validated computational approaches that span vast differences in time and length scales. This fundamental computational challenge crosscuts all of the foundational challenge areas. Similarly challenging is coupling of analytical data from multiple instruments and techniques that are required to link these length and time scales. Experimental validation and quantification of uncertainty in simulation and modeling. Uncertainty quantification becomes increasingly challenging as simulations become more complex. Robust and sustainable computational infrastructure, including software and applications. For modeling and simulation, software equals infrastructure. To validate the computational tools, software is critical infrastructure that effectively translates huge arrays of experimental data into useful scientific understanding. An integrated approach for managing this infrastructure is essential. Efficient transfer and incorporation of simulation-based engineering and science in industry. Strategies for bridging the gap between research and industrial applications and for widespread industry adoption of integrated computational materials engineering are needed.« less

  6. Community Petascale Project for Accelerator Science and Simulation: Advancing Computational Science for Future Accelerators and Accelerator Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spentzouris, P.; /Fermilab; Cary, J.

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessarymore » accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors. ComPASS is in the first year of executing its plan to develop the next-generation HPC accelerator modeling tools. ComPASS aims to develop an integrated simulation environment that will utilize existing and new accelerator physics modules with petascale capabilities, by employing modern computing and solver technologies. The ComPASS vision is to deliver to accelerator scientists a virtual accelerator and virtual prototyping modeling environment, with the necessary multiphysics, multiscale capabilities. The plan for this development includes delivering accelerator modeling applications appropriate for each stage of the ComPASS software evolution. Such applications are already being used to address challenging problems in accelerator design and optimization. The ComPASS organization for software development and applications accounts for the natural domain areas (beam dynamics, electromagnetics, and advanced acceleration), and all areas depend on the enabling technologies activities, such as solvers and component technology, to deliver the desired performance and integrated simulation environment. The ComPASS applications focus on computationally challenging problems important for design or performance optimization to all major HEP, NP, and BES accelerator facilities. With the cost and complexity of particle accelerators rising, the use of computation to optimize their designs and find improved operating regimes becomes essential, potentially leading to significant cost savings with modest investment.« less

  7. The Future of Pharmaceutical Manufacturing Sciences

    PubMed Central

    2015-01-01

    The entire pharmaceutical sector is in an urgent need of both innovative technological solutions and fundamental scientific work, enabling the production of highly engineered drug products. Commercial‐scale manufacturing of complex drug delivery systems (DDSs) using the existing technologies is challenging. This review covers important elements of manufacturing sciences, beginning with risk management strategies and design of experiments (DoE) techniques. Experimental techniques should, where possible, be supported by computational approaches. With that regard, state‐of‐art mechanistic process modeling techniques are described in detail. Implementation of materials science tools paves the way to molecular‐based processing of future DDSs. A snapshot of some of the existing tools is presented. Additionally, general engineering principles are discussed covering process measurement and process control solutions. Last part of the review addresses future manufacturing solutions, covering continuous processing and, specifically, hot‐melt processing and printing‐based technologies. Finally, challenges related to implementing these technologies as a part of future health care systems are discussed. © 2015 The Authors. Journal of Pharmaceutical Sciences published by Wiley Periodicals, Inc. and the American Pharmacists Association J Pharm Sci 104:3612–3638, 2015 PMID:26280993

  8. The Future of Pharmaceutical Manufacturing Sciences.

    PubMed

    Rantanen, Jukka; Khinast, Johannes

    2015-11-01

    The entire pharmaceutical sector is in an urgent need of both innovative technological solutions and fundamental scientific work, enabling the production of highly engineered drug products. Commercial-scale manufacturing of complex drug delivery systems (DDSs) using the existing technologies is challenging. This review covers important elements of manufacturing sciences, beginning with risk management strategies and design of experiments (DoE) techniques. Experimental techniques should, where possible, be supported by computational approaches. With that regard, state-of-art mechanistic process modeling techniques are described in detail. Implementation of materials science tools paves the way to molecular-based processing of future DDSs. A snapshot of some of the existing tools is presented. Additionally, general engineering principles are discussed covering process measurement and process control solutions. Last part of the review addresses future manufacturing solutions, covering continuous processing and, specifically, hot-melt processing and printing-based technologies. Finally, challenges related to implementing these technologies as a part of future health care systems are discussed. © 2015 The Authors. Journal of Pharmaceutical Sciences published by Wiley Periodicals, Inc. and the American Pharmacists Association.

  9. Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April, 1986 through September 30, 1986 is summarized.

  10. 78 FR 10180 - Annual Computational Science Symposium; Conference

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-13

    ...] Annual Computational Science Symposium; Conference AGENCY: Food and Drug Administration, HHS. ACTION... Computational Science Symposium.'' The purpose of the conference is to help the broader community align and share experiences to advance computational science. At the conference, which will bring together FDA...

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hules, John

    This 1998 annual report from the National Scientific Energy Research Computing Center (NERSC) presents the year in review of the following categories: Computational Science; Computer Science and Applied Mathematics; and Systems and Services. Also presented are science highlights in the following categories: Basic Energy Sciences; Biological and Environmental Research; Fusion Energy Sciences; High Energy and Nuclear Physics; and Advanced Scientific Computing Research and Other Projects.

  12. The ASCI Network for SC 2000: Gigabyte Per Second Networking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    PRATT, THOMAS J.; NAEGLE, JOHN H.; MARTINEZ JR., LUIS G.

    2001-11-01

    This document highlights the Discom's Distance computing and communication team activities at the 2000 Supercomputing conference in Dallas Texas. This conference is sponsored by the IEEE and ACM. Sandia's participation in the conference has now spanned a decade, for the last five years Sandia National Laboratories, Los Alamos National Lab and Lawrence Livermore National Lab have come together at the conference under the DOE's ASCI, Accelerated Strategic Computing Initiatives, Program rubric to demonstrate ASCI's emerging capabilities in computational science and our combined expertise in high performance computer science and communication networking developments within the program. At SC 2000, DISCOM demonstratedmore » an infrastructure. DISCOM2 uses this forum to demonstrate and focus communication and pre-standard implementation of 10 Gigabit Ethernet, the first gigabyte per second data IP network transfer application, and VPN technology that enabled a remote Distributed Resource Management tools demonstration. Additionally a national OC48 POS network was constructed to support applications running between the show floor and home facilities. This network created the opportunity to test PSE's Parallel File Transfer Protocol (PFTP) across a network that had similar speed and distances as the then proposed DISCOM WAN. The SCINET SC2000 showcased wireless networking and the networking team had the opportunity to explore this emerging technology while on the booth. This paper documents those accomplishments, discusses the details of their convention exhibit floor. We also supported the production networking needs of the implementation, and describes how these demonstrations supports DISCOM overall strategies in high performance computing networking.« less

  13. Scalable data management, analysis and visualization (SDAV) Institute. Final Scientific/Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geveci, Berk

    The purpose of the SDAV institute is to provide tools and expertise in scientific data management, analysis, and visualization to DOE’s application scientists. Our goal is to actively work with application teams to assist them in achieving breakthrough science, and to provide technical solutions in the data management, analysis, and visualization regimes that are broadly used by the computational science community. Over the last 5 years members of our institute worked directly with application scientists and DOE leadership-class facilities to assist them by applying the best tools and technologies at our disposal. We also enhanced our tools based on inputmore » from scientists on their needs. Many of the applications we have been working with are based on connections with scientists established in previous years. However, we contacted additional scientists though our outreach activities, as well as engaging application teams running on leading DOE computing systems. Our approach is to employ an evolutionary development and deployment process: first considering the application of existing tools, followed by the customization necessary for each particular application, and then the deployment in real frameworks and infrastructures. The institute is organized into three areas, each with area leaders, who keep track of progress, engagement of application scientists, and results. The areas are: (1) Data Management, (2) Data Analysis, and (3) Visualization. Kitware has been involved in the Visualization area. This report covers Kitware’s contributions over the last 5 years (February 2012 – February 2017). For details on the work performed by the SDAV institute as a whole, please see the SDAV final report.« less

  14. 34 CFR 637.32 - What selection criteria does the Secretary use?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... POSTSECONDARY EDUCATION, DEPARTMENT OF EDUCATION MINORITY SCIENCE AND ENGINEERING IMPROVEMENT PROGRAM How Does... for enhancing the institution's capacity for improving and maintaining quality science education for... science education improvement plans will be developed with the technical assistance provided under the...

  15. 34 CFR 637.32 - What selection criteria does the Secretary use?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... POSTSECONDARY EDUCATION, DEPARTMENT OF EDUCATION MINORITY SCIENCE AND ENGINEERING IMPROVEMENT PROGRAM How Does... for enhancing the institution's capacity for improving and maintaining quality science education for... science education improvement plans will be developed with the technical assistance provided under the...

  16. 34 CFR 637.32 - What selection criteria does the Secretary use?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... POSTSECONDARY EDUCATION, DEPARTMENT OF EDUCATION MINORITY SCIENCE AND ENGINEERING IMPROVEMENT PROGRAM How Does... for enhancing the institution's capacity for improving and maintaining quality science education for... science education improvement plans will be developed with the technical assistance provided under the...

  17. 34 CFR 637.32 - What selection criteria does the Secretary use?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... POSTSECONDARY EDUCATION, DEPARTMENT OF EDUCATION MINORITY SCIENCE AND ENGINEERING IMPROVEMENT PROGRAM How Does... for enhancing the institution's capacity for improving and maintaining quality science education for... science education improvement plans will be developed with the technical assistance provided under the...

  18. Enduring Influence of Stereotypical Computer Science Role Models on Women's Academic Aspirations

    ERIC Educational Resources Information Center

    Cheryan, Sapna; Drury, Benjamin J.; Vichayapai, Marissa

    2013-01-01

    The current work examines whether a brief exposure to a computer science role model who fits stereotypes of computer scientists has a lasting influence on women's interest in the field. One-hundred undergraduate women who were not computer science majors met a female or male peer role model who embodied computer science stereotypes in appearance…

  19. A Web of Resources for Introductory Computer Science.

    ERIC Educational Resources Information Center

    Rebelsky, Samuel A.

    As the field of Computer Science has grown, the syllabus of the introductory Computer Science course has changed significantly. No longer is it a simple introduction to programming or a tutorial on computer concepts and applications. Rather, it has become a survey of the field of Computer Science, touching on a wide variety of topics from digital…

  20. Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April l, 1988 through September 30, 1988.

  1. Summary of research in applied mathematics, numerical analysis and computer science at the Institute for Computer Applications in Science and Engineering

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science during the period October 1, 1983 through March 31, 1984 is summarized.

  2. Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1986 through March 31, 1987 is summarized.

  3. Computational Cellular Dynamics Based on the Chemical Master Equation: A Challenge for Understanding Complexity

    PubMed Central

    Liang, Jie; Qian, Hong

    2010-01-01

    Modern molecular biology has always been a great source of inspiration for computational science. Half a century ago, the challenge from understanding macromolecular dynamics has led the way for computations to be part of the tool set to study molecular biology. Twenty-five years ago, the demand from genome science has inspired an entire generation of computer scientists with an interest in discrete mathematics to join the field that is now called bioinformatics. In this paper, we shall lay out a new mathematical theory for dynamics of biochemical reaction systems in a small volume (i.e., mesoscopic) in terms of a stochastic, discrete-state continuous-time formulation, called the chemical master equation (CME). Similar to the wavefunction in quantum mechanics, the dynamically changing probability landscape associated with the state space provides a fundamental characterization of the biochemical reaction system. The stochastic trajectories of the dynamics are best known through the simulations using the Gillespie algorithm. In contrast to the Metropolis algorithm, this Monte Carlo sampling technique does not follow a process with detailed balance. We shall show several examples how CMEs are used to model cellular biochemical systems. We shall also illustrate the computational challenges involved: multiscale phenomena, the interplay between stochasticity and nonlinearity, and how macroscopic determinism arises from mesoscopic dynamics. We point out recent advances in computing solutions to the CME, including exact solution of the steady state landscape and stochastic differential equations that offer alternatives to the Gilespie algorithm. We argue that the CME is an ideal system from which one can learn to understand “complex behavior” and complexity theory, and from which important biological insight can be gained. PMID:24999297

  4. Computational Cellular Dynamics Based on the Chemical Master Equation: A Challenge for Understanding Complexity.

    PubMed

    Liang, Jie; Qian, Hong

    2010-01-01

    Modern molecular biology has always been a great source of inspiration for computational science. Half a century ago, the challenge from understanding macromolecular dynamics has led the way for computations to be part of the tool set to study molecular biology. Twenty-five years ago, the demand from genome science has inspired an entire generation of computer scientists with an interest in discrete mathematics to join the field that is now called bioinformatics. In this paper, we shall lay out a new mathematical theory for dynamics of biochemical reaction systems in a small volume (i.e., mesoscopic) in terms of a stochastic, discrete-state continuous-time formulation, called the chemical master equation (CME). Similar to the wavefunction in quantum mechanics, the dynamically changing probability landscape associated with the state space provides a fundamental characterization of the biochemical reaction system. The stochastic trajectories of the dynamics are best known through the simulations using the Gillespie algorithm. In contrast to the Metropolis algorithm, this Monte Carlo sampling technique does not follow a process with detailed balance. We shall show several examples how CMEs are used to model cellular biochemical systems. We shall also illustrate the computational challenges involved: multiscale phenomena, the interplay between stochasticity and nonlinearity, and how macroscopic determinism arises from mesoscopic dynamics. We point out recent advances in computing solutions to the CME, including exact solution of the steady state landscape and stochastic differential equations that offer alternatives to the Gilespie algorithm. We argue that the CME is an ideal system from which one can learn to understand "complex behavior" and complexity theory, and from which important biological insight can be gained.

  5. High school computer science education paves the way for higher education: the Israeli case

    NASA Astrophysics Data System (ADS)

    Armoni, Michal; Gal-Ezer, Judith

    2014-07-01

    The gap between enrollments in higher education computing programs and the high-tech industry's demands is widely reported, and is especially prominent for women. Increasing the availability of computer science education in high school is one of the strategies suggested in order to address this gap. We look at the connection between exposure to computer science in high school and pursuing computing in higher education. We also examine the gender gap, in the context of high school computer science education. We show that in Israel, students who took the high-level computer science matriculation exam were more likely to pursue computing in higher education. Regarding the issue of gender, we will show that, in general, in Israel the difference between males and females who take computer science in high school is relatively small, and a larger, though still not very large difference exists only for the highest exam level. In addition, exposing females to high-level computer science in high school has more relative impact on pursuing higher education in computing.

  6. Defining Computational Thinking for Mathematics and Science Classrooms

    NASA Astrophysics Data System (ADS)

    Weintrop, David; Beheshti, Elham; Horn, Michael; Orton, Kai; Jona, Kemi; Trouille, Laura; Wilensky, Uri

    2016-02-01

    Science and mathematics are becoming computational endeavors. This fact is reflected in the recently released Next Generation Science Standards and the decision to include "computational thinking" as a core scientific practice. With this addition, and the increased presence of computation in mathematics and scientific contexts, a new urgency has come to the challenge of defining computational thinking and providing a theoretical grounding for what form it should take in school science and mathematics classrooms. This paper presents a response to this challenge by proposing a definition of computational thinking for mathematics and science in the form of a taxonomy consisting of four main categories: data practices, modeling and simulation practices, computational problem solving practices, and systems thinking practices. In formulating this taxonomy, we draw on the existing computational thinking literature, interviews with mathematicians and scientists, and exemplary computational thinking instructional materials. This work was undertaken as part of a larger effort to infuse computational thinking into high school science and mathematics curricular materials. In this paper, we argue for the approach of embedding computational thinking in mathematics and science contexts, present the taxonomy, and discuss how we envision the taxonomy being used to bring current educational efforts in line with the increasingly computational nature of modern science and mathematics.

  7. Nicholas Brunhart-Lupo | NREL

    Science.gov Websites

    . Education Ph.D., Computer Science, Colorado School of Mines M.S., Computer Science, University of Queensland B.S., Computer Science, Colorado School of Mines Brunhart-Lupo Nicholas Brunhart-Lupo Computational Science Nicholas.Brunhart-Lupo@nrel.gov

  8. The Need for Computer Science

    ERIC Educational Resources Information Center

    Margolis, Jane; Goode, Joanna; Bernier, David

    2011-01-01

    Broadening computer science learning to include more students is a crucial item on the United States' education agenda, these authors say. Although policymakers advocate more computer science expertise, computer science offerings in high schools are few--and actually shrinking. In addition, poorly resourced schools with a high percentage of…

  9. Summary of research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1988 through March 31, 1989 is summarized.

  10. 45 CFR 630.105 - Does this part apply to me?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ....105 Public Welfare Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE FOUNDATION GOVERNMENTWIDE REQUIREMENTS FOR DRUG-FREE WORKPLACE (FINANCIAL ASSISTANCE) Purpose and Coverage § 630.105 Does... assistance award from the National Science Foundation; or (2) A(n) National Science Foundation awarding...

  11. Curricular Influences on Female Afterschool Facilitators' Computer Science Interests and Career Choices

    NASA Astrophysics Data System (ADS)

    Koch, Melissa; Gorges, Torie

    2016-10-01

    Underrepresented populations such as women, African-Americans, and Latinos/as often come to STEM (science, technology, engineering, and mathematics) careers by less traditional paths than White and Asian males. To better understand how and why women might shift toward STEM, particularly computer science, careers, we investigated the education and career direction of afterschool facilitators, primarily women of color in their twenties and thirties, who taught Build IT, an afterschool computer science curriculum for middle school girls. Many of these women indicated that implementing Build IT had influenced their own interest in technology and computer science and in some cases had resulted in their intent to pursue technology and computer science education. We wanted to explore the role that teaching Build IT may have played in activating or reactivating interest in careers in computer science and to see whether in the years following implementation of Build IT, these women pursued STEM education and/or careers. We reached nine facilitators who implemented the program in 2011-12 or shortly after. Many indicated that while facilitating Build IT, they learned along with the participants, increasing their interest in and confidence with technology and computer science. Seven of the nine participants pursued further STEM or computer science learning or modified their career paths to include more of a STEM or computer science focus. Through interviews, we explored what aspects of Build IT influenced these facilitators' interest and confidence in STEM and when relevant their pursuit of technology and computer science education and careers.

  12. The NASA computer science research program plan

    NASA Technical Reports Server (NTRS)

    1983-01-01

    A taxonomy of computer science is included, one state of the art of each of the major computer science categories is summarized. A functional breakdown of NASA programs under Aeronautics R and D, space R and T, and institutional support is also included. These areas were assessed against the computer science categories. Concurrent processing, highly reliable computing, and information management are identified.

  13. US Department of Energy education programs catalog

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1992-07-01

    Missions assigned to DOE by Congress include fundamental scientific research, research and development of energy technologies, energy conservation, strategic weapons development and production, energy regulation, energy data collection and analysis, federal power marketing, and education in science and technology. Contributing to mathematics and science education initiatives are nine DOE national laboratories and more than 30 additional specialized research facilities. Within their walls, some of the most exciting research in contemporary science is conducted. The Synchrotron Light Source at Brookhaven National Laboratory, the Intense Pulsed Neutron Source at Argonne National Laboratory, lasers, electron microscopes, advanced robotics and supercomputers are examples ofmore » some of the unique tools that DOE employs in exploring research frontiers. Nobel laureates and other eminent scientists employed by DOE laboratories have accomplished landmark work in physics, chemistry, biology, materials science, and other disciplines. The Department oversees an unparalleled collection of scientific and technical facilities and equipment with extraordinary potential for kindling in students and the general public a sense of excitement about science and increasing public science literacy. During 1991, programs funded by DOE and its contractors reached more than one million students and educators. This document is a catalog of these education programs.« less

  14. US Department of Energy education programs catalog

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1992-01-01

    Missions assigned to DOE by Congress include fundamental scientific research, research and development of energy technologies, energy conservation, strategic weapons development and production, energy regulation, energy data collection and analysis, federal power marketing, and education in science and technology. Contributing to mathematics and science education initiatives are nine DOE national laboratories and more than 30 additional specialized research facilities. Within their walls, some of the most exciting research in contemporary science is conducted. The Synchrotron Light Source at Brookhaven National Laboratory, the Intense Pulsed Neutron Source at Argonne National Laboratory, lasers, electron microscopes, advanced robotics and supercomputers are examples ofmore » some of the unique tools that DOE employs in exploring research frontiers. Nobel laureates and other eminent scientists employed by DOE laboratories have accomplished landmark work in physics, chemistry, biology, materials science, and other disciplines. The Department oversees an unparalleled collection of scientific and technical facilities and equipment with extraordinary potential for kindling in students and the general public a sense of excitement about science and increasing public science literacy. During 1991, programs funded by DOE and its contractors reached more than one million students and educators. This document is a catalog of these education programs.« less

  15. On teaching computer ethics within a computer science department.

    PubMed

    Quinn, Michael J

    2006-04-01

    The author has surveyed a quarter of the accredited undergraduate computer science programs in the United States. More than half of these programs offer a 'social and ethical implications of computing' course taught by a computer science faculty member, and there appears to be a trend toward teaching ethics classes within computer science departments. Although the decision to create an 'in house' computer ethics course may sometimes be a pragmatic response to pressure from the accreditation agency, this paper argues that teaching ethics within a computer science department can provide students and faculty members with numerous benefits. The paper lists topics that can be covered in a computer ethics course and offers some practical suggestions for making the course successful.

  16. DOE's Computer Incident Advisory Capability (CIAC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schultz, E.

    1990-09-01

    Computer security is essential in maintaining quality in the computing environment. Computer security incidents, however, are becoming more sophisticated. The DOE Computer Incident Advisory Capability (CIAC) team was formed primarily to assist DOE sites in responding to computer security incidents. Among CIAC's other responsibilities are gathering and distributing information to DOE sites, providing training workshops, coordinating with other agencies, response teams, and vendors, creating guidelines for incident handling, and developing software tools. CIAC has already provided considerable assistance to DOE sites faced with virus infections and worm and hacker attacks, has issued over 40 information bulletins, and has developed andmore » presented a workshop on incident handling. CIAC's experience in helping sites has produced several lessons learned, including the need to follow effective procedures to avoid virus infections in small systems and the need for sound password management and system administration in networked systems. CIAC's activity and scope will expand in the future. 4 refs.« less

  17. Computational Science News | Computational Science | NREL

    Science.gov Websites

    -Cooled High-Performance Computing Technology at the ESIF February 28, 2018 NREL Launches New Website for High-Performance Computing System Users The National Renewable Energy Laboratory (NREL) Computational Science Center has launched a revamped website for users of the lab's high-performance computing (HPC

  18. Stability Analysis of Finite Difference Approximations to Hyperbolic Systems, and Problems in Applied and Computational Matrix Theory

    DTIC Science & Technology

    1988-07-08

    Marcus and C. Baczynski), Computer Science Press, Rockville, Maryland, 1986. 3. An Introduction to Pascal and Precalculus , Computer Science Press...Science Press, Rockville, Maryland, 1986. 35. An Introduction to Pascal and Precalculus , Computer Science Press, Rockville, Maryland, 1986. 36

  19. Empirical Determination of Competence Areas to Computer Science Education

    ERIC Educational Resources Information Center

    Zendler, Andreas; Klaudt, Dieter; Seitz, Cornelia

    2014-01-01

    The authors discuss empirically determined competence areas to K-12 computer science education, emphasizing the cognitive level of competence. The results of a questionnaire with 120 professors of computer science serve as a database. By using multi-dimensional scaling and cluster analysis, four competence areas to computer science education…

  20. Factors Influencing Exemplary Science Teachers' Levels of Computer Use

    ERIC Educational Resources Information Center

    Hakverdi, Meral; Dana, Thomas M.; Swain, Colleen

    2011-01-01

    The purpose of this study was to examine exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their…

  1. Preparing Future Secondary Computer Science Educators

    ERIC Educational Resources Information Center

    Ajwa, Iyad

    2007-01-01

    Although nearly every college offers a major in computer science, many computer science teachers at the secondary level have received little formal training. This paper presents details of a project that could make a significant contribution to national efforts to improve computer science education by combining teacher education and professional…

  2. Enabling Wide-Scale Computer Science Education through Improved Automated Assessment Tools

    NASA Astrophysics Data System (ADS)

    Boe, Bryce A.

    There is a proliferating demand for newly trained computer scientists as the number of computer science related jobs continues to increase. University programs will only be able to train enough new computer scientists to meet this demand when two things happen: when there are more primary and secondary school students interested in computer science, and when university departments have the resources to handle the resulting increase in enrollment. To meet these goals, significant effort is being made to both incorporate computational thinking into existing primary school education, and to support larger university computer science class sizes. We contribute to this effort through the creation and use of improved automated assessment tools. To enable wide-scale computer science education we do two things. First, we create a framework called Hairball to support the static analysis of Scratch programs targeted for fourth, fifth, and sixth grade students. Scratch is a popular building-block language utilized to pique interest in and teach the basics of computer science. We observe that Hairball allows for rapid curriculum alterations and thus contributes to wide-scale deployment of computer science curriculum. Second, we create a real-time feedback and assessment system utilized in university computer science classes to provide better feedback to students while reducing assessment time. Insights from our analysis of student submission data show that modifications to the system configuration support the way students learn and progress through course material, making it possible for instructors to tailor assignments to optimize learning in growing computer science classes.

  3. How are the energy waves blocked on the way from hot to cold?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bai, Xianming; He, Lingfeng; Khafizov, Marat

    Representing the Center for Materials Science of Nuclear Fuel (CMSNF), this document is one of the entries in the Ten Hundred and One Word Challenge. As part of the challenge, the 46 Energy Frontier Research Centers were invited to represent their science in images, cartoons, photos, words and original paintings, but any descriptions or words could only use the 1000 most commonly used words in the English language, with the addition of one word important to each of the EFRCs and the mission of DOE energy. The mission of CMSNF to develop an experimentally validated multi-scale computational capability for themore » predictive understanding of the impact of microstructure on thermal transport in nuclear fuel under irradiation, with ultimate application to UO2 as a model system« less

  4. Programmers, professors, and parasites: credit and co-authorship in computer science.

    PubMed

    Solomon, Justin

    2009-12-01

    This article presents an in-depth analysis of past and present publishing practices in academic computer science to suggest the establishment of a more consistent publishing standard. Historical precedent for academic publishing in computer science is established through the study of anecdotes as well as statistics collected from databases of published computer science papers. After examining these facts alongside information about analogous publishing situations and standards in other scientific fields, the article concludes with a list of basic principles that should be adopted in any computer science publishing standard. These principles would contribute to the reliability and scientific nature of academic publications in computer science and would allow for more straightforward discourse in future publications.

  5. On The Computational Capabilities of Physical Systems. Part 2; Relationship With Conventional Computer Science

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.; Koga, Dennis (Technical Monitor)

    2000-01-01

    In the first of this pair of papers, it was proven that there cannot be a physical computer to which one can properly pose any and all computational tasks concerning the physical universe. It was then further proven that no physical computer C can correctly carry out all computational tasks that can be posed to C. As a particular example, this result means that no physical computer that can, for any physical system external to that computer, take the specification of that external system's state as input and then correctly predict its future state before that future state actually occurs; one cannot build a physical computer that can be assured of correctly "processing information faster than the universe does". These results do not rely on systems that are infinite, and/or non-classical, and/or obey chaotic dynamics. They also hold even if one uses an infinitely fast, infinitely dense computer, with computational powers greater than that of a Turing Machine. This generality is a direct consequence of the fact that a novel definition of computation - "physical computation" - is needed to address the issues considered in these papers, which concern real physical computers. While this novel definition does not fit into the traditional Chomsky hierarchy, the mathematical structure and impossibility results associated with it have parallels in the mathematics of the Chomsky hierarchy. This second paper of the pair presents a preliminary exploration of some of this mathematical structure. Analogues of Chomskian results concerning universal Turing Machines and the Halting theorem are derived, as are results concerning the (im)possibility of certain kinds of error-correcting codes. In addition, an analogue of algorithmic information complexity, "prediction complexity", is elaborated. A task-independent bound is derived on how much the prediction complexity of a computational task can differ for two different reference universal physical computers used to solve that task, a bound similar to the "encoding" bound governing how much the algorithm information complexity of a Turing machine calculation can differ for two reference universal Turing machines. Finally, it is proven that either the Hamiltonian of our universe proscribes a certain type of computation, or prediction complexity is unique (unlike algorithmic information complexity), in that there is one and only version of it that can be applicable throughout our universe.

  6. Increasing Diversity in Computer Science: Acknowledging, yet Moving Beyond, Gender

    NASA Astrophysics Data System (ADS)

    Larsen, Elizabeth A.; Stubbs, Margaret L.

    Lack of diversity within the computer science field has, thus far, been examined most fully through the lens of gender. This article is based on a follow-on to Margolis and Fisher's (2002) study and includes interviews with 33 Carnegie Mellon University students from the undergraduate senior class of 2002 in the School of Computer Science. We found evidence of similarities among the perceptions of these women and men on definitions of computer science, explanations for the notoriously low proportion of women in the field, characterizations of a typical computer science student, impressions of recent curricular changes, a sense of the atmosphere/culture in the program, views of the Women@SCS campus organization, and suggestions for attracting and retaining well-rounded students in computer science. We conclude that efforts to increase diversity in the computer science field will benefit from a more broad-based approach that considers, but is not limited to, notions of gender difference.

  7. Democratizing Computer Science

    ERIC Educational Resources Information Center

    Margolis, Jane; Goode, Joanna; Ryoo, Jean J.

    2015-01-01

    Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…

  8. Implementing an Affordable High-Performance Computing for Teaching-Oriented Computer Science Curriculum

    ERIC Educational Resources Information Center

    Abuzaghleh, Omar; Goldschmidt, Kathleen; Elleithy, Yasser; Lee, Jeongkyu

    2013-01-01

    With the advances in computing power, high-performance computing (HPC) platforms have had an impact on not only scientific research in advanced organizations but also computer science curriculum in the educational community. For example, multicore programming and parallel systems are highly desired courses in the computer science major. However,…

  9. Computer Science and the Liberal Arts

    ERIC Educational Resources Information Center

    Shannon, Christine

    2010-01-01

    Computer science and the liberal arts have much to offer each other. Yet liberal arts colleges, in particular, have been slow to recognize the opportunity that the study of computer science provides for achieving the goals of a liberal education. After the precipitous drop in computer science enrollments during the first decade of this century,…

  10. Marrying Content and Process in Computer Science Education

    ERIC Educational Resources Information Center

    Zendler, A.; Spannagel, C.; Klaudt, D.

    2011-01-01

    Constructivist approaches to computer science education emphasize that as well as knowledge, thinking skills and processes are involved in active knowledge construction. K-12 computer science curricula must not be based on fashions and trends, but on contents and processes that are observable in various domains of computer science, that can be…

  11. Computing Whether She Belongs: Stereotypes Undermine Girls' Interest and Sense of Belonging in Computer Science

    ERIC Educational Resources Information Center

    Master, Allison; Cheryan, Sapna; Meltzoff, Andrew N.

    2016-01-01

    Computer science has one of the largest gender disparities in science, technology, engineering, and mathematics. An important reason for this disparity is that girls are less likely than boys to enroll in necessary "pipeline courses," such as introductory computer science. Two experiments investigated whether high-school girls' lower…

  12. Approaching Gender Parity: Women in Computer Science at Afghanistan's Kabul University

    ERIC Educational Resources Information Center

    Plane, Jandelyn

    2010-01-01

    This study explores the representation of women in computer science at the tertiary level through data collected about undergraduate computer science education at Kabul University in Afghanistan. Previous studies have theorized reasons for underrepresentation of women in computer science, and while many of these reasons are indeed present in…

  13. Some Hail 'Computational Science' as Biggest Advance Since Newton, Galileo.

    ERIC Educational Resources Information Center

    Turner, Judith Axler

    1987-01-01

    Computational science is defined as science done on a computer. A computer can serve as a laboratory for researchers who cannot experiment with their subjects, and as a calculator for those who otherwise might need centuries to solve some problems mathematically. The National Science Foundation's support of supercomputers is discussed. (MLW)

  14. African-American males in computer science---Examining the pipeline for clogs

    NASA Astrophysics Data System (ADS)

    Stone, Daryl Bryant

    The literature on African-American males (AAM) begins with a statement to the effect that "Today young Black men are more likely to be killed or sent to prison than to graduate from college." Why are the numbers of African-American male college graduates decreasing? Why are those enrolled in college not majoring in the science, technology, engineering, and mathematics (STEM) disciplines? This research explored why African-American males are not filling the well-recognized industry need for Computer Scientist/Technologists by choosing college tracks to these careers. The literature on STEM disciplines focuses largely on women in STEM, as opposed to minorities, and within minorities, there is a noticeable research gap in addressing the needs and opportunities available to African-American males. The primary goal of this study was therefore to examine the computer science "pipeline" from the African-American male perspective. The method included a "Computer Science Degree Self-Efficacy Scale" be distributed to five groups of African-American male students, to include: (1) fourth graders, (2) eighth graders, (3) eleventh graders, (4) underclass undergraduate computer science majors, and (5) upperclass undergraduate computer science majors. In addition to a 30-question self-efficacy test, subjects from each group were asked to participate in a group discussion about "African-American males in computer science." The audio record of each group meeting provides qualitative data for the study. The hypotheses include the following: (1) There is no significant difference in "Computer Science Degree" self-efficacy between fourth and eighth graders. (2) There is no significant difference in "Computer Science Degree" self-efficacy between eighth and eleventh graders. (3) There is no significant difference in "Computer Science Degree" self-efficacy between eleventh graders and lower-level computer science majors. (4) There is no significant difference in "Computer Science Degree" self-efficacy between lower-level computer science majors and upper-level computer science majors. (5) There is no significant difference in "Computer Science Degree" self-efficacy between each of the five groups of students. Finally, the researcher selected African-American male students attending six primary schools, including the predominately African-American elementary, middle and high school that the researcher attended during his own academic career. Additionally, a racially mixed elementary, middle and high school was selected from the same county in Maryland. Bowie State University provided both the underclass and upperclass computer science majors surveyed in this study. Of the five hypotheses, the sample provided enough evidence to support the claim that there are significant differences in the "Computer Science Degree" self-efficacy between each of the five groups of students. ANOVA analysis by question and total self-efficacy scores provided more results of statistical significance. Additionally, factor analysis and review of the qualitative data provide more insightful results. Overall, the data suggest 'a clog' may exist in the middle school level and students attending racially mixed schools were more confident in their computer, math and science skills. African-American males admit to spending lots of time on social networking websites and emailing, but are 'dis-aware' of the skills and knowledge needed to study in the computing disciplines. The majority of the subjects knew little, if any, AAMs in the 'computing discipline pipeline'. The collegian African-American males, in this study, agree that computer programming is a difficult area and serves as a 'major clog in the pipeline'.

  15. 34 CFR 637.32 - What selection criteria does the Secretary use?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... POSTSECONDARY EDUCATION, DEPARTMENT OF EDUCATION MINORITY SCIENCE AND ENGINEERING IMPROVEMENT PROGRAM How Does... project; (iii) A clear description of how the objectives of the project relate to the purpose of the... specific needs in science; and (iii) Involvement of appropriate individuals, especially science faculty, in...

  16. Girls in computer science: A female only introduction class in high school

    NASA Astrophysics Data System (ADS)

    Drobnis, Ann W.

    This study examined the impact of an all girls' classroom environment in a high school introductory computer science class on the student's attitudes towards computer science and their thoughts on future involvement with computer science. It was determined that an all girls' introductory class could impact the declining female enrollment and female students' efficacy towards computer science. This research was conducted in a summer school program through a regional magnet school for science and technology which these students attend during the school year. Three different groupings of students were examined for the research: female students in an all girls' class, female students in mixed-gender classes and male students in mixed-gender classes. A survey, Attitudes about Computers and Computer Science (ACCS), was designed to obtain an understanding of the students' thoughts, preconceptions, attitude, knowledge of computer science, and future intentions around computer science, both in education and career. Students in all three groups were administered the ACCS prior to taking the class and upon completion of the class. In addition, students in the all girls' class wrote in a journal throughout the course, and some of those students were also interviewed upon completion of the course. The data was analyzed using quantitative and qualitative techniques. While there were no major differences found in the quantitative data, it was determined that girls in the all girls' class were truly excited by what they had learned and were more open to the idea of computer science being a part of their future.

  17. Opening Remarks: SciDAC 2007

    NASA Astrophysics Data System (ADS)

    Strayer, Michael

    2007-09-01

    Good morning. Welcome to Boston, the home of the Red Sox, Celtics and Bruins, baked beans, tea parties, Robert Parker, and SciDAC 2007. A year ago I stood before you to share the legacy of the first SciDAC program and identify the challenges that we must address on the road to petascale computing—a road E E Cummins described as `. . . never traveled, gladly beyond any experience.' Today, I want to explore the preparations for the rapidly approaching extreme scale (X-scale) generation. These preparations are the first step propelling us along the road of burgeoning scientific discovery enabled by the application of X- scale computing. We look to petascale computing and beyond to open up a world of discovery that cuts across scientific fields and leads us to a greater understanding of not only our world, but our universe. As part of the President's America Competitiveness Initiative, the ASCR Office has been preparing a ten year vision for computing. As part of this planning the LBNL together with ORNL and ANL hosted three town hall meetings on Simulation and Modeling at the Exascale for Energy, Ecological Sustainability and Global Security (E3). The proposed E3 initiative is organized around four programmatic themes: Engaging our top scientists, engineers, computer scientists and applied mathematicians; investing in pioneering large-scale science; developing scalable analysis algorithms, and storage architectures to accelerate discovery; and accelerating the build-out and future development of the DOE open computing facilities. It is clear that we have only just started down the path to extreme scale computing. Plan to attend Thursday's session on the out-briefing and discussion of these meetings. The road to the petascale has been at best rocky. In FY07, the continuing resolution provided 12% less money for Advanced Scientific Computing than either the President, the Senate, or the House. As a consequence, many of you had to absorb a no cost extension for your SciDAC work. I am pleased that the President's FY08 budget restores the funding for SciDAC. Quoting from Advanced Scientific Computing Research description in the House Energy and Water Development Appropriations Bill for FY08, "Perhaps no other area of research at the Department is so critical to sustaining U.S. leadership in science and technology, revolutionizing the way science is done and improving research productivity." As a society we need to revolutionize our approaches to energy, environmental and global security challenges. As we go forward along the road to the X-scale generation, the use of computation will continue to be a critical tool along with theory and experiment in understanding the behavior of the fundamental components of nature as well as for fundamental discovery and exploration of the behavior of complex systems. The foundation to overcome these societal challenges will build from the experiences and knowledge gained as you, members of our SciDAC research teams, work together to attack problems at the tera- and peta- scale. If SciDAC is viewed as an experiment for revolutionizing scientific methodology, then a strategic goal of ASCR program must be to broaden the intellectual base prepared to address the challenges of the new X-scale generation of computing. We must focus our computational science experiences gained over the past five years on the opportunities introduced with extreme scale computing. Our facilities are on a path to provide the resources needed to undertake the first part of our journey. Using the newly upgraded 119 teraflop Cray XT system at the Leadership Computing Facility, SciDAC research teams have in three days performed a 100-year study of the time evolution of the atmospheric CO2 concentration originating from the land surface. The simulation of the El Nino/Southern Oscillation which was part of this study has been characterized as `the most impressive new result in ten years' gained new insight into the behavior of superheated ionic gas in the ITER reactor as a result of an AORSA run on 22,500 processors that achieved over 87 trillion calculations per second (87 teraflops) which is 74% of the system's theoretical peak. Tomorrow, Argonne and IBM will announce that the first IBM Blue Gene/P, a 100 teraflop system, will be shipped to the Argonne Leadership Computing Facility later this fiscal year. By the end of FY2007 ASCR high performance and leadership computing resources will include the 114 teraflop IBM Blue Gene/P; a 102 teraflop Cray XT4 at NERSC and a 119 teraflop Cray XT system at Oak Ridge. Before ringing in the New Year, Oak Ridge will upgrade to 250 teraflops with the replacement of the dual core processors with quad core processors and Argonne will upgrade to between 250-500 teraflops, and next year, a petascale Cray Baker system is scheduled for delivery at Oak Ridge. The multidisciplinary teams in our SciDAC Centers for Enabling Technologies and our SciDAC Institutes must continue to work with our Scientific Application teams to overcome the barriers that prevent effective use of these new systems. These challenges include: the need for new algorithms as well as operating system and runtime software and tools which scale to parallel systems composed of hundreds of thousands processors; program development environments and tools which scale effectively and provide ease of use for developers and scientific end users; and visualization and data management systems that support moving, storing, analyzing, manipulating and visualizing multi-petabytes of scientific data and objects. The SciDAC Centers, located primarily at our DOE national laboratories will take the lead in ensuring that critical computer science and applied mathematics issues are addressed in a timely and comprehensive fashion and to address issues associated with research software lifecycle. In contrast, the SciDAC Institutes, which are university-led centers of excellence, will have more flexibility to pursue new research topics through a range of research collaborations. The Institutes will also work to broaden the intellectual and researcher base—conducting short courses and summer schools to take advantage of new high performance computing capabilities. The SciDAC Outreach Center at Lawrence Berkeley National Laboratory complements the outreach efforts of the SciDAC Institutes. The Outreach Center is our clearinghouse for SciDAC activities and resources and will communicate with the high performance computing community in part to understand their needs for workshops, summer schools and institutes. SciDAC is not ASCR's only effort to broaden the computational science community needed to meet the challenges of the new X-scale generation. I hope that you were able to attend the Computational Science Graduate Fellowship poster session last night. ASCR developed the fellowship in 1991 to meet the nation's growing need for scientists and technology professionals with advanced computer skills. CSGF, now jointly funded between ASCR and NNSA, is more than a traditional academic fellowship. It has provided more than 200 of the best and brightest graduate students with guidance, support and community in preparing them as computational scientists. Today CSGF alumni are bringing their diverse top-level skills and knowledge to research teams at DOE laboratories and in industries such as Proctor and Gamble, Lockheed Martin and Intel. At universities they are working to train the next generation of computational scientists. To build on this success, we intend to develop a wholly new Early Career Principal Investigator's (ECPI) program. Our objective is to stimulate academic research in scientific areas within ASCR's purview especially among faculty in early stages of their academic careers. Last February, we lost Ken Kennedy, one of the leading lights of our community. As we move forward into the extreme computing generation, his vision and insight will be greatly missed. In memorial to Ken Kennedy, we shall designate the ECPI grants to beginning faculty in Computer Science as the Ken Kennedy Fellowship. Watch the ASCR website for more information about ECPI and other early career programs in the computational sciences. We look to you, our scientists, researchers, and visionaries to take X-scale computing and use it to explode scientific discovery in your fields. We at SciDAC will work to ensure that this tool is the sharpest and most precise and efficient instrument to carve away the unknown and reveal the most exciting secrets and stimulating scientific discoveries of our time. The partnership between research and computing is the marriage that will spur greater discovery, and as Spencer said to Susan in Robert Parker's novel, `Sudden Mischief', `We stick together long enough, and we may get as smart as hell'. Michael Strayer

  18. Get immersed in the Soil Sciences: the first community of avatars in the EGU Assembly 2015!

    NASA Astrophysics Data System (ADS)

    Castillo, Sebastian; Alarcón, Purificación; Beato, Mamen; Emilio Guerrero, José; José Martínez, Juan; Pérez, Cristina; Ortiz, Leovigilda; Taguas, Encarnación V.

    2015-04-01

    Virtual reality and immersive worlds refer to artificial computer-generated environments, with which users act and interact as in a known environment by the use of figurative virtual individuals (avatars). Virtual environments will be the technology of the early twenty-first century that will most dramatically change the way we live, particularly in the areas of training and education, product development and entertainment (Schmorrow, 2009). The usefulness of immersive worlds has been proved in different fields. They reduce geographic and social barriers between different stakeholders and create virtual social spaces which can positively impact learning and discussion outcomes (Lorenzo et al. 2012). In this work we present a series of interactive meetings in a virtual building to celebrate the International Year of Soil to promote the importance of soil functions and its conservation. In a virtual room, the avatars of different senior researchers will meet young scientist avatars to talk about: 1) what remains to be done in Soil Sciences; 2) which are their main current limitations and difficulties and 3) which are the future hot research lines. The interactive participation does not require physically attend to the EGU Assembly 2015. In addition, this virtual building inspired in Soil Sciences can be completed with different teaching resources from different locations around the world and it will be used to improve the learning of Soil Sciences in a multicultural context. REFERENCES: Lorenzo C.M., Sicilia, M.A., Sánchez S. 2012. Studying the effectiveness of multi-user immersive environments for collaborative evaluation tasks. Computers & Education 59 (2012) 1361-1376 Schmorrow D.D. 2009. "Why virtual?" Theoretical Issues in Ergonomics Science 10(3): 279-282.

  19. Reaction Rate Theory in Coordination Number Space: An Application to Ion Solvation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roy, Santanu; Baer, Marcel D.; Mundy, Christopher J.

    2016-04-14

    Understanding reaction mechanisms in many chemical and biological processes require application of rare event theories. In these theories, an effective choice of a reaction coordinate to describe a reaction pathway is essential. To this end, we study ion solvation in water using molecular dynamics simulations and explore the utility of coordination number (n = number of water molecules in the first solvation shell) as the reaction coordinate. Here we compute the potential of mean force (W(n)) using umbrella sampling, predicting multiple metastable n-states for both cations and anions. We find with increasing ionic size, these states become more stable andmore » structured for cations when compared to anions. We have extended transition state theory (TST) to calculate transition rates between n-states. TST overestimates the rate constant due to solvent-induced barrier recrossings that are not accounted for. We correct the TST rates by calculating transmission coefficients using the reactive flux method. This approach enables a new way of understanding rare events involving coordination complexes. We gratefully acknowledge Liem Dang and Panos Stinis for useful discussion. This research used resources of the National Energy Research Scientific Computing Center, a DOE Office of Science User Facility supported by the Office of Science of the U.S. Department of Energy under Contract No. DE-AC02-05CH11231. SR, CJM, and GKS were supported by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences, and Biosciences. MDB was supported by MS3 (Materials Synthesis and Simulation Across Scales) Initiative, a Laboratory Directed Research and Development Program at Pacific Northwest National Laboratory (PNNL). PNNL is a multiprogram national laboratory operated by Battelle for the U.S. Department of Energy.« less

  20. Use of handheld computers in clinical practice: a systematic review.

    PubMed

    Mickan, Sharon; Atherton, Helen; Roberts, Nia Wyn; Heneghan, Carl; Tilson, Julie K

    2014-07-06

    Many healthcare professionals use smartphones and tablets to inform patient care. Contemporary research suggests that handheld computers may support aspects of clinical diagnosis and management. This systematic review was designed to synthesise high quality evidence to answer the question; Does healthcare professionals' use of handheld computers improve their access to information and support clinical decision making at the point of care? A detailed search was conducted using Cochrane, MEDLINE, EMBASE, PsycINFO, Science and Social Science Citation Indices since 2001. Interventions promoting healthcare professionals seeking information or making clinical decisions using handheld computers were included. Classroom learning and the use of laptop computers were excluded. Two authors independently selected studies, assessed quality using the Cochrane Risk of Bias tool and extracted data. High levels of data heterogeneity negated statistical synthesis. Instead, evidence for effectiveness was summarised narratively, according to each study's aim for assessing the impact of handheld computer use. We included seven randomised trials investigating medical or nursing staffs' use of Personal Digital Assistants. Effectiveness was demonstrated across three distinct functions that emerged from the data: accessing information for clinical knowledge, adherence to guidelines and diagnostic decision making. When healthcare professionals used handheld computers to access clinical information, their knowledge improved significantly more than peers who used paper resources. When clinical guideline recommendations were presented on handheld computers, clinicians made significantly safer prescribing decisions and adhered more closely to recommendations than peers using paper resources. Finally, healthcare professionals made significantly more appropriate diagnostic decisions using clinical decision making tools on handheld computers compared to colleagues who did not have access to these tools. For these clinical decisions, the numbers need to test/screen were all less than 11. Healthcare professionals' use of handheld computers may improve their information seeking, adherence to guidelines and clinical decision making. Handheld computers can provide real time access to and analysis of clinical information. The integration of clinical decision support systems within handheld computers offers clinicians the highest level of synthesised evidence at the point of care. Future research is needed to replicate these early results and to identify beneficial clinical outcomes.

  1. Use of handheld computers in clinical practice: a systematic review

    PubMed Central

    2014-01-01

    Background Many healthcare professionals use smartphones and tablets to inform patient care. Contemporary research suggests that handheld computers may support aspects of clinical diagnosis and management. This systematic review was designed to synthesise high quality evidence to answer the question; Does healthcare professionals’ use of handheld computers improve their access to information and support clinical decision making at the point of care? Methods A detailed search was conducted using Cochrane, MEDLINE, EMBASE, PsycINFO, Science and Social Science Citation Indices since 2001. Interventions promoting healthcare professionals seeking information or making clinical decisions using handheld computers were included. Classroom learning and the use of laptop computers were excluded. Two authors independently selected studies, assessed quality using the Cochrane Risk of Bias tool and extracted data. High levels of data heterogeneity negated statistical synthesis. Instead, evidence for effectiveness was summarised narratively, according to each study’s aim for assessing the impact of handheld computer use. Results We included seven randomised trials investigating medical or nursing staffs’ use of Personal Digital Assistants. Effectiveness was demonstrated across three distinct functions that emerged from the data: accessing information for clinical knowledge, adherence to guidelines and diagnostic decision making. When healthcare professionals used handheld computers to access clinical information, their knowledge improved significantly more than peers who used paper resources. When clinical guideline recommendations were presented on handheld computers, clinicians made significantly safer prescribing decisions and adhered more closely to recommendations than peers using paper resources. Finally, healthcare professionals made significantly more appropriate diagnostic decisions using clinical decision making tools on handheld computers compared to colleagues who did not have access to these tools. For these clinical decisions, the numbers need to test/screen were all less than 11. Conclusion Healthcare professionals’ use of handheld computers may improve their information seeking, adherence to guidelines and clinical decision making. Handheld computers can provide real time access to and analysis of clinical information. The integration of clinical decision support systems within handheld computers offers clinicians the highest level of synthesised evidence at the point of care. Future research is needed to replicate these early results and to identify beneficial clinical outcomes. PMID:24998515

  2. Bringing computational science to the public.

    PubMed

    McDonagh, James L; Barker, Daniel; Alderson, Rosanna G

    2016-01-01

    The increasing use of computers in science allows for the scientific analyses of large datasets at an increasing pace. We provided examples and interactive demonstrations at Dundee Science Centre as part of the 2015 Women in Science festival, to present aspects of computational science to the general public. We used low-cost Raspberry Pi computers to provide hands on experience in computer programming and demonstrated the application of computers to biology. Computer games were used as a means to introduce computers to younger visitors. The success of the event was evaluated by voluntary feedback forms completed by visitors, in conjunction with our own self-evaluation. This work builds on the original work of the 4273π bioinformatics education program of Barker et al. (2013, BMC Bioinform. 14:243). 4273π provides open source education materials in bioinformatics. This work looks at the potential to adapt similar materials for public engagement events. It appears, at least in our small sample of visitors (n = 13), that basic computational science can be conveyed to people of all ages by means of interactive demonstrations. Children as young as five were able to successfully edit simple computer programs with supervision. This was, in many cases, their first experience of computer programming. The feedback is predominantly positive, showing strong support for improving computational science education, but also included suggestions for improvement. Our conclusions are necessarily preliminary. However, feedback forms suggest methods were generally well received among the participants; "Easy to follow. Clear explanation" and "Very easy. Demonstrators were very informative." Our event, held at a local Science Centre in Dundee, demonstrates that computer games and programming activities suitable for young children can be performed alongside a more specialised and applied introduction to computational science for older visitors.

  3. Computer Science and Telecommunications Board summary of activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blumenthal, M.S.

    1992-03-27

    The Computer Science and Telecommunications Board (CSTB) considers technical and policy issues pertaining to computer science, telecommunications, and associated technologies. CSTB actively disseminates the results of its completed projects to those in a position to help implement their recommendations or otherwise use their insights. It provides a forum for the exchange of information on computer science, computing technology, and telecommunications. This report discusses the major accomplishments of CSTB.

  4. Hispanic women overcoming deterrents to computer science: A phenomenological study

    NASA Astrophysics Data System (ADS)

    Herling, Lourdes

    The products of computer science are important to all aspects of society and are tools in the solution of the world's problems. It is, therefore, troubling that the United States faces a shortage in qualified graduates in computer science. The number of women and minorities in computer science is significantly lower than the percentage of the U.S. population which they represent. The overall enrollment in computer science programs has continued to decline with the enrollment of women declining at a higher rate than that of men. This study addressed three aspects of underrepresentation about which there has been little previous research: addressing computing disciplines specifically rather than embedding them within the STEM disciplines, what attracts women and minorities to computer science, and addressing the issues of race/ethnicity and gender in conjunction rather than in isolation. Since women of underrepresented ethnicities are more severely underrepresented than women in general, it is important to consider whether race and ethnicity play a role in addition to gender as has been suggested by previous research. Therefore, this study examined what attracted Hispanic women to computer science specifically. The study determines whether being subjected to multiple marginalizations---female and Hispanic---played a role in the experiences of Hispanic women currently in computer science. The study found five emergent themes within the experiences of Hispanic women in computer science. Encouragement and role models strongly influenced not only the participants' choice to major in the field, but to persist as well. Most of the participants experienced a negative atmosphere and feelings of not fitting in while in college and industry. The interdisciplinary nature of computer science was the most common aspect that attracted the participants to computer science. The aptitudes participants commonly believed are needed for success in computer science are the Twenty-First Century skills problem solving, creativity, and critical thinking. While not all the participants had experience with computers or programming prior to attending college, experience played a role in the self-confidence of those who did.

  5. Research in applied mathematics, numerical analysis, and computer science

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  6. ENergy and Power Evaluation Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1996-11-01

    In the late 1970s, national and international attention began to focus on energy issues. Efforts were initiated to design and test analytical tools that could be used to assist energy planners in evaluating energy systems, particularly in developing countries. In 1984, the United States Department of Energy (DOE) commissioned Argonne National Laboratory`s Decision and Information Sciences Division (DIS) to incorporate a set of analytical tools into a personal computer-based package for distribution in developing countries. The package developed by DIS staff, the ENergy and Power Evaluation Program (ENPEP), covers the range of issues that energy planners must face: economic development,more » energy demand projections, supply-and-demand balancing, energy system expansion, and environmental impact analysis. Following the original DOE-supported development effort, the International Atomic Energy Agency (IAEA), with the assistance from the US Department of State (DOS) and the US Department of Energy (DOE), provided ENPEP training, distribution, and technical support to many countries. ENPEP is now in use in over 60 countries and is an international standard for energy planning tools. More than 500 energy experts have been trained in the use of the entire ENPEP package or some of its modules during the international training courses organized by the IAEA in collaboration with Argonne`s Decision and Information Sciences (DIS) Division and the Division of Educational Programs (DEP). This report contains the ENPEP program which can be download from the internet. Described in this report is the description of ENPEP Program, news, forums, online support and contacts.« less

  7. Gender Differences in the Use of Computers, Programming, and Peer Interactions in Computer Science Classrooms

    ERIC Educational Resources Information Center

    Stoilescu, Dorian; Egodawatte, Gunawardena

    2010-01-01

    Research shows that female and male students in undergraduate computer science programs view computer culture differently. Female students are interested more in the use of computers than in doing programming, whereas male students see computer science mainly as a programming activity. The overall purpose of our research was not to find new…

  8. Modeling Reality - How Computers Mirror Life

    NASA Astrophysics Data System (ADS)

    Bialynicki-Birula, Iwo; Bialynicka-Birula, Iwona

    2005-01-01

    The bookModeling Reality covers a wide range of fascinating subjects, accessible to anyone who wants to learn about the use of computer modeling to solve a diverse range of problems, but who does not possess a specialized training in mathematics or computer science. The material presented is pitched at the level of high-school graduates, even though it covers some advanced topics (cellular automata, Shannon's measure of information, deterministic chaos, fractals, game theory, neural networks, genetic algorithms, and Turing machines). These advanced topics are explained in terms of well known simple concepts: Cellular automata - Game of Life, Shannon's formula - Game of twenty questions, Game theory - Television quiz, etc. The book is unique in explaining in a straightforward, yet complete, fashion many important ideas, related to various models of reality and their applications. Twenty-five programs, written especially for this book, are provided on an accompanying CD. They greatly enhance its pedagogical value and make learning of even the more complex topics an enjoyable pleasure.

  9. Computer-aided discovery of a metal-organic framework with superior oxygen uptake.

    PubMed

    Moghadam, Peyman Z; Islamoglu, Timur; Goswami, Subhadip; Exley, Jason; Fantham, Marcus; Kaminski, Clemens F; Snurr, Randall Q; Farha, Omar K; Fairen-Jimenez, David

    2018-04-11

    Current advances in materials science have resulted in the rapid emergence of thousands of functional adsorbent materials in recent years. This clearly creates multiple opportunities for their potential application, but it also creates the following challenge: how does one identify the most promising structures, among the thousands of possibilities, for a particular application? Here, we present a case of computer-aided material discovery, in which we complete the full cycle from computational screening of metal-organic framework materials for oxygen storage, to identification, synthesis and measurement of oxygen adsorption in the top-ranked structure. We introduce an interactive visualization concept to analyze over 1000 unique structure-property plots in five dimensions and delimit the relationships between structural properties and oxygen adsorption performance at different pressures for 2932 already-synthesized structures. We also report a world-record holding material for oxygen storage, UMCM-152, which delivers 22.5% more oxygen than the best known material to date, to the best of our knowledge.

  10. Opportunities for Computational Discovery in Basic Energy Sciences

    NASA Astrophysics Data System (ADS)

    Pederson, Mark

    2011-03-01

    An overview of the broad-ranging support of computational physics and computational science within the Department of Energy Office of Science will be provided. Computation as the third branch of physics is supported by all six offices (Advanced Scientific Computing, Basic Energy, Biological and Environmental, Fusion Energy, High-Energy Physics, and Nuclear Physics). Support focuses on hardware, software and applications. Most opportunities within the fields of~condensed-matter physics, chemical-physics and materials sciences are supported by the Officeof Basic Energy Science (BES) or through partnerships between BES and the Office for Advanced Scientific Computing. Activities include radiation sciences, catalysis, combustion, materials in extreme environments, energy-storage materials, light-harvesting and photovoltaics, solid-state lighting and superconductivity.~ A summary of two recent reports by the computational materials and chemical communities on the role of computation during the next decade will be provided. ~In addition to materials and chemistry challenges specific to energy sciences, issues identified~include a focus on the role of the domain scientist in integrating, expanding and sustaining applications-oriented capabilities on evolving high-performance computing platforms and on the role of computation in accelerating the development of innovative technologies. ~~

  11. Research | Computational Science | NREL

    Science.gov Websites

    Research Research NREL's computational science experts use advanced high-performance computing (HPC technologies, thereby accelerating the transformation of our nation's energy system. Enabling High-Impact Research NREL's computational science capabilities enable high-impact research. Some recent examples

  12. Science beyond fiction. A revolution of knowledge transfer in research, education, and practice is on the horizon.

    PubMed

    Ammann, Alexander

    2016-01-01

    "Digitality" (as opposed to "digitalization"--the conversion from the analog domain to the digital domain) will open up a whole new world that does not originate from the analog world. Contemporary research in the field of neural concepts and neuromorphic computing systems will lead to convergences between the world of digitality and the world of neuronality, giving the theme "Knowledge and Culture" a new meaning. The simulation of virtual multidimensional and contextual spaces will transform the transfer of knowledge from a uni- and bidirectional process into an interactive experience. We will learn to learn in a ubiquitous computing environment and will abandon conventional curriculum organization principles. The adaptation of individualized ontologies will result in the emergence of a new world of knowledge in which knowledge evolves from a cultural heritage into a commodity.

  13. 77 FR 45345 - DOE/Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-31

    ... Recompetition results for Scientific Discovery through Advanced Computing (SciDAC) applications Co-design Public... DEPARTMENT OF ENERGY DOE/Advanced Scientific Computing Advisory Committee AGENCY: Office of... the Advanced Scientific Computing Advisory Committee (ASCAC). The Federal Advisory Committee Act (Pub...

  14. 75 FR 64720 - DOE/Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-20

    ... DEPARTMENT OF ENERGY DOE/Advanced Scientific Computing Advisory Committee AGENCY: Department of... the Advanced Scientific Computing Advisory Committee (ASCAC). Federal Advisory Committee Act (Pub. L.... FOR FURTHER INFORMATION CONTACT: Melea Baker, Office of Advanced Scientific Computing Research; SC-21...

  15. Student science enrichment training program. Progress report, June 1, 1991--May 31, 1992

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sandhu, S.S.

    1992-04-21

    Historically Black Colleges and Universities wing of the United States Department of Energy (DOE) provided funds to Claflin College, Orangeburg, S.C. To conduct a student Science Enrichment Training Program for a period of six weeks during 1991 summer. Thirty participants were selected from a pool of applicants, generated by the High School Seniors and Juniors and the Freshmen class of 1990-1991 at Claflin College. The program primarily focused on high ability students, with potential for Science, Mathematics and Engineering Careers. The major objectives of the program were W to increase the pool of well qualified college entering minority students whomore » will elect to go in Physical Sciences and Engineering and (II) to increase the enrollment in Chemistry and Preprofessional-Pre-Med, Pre-Dent, etc.-majors at Claflin College by including the Claflin students to participate in summer academic program. The summer academic program consisted of Chemistry and Computer Science training. The program placed emphasis upon laboratory experience and research. Visits to Scientific and Industrial laboratories were arranged. Guest speakers which were drawn from academia, industry and several federal agencies, addressed the participants on the future role of Science in the industrial growth of United States of America. The guest speakers also acted as role models for the participants. Several videos and films, emphasizing the role of Science in human life, were also screened.« less

  16. Artificial neural networks in evaluation and optimization of modified release solid dosage forms.

    PubMed

    Ibrić, Svetlana; Djuriš, Jelena; Parojčić, Jelena; Djurić, Zorica

    2012-10-18

    Implementation of the Quality by Design (QbD) approach in pharmaceutical development has compelled researchers in the pharmaceutical industry to employ Design of Experiments (DoE) as a statistical tool, in product development. Among all DoE techniques, response surface methodology (RSM) is the one most frequently used. Progress of computer science has had an impact on pharmaceutical development as well. Simultaneous with the implementation of statistical methods, machine learning tools took an important place in drug formulation. Twenty years ago, the first papers describing application of artificial neural networks in optimization of modified release products appeared. Since then, a lot of work has been done towards implementation of new techniques, especially Artificial Neural Networks (ANN) in modeling of production, drug release and drug stability of modified release solid dosage forms. The aim of this paper is to review artificial neural networks in evaluation and optimization of modified release solid dosage forms.

  17. Artificial Neural Networks in Evaluation and Optimization of Modified Release Solid Dosage Forms

    PubMed Central

    Ibrić, Svetlana; Djuriš, Jelena; Parojčić, Jelena; Djurić, Zorica

    2012-01-01

    Implementation of the Quality by Design (QbD) approach in pharmaceutical development has compelled researchers in the pharmaceutical industry to employ Design of Experiments (DoE) as a statistical tool, in product development. Among all DoE techniques, response surface methodology (RSM) is the one most frequently used. Progress of computer science has had an impact on pharmaceutical development as well. Simultaneous with the implementation of statistical methods, machine learning tools took an important place in drug formulation. Twenty years ago, the first papers describing application of artificial neural networks in optimization of modified release products appeared. Since then, a lot of work has been done towards implementation of new techniques, especially Artificial Neural Networks (ANN) in modeling of production, drug release and drug stability of modified release solid dosage forms. The aim of this paper is to review artificial neural networks in evaluation and optimization of modified release solid dosage forms. PMID:24300369

  18. First-principles Study of Phenol Hydrogenation on Pt and Ni Catalysts in Aqueous Phase

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoon, Yeohoon; Rousseau, Roger J.; Weber, Robert S.

    2014-07-23

    The effects of aqueous phase on the reactivity of phenol hydrogenation over Pt and Ni catalysts were investigated using density functional theory based ab initio molecular dynamics (AIMD) calculations. The adsorption of phenol and the first hydrogenation steps via three carbon positions (ortho, meta and para) with respect to the phenolic OH group were studied in both vacuum and liquid phase conditions. To gain insight into how the aqueous phase affects the metal catalyst surface, increasing water environments including singly adsorbed water molecule, mono- (9 water molecules), double layers (24 water molecules), and the bulk liquid water which (52 watermore » molecules) on the Pt(111) and the Ni(111) surfaces were modeled. Compared to the vacuum/metal interfaces, AIMD simulation results suggest that the aqueous Pt(111) and Ni(111) interfaces have a lower metal work function in the order of 0.8 - 0.9 eV, thus, making the metals in aqueous phase stronger reducing agents and poorer oxidizing agents. Phenol adsorption from the aqueous phase is found to be slightly weaker that from the vapor phase. The first hydrogenation step of phenol at the ortho position of the phenolic ring is slightly favored over the other two positions. The polarization induced by the surrounding water molecules and the solvation effect play important roles in stabilizing the transition states associated with phenol hydrogenation by lowering the barriers of 0.1 - 0.4 eV. The detailed discussion on the basis of the interfacial electrostatics from the current study is very useful to understand the nature of a broader class of metal catalyzed reactions in liquid solution phase. This work was supported by the US Department of Energy (DOE), Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences & Biosciences and Office of Energy Efficiency and Renewable Energy. Pacific Northwest National Laboratory (PNNL) is a multiprogram national laboratory operated for DOE by Battelle. Computing time was granted by the grand challenge of computational catalysis of the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) and by the National Energy Research Scientific Computing Center (NERSC). EMSL is a national scientific user facility located at Pacific Northwest National Laboratory (PNNL) and sponsored by DOE’s Office of Biological and Environmental Research.« less

  19. Delivering The Benefits of Chemical-Biological Integration in ...

    EPA Pesticide Factsheets

    Abstract: Researchers at the EPA’s National Center for Computational Toxicology integrate advances in biology, chemistry, and computer science to examine the toxicity of chemicals and help prioritize chemicals for further research based on potential human health risks. The intention of this research program is to quickly evaluate thousands of chemicals for potential risk but with much reduced cost relative to historical approaches. This work involves computational and data driven approaches including high-throughput screening, modeling, text-mining and the integration of chemistry, exposure and biological data. We have developed a number of databases and applications that are delivering on the vision of developing a deeper understanding of chemicals and their effects on exposure and biological processes that are supporting a large community of scientists in their research efforts. This presentation will provide an overview of our work to bring together diverse large scale data from the chemical and biological domains, our approaches to integrate and disseminate these data, and the delivery of models supporting computational toxicology. This abstract does not reflect U.S. EPA policy. Presentation at ACS TOXI session on Computational Chemistry and Toxicology in Chemical Discovery and Assessement (QSARs).

  20. Report on Project Action Sheet PP05 task 3 between the U.S. Department of Energy and the Republic of Korea Ministry of Education, Science, and Technology (MEST).

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snell, Mark Kamerer

    2013-01-01

    This report documents the results of Task 3 of Project Action Sheet PP05 between the United States Department of Energy (DOE) and the Republic of Korea (ROK) Ministry of Education, Science, and Technology (MEST) for Support with Review of an ROK Risk Evaluation Process. This task was to have Sandia National Laboratories collaborate with the Korea Institute of Nuclear Nonproliferation and Control (KINAC) on several activities concerning how to determine the Probability of Neutralization, PN, and the Probability of System Effectiveness, PE, to include: providing descriptions on how combat simulations are used to determine PN and PE; comparisons of themore » strengths and weaknesses of two neutralization models (the Neutralization.xls spreadsheet model versus the Brief Adversary Threat-Loss Estimator (BATLE) software); and demonstrating how computer simulations can be used to determine PN. Note that the computer simulation used for the demonstration was the Scenario Toolkit And Generation Environment (STAGE) simulation, which is a stand-alone synthetic tactical simulation sold by Presagis Canada Incorporated. The demonstration is provided in a separate Audio Video Interleave (.AVI) file.« less

  1. The Effectiveness of Computer-Assisted Instruction to Teach Physical Examination to Students and Trainees in the Health Sciences Professions: A Systematic Review and Meta-Analysis

    PubMed Central

    Tomesko, Jennifer; Touger-Decker, Riva; Dreker, Margaret; Zelig, Rena; Parrott, James Scott

    2017-01-01

    Purpose: To explore knowledge and skill acquisition outcomes related to learning physical examination (PE) through computer-assisted instruction (CAI) compared with a face-to-face (F2F) approach. Method: A systematic literature review and meta-analysis published between January 2001 and December 2016 was conducted. Databases searched included Medline, Cochrane, CINAHL, ERIC, Ebsco, Scopus, and Web of Science. Studies were synthesized by study design, intervention, and outcomes. Statistical analyses included DerSimonian-Laird random-effects model. Results: In total, 7 studies were included in the review, and 5 in the meta-analysis. There were no statistically significant differences for knowledge (mean difference [MD] = 5.39, 95% confidence interval [CI]: −2.05 to 12.84) or skill acquisition (MD = 0.35, 95% CI: −5.30 to 6.01). Conclusions: The evidence does not suggest a strong consistent preference for either CAI or F2F instruction to teach students/trainees PE. Further research is needed to identify conditions which examine knowledge and skill acquisition outcomes that favor one mode of instruction over the other. PMID:29349338

  2. Scientific Argumentation and Deliberative Democracy: An Incompatible Mix in School Science?

    ERIC Educational Resources Information Center

    Erduran, Sibel; Kaya, Ebru

    2016-01-01

    The article investigates how deliberative democracy is related to argumentation in school science. We use examples of political models of deliberative democracy to synthesize implications for argumentation in science teaching and learning. Some key questions guided our approach: How does democratic deliberation work and how does it relate to…

  3. Science and Engineering Indicators 2010. NSB 10-01

    ERIC Educational Resources Information Center

    Lehming, Rolf F.; Alt, Martha Naomi; Chen, Xianglei; Hall, Leslie; Burton, Lawrence; Burrelli, Joan S.; Kannankutty, Nirmala; Proudfoot, Steven; Regets, Mark C.; Boroush, Mark; Moris, Francisco A.; Wolfe, Raymond M.; Britt, Ronda; Christovich, Leslie; Hill, Derek; Falkenheim, Jaquelina C.; Dunnigan, Paula C.

    2010-01-01

    "Science and Engineering Indicators" (SEI) is first and foremost a volume of record comprising the major high-quality quantitative data on the U.S. and international science and engineering enterprise. SEI is factual and policy neutral. It does not offer policy options, and it does not make policy recommendations. SEI employs a variety…

  4. Science and Engineering Indicators 2012. NSB 12-01

    ERIC Educational Resources Information Center

    National Science Foundation, 2012

    2012-01-01

    Science and Engineering Indicators (SEI) is first and foremost a volume of record comprising the major high-quality quantitative data on the U.S. and international science and engineering enterprise. SEI is factual and policy neutral. It does not offer policy options, and it does not make policy recommendations. SEI employs a variety of…

  5. Towards a unified picture of the water self-ions at the air-water interface: a density functional theory perspective

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baer, Marcel D.; Kuo, I-F W.; Tobias, Douglas J.

    2014-07-17

    The propensities of the water self ions, H3O+ and OH- , for the air-water interface has implications for interfacial acid-base chemistry. Despite numerous experimental and computational studies, no consensus has been reached on the question of whether or not H3O+ and/or OH- prefer to be at the water surface or in the bulk. Here we report a molecular dynamics simulation study of the bulk vs. interfacial behavior of H3O+ and OH- that employs forces derived from density functional theory with a generalized gradient approximation exchangecorrelation functional (specifically, BLYP) and empirical dispersion corrections. We computed the potential of mean force (PMF)more » for H3O+ as a function of the position of the ion in a 215-molecule water slab. The PMF is flat, suggesting that H3O+ has equal propensity for the air-water interface and the bulk. We compare the PMF for H3O+ to our previously computed PMF for OH- adsorption, which contains a shallow minimum at the interface, and we explore how differences in solvation of each ion at the interface vs. the bulk are connected with interfacial propensity. We find that the solvation shell of H3O+ is only slightly dependent on its position in the water slab, while OH- partially desolvates as it approaches the interface, and we examine how this difference in solvation behavior is manifested in the electronic structure and chemistry of the two ions. DJT was supported by National Science Foundation grant CHE-0909227. CJM was supported by the U.S. Department of Energy‘s (DOE) Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences and Biosciences. Pacific Northwest National Laboratory (PNNL) is operated for the Department of Energy by Battelle. The potential of mean force required resources of the Oak Ridge Leadership Computing Facility at the Oak Ridge National Laboratory, which is supported by the Office of Science of the U.S. Department of Energy under Contract No. DEAC05-00OR22725. The remaining simulations and analysis used resources of the National Energy Research Scientific Computing Center, which is supported by the Office of Science of the U.S. Department of Energy under Contract No. DE-AC02-05CH11231. at at Lawrence Berkeley National Laboratory. MDB is grateful for the support of the Linus Pauling Distinguished Postdoctoral Fellowship Program at PNNL.« less

  6. NASA's computer science research program

    NASA Technical Reports Server (NTRS)

    Larsen, R. L.

    1983-01-01

    Following a major assessment of NASA's computing technology needs, a new program of computer science research has been initiated by the Agency. The program includes work in concurrent processing, management of large scale scientific databases, software engineering, reliable computing, and artificial intelligence. The program is driven by applications requirements in computational fluid dynamics, image processing, sensor data management, real-time mission control and autonomous systems. It consists of university research, in-house NASA research, and NASA's Research Institute for Advanced Computer Science (RIACS) and Institute for Computer Applications in Science and Engineering (ICASE). The overall goal is to provide the technical foundation within NASA to exploit advancing computing technology in aerospace applications.

  7. Girls Save the World through Computer Science

    ERIC Educational Resources Information Center

    Murakami, Christine

    2011-01-01

    It's no secret that fewer and fewer women are entering computer science fields. Attracting high school girls to computer science is only part of the solution. Retaining them while they are in higher education or the workforce is also a challenge. To solve this, there is a need to show girls that computer science is a wide-open field that offers…

  8. The Assessment of Taiwanese College Students' Conceptions of and Approaches to Learning Computer Science and Their Relationships

    ERIC Educational Resources Information Center

    Liang, Jyh-Chong; Su, Yi-Ching; Tsai, Chin-Chung

    2015-01-01

    The aim of this study was to explore Taiwanese college students' conceptions of and approaches to learning computer science and then explore the relationships between the two. Two surveys, Conceptions of Learning Computer Science (COLCS) and Approaches to Learning Computer Science (ALCS), were administered to 421 college students majoring in…

  9. Hispanic Women Overcoming Deterrents to Computer Science: A Phenomenological Study

    ERIC Educational Resources Information Center

    Herling, Lourdes

    2011-01-01

    The products of computer science are important to all aspects of society and are tools in the solution of the world's problems. It is, therefore, troubling that the United States faces a shortage in qualified graduates in computer science. The number of women and minorities in computer science is significantly lower than the percentage of the…

  10. The Effects of Integrating Service Learning into Computer Science: An Inter-Institutional Longitudinal Study

    ERIC Educational Resources Information Center

    Payton, Jamie; Barnes, Tiffany; Buch, Kim; Rorrer, Audrey; Zuo, Huifang

    2015-01-01

    This study is a follow-up to one published in computer science education in 2010 that reported preliminary results showing a positive impact of service learning on student attitudes associated with success and retention in computer science. That paper described how service learning was incorporated into a computer science course in the context of…

  11. Non-Determinism: An Abstract Concept in Computer Science Studies

    ERIC Educational Resources Information Center

    Armoni, Michal; Gal-Ezer, Judith

    2007-01-01

    Non-determinism is one of the most important, yet abstract, recurring concepts of Computer Science. It plays an important role in Computer Science areas such as formal language theory, computability theory, distributed computing, and operating systems. We conducted a series of studies on the perception of non-determinism. In the current research,…

  12. An Investigation of Primary School Science Teachers' Use of Computer Applications

    ERIC Educational Resources Information Center

    Ocak, Mehmet Akif; Akdemir, Omur

    2008-01-01

    This study investigated the level and frequency of science teachers' use of computer applications as an instructional tool in the classroom. The manner and frequency of science teachers' use of computer, their perceptions about integration of computer applications, and other factors contributed to changes in their computer literacy are…

  13. Methodical Approaches to Teaching of Computer Modeling in Computer Science Course

    ERIC Educational Resources Information Center

    Rakhimzhanova, B. Lyazzat; Issabayeva, N. Darazha; Khakimova, Tiyshtik; Bolyskhanova, J. Madina

    2015-01-01

    The purpose of this study was to justify of the formation technique of representation of modeling methodology at computer science lessons. The necessity of studying computer modeling is that the current trends of strengthening of general education and worldview functions of computer science define the necessity of additional research of the…

  14. Climate Modeling Computing Needs Assessment

    NASA Astrophysics Data System (ADS)

    Petraska, K. E.; McCabe, J. D.

    2011-12-01

    This paper discusses early findings of an assessment of computing needs for NASA science, engineering and flight communities. The purpose of this assessment is to document a comprehensive set of computing needs that will allow us to better evaluate whether our computing assets are adequately structured to meet evolving demand. The early results are interesting, already pointing out improvements we can make today to get more out of the computing capacity we have, as well as potential game changing innovations for the future in how we apply information technology to science computing. Our objective is to learn how to leverage our resources in the best way possible to do more science for less money. Our approach in this assessment is threefold: Development of use case studies for science workflows; Creating a taxonomy and structure for describing science computing requirements; and characterizing agency computing, analysis, and visualization resources. As projects evolve, science data sets increase in a number of ways: in size, scope, timelines, complexity, and fidelity. Generating, processing, moving, and analyzing these data sets places distinct and discernable requirements on underlying computing, analysis, storage, and visualization systems. The initial focus group for this assessment is the Earth Science modeling community within NASA's Science Mission Directorate (SMD). As the assessment evolves, this focus will expand to other science communities across the agency. We will discuss our use cases, our framework for requirements and our characterizations, as well as our interview process, what we learned and how we plan to improve our materials after using them in the first round of interviews in the Earth Science Modeling community. We will describe our plans for how to expand this assessment, first into the Earth Science data analysis and remote sensing communities, and then throughout the full community of science, engineering and flight at NASA.

  15. Aqueous Cation-Amide Binding: Free Energies and IR Spectral Signatures by Ab Initio Molecular Dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pluharova, Eva; Baer, Marcel D.; Mundy, Christopher J.

    2014-07-03

    Understanding specific ion effects on proteins remains a considerable challenge. N-methylacetamide serves as a useful proxy for the protein backbone that can be well characterized both experimentally and theoretically. The spectroscopic signatures in the amide I band reflecting the strength of the interaction of alkali cations and alkali earth dications with the carbonyl group remain difficult to assign and controversial to interpret. Herein, we directly compute the IR shifts corresponding to the binding of either sodium or calcium to aqueous N-methylacetamide using ab initio molecular dynamics simulations. We show that the two cations interact with aqueous N-methylacetamide with different affinitiesmore » and in different geometries. Since sodium exhibits a weak interaction with the carbonyl group, the resulting amide I band is similar to an unperturbed carbonyl group undergoing aqueous solvation. In contrast, the stronger calcium binding results in a clear IR shift with respect to N-methylacetamide in pure water. Support from the Czech Ministry of Education (grant LH12001) is gratefully acknowledged. EP thanks the International Max-Planck Research School for support and the Alternative Sponsored Fellowship program at Pacific Northwest National Laboratory (PNNL). PJ acknowledges the Praemium Academie award from the Academy of Sciences. Calculations of the free energy profiles were made possible through generous allocation of computer time from the North-German Supercomputing Alliance (HLRN). Calculations of vibrational spectra were performed in part using the computational resources in the National Energy Research Supercomputing Center (NERSC) at Lawrence Berkeley National Laboratory. This work was supported by National Science Foundation grant CHE-0431312. CJM is supported by the U.S. Department of Energy`s (DOE) Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences and Biosciences. PNNL is operated for the Department of Energy by Battelle. MDB is grateful for the support of the Linus Pauling Distinguished Postdoctoral Fellowship Program at PNNL.« less

  16. Research Projects, Technical Reports and Publications

    NASA Technical Reports Server (NTRS)

    Oliger, Joseph

    1996-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. A flexible scientific staff is provided through a university faculty visitor program, a post doctoral program, and a student visitor program. Not only does this provide appropriate expertise but it also introduces scientists outside of NASA to NASA problems. A small group of core RIACS staff provides continuity and interacts with an ARC technical monitor and scientific advisory group to determine the RIACS mission. RIACS activities are reviewed and monitored by a USRA advisory council and ARC technical monitor. Research at RIACS is currently being done in the following areas: Advanced Methods for Scientific Computing High Performance Networks During this report pefiod Professor Antony Jameson of Princeton University, Professor Wei-Pai Tang of the University of Waterloo, Professor Marsha Berger of New York University, Professor Tony Chan of UCLA, Associate Professor David Zingg of University of Toronto, Canada and Assistant Professor Andrew Sohn of New Jersey Institute of Technology have been visiting RIACS. January 1, 1996 through September 30, 1996 RIACS had three staff scientists, four visiting scientists, one post-doctoral scientist, three consultants, two research associates and one research assistant. RIACS held a joint workshop with Code 1 29-30 July 1996. The workshop was held to discuss needs and opportunities in basic research in computer science in and for NASA applications. There were 14 talks given by NASA, industry and university scientists and three open discussion sessions. There were approximately fifty participants. A proceedings is being prepared. It is planned to have similar workshops on an annual basis. RIACS technical reports are usually preprints of manuscripts that have been submitted to research 'ournals or conference proceedings. A list of these reports for the period January i 1, 1996 through September 30, 1996 is in the Reports and Abstracts section of this report.

  17. Preface: SciDAC 2007

    NASA Astrophysics Data System (ADS)

    Keyes, David E.

    2007-09-01

    It takes a village to perform a petascale computation—domain scientists, applied mathematicians, computer scientists, computer system vendors, program managers, and support staff—and the village was assembled during 24-28 June 2007 in Boston's Westin Copley Place for the third annual Scientific Discovery through Advanced Computing (SciDAC) 2007 Conference. Over 300 registered participants networked around 76 posters, focused on achievements and challenges in 36 plenary talks, and brainstormed in two panels. In addition, with an eye to spreading the vision for simulation at the petascale and to growing the workforce, 115 participants—mostly doctoral students and post-docs complementary to the conferees—were gathered on 29 June 2007 in classrooms of the Massachusetts Institute of Technology for a full day of tutorials on the use of SciDAC software. Eleven SciDAC-sponsored research groups presented their software at an introductory level, in both lecture and hands-on formats that included live runs on a local BlueGene/L. Computation has always been about garnering insight into the behavior of systems too complex to explore satisfactorily by theoretical means alone. Today, however, computation is about much more: scientists and decision makers expect quantitatively reliable predictions from simulations ranging in scale from that of the Earth's climate, down to quarks, and out to colliding black holes. Predictive simulation lies at the heart of policy choices in energy and environment affecting billions of lives and expenditures of trillions of dollars. It is also at the heart of scientific debates on the nature of matter and the origin of the universe. The petascale is barely adequate for such demands and we are barely established at the levels of resolution and throughput that this new scale of computation affords. However, no scientific agenda worldwide is pushing the petascale frontier on all its fronts as vigorously as SciDAC. The breadth of this conference archive reflects the philosophy of the SciDAC program, which was introduced as a collaboration of all of the program offices in the Office of Science of the U.S. Department of Energy (DOE) in Fall 2001 and was renewed for a second period of five years in Fall 2006, with additional support in certain areas from the DOE's National Nuclear Security Administration (NNSA) and the U.S. National Science Foundation (NSF). All of the projects in the SciDAC portfolio were represented at the conference and most are captured in this volume. In addition, the Organizing Committee incorporated into the technical program a number of computational science highlights from outside of SciDAC, and, indeed, from outside of the United States. As implied by the title, scientific discovery is the driving deliverable of the SciDAC program, spanning the full range of the DOE Office of Science: accelerator design, astrophysics, chemistry and materials science, climate science, combustion, life science, nuclear physics, plasma physics, and subsurface physics. As articulated in the eponymous report that launched SciDAC, the computational challenges of these diverse areas are remarkably common. Each is profoundly multiscale in space and time and therefore continues to benefit at any margin from access to the largest and fastest computers available. Optimality of representation and execution requires adaptive, scalable mathematical algorithms in both continuous (geometrically complex domain) and discrete (mesh and graph) aspects. Programmability and performance optimality require software environments that both manage the intricate details of the underlying hardware and abstract them for scientific users. Running effectively on remote specialized hardware requires transparent workflow systems. Comprehending the petascale data sets generated in such simulations requires automated tools for data exploration and visualization. Archiving and sharing access to this data within the inevitably distributed community of leading scientists requires networked collaborative environments. Each of these elements is a research and development project in its own right. SciDAC does not replace theoretical programs oriented towards long-term basic research, but harvests them for contemporary, complementary state-of-the-art computational campaigns. By clustering researchers from applications and enabling technologies into coordinated, mission-driven projects, SciDAC accomplishes two ends with remarkable effectiveness: (1) it enriches the scientific perspective of both applications and enabling communities through mutual interaction and (2) it leverages between applications solutions and effort encapsulated in software. Though SciDAC is unique, its objective of multiscale science at extreme computational scale is shared and approached through different programmatic mechanisms, notably NNSA's ASC program, NSF's Cyberinfrastructure program, and DoD's CREATE program in the U.S., and RIKEN's computational simulation programs in Japan. Representatives of each of these programs were given the podium at SciDAC 2007 and communication occurred that will be valuable towards the ends of complementarity, leverage, and promulgation of best practices. The 2007 conference was graced with additional welcome program announcements. Michael Strayer announced a new program of postdoctoral research fellowships in the enabling technologies. (The computer science post-docs will be named after the late Professor Ken Kennedy, who briefly led the SciDAC project Center for Scalable Application Development Software (CScADS) until his untimely death in February 2007.) IBM announced its petascale BlueGene/P system on June 26. Meanwhile, at ISC07 in Dresden, the semi-annual posting of a revised Top 500 list on June 27 showed several new Top 10 systems accessible to various SciDAC participants. While SciDAC is dominated in 2007 by the classical scientific pursuit of understanding through reduction to components and isolation of causes and effects, simulation at scale is beginning to offer something even more tantalizing: synthesis and integration of multiple interacting phenomena in complex systems. Indeed, the design-oriented elements of SciDAC, such as accelerator and tokamak modeling, area already emphasizing multiphysics coupling, and climate science has been doing so for years in the coupling of models of the ocean, atmosphere, ice, and land. In one of the panels at SciDAC 2007, leaders of a three-stage `progressive workshop' on exascale simulation for energy and environment (E3), considered prospects for whole-system modeling in a variety of scientific areas within the domain of DOE related to energy, environmental, and global security. Computer vendors were invited to comment on the prospects for delivering exascale computing systems in another panel. The daunting nature of this challenge is summarized with the observation that the peak processing power of the entire Top 500 list of June 2007 is only 0.0052 exaflop/s. It takes the combined power of most of the computers on the internet today worldwide to reach 1 exaflop/s or 1018 floating point operations per second. The program of SciDAC 2007 followed a template honed by its predecessor meetings in San Francisco in 2005 and Denver in 2006. The Boston venue permitted outreach to a number of universities in the immediate region and throughout southern New England, including SciDAC campuses of Boston University, Harvard, and MIT, and a dozen others including most of the Ivy League. Altogether 55 universities, 20 laboratories, 14 private companies, 5 agencies, and 4 countries were represented among the conference and tutorial workshop participants. Approximately 47% of the conference participants were from government laboratories, 37% from universities, 9% from federal program offices, and 7% from industry. Keys to the success of SciDAC 2007 were the informal poster receptions, coffee breaks, working breakfasts and lunches, and even the `Right-brain Night' featuring artistic statements, both reverent and irreverent, by computational scientists, inspired by their work. The organizers thank the sponsors for their generosity in attracting participants to these informal occasions with sumptuous snacks and beverages: AMD, Cray, DataDirect, IBM, SGI, SiCortex, and the Institute of Physics. A conference as logistically complex as SciDAC 2007 cannot possibly and should not be executed primarily by the scientists, themselves. It is a great pleasure to acknowledge the many talented staff that contributed to a productive time for all participants and nearperfect adherence to schedule. Chief among them is Betsy Riley, currently detailed from ORNL to the program office in Germantown, with degrees in mathematics and computer science, but a passion for organizing interdisciplinary scientific programs. Betsy staffed the organizing committee during the year of telecon meetings leading up to the conference and masterminded sponsorship, invitations, and the compilation of the proceedings. Assisting her from ORNL in managing the program were Daniel Pack, Angela Beach, and Angela Fincher. Cynthia Latham of ORNL performed admirably in website and graphic design for all aspects of the online and printed materials of the meeting. John Bui, John Smith, and Missy Smith of ORNL ran their customary tight ship with respect to audio-visual execution and capture, assisted by Eric Ecklund and Keith Quinn of the Westin. Pamelia Nixon-Hartje of Ambassador Services was personally invaluable in getting the most out of the hotel and its staff. We thank Jeff Nichols of ORNL for managing the primary subcontract for the meeting. The SciDAC tutorial program was a joint effort of Professor John Negele of MIT, David Skinner, PI of the SciDAC Outreach Center, and the SciDAC 2007 Chair. Sponsorship from the Outreach Center in the form of travel scholarships for students, and of the local area SciDAC university delegation of BU, Harvard, and MIT for food and facilities is gratefully acknowledged. Of course, the archival success of a scientific meeting rests with the willingness of the presenters to make the extra effort to package their field-leading science in a form suitable for interaction with colleagues from other disciplines rather than fellow specialists. This goal, oft-stated in the run up to the meeting, was achieved to an admirable degree, both in the live presentations and in these proceedings. This effort is its own reward, since it leads to enhanced communication and accelerated scientific progress. Our greatest thanks are reserved for Michael Strayer, Associate Director for OASCR and the Director of SciDAC, for envisioning this celebratory meeting three years ago, and sustaining it with his own enthusiasm, in order to provide a highly visible manifestation of the fruits of SciDAC. He and the other Office of Science program managers in attendance and working in Washington, DC to communicate the opportunities afforded by SciDAC deserve the gratitude of a new virtual scientific village created and cemented under the vision of scientific discovery through advanced computing. David E Keyes Fu Foundation Professor of Applied Mathematics

  18. Sampling Approaches for Multi-Domain Internet Performance Measurement Infrastructures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Calyam, Prasad

    2014-09-15

    The next-generation of high-performance networks being developed in DOE communities are critical for supporting current and emerging data-intensive science applications. The goal of this project is to investigate multi-domain network status sampling techniques and tools to measure/analyze performance, and thereby provide “network awareness” to end-users and network operators in DOE communities. We leverage the infrastructure and datasets available through perfSONAR, which is a multi-domain measurement framework that has been widely deployed in high-performance computing and networking communities; the DOE community is a core developer and the largest adopter of perfSONAR. Our investigations include development of semantic scheduling algorithms, measurement federationmore » policies, and tools to sample multi-domain and multi-layer network status within perfSONAR deployments. We validate our algorithms and policies with end-to-end measurement analysis tools for various monitoring objectives such as network weather forecasting, anomaly detection, and fault-diagnosis. In addition, we develop a multi-domain architecture for an enterprise-specific perfSONAR deployment that can implement monitoring-objective based sampling and that adheres to any domain-specific measurement policies.« less

  19. Kenny Gruchalla | NREL

    Science.gov Websites

    feature extraction, human-computer interaction, and physics-based modeling. Professional Experience 2009 ., computer science, University of Colorado at Boulder M.S., computer science, University of Colorado at Boulder B.S., computer science, New Mexico Institute of Mining and Technology

  20. 41 CFR 102-3.185 - What does this subpart require agencies to do?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... National Academy of Sciences or the National Academy of Public Administration? § 102-3.185 What does this... 41 Public Contracts and Property Management 3 2014-01-01 2014-01-01 false What does this subpart... recommendation provided to an agency by the National Academy of Sciences (NAS) or the National Academy of Public...

  1. 41 CFR 102-3.185 - What does this subpart require agencies to do?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... National Academy of Sciences or the National Academy of Public Administration? § 102-3.185 What does this... 41 Public Contracts and Property Management 3 2012-01-01 2012-01-01 false What does this subpart... recommendation provided to an agency by the National Academy of Sciences (NAS) or the National Academy of Public...

  2. 41 CFR 102-3.185 - What does this subpart require agencies to do?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... National Academy of Sciences or the National Academy of Public Administration? § 102-3.185 What does this... 41 Public Contracts and Property Management 3 2011-01-01 2011-01-01 false What does this subpart... recommendation provided to an agency by the National Academy of Sciences (NAS) or the National Academy of Public...

  3. 41 CFR 102-3.185 - What does this subpart require agencies to do?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... National Academy of Sciences or the National Academy of Public Administration? § 102-3.185 What does this... 41 Public Contracts and Property Management 3 2013-07-01 2013-07-01 false What does this subpart... recommendation provided to an agency by the National Academy of Sciences (NAS) or the National Academy of Public...

  4. Computer-aided design and computer science technology

    NASA Technical Reports Server (NTRS)

    Fulton, R. E.; Voigt, S. J.

    1976-01-01

    A description is presented of computer-aided design requirements and the resulting computer science advances needed to support aerospace design. The aerospace design environment is examined, taking into account problems of data handling and aspects of computer hardware and software. The interactive terminal is normally the primary interface between the computer system and the engineering designer. Attention is given to user aids, interactive design, interactive computations, the characteristics of design information, data management requirements, hardware advancements, and computer science developments.

  5. The effects of integrating service learning into computer science: an inter-institutional longitudinal study

    NASA Astrophysics Data System (ADS)

    Payton, Jamie; Barnes, Tiffany; Buch, Kim; Rorrer, Audrey; Zuo, Huifang

    2015-07-01

    This study is a follow-up to one published in computer science education in 2010 that reported preliminary results showing a positive impact of service learning on student attitudes associated with success and retention in computer science. That paper described how service learning was incorporated into a computer science course in the context of the Students & Technology in Academia, Research, and Service (STARS) Alliance, an NSF-supported broadening participation in computing initiative that aims to diversify the computer science pipeline through innovative pedagogy and inter-institutional partnerships. The current paper describes how the STARS Alliance has expanded to diverse institutions, all using service learning as a vehicle for broadening participation in computing and enhancing attitudes and behaviors associated with student success. Results supported the STARS model of service learning for enhancing computing efficacy and computing commitment and for providing diverse students with many personal and professional development benefits.

  6. ICASE Computer Science Program

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The Institute for Computer Applications in Science and Engineering computer science program is discussed in outline form. Information is given on such topics as problem decomposition, algorithm development, programming languages, and parallel architectures.

  7. Final report and recommendations of the ESnet Authentication Pilot Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, G.R.; Moore, J.P.; Athey, C.L.

    1995-01-01

    To conduct their work, U.S. Department of Energy (DOE) researchers require access to a wide range of computing systems and information resources outside of their respective laboratories. Electronically communicating with peers using the global Internet has become a necessity to effective collaboration with university, industrial, and other government partners. DOE`s Energy Sciences Network (ESnet) needs to be engineered to facilitate this {open_quotes}collaboratory{close_quotes} while ensuring the protection of government computing resources from unauthorized use. Sensitive information and intellectual properties must be protected from unauthorized disclosure, modification, or destruction. In August 1993, DOE funded four ESnet sites (Argonne National Laboratory, Lawrence Livermoremore » National Laboratory, the National Energy Research Supercomputer Center, and Pacific Northwest Laboratory) to begin implementing and evaluating authenticated ESnet services using the advanced Kerberos Version 5. The purpose of this project was to identify, understand, and resolve the technical, procedural, cultural, and policy issues surrounding peer-to-peer authentication in an inter-organization internet. The investigators have concluded that, with certain conditions, Kerberos Version 5 is a suitable technology to enable ESnet users to freely share resources and information without compromising the integrity of their systems and data. The pilot project has demonstrated that Kerberos Version 5 is capable of supporting trusted third-party authentication across an inter-organization internet and that Kerberos Version 5 would be practical to implement across the ESnet community within the U.S. The investigators made several modifications to the Kerberos Version 5 system that are necessary for operation in the current Internet environment and have documented other technical shortcomings that must be addressed before large-scale deployment is attempted.« less

  8. Next-Generation Climate Modeling Science Challenges for Simulation, Workflow and Analysis Systems

    NASA Astrophysics Data System (ADS)

    Koch, D. M.; Anantharaj, V. G.; Bader, D. C.; Krishnan, H.; Leung, L. R.; Ringler, T.; Taylor, M.; Wehner, M. F.; Williams, D. N.

    2016-12-01

    We will present two examples of current and future high-resolution climate-modeling research that are challenging existing simulation run-time I/O, model-data movement, storage and publishing, and analysis. In each case, we will consider lessons learned as current workflow systems are broken by these large-data science challenges, as well as strategies to repair or rebuild the systems. First we consider the science and workflow challenges to be posed by the CMIP6 multi-model HighResMIP, involving around a dozen modeling groups performing quarter-degree simulations, in 3-member ensembles for 100 years, with high-frequency (1-6 hourly) diagnostics, which is expected to generate over 4PB of data. An example of science derived from these experiments will be to study how resolution affects the ability of models to capture extreme-events such as hurricanes or atmospheric rivers. Expected methods to transfer (using parallel Globus) and analyze (using parallel "TECA" software tools) HighResMIP data for such feature-tracking by the DOE CASCADE project will be presented. A second example will be from the Accelerated Climate Modeling for Energy (ACME) project, which is currently addressing challenges involving multiple century-scale coupled high resolution (quarter-degree) climate simulations on DOE Leadership Class computers. ACME is anticipating production of over 5PB of data during the next 2 years of simulations, in order to investigate the drivers of water cycle changes, sea-level-rise, and carbon cycle evolution. The ACME workflow, from simulation to data transfer, storage, analysis and publication will be presented. Current and planned methods to accelerate the workflow, including implementing run-time diagnostics, and implementing server-side analysis to avoid moving large datasets will be presented.

  9. Applications of Out-of-Domain Knowledge in Students' Reasoning about Computer Program State

    ERIC Educational Resources Information Center

    Lewis, Colleen Marie

    2012-01-01

    To meet a growing demand and a projected deficit in the supply of computer professionals (NCWIT, 2009), it is of vital importance to expand students' access to computer science. However, many researchers in the computer science education community unproductively assume that some students lack an innate ability for computer science and…

  10. Focused Science Delivery makes science make sense.

    Treesearch

    Rachel W. Scheuering; Jamie Barbour

    2004-01-01

    Science does not exist in a vacuum, but reading scientific publications might make you think it does. Although the policy and management implications of their findings could often touch a much wider audience, many scientists write only for the few people in the world who share their area of expertise. In addition, most scientific publications provide information that...

  11. A Cognitive Model for Problem Solving in Computer Science

    ERIC Educational Resources Information Center

    Parham, Jennifer R.

    2009-01-01

    According to industry representatives, computer science education needs to emphasize the processes involved in solving computing problems rather than their solutions. Most of the current assessment tools used by universities and computer science departments analyze student answers to problems rather than investigating the processes involved in…

  12. Approaches to Classroom-Based Computational Science.

    ERIC Educational Resources Information Center

    Guzdial, Mark

    Computational science includes the use of computer-based modeling and simulation to define and test theories about scientific phenomena. The challenge for educators is to develop techniques for implementing computational science in the classroom. This paper reviews some previous work on the use of simulation alone (without modeling), modeling…

  13. Defining Computational Thinking for Mathematics and Science Classrooms

    ERIC Educational Resources Information Center

    Weintrop, David; Beheshti, Elham; Horn, Michael; Orton, Kai; Jona, Kemi; Trouille, Laura; Wilensky, Uri

    2016-01-01

    Science and mathematics are becoming computational endeavors. This fact is reflected in the recently released Next Generation Science Standards and the decision to include "computational thinking" as a core scientific practice. With this addition, and the increased presence of computation in mathematics and scientific contexts, a new…

  14. New evaporator station for the center for accelerator target science

    NASA Astrophysics Data System (ADS)

    Greene, John P.; Labib, Mina

    2018-05-01

    As part of an equipment grant provided by DOE-NP for the Center for Accelerator Target Science (CATS) initiative, the procurement of a new, electron beam, high-vacuum deposition system was identified as a priority to insure reliable and continued availability of high-purity targets. The apparatus is designed to contain TWO electron beam guns; a standard 4-pocket 270° geometry source as well as an electron bombardment source. The acquisition of this new system allows for the replacement of TWO outdated and aging vacuum evaporators. Also included is an additional thermal boat source, enhancing our capability within this deposition unit. Recommended specifications for this system included an automated, high-vacuum pumping station, a deposition chamber with a rotating and heated substrate holder for uniform coating capabilities and incorporating computer-controlled state-of-the-art thin film technologies. Design specifications, enhanced capabilities and the necessary mechanical modifications for our target work are discussed.

  15. A personal perspective on modelling the climate system.

    PubMed

    Palmer, T N

    2016-04-01

    Given their increasing relevance for society, I suggest that the climate science community itself does not treat the development of error-free ab initio models of the climate system with sufficient urgency. With increasing levels of difficulty, I discuss a number of proposals for speeding up such development. Firstly, I believe that climate science should make better use of the pool of post-PhD talent in mathematics and physics, for developing next-generation climate models. Secondly, I believe there is more scope for the development of modelling systems which link weather and climate prediction more seamlessly. Finally, here in Europe, I call for a new European Programme on Extreme Computing and Climate to advance our ability to simulate climate extremes, and understand the drivers of such extremes. A key goal for such a programme is the development of a 1 km global climate system model to run on the first exascale supercomputers in the early 2020s.

  16. ESnet authentication services and trust federations

    NASA Astrophysics Data System (ADS)

    Muruganantham, Dhivakaran; Helm, Mike; Genovese, Tony

    2005-01-01

    ESnet provides authentication services and trust federation support for SciDAC projects, collaboratories, and other distributed computing applications. The ESnet ATF team operates the DOEGrids Certificate Authority, available to all DOE Office of Science programs, plus several custom CAs, including one for the National Fusion Collaboratory and one for NERSC. The secure hardware and software environment developed to support CAs is suitable for supporting additional custom authentication and authorization applications that your program might require. Seamless, secure interoperation across organizational and international boundaries is vital to collaborative science. We are fostering the development of international PKI federations by founding the TAGPMA, the American regional PMA, and the worldwide IGTF Policy Management Authority (PMA), as well as participating in European and Asian regional PMAs. We are investigating and prototyping distributed authentication technology that will allow us to support the "roaming scientist" (distributed wireless via eduroam), as well as more secure authentication methods (one-time password tokens).

  17. Can computational goals inform theories of vision?

    PubMed

    Anderson, Barton L

    2015-04-01

    One of the most lasting contributions of Marr's posthumous book is his articulation of the different "levels of analysis" that are needed to understand vision. Although a variety of work has examined how these different levels are related, there is comparatively little examination of the assumptions on which his proposed levels rest, or the plausibility of the approach Marr articulated given those assumptions. Marr placed particular significance on computational level theory, which specifies the "goal" of a computation, its appropriateness for solving a particular problem, and the logic by which it can be carried out. The structure of computational level theory is inherently teleological: What the brain does is described in terms of its purpose. I argue that computational level theory, and the reverse-engineering approach it inspires, requires understanding the historical trajectory that gave rise to functional capacities that can be meaningfully attributed with some sense of purpose or goal, that is, a reconstruction of the fitness function on which natural selection acted in shaping our visual abilities. I argue that this reconstruction is required to distinguish abilities shaped by natural selection-"natural tasks" -from evolutionary "by-products" (spandrels, co-optations, and exaptations), rather than merely demonstrating that computational goals can be embedded in a Bayesian model that renders a particular behavior or process rational. Copyright © 2015 Cognitive Science Society, Inc.

  18. Electrostatic solvation free energies of charged hard spheres using molecular dynamics with density functional theory interactions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duignan, Timothy T.; Baer, Marcel D.; Schenter, Gregory K.

    Determining the solvation free energies of single ions in water is one of the most fundamental problems in physical chemistry and yet many unresolved questions remain. In particular, the ability to decompose the solvation free energy into simple and intuitive contributions will have important implications for coarse grained models of electrolyte solution. Here, we provide rigorous definitions of the various types of single ion solvation free energies based on different simulation protocols. We calculate solvation free energies of charged hard spheres using density functional theory interaction potentials with molecular dynamics simulation (DFT-MD) and isolate the effects of charge and cavitation,more » comparing to the Born (linear response) model. We show that using uncorrected Ewald summation leads to highly unphysical values for the solvation free energy and that charging free energies for cations are approximately linear as a function of charge but that there is a small non-linearity for small anions. The charge hydration asymmetry (CHA) for hard spheres, determined with quantum mechanics, is much larger than for the analogous real ions. This suggests that real ions, particularly anions, are significantly more complex than simple charged hard spheres, a commonly employed representation. We would like to thank Thomas Beck, Shawn Kathmann, Richard Remsing and John Weeks for helpful discussions. Computing resources were generously allocated by PNNL's Institutional Computing program. This research also used resources of the National Energy Research Scientific Computing Center, a DOE Office of Science User Facility supported by the Office of Science of the U.S. Department of Energy under Contract No. DE-AC02-05CH11231. TTD, GKS, and CJM were supported by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences, and Biosciences. MDB was supported by MS3 (Materials Synthesis and Simulation Across Scales) Initiative, a Laboratory Directed Research and Development Program at Pacific Northwest National Laboratory (PNNL). PNNL is a multi-program national laboratory operated by Battelle for the U.S. Department of Energy.« less

  19. The origin of the reactivity of the Criegee intermediate: implications for atmospheric particle growth

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miliordos, Evangelos; Xantheas, Sotiris S.

    2016-01-18

    The electronic structure of the simplest Criegee intermediate (H₂COO) is practically that of a closed shell. On the biradical scale (β) from 0 (pure closed shell) to 1 (pure biradical) it registers a mere β=0.10, suggesting that a Lewis structure of a H₂C=O δ+-O δ- zwitterion best describes its ground electronic state. However, this picture of a nearly inert closed shell contradicts its rich atmospheric reactivity. It is the mixing of its ground with the first triplet excited state, which is a pure biradical state of the type H₂C•-O-O•, that is responsible for the formation of strongly bound products duringmore » reactions inducing atmospheric particle growth. This work was supported by the US Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences and Biosciences. Pacific Northwest National Laboratory (PNNL) is a multiprogram national laboratory operated for DOE by Battelle. This research also used resources of the National Energy Research Scientific Computing Center, which is supported by the Office of Science of the U.S. Department of Energy under Contract No. DE-AC02-05CH11231.« less

  20. Does Geophysics Need "A new kind of Science"?

    NASA Astrophysics Data System (ADS)

    Turcotte, D. L.; Rundle, J. B.

    2002-12-01

    Stephen Wolfram's book "A New Kind of Science" has received a great deal of attention in the last six months, both positive and negative. The theme of the book is that "cellular automata", which arise from spatial and temporal coarse-graining of equations of motion, provide the foundations for a new nonlinear science of "complexity". The old science is the science of partial differential equations. Some of the major contributions of this old science have been in geophysics, i.e. gravity, magnetics, seismic waves, heat flow. The basis of the new science is the use of massive computing and numerical simulations. The new science is motivated by the observations that many physical systems display a vast multiplicity of space and time scales, and have hidden dynamics that in many cases are impossible to directly observe. An example would be molecular dynamics. Statistical physics derives continuum equations from the discrete interactions between atoms and molecules, in the modern world the continuum equations are then discretized using finite differences, finite elements, etc. in order to obtain numerical solutions. Examples of widely used cellular automata models include diffusion limited aggregation and site percolation. Also the class of models that are said to exhibit self-organized criticality, the sand-pile model, the slider-block model, the forest-fire model. Applications of these models include drainage networks, seismicity, distributions of minerals,and the evolution of landforms and coastlines. Simple cellular automata models generate deterministic chaos, i.e. the logistic map.

  1. NASA Center for Computational Sciences: History and Resources

    NASA Technical Reports Server (NTRS)

    2000-01-01

    The Nasa Center for Computational Sciences (NCCS) has been a leading capacity computing facility, providing a production environment and support resources to address the challenges facing the Earth and space sciences research community.

  2. Institute for Computer Applications in Science and Engineering (ICASE)

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science during the period April 1, 1983 through September 30, 1983 is summarized.

  3. Computers in Science: Thinking Outside the Discipline.

    ERIC Educational Resources Information Center

    Hamilton, Todd M.

    2003-01-01

    Describes the Computers in Science course which integrates computer-related techniques into the science disciplines of chemistry, physics, biology, and Earth science. Uses a team teaching approach and teaches students how to solve chemistry problems with spreadsheets, identify minerals with X-rays, and chemical and force analysis. (Contains 14…

  4. 78 FR 64255 - Advisory Committee for Computer and Information Science and Engineering; Cancellation of Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-28

    ... NATIONAL SCIENCE FOUNDATION Advisory Committee for Computer and Information Science and Engineering; Cancellation of Meeting SUMMARY: As a result of the impact of the recent government shutdown, the... Committee for Computer and Information Science and Engineering meeting. The public notice for this committee...

  5. Exemplary Science Teachers' Use of Technology

    ERIC Educational Resources Information Center

    Hakverdi-Can, Meral; Dana, Thomas M.

    2012-01-01

    The purpose of this study is to examine exemplary science teachers' level of computer use, their knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, how often they required their students to use those applications in or for their science class…

  6. A philosophical examination of Mead's pragmatist constructivism as a referent for adult science education

    NASA Astrophysics Data System (ADS)

    Furbish, Dean Russel

    The purpose of this study is to examine pragmatist constructivism as a science education referent for adult learners. Specifically, this study seeks to determine whether George Herbert Mead's doctrine, which conflates pragmatist learning theory and philosophy of natural science, might facilitate (a) scientific concept acquisition, (b) learning scientific methods, and (c) preparation of learners for careers in science and science-related areas. A philosophical examination of Mead's doctrine in light of these three criteria has determined that pragmatist constructivism is not a viable science education referent for adult learners. Mead's pragmatist constructivism does not portray scientific knowledge or scientific methods as they are understood by practicing scientists themselves, that is, according to scientific realism. Thus, employment of pragmatist constructivism does not adequately prepare future practitioners for careers in science-related areas. Mead's metaphysics does not allow him to commit to the existence of the unobservable objects of science such as molecular cellulose or mosquito-borne malarial parasites. Mead's anti-realist metaphysics also affects his conception of scientific methods. Because Mead does not commit existentially to the unobservable objects of realist science, Mead's science does not seek to determine what causal role if any the hypothetical objects that scientists routinely posit while theorizing might play in observable phenomena. Instead, constructivist pragmatism promotes subjective epistemology and instrumental methods. The implication for learning science is that students are encouraged to derive scientific concepts based on a combination of personal experience and personal meaningfulness. Contrary to pragmatist constructivism, however, scientific concepts do not arise inductively from subjective experience driven by personal interests. The broader implication of this study for adult education is that the philosophically laden claims of constructivist learning theories need to be identified and assessed independently of any empirical support that these learning theories might enjoy. This in turn calls for educational experiences for graduate students of education that incorporate philosophical understanding such that future educators might be able to recognize and weigh the philosophically laden claims of adult learning theories.

  7. Software Reviews.

    ERIC Educational Resources Information Center

    Science and Children, 1990

    1990-01-01

    Reviewed are seven computer software packages for IBM and/or Apple Computers. Included are "Windows on Science: Volume 1--Physical Science"; "Science Probe--Physical Science"; "Wildlife Adventures--Grizzly Bears"; "Science Skills--Development Programs"; "The Clean Machine"; "Rock Doctor";…

  8. Citizenship and Primary Science: The Trainee's Perspective

    ERIC Educational Resources Information Center

    Hartley, Karen

    2005-01-01

    What does citizenship mean to a 6-year-old? To a 10-year-old? What does it mean to you? Is it merely about looking after other people? Would you agree that keeping a hamster in a cage in a primary classroom is cruel? How does a subject leader for science persuade colleagues that the non-statutory guidance for personal, social and health education…

  9. Light. Physical Science in Action[TM]. Schlessinger Science Library. [Videotape].

    ERIC Educational Resources Information Center

    2000

    Why does light behave the way it does? How does it travel from its source to the objects it illuminates and then to human eyes? Students will learn about waves, including the concepts of reflection, absorption, refraction and how light is related to the colors that can be seen. With a hands-on activity and real-life examples, these concepts are…

  10. Science Outreach for the Thousands: Coe College's Playground of Science

    NASA Astrophysics Data System (ADS)

    Watson, D. E.; Franke, M.; Affatigato, M.; Feller, S.

    2011-12-01

    Coe College is a private liberal arts college nestled in the northeast quadrant of Cedar Rapids, IA. Coe takes pride in the outreach it does in the local community. The sciences at Coe find enjoyment in educating the children and families of this community through a diverse set of venues; from performing science demonstrations for children at Cedar Rapids' Fourth of July Freedom Festival to hosting summer forums and talks to invigorate the minds of its more mature audiences. Among these events, the signature event of the year is the Coe Playground of Science. On the last Thursday of October, before Halloween, the science departments at Coe invite nearly two thousand children from pre elementary to high school ages, along with their parents to participate in a night filled with science demos, haunted halls, and trick-or-treating for more than just candy. The demonstrations are performed by professors and students alike from a raft of cooperative departments including physics, chemistry, biology, math, computer science, nursing, ROTC, and psychology. This event greatly strengthens the relationships between institution members and community members. The sciences at Coe understand the importance of imparting the thrill and hunger for exploration and discovery into the future generations. More importantly they recognize that this cannot start and end at the collegiate level, but the American public must be reached at younger ages and continue to be encouraged beyond the college experience. The Playground of Science unites these two groups under the common goal of elevating scientific interest in the American people.

  11. An Overview of NASA's Intelligent Systems Program

    NASA Technical Reports Server (NTRS)

    Cooke, Daniel E.; Norvig, Peter (Technical Monitor)

    2001-01-01

    NASA and the Computer Science Research community are poised to enter a critical era. An era in which - it seems - that each needs the other. Market forces, driven by the immediate economic viability of computer science research results, place Computer Science in a relatively novel position. These forces impact how research is done, and could, in worst case, drive the field away from significant innovation opting instead for incremental advances that result in greater stability in the market place. NASA, however, requires significant advances in computer science research in order to accomplish the exploration and science agenda it has set out for itself. NASA may indeed be poised to advance computer science research in this century much the way it advanced aero-based research in the last.

  12. A Review of Models for Teacher Preparation Programs for Precollege Computer Science Education.

    ERIC Educational Resources Information Center

    Deek, Fadi P.; Kimmel, Howard

    2002-01-01

    Discusses the need for adequate precollege computer science education and focuses on the issues of teacher preparation programs and requirements needed to teach high school computer science. Presents models of teacher preparation programs and compares state requirements with Association for Computing Machinery (ACM) recommendations. (Author/LRW)

  13. A DDC Bibliography on Computers in Information Sciences. Volume II. Information Sciences Series.

    ERIC Educational Resources Information Center

    Defense Documentation Center, Alexandria, VA.

    The unclassified and unlimited bibliography compiles references dealing specifically with the role of computers in information sciences. The volume contains 239 annotated references grouped under three major headings: Artificial and Programming Languages, Computer Processing of Analog Data, and Computer Processing of Digital Data. The references…

  14. Making Advanced Computer Science Topics More Accessible through Interactive Technologies

    ERIC Educational Resources Information Center

    Shao, Kun; Maher, Peter

    2012-01-01

    Purpose: Teaching advanced technical concepts in a computer science program to students of different technical backgrounds presents many challenges. The purpose of this paper is to present a detailed experimental pedagogy in teaching advanced computer science topics, such as computer networking, telecommunications and data structures using…

  15. 48 CFR 252.204-7014 - Limitations on the Use or Disclosure of Information by Litigation Support Contractors.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    .... Computer software does not include computer data bases or computer software documentation. Litigation... includes technical data and computer software, but does not include information that is lawfully, publicly available without restriction. Technical data means recorded information, regardless of the form or method...

  16. 48 CFR 52.227-14 - Rights in Data-General.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... database or database means a collection of recorded information in a form capable of, and for the purpose... enable the computer program to be produced, created, or compiled. (2) Does not include computer databases... databases and computer software documentation). This term does not include computer software or financial...

  17. 48 CFR 52.227-14 - Rights in Data-General.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... database or database means a collection of recorded information in a form capable of, and for the purpose... enable the computer program to be produced, created, or compiled. (2) Does not include computer databases... databases and computer software documentation). This term does not include computer software or financial...

  18. 48 CFR 52.227-14 - Rights in Data-General.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... database or database means a collection of recorded information in a form capable of, and for the purpose... enable the computer program to be produced, created, or compiled. (2) Does not include computer databases... databases and computer software documentation). This term does not include computer software or financial...

  19. 48 CFR 52.227-14 - Rights in Data-General.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... database or database means a collection of recorded information in a form capable of, and for the purpose... enable the computer program to be produced, created, or compiled. (2) Does not include computer databases... databases and computer software documentation). This term does not include computer software or financial...

  20. BIOCOMPUTATION: some history and prospects.

    PubMed

    Cull, Paul

    2013-06-01

    At first glance, biology and computer science are diametrically opposed sciences. Biology deals with carbon based life forms shaped by evolution and natural selection. Computer Science deals with electronic machines designed by engineers and guided by mathematical algorithms. In this brief paper, we review biologically inspired computing. We discuss several models of computation which have arisen from various biological studies. We show what these have in common, and conjecture how biology can still suggest answers and models for the next generation of computing problems. We discuss computation and argue that these biologically inspired models do not extend the theoretical limits on computation. We suggest that, in practice, biological models may give more succinct representations of various problems, and we mention a few cases in which biological models have proved useful. We also discuss the reciprocal impact of computer science on biology and cite a few significant contributions to biological science. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  1. A Case Study of the Introduction of Computer Science in NZ Schools

    ERIC Educational Resources Information Center

    Bell, Tim; Andreae, Peter; Robins, Anthony

    2014-01-01

    For many years computing in New Zealand schools was focused on teaching students how to use computers, and there was little opportunity for students to learn about programming and computer science as formal subjects. In this article we review a series of initiatives that occurred from 2007 to 2009 that led to programming and computer science being…

  2. Nonambipolar Transport and Torque in Perturbed Equilibria

    NASA Astrophysics Data System (ADS)

    Logan, N. C.; Park, J.-K.; Wang, Z. R.; Berkery, J. W.; Kim, K.; Menard, J. E.

    2013-10-01

    A new Perturbed Equilibrium Nonambipolar Transport (PENT) code has been developed to calculate the neoclassical toroidal torque from radial current composed of both passing and trapped particles in perturbed equilibria. This presentation outlines the physics approach used in the development of the PENT code, with emphasis on the effects of retaining general aspect-ratio geometric effects. First, nonambipolar transport coefficients and corresponding neoclassical toroidal viscous (NTV) torque in perturbed equilibria are re-derived from the first order gyro-drift-kinetic equation in the ``combined-NTV'' PENT formalism. The equivalence of NTV torque and change in potential energy due to kinetic effects [J-K. Park, Phys. Plas., 2011] is then used to showcase computational challenges shared between PENT and stability codes MISK and MARS-K. Extensive comparisons to a reduced model, which makes numerous large aspect ratio approximations, are used throughout to emphasize geometry dependent physics such as pitch angle resonances. These applications make extensive use of the PENT code's native interfacing with the Ideal Perturbed Equilibrium Code (IPEC), and the combination of these codes is a key step towards an iterative solver for self-consistent perturbed equilibrium torque. Supported by US DOE contract #DE-AC02-09CH11466 and the DOE Office of Science Graduate Fellowship administered by the Oak Ridge Institute for Science & Education under contract #DE-AC05-06OR23100.

  3. Research in Applied Mathematics, Fluid Mechanics and Computer Science

    NASA Technical Reports Server (NTRS)

    1999-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1998 through March 31, 1999.

  4. [Research activities in applied mathematics, fluid mechanics, and computer science

    NASA Technical Reports Server (NTRS)

    1995-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period April 1, 1995 through September 30, 1995.

  5. Activities of the Institute for Computer Applications in Science and Engineering

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April 1, 1985 through October 2, 1985 is summarized.

  6. Synchrotron X-Ray Diffraction Studies on the New Generation Ferromagnetic Semiconductor Li(Zn,Mn)As under High Pressure

    NASA Astrophysics Data System (ADS)

    Sun, Fei; Xu, Cong; Yu, Shuang; Chen, Bi-Juan; Zhao, Guo-Qiang; Deng, Zheng; Yang, Wen-Ge; Jin, Chang-Qing

    2017-06-01

    Not Available Supported by the National Natural Science Foundation and the Ministry of Science and Technology of China, the National Natural Science Foundation of China under Grant No U1530402, the U.S. Department of Energy of Office of Science under Grant No DE-AC02-06CH11357, the DOE-NNSA under Grant No DE-NA0001974, the DOE-BES under Grant No DE-FG02-99ER45775, and the Instrumentation Funding of National Science Foundation.

  7. A Quantitative Model for Assessing Visual Simulation Software Architecture

    DTIC Science & Technology

    2011-09-01

    Software Engineering Arnold Buss Research Associate Professor of MOVES LtCol Jeff Boleng, PhD Associate Professor of Computer Science U.S. Air Force Academy... science (operating and programming systems series). New York, NY, USA: Elsevier Science Ltd. Henry, S., & Kafura, D. (1984). The evaluation of software...Rudy Darken Professor of Computer Science Dissertation Supervisor Ted Lewis Professor of Computer Science Richard Riehle Professor of Practice

  8. K-16 Computationally Rich Science Education: A Ten-Year Review of the "Journal of Science Education and Technology" (1998-2008)

    ERIC Educational Resources Information Center

    Wofford, Jennifer

    2009-01-01

    Computing is anticipated to have an increasingly expansive impact on the sciences overall, becoming the third, crucial component of a "golden triangle" that includes mathematics and experimental and theoretical science. However, even more true with computing than with math and science, we are not preparing our students for this new reality. It is…

  9. Interactive Synthesis of Code Level Security Rules

    DTIC Science & Technology

    2017-04-01

    Interactive Synthesis of Code-Level Security Rules A Thesis Presented by Leo St. Amour to The Department of Computer Science in partial fulfillment...of the requirements for the degree of Master of Science in Computer Science Northeastern University Boston, Massachusetts April 2017 DISTRIBUTION...Abstract of the Thesis Interactive Synthesis of Code-Level Security Rules by Leo St. Amour Master of Science in Computer Science Northeastern University

  10. Approaching gender parity: Women in computer science at Afghanistan's Kabul University

    NASA Astrophysics Data System (ADS)

    Plane, Jandelyn

    This study explores the representation of women in computer science at the tertiary level through data collected about undergraduate computer science education at Kabul University in Afghanistan. Previous studies have theorized reasons for underrepresentation of women in computer science, and while many of these reasons are indeed present in Afghanistan, they appear to hinder advancement to degree to a lesser extent. Women comprise at least 36% of each graduating class from KU's Computer Science Department; however, in 2007 women were 25% of the university population. In the US, women comprise over 50% of university populations while only graduating on average 25% women in undergraduate computer science programs. Representation of women in computer science in the US is 50% below the university rate, but at KU, it is 50% above the university rate. This mixed methods study of KU was conducted in the following three stages: setting up focus groups with women computer science students, distributing surveys to all students in the CS department, and conducting a series of 22 individual interviews with fourth year CS students. The analysis of the data collected and its comparison to literature on university/department retention in Science, Technology, Engineering and Mathematics gender representation and on women's education in underdeveloped Islamic countries illuminates KU's uncharacteristic representation of women in its Computer Science Department. The retention of women in STEM through the education pipeline has several characteristics in Afghanistan that differ from countries often studied in available literature. Few Afghan students have computers in their home and few have training beyond secretarial applications before considering studying CS at university. University students in Afghanistan are selected based on placement exams and are then assigned to an area of study, and financially supported throughout their academic career, resulting in a low attrition rate from the program. Gender and STEM literature identifies parental encouragement, stereotypes and employment perceptions as influential characteristics. Afghan women in computer science received significant parental encouragement even from parents with no computer background. They do not seem to be influenced by any negative "geek" stereotypes, but they do perceive limitations when considering employment after graduation.

  11. High Energy Physics Exascale Requirements Review. An Office of Science review sponsored jointly by Advanced Scientific Computing Research and High Energy Physics, June 10-12, 2015, Bethesda, Maryland

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Habib, Salman; Roser, Robert; Gerber, Richard

    The U.S. Department of Energy (DOE) Office of Science (SC) Offices of High Energy Physics (HEP) and Advanced Scientific Computing Research (ASCR) convened a programmatic Exascale Requirements Review on June 10–12, 2015, in Bethesda, Maryland. This report summarizes the findings, results, and recommendations derived from that meeting. The high-level findings and observations are as follows. Larger, more capable computing and data facilities are needed to support HEP science goals in all three frontiers: Energy, Intensity, and Cosmic. The expected scale of the demand at the 2025 timescale is at least two orders of magnitude — and in some cases greatermore » — than that available currently. The growth rate of data produced by simulations is overwhelming the current ability of both facilities and researchers to store and analyze it. Additional resources and new techniques for data analysis are urgently needed. Data rates and volumes from experimental facilities are also straining the current HEP infrastructure in its ability to store and analyze large and complex data volumes. Appropriately configured leadership-class facilities can play a transformational role in enabling scientific discovery from these datasets. A close integration of high-performance computing (HPC) simulation and data analysis will greatly aid in interpreting the results of HEP experiments. Such an integration will minimize data movement and facilitate interdependent workflows. Long-range planning between HEP and ASCR will be required to meet HEP’s research needs. To best use ASCR HPC resources, the experimental HEP program needs (1) an established, long-term plan for access to ASCR computational and data resources, (2) the ability to map workflows to HPC resources, (3) the ability for ASCR facilities to accommodate workflows run by collaborations potentially comprising thousands of individual members, (4) to transition codes to the next-generation HPC platforms that will be available at ASCR facilities, (5) to build up and train a workforce capable of developing and using simulations and analysis to support HEP scientific research on next-generation systems.« less

  12. Science-Technology Coupling: The Case of Mathematical Logic and Computer Science.

    ERIC Educational Resources Information Center

    Wagner-Dobler, Roland

    1997-01-01

    In the history of science, there have often been periods of sudden rapprochements between pure science and technology-oriented branches of science. Mathematical logic as pure science and computer science as technology-oriented science have experienced such a rapprochement, which is studied in this article in a bibliometric manner. (Author)

  13. Cognitive computing and eScience in health and life science research: artificial intelligence and obesity intervention programs.

    PubMed

    Marshall, Thomas; Champagne-Langabeer, Tiffiany; Castelli, Darla; Hoelscher, Deanna

    2017-12-01

    To present research models based on artificial intelligence and discuss the concept of cognitive computing and eScience as disruptive factors in health and life science research methodologies. The paper identifies big data as a catalyst to innovation and the development of artificial intelligence, presents a framework for computer-supported human problem solving and describes a transformation of research support models. This framework includes traditional computer support; federated cognition using machine learning and cognitive agents to augment human intelligence; and a semi-autonomous/autonomous cognitive model, based on deep machine learning, which supports eScience. The paper provides a forward view of the impact of artificial intelligence on our human-computer support and research methods in health and life science research. By augmenting or amplifying human task performance with artificial intelligence, cognitive computing and eScience research models are discussed as novel and innovative systems for developing more effective adaptive obesity intervention programs.

  14. Science 101: Why Does It Take Longer to Boil Potatoes at High Altitudes?

    ERIC Educational Resources Information Center

    Robertson, Bill

    2017-01-01

    Why Does It Take Longer to Boil Potatoes at High Altitudes? This column provides background science information for elementary teachers. This month's issue looks at why water boils at different temperatures at different altitudes.

  15. Website Policies / Important Links | DOepatents

    Science.gov Websites

    Global to push GA events into Global to push GA events into skip to main content PATENTS Toggle Science Resources: SciTech Connect DOE PAGES More DOE Collections » U.S. / Global Science Resources

  16. 45 CFR 660.7 - How does the Director communicate with state and local officials concerning the Foundation's...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE FOUNDATION INTERGOVERNMENTAL REVIEW OF THE NATIONAL SCIENCE FOUNDATION PROGRAMS AND ACTIVITIES § 660.7 How does the Director communicate with state...

  17. 45 CFR 660.7 - How does the Director communicate with state and local officials concerning the Foundation's...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE FOUNDATION INTERGOVERNMENTAL REVIEW OF THE NATIONAL SCIENCE FOUNDATION PROGRAMS AND ACTIVITIES § 660.7 How does the Director communicate with state...

  18. 45 CFR 660.7 - How does the Director communicate with state and local officials concerning the Foundation's...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE FOUNDATION INTERGOVERNMENTAL REVIEW OF THE NATIONAL SCIENCE FOUNDATION PROGRAMS AND ACTIVITIES § 660.7 How does the Director communicate with state...

  19. 45 CFR 660.7 - How does the Director communicate with state and local officials concerning the Foundation's...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE FOUNDATION INTERGOVERNMENTAL REVIEW OF THE NATIONAL SCIENCE FOUNDATION PROGRAMS AND ACTIVITIES § 660.7 How does the Director communicate with state...

  20. 45 CFR 660.7 - How does the Director communicate with state and local officials concerning the Foundation's...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE FOUNDATION INTERGOVERNMENTAL REVIEW OF THE NATIONAL SCIENCE FOUNDATION PROGRAMS AND ACTIVITIES § 660.7 How does the Director communicate with state...

Top