Center for Center for Technology for Advanced Scientific Component Software (TASCS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kostadin, Damevski
A resounding success of the Scientific Discovery through Advanced Computing (SciDAC) program is that high-performance computational science is now universally recognized as a critical aspect of scientific discovery [71], complementing both theoretical and experimental research. As scientific communities prepare to exploit unprecedented computing capabilities of emerging leadership-class machines for multi-model simulations at the extreme scale [72], it is more important than ever to address the technical and social challenges of geographically distributed teams that combine expertise in domain science, applied mathematics, and computer science to build robust and flexible codes that can incorporate changes over time. The Center for Technologymore » for Advanced Scientific Component Software (TASCS)1 tackles these these issues by exploiting component-based software development to facilitate collaborative high-performance scientific computing.« less
Joint the Center for Applied Scientific Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gamblin, Todd; Bremer, Timo; Van Essen, Brian
The Center for Applied Scientific Computing serves as Livermore Lab’s window to the broader computer science, computational physics, applied mathematics, and data science research communities. In collaboration with academic, industrial, and other government laboratory partners, we conduct world-class scientific research and development on problems critical to national security. CASC applies the power of high-performance computing and the efficiency of modern computational methods to the realms of stockpile stewardship, cyber and energy security, and knowledge discovery for intelligence applications.
75 FR 65639 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-26
...: Computational Biology Special Emphasis Panel A. Date: October 29, 2010. Time: 2 p.m. to 3:30 p.m. Agenda: To.... Name of Committee: Center for Scientific Review Special Emphasis Panel; Member Conflict: Computational...
77 FR 11139 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-24
...: Center for Scientific Review Special Emphasis Panel; ``Genetics and Epigenetics of Disease.'' Date: March... Scientific Review Special Emphasis Panel; Small Business: Cell, Computational, and Molecular Biology. Date...
Evolution of the Virtualized HPC Infrastructure of Novosibirsk Scientific Center
NASA Astrophysics Data System (ADS)
Adakin, A.; Anisenkov, A.; Belov, S.; Chubarov, D.; Kalyuzhny, V.; Kaplin, V.; Korol, A.; Kuchin, N.; Lomakin, S.; Nikultsev, V.; Skovpen, K.; Sukharev, A.; Zaytsev, A.
2012-12-01
Novosibirsk Scientific Center (NSC), also known worldwide as Akademgorodok, is one of the largest Russian scientific centers hosting Novosibirsk State University (NSU) and more than 35 research organizations of the Siberian Branch of Russian Academy of Sciences including Budker Institute of Nuclear Physics (BINP), Institute of Computational Technologies, and Institute of Computational Mathematics and Mathematical Geophysics (ICM&MG). Since each institute has specific requirements on the architecture of computing farms involved in its research field, currently we've got several computing facilities hosted by NSC institutes, each optimized for a particular set of tasks, of which the largest are the NSU Supercomputer Center, Siberian Supercomputer Center (ICM&MG), and a Grid Computing Facility of BINP. A dedicated optical network with the initial bandwidth of 10 Gb/s connecting these three facilities was built in order to make it possible to share the computing resources among the research communities, thus increasing the efficiency of operating the existing computing facilities and offering a common platform for building the computing infrastructure for future scientific projects. Unification of the computing infrastructure is achieved by extensive use of virtualization technology based on XEN and KVM platforms. This contribution gives a thorough review of the present status and future development prospects for the NSC virtualized computing infrastructure and the experience gained while using it for running production data analysis jobs related to HEP experiments being carried out at BINP, especially the KEDR detector experiment at the VEPP-4M electron-positron collider.
ERIC Educational Resources Information Center
Cottrell, William B.; And Others
The Nuclear Safety Information Center (NSIC) is a highly sophisticated scientific information center operated at Oak Ridge National Laboratory (ORNL) for the U.S. Atomic Energy Commission. Its information file, which consists of both data and bibliographic information, is computer stored and numerous programs have been developed to facilitate the…
EPA uses high-end scientific computing, geospatial services and remote sensing/imagery analysis to support EPA's mission. The Center for Environmental Computing (CEC) assists the Agency's program offices and regions to meet staff needs in these areas.
78 FR 68462 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-14
... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel; Brain Injury and... Methodologies Integrated Review Group; Biomedical Computing and Health Informatics Study Section. Date: December...
76 FR 24036 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-29
... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel, Brain Disorders... Integrated Review Group, Biomedical Computing and Health Informatics Study Section. Date: June 7-8, 2011...
NASA Technical Reports Server (NTRS)
VanZandt, John
1994-01-01
The usage model of supercomputers for scientific applications, such as computational fluid dynamics (CFD), has changed over the years. Scientific visualization has moved scientists away from looking at numbers to looking at three-dimensional images, which capture the meaning of the data. This change has impacted the system models for computing. This report details the model which is used by scientists at NASA's research centers.
Ames Research Center Publications: A Continuing Bibliography
NASA Technical Reports Server (NTRS)
1981-01-01
The Ames Research Center Publications: A Continuing Bibliography contains the research output of the Center indexed during 1981 in Scientific and Technical Aerospace Reports (STAR), Limited Scientific and Technical Aerospace Reports (LSTAR), International Aerospace Abstracts (IAA), and Computer Program Abstracts (CPA). This bibliography is published annually in an attempt to effect greater awareness and distribution of the Center's research output.
The role of dedicated data computing centers in the age of cloud computing
NASA Astrophysics Data System (ADS)
Caramarcu, Costin; Hollowell, Christopher; Strecker-Kellogg, William; Wong, Antonio; Zaytsev, Alexandr
2017-10-01
Brookhaven National Laboratory (BNL) anticipates significant growth in scientific programs with large computing and data storage needs in the near future and has recently reorganized support for scientific computing to meet these needs. A key component is the enhanced role of the RHIC-ATLAS Computing Facility (RACF) in support of high-throughput and high-performance computing (HTC and HPC) at BNL. This presentation discusses the evolving role of the RACF at BNL, in light of its growing portfolio of responsibilities and its increasing integration with cloud (academic and for-profit) computing activities. We also discuss BNL’s plan to build a new computing center to support the new responsibilities of the RACF and present a summary of the cost benefit analysis done, including the types of computing activities that benefit most from a local data center vs. cloud computing. This analysis is partly based on an updated cost comparison of Amazon EC2 computing services and the RACF, which was originally conducted in 2012.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hules, John
This 1998 annual report from the National Scientific Energy Research Computing Center (NERSC) presents the year in review of the following categories: Computational Science; Computer Science and Applied Mathematics; and Systems and Services. Also presented are science highlights in the following categories: Basic Energy Sciences; Biological and Environmental Research; Fusion Energy Sciences; High Energy and Nuclear Physics; and Advanced Scientific Computing Research and Other Projects.
Ames Research Center publications: A continuing bibliography, 1980
NASA Technical Reports Server (NTRS)
1981-01-01
This bibliography lists formal NASA publications, journal articles, books, chapters of books, patents, contractor reports, and computer programs that were issued by Ames Research Center and indexed by Scientific and Technical Aerospace Reports, Limited Scientific and Technical Aerospace Reports, International Aerospace Abstracts, and Computer Program Abstracts in 1980. Citations are arranged by directorate, type of publication, and NASA accession numbers. Subject, personal author, corporate source, contract number, and report/accession number indexes are provided.
Scientific and technical information output of the Langley Research Center
NASA Technical Reports Server (NTRS)
1984-01-01
Scientific and technical information that the Langley Research Center produced during the calendar year 1983 is compiled. Included are citations for Formal Reports, Quick-Release Technical Memorandums, Contractor Reports, Journal Articles and other Publications, Meeting Presentations, Technical Talks, Computer Programs, Tech Briefs, and Patents.
Argonne Research Library | Argonne National Laboratory
Publications Researchers Postdocs Exascale Computing Institute for Molecular Engineering at Argonne Work with Scientific Publications Researchers Postdocs Exascale Computing Institute for Molecular Engineering at IMEInstitute for Molecular Engineering JCESRJoint Center for Energy Storage Research MCSGMidwest Center for
ASCR Cybersecurity for Scientific Computing Integrity - Research Pathways and Ideas Workshop
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peisert, Sean; Potok, Thomas E.; Jones, Todd
At the request of the U.S. Department of Energy's (DOE) Office of Science (SC) Advanced Scientific Computing Research (ASCR) program office, a workshop was held June 2-3, 2015, in Gaithersburg, MD, to identify potential long term (10 to +20 year) cybersecurity fundamental basic research and development challenges, strategies and roadmap facing future high performance computing (HPC), networks, data centers, and extreme-scale scientific user facilities. This workshop was a follow-on to the workshop held January 7-9, 2015, in Rockville, MD, that examined higher level ideas about scientific computing integrity specific to the mission of the DOE Office of Science. Issues includedmore » research computation and simulation that takes place on ASCR computing facilities and networks, as well as network-connected scientific instruments, such as those run by various DOE Office of Science programs. Workshop participants included researchers and operational staff from DOE national laboratories, as well as academic researchers and industry experts. Participants were selected based on the submission of abstracts relating to the topics discussed in the previous workshop report [1] and also from other ASCR reports, including "Abstract Machine Models and Proxy Architectures for Exascale Computing" [27], the DOE "Preliminary Conceptual Design for an Exascale Computing Initiative" [28], and the January 2015 machine learning workshop [29]. The workshop was also attended by several observers from DOE and other government agencies. The workshop was divided into three topic areas: (1) Trustworthy Supercomputing, (2) Extreme-Scale Data, Knowledge, and Analytics for Understanding and Improving Cybersecurity, and (3) Trust within High-end Networking and Data Centers. Participants were divided into three corresponding teams based on the category of their abstracts. The workshop began with a series of talks from the program manager and workshop chair, followed by the leaders for each of the three topics and a representative of each of the four major DOE Office of Science Advanced Scientific Computing Research Facilities: the Argonne Leadership Computing Facility (ALCF), the Energy Sciences Network (ESnet), the National Energy Research Scientific Computing Center (NERSC), and the Oak Ridge Leadership Computing Facility (OLCF). The rest of the workshop consisted of topical breakout discussions and focused writing periods that produced much of this report.« less
NASA Astrophysics Data System (ADS)
Bogdanov, A. V.; Iuzhanin, N. V.; Zolotarev, V. I.; Ezhakova, T. R.
2017-12-01
In this article the problem of scientific projects support throughout their lifecycle in the computer center is considered in every aspect of support. Configuration Management system plays a connecting role in processes related to the provision and support of services of a computer center. In view of strong integration of IT infrastructure components with the use of virtualization, control of infrastructure becomes even more critical to the support of research projects, which means higher requirements for the Configuration Management system. For every aspect of research projects support, the influence of the Configuration Management system is being reviewed and development of the corresponding elements of the system is being described in the present paper.
Singularity: Scientific containers for mobility of compute.
Kurtzer, Gregory M; Sochat, Vanessa; Bauer, Michael W
2017-01-01
Here we present Singularity, software developed to bring containers and reproducibility to scientific computing. Using Singularity containers, developers can work in reproducible environments of their choosing and design, and these complete environments can easily be copied and executed on other platforms. Singularity is an open source initiative that harnesses the expertise of system and software engineers and researchers alike, and integrates seamlessly into common workflows for both of these groups. As its primary use case, Singularity brings mobility of computing to both users and HPC centers, providing a secure means to capture and distribute software and compute environments. This ability to create and deploy reproducible environments across these centers, a previously unmet need, makes Singularity a game changing development for computational science.
Singularity: Scientific containers for mobility of compute
Kurtzer, Gregory M.; Bauer, Michael W.
2017-01-01
Here we present Singularity, software developed to bring containers and reproducibility to scientific computing. Using Singularity containers, developers can work in reproducible environments of their choosing and design, and these complete environments can easily be copied and executed on other platforms. Singularity is an open source initiative that harnesses the expertise of system and software engineers and researchers alike, and integrates seamlessly into common workflows for both of these groups. As its primary use case, Singularity brings mobility of computing to both users and HPC centers, providing a secure means to capture and distribute software and compute environments. This ability to create and deploy reproducible environments across these centers, a previously unmet need, makes Singularity a game changing development for computational science. PMID:28494014
Space and Earth Sciences, Computer Systems, and Scientific Data Analysis Support, Volume 1
NASA Technical Reports Server (NTRS)
Estes, Ronald H. (Editor)
1993-01-01
This Final Progress Report covers the specific technical activities of Hughes STX Corporation for the last contract triannual period of 1 June through 30 Sep. 1993, in support of assigned task activities at Goddard Space Flight Center (GSFC). It also provides a brief summary of work throughout the contract period of performance on each active task. Technical activity is presented in Volume 1, while financial and level-of-effort data is presented in Volume 2. Technical support was provided to all Division and Laboratories of Goddard's Space Sciences and Earth Sciences Directorates. Types of support include: scientific programming, systems programming, computer management, mission planning, scientific investigation, data analysis, data processing, data base creation and maintenance, instrumentation development, and management services. Mission and instruments supported include: ROSAT, Astro-D, BBXRT, XTE, AXAF, GRO, COBE, WIND, UIT, SMM, STIS, HEIDI, DE, URAP, CRRES, Voyagers, ISEE, San Marco, LAGEOS, TOPEX/Poseidon, Pioneer-Venus, Galileo, Cassini, Nimbus-7/TOMS, Meteor-3/TOMS, FIFE, BOREAS, TRMM, AVHRR, and Landsat. Accomplishments include: development of computing programs for mission science and data analysis, supercomputer applications support, computer network support, computational upgrades for data archival and analysis centers, end-to-end management for mission data flow, scientific modeling and results in the fields of space and Earth physics, planning and design of GSFC VO DAAC and VO IMS, fabrication, assembly, and testing of mission instrumentation, and design of mission operations center.
Activities at the Lunar and Planetary Institute
NASA Technical Reports Server (NTRS)
1985-01-01
The activities of the Lunar and Planetary Institute for the period July to December 1984 are discussed. Functions of its departments and projects are summarized. These include: planetary image center; library information center; computer center; production services; scientific staff; visitors program; scientific projects; conferences; workshops; seminars; publications and communications; panels, teams, committees and working groups; NASA-AMES vertical gun range (AVGR); and lunar and planetary science council.
Introduction to the LaRC central scientific computing complex
NASA Technical Reports Server (NTRS)
Shoosmith, John N.
1993-01-01
The computers and associated equipment that make up the Central Scientific Computing Complex of the Langley Research Center are briefly described. The electronic networks that provide access to the various components of the complex and a number of areas that can be used by Langley and contractors staff for special applications (scientific visualization, image processing, software engineering, and grid generation) are also described. Flight simulation facilities that use the central computers are described. Management of the complex, procedures for its use, and available services and resources are discussed. This document is intended for new users of the complex, for current users who wish to keep appraised of changes, and for visitors who need to understand the role of central scientific computers at Langley.
Scientific and technical information output of the Langley Research Center for Calendar Year 1985
NASA Technical Reports Server (NTRS)
1986-01-01
A compilation of the scientific and technical information that the Langley Research Center has produced during the calendar year 1985 is presented. Included are citations for Formal Reports, Quick-Release Technical Memorandums, Contractor Reports, Journal Articles and Other Publications, Meeting Presentations, Technical Talks, Computer Programs, Tech Briefs, and Patents.
Scientific and technical information output of the Langley Research Center for calendar year 1984
NASA Technical Reports Server (NTRS)
1985-01-01
The scientific and technical information that the Langley Research Center produced during the calendar year 1984 is compiled. Approximately 1650 citations are included comprising formal reports, quick-release technical memorandums, contractor reports, journal articles and other publications, meeting presentations, technical talks, computer programs, tech briefs, and patents.
NASA Technical Reports Server (NTRS)
Denning, P. J.; Adams, G. B., III; Brown, R. L.; Kanerva, P.; Leiner, B. M.; Raugh, M. R.
1986-01-01
Large, complex computer systems require many years of development. It is recognized that large scale systems are unlikely to be delivered in useful condition unless users are intimately involved throughout the design process. A mechanism is described that will involve users in the design of advanced computing systems and will accelerate the insertion of new systems into scientific research. This mechanism is embodied in a facility called the Center for Advanced Architectures (CAA). CAA would be a division of RIACS (Research Institute for Advanced Computer Science) and would receive its technical direction from a Scientific Advisory Board established by RIACS. The CAA described here is a possible implementation of a center envisaged in a proposed cooperation between NASA and DARPA.
Computer network access to scientific information systems for minority universities
NASA Astrophysics Data System (ADS)
Thomas, Valerie L.; Wakim, Nagi T.
1993-08-01
The evolution of computer networking technology has lead to the establishment of a massive networking infrastructure which interconnects various types of computing resources at many government, academic, and corporate institutions. A large segment of this infrastructure has been developed to facilitate information exchange and resource sharing within the scientific community. The National Aeronautics and Space Administration (NASA) supports both the development and the application of computer networks which provide its community with access to many valuable multi-disciplinary scientific information systems and on-line databases. Recognizing the need to extend the benefits of this advanced networking technology to the under-represented community, the National Space Science Data Center (NSSDC) in the Space Data and Computing Division at the Goddard Space Flight Center has developed the Minority University-Space Interdisciplinary Network (MU-SPIN) Program: a major networking and education initiative for Historically Black Colleges and Universities (HBCUs) and Minority Universities (MUs). In this paper, we will briefly explain the various components of the MU-SPIN Program while highlighting how, by providing access to scientific information systems and on-line data, it promotes a higher level of collaboration among faculty and students and NASA scientists.
The Petascale Data Storage Institute
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gibson, Garth; Long, Darrell; Honeyman, Peter
2013-07-01
Petascale computing infrastructures for scientific discovery make petascale demands on information storage capacity, performance, concurrency, reliability, availability, and manageability.The Petascale Data Storage Institute focuses on the data storage problems found in petascale scientific computing environments, with special attention to community issues such as interoperability, community buy-in, and shared tools.The Petascale Data Storage Institute is a collaboration between researchers at Carnegie Mellon University, National Energy Research Scientific Computing Center, Pacific Northwest National Laboratory, Oak Ridge National Laboratory, Sandia National Laboratory, Los Alamos National Laboratory, University of Michigan, and the University of California at Santa Cruz.
76 FR 7868 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-11
... Special Emphasis Panel, Small Business: Computational Biology, Image Processing and Data Mining. Date... for Scientific Review Special Emphasis Panel, Quick Trial on Imaging and Image-Guided Intervention...
Scientific and technical information output of the Langley Research Center for calendar year 1980
NASA Technical Reports Server (NTRS)
1981-01-01
This document is a compilation of the scientific and technical information that the Langley Research Center has produced during the calendar year 1980. Approximately 1400 citations are given. Formal reports, quick-release technical memorandums, contractor reports, journal articles, meeting/conference papers, computer programs, tech briefs, patents, and unpublished research are included.
78 FR 64968 - Center for Scientific Review; Amended Notice of Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-30
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review; Amended Notice of Meeting Notice is hereby given of a change in the meeting of the Genomics, Computational Biology and Technology Study Section, October 16, 2013, 8:30 a.m. to October 17, 2013, 1:00 p.m., Avenue...
Scientific and technical information output of the Langley Research Center for calendar year 1986
NASA Technical Reports Server (NTRS)
1987-01-01
This document is a compilation of the scientific and technical information that the Langley Research Center has produced during the calendar year 1986. Included are citations for Formal Reports, Quick-Release Technical Memorandums, Contractor Reports, Journal Articles and Other Publications, Meeting Presentations, Techncial Talks, Computer Programs, Tech Briefs, and Patents.
EOS MLS Science Data Processing System: A Description of Architecture and Capabilities
NASA Technical Reports Server (NTRS)
Cuddy, David T.; Echeverri, Mark D.; Wagner, Paul A.; Hanzel, Audrey T.; Fuller, Ryan A.
2006-01-01
This paper describes the architecture and capabilities of the Science Data Processing System (SDPS) for the EOS MLS. The SDPS consists of two major components--the Science Computing Facility and the Science Investigator-led Processing System. The Science Computing Facility provides the facilities for the EOS MLS Science Team to perform the functions of scientific algorithm development, processing software development, quality control of data products, and scientific analyses. The Science Investigator-led Processing System processes and reprocesses the science data for the entire mission and delivers the data products to the Science Computing Facility and to the Goddard Space Flight Center Earth Science Distributed Active Archive Center, which archives and distributes the standard science products.
Computational Science: A Research Methodology for the 21st Century
NASA Astrophysics Data System (ADS)
Orbach, Raymond L.
2004-03-01
Computational simulation - a means of scientific discovery that employs computer systems to simulate a physical system according to laws derived from theory and experiment - has attained peer status with theory and experiment. Important advances in basic science are accomplished by a new "sociology" for ultrascale scientific computing capability (USSCC), a fusion of sustained advances in scientific models, mathematical algorithms, computer architecture, and scientific software engineering. Expansion of current capabilities by factors of 100 - 1000 open up new vistas for scientific discovery: long term climatic variability and change, macroscopic material design from correlated behavior at the nanoscale, design and optimization of magnetic confinement fusion reactors, strong interactions on a computational lattice through quantum chromodynamics, and stellar explosions and element production. The "virtual prototype," made possible by this expansion, can markedly reduce time-to-market for industrial applications such as jet engines and safer, more fuel efficient cleaner cars. In order to develop USSCC, the National Energy Research Scientific Computing Center (NERSC) announced the competition "Innovative and Novel Computational Impact on Theory and Experiment" (INCITE), with no requirement for current DOE sponsorship. Fifty nine proposals for grand challenge scientific problems were submitted for a small number of awards. The successful grants, and their preliminary progress, will be described.
75 FR 33816 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-15
... Scientific Review Special Emphasis Panel; Small Business: Computational Biology, Image Processing, and Data Mining. Date: July 21, 2010. Time: 8 a.m. to 6 p.m. Agenda: To review and evaluate grant applications...
A Queue Simulation Tool for a High Performance Scientific Computing Center
NASA Technical Reports Server (NTRS)
Spear, Carrie; McGalliard, James
2007-01-01
The NASA Center for Computational Sciences (NCCS) at the Goddard Space Flight Center provides high performance highly parallel processors, mass storage, and supporting infrastructure to a community of computational Earth and space scientists. Long running (days) and highly parallel (hundreds of CPUs) jobs are common in the workload. NCCS management structures batch queues and allocates resources to optimize system use and prioritize workloads. NCCS technical staff use a locally developed discrete event simulation tool to model the impacts of evolving workloads, potential system upgrades, alternative queue structures and resource allocation policies.
Virtual Environments in Scientific Visualization
NASA Technical Reports Server (NTRS)
Bryson, Steve; Lisinski, T. A. (Technical Monitor)
1994-01-01
Virtual environment technology is a new way of approaching the interface between computers and humans. Emphasizing display and user control that conforms to the user's natural ways of perceiving and thinking about space, virtual environment technologies enhance the ability to perceive and interact with computer generated graphic information. This enhancement potentially has a major effect on the field of scientific visualization. Current examples of this technology include the Virtual Windtunnel being developed at NASA Ames Research Center. Other major institutions such as the National Center for Supercomputing Applications and SRI International are also exploring this technology. This talk will be describe several implementations of virtual environments for use in scientific visualization. Examples include the visualization of unsteady fluid flows (the virtual windtunnel), the visualization of geodesics in curved spacetime, surface manipulation, and examples developed at various laboratories.
78 FR 11659 - Center For Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-19
..., Computational, and Molecular Biology. Date: March 12, 2013. Time: 8:00 a.m. to 6:00 p.m. Agenda: To review and... Scientific Review Special Emphasis Panel; Member Conflict: Genetics, Informatics and Vision Studies. Date...
Argonne's Magellan Cloud Computing Research Project
Beckman, Pete
2017-12-11
Pete Beckman, head of Argonne's Leadership Computing Facility (ALCF), discusses the Department of Energy's new $32-million Magellan project, which designed to test how cloud computing can be used for scientific research. More information: http://www.anl.gov/Media_Center/News/2009/news091014a.html
Argonne's Magellan Cloud Computing Research Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beckman, Pete
Pete Beckman, head of Argonne's Leadership Computing Facility (ALCF), discusses the Department of Energy's new $32-million Magellan project, which designed to test how cloud computing can be used for scientific research. More information: http://www.anl.gov/Media_Center/News/2009/news091014a.html
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhattacharya, Papri; Prokopchuk, Demyan E.; Mock, Michael T.
2017-03-01
This review examines the synthesis and acid reactivity of transition metal dinitrogen complexes bearing diphosphine ligands containing pendant amine groups in the second coordination sphere. This manuscript is a review of the work performed in the Center for Molecular Electrocatalysis. This work was supported as part of the Center for Molecular Electrocatalysis, an Energy Frontier Research Center funded by the U.S. Department of Energy (U.S. DOE), Office of Science, Office of Basic Energy Sciences. EPR studies on Fe were performed using EMSL, a national scientific user facility sponsored by the DOE’s Office of Biological and Environmental Research and located atmore » PNNL. Computational resources were provided by the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory. Pacific Northwest National Laboratory is operated by Battelle for the U.S. DOE.« less
Energy 101: Energy Efficient Data Centers
None
2018-04-16
Data centers provide mission-critical computing functions vital to the daily operation of top U.S. economic, scientific, and technological organizations. These data centers consume large amounts of energy to run and maintain their computer systems, servers, and associated high-performance componentsâup to 3% of all U.S. electricity powers data centers. And as more information comes online, data centers will consume even more energy. Data centers can become more energy efficient by incorporating features like power-saving "stand-by" modes, energy monitoring software, and efficient cooling systems instead of energy-intensive air conditioners. These and other efficiency improvements to data centers can produce significant energy savings, reduce the load on the electric grid, and help protect the nation by increasing the reliability of critical computer operations.
New project to support scientific collaboration electronically
NASA Astrophysics Data System (ADS)
Clauer, C. R.; Rasmussen, C. E.; Niciejewski, R. J.; Killeen, T. L.; Kelly, J. D.; Zambre, Y.; Rosenberg, T. J.; Stauning, P.; Friis-Christensen, E.; Mende, S. B.; Weymouth, T. E.; Prakash, A.; McDaniel, S. E.; Olson, G. M.; Finholt, T. A.; Atkins, D. E.
A new multidisciplinary effort is linking research in the upper atmospheric and space, computer, and behavioral sciences to develop a prototype electronic environment for conducting team science worldwide. A real-world electronic collaboration testbed has been established to support scientific work centered around the experimental operations being conducted with instruments from the Sondrestrom Upper Atmospheric Research Facility in Kangerlussuaq, Greenland. Such group computing environments will become an important component of the National Information Infrastructure initiative, which is envisioned as the high-performance communications infrastructure to support national scientific research.
Selected Mechanized Scientific and Technical Information Systems.
ERIC Educational Resources Information Center
Ackerman, Lynn, Ed.; And Others
The publication describes the following thirteen computer-based, operational systems designed primarily for the announcement, storage, retrieval and secondary distribution of scientific and technical reports: Defense Documentation Center; Highway Research Board; National Aeronautics and Space Administration; National Library of Medicine; U.S.…
The Center for Nanophase Materials Sciences
NASA Astrophysics Data System (ADS)
Lowndes, Douglas
2005-03-01
The Center for Nanophase Materials Sciences (CNMS) located at Oak Ridge National Laboratory (ORNL) will be the first DOE Nanoscale Science Research Center to begin operation, with construction to be completed in April 2005 and initial operations in October 2005. The CNMS' scientific program has been developed through workshops with the national community, with the goal of creating a highly collaborative research environment to accelerate discovery and drive technological advances. Research at the CNMS is organized under seven Scientific Themes selected to address challenges to understanding and to exploit particular ORNL strengths (see http://cnms.ornl.govhttp://cnms.ornl.gov). These include extensive synthesis and characterization capabilities for soft, hard, nanostructured, magnetic and catalytic materials and their composites; neutron scattering at the Spallation Neutron Source and High Flux Isotope Reactor; computational nanoscience in the CNMS' Nanomaterials Theory Institute and utilizing facilities and expertise of the Center for Computational Sciences and the new Leadership Scientific Computing Facility at ORNL; a new CNMS Nanofabrication Research Laboratory; and a suite of unique and state-of-the-art instruments to be made reliably available to the national community for imaging, manipulation, and properties measurements on nanoscale materials in controlled environments. The new research facilities will be described together with the planned operation of the user research program, the latter illustrated by the current ``jump start'' user program that utilizes existing ORNL/CNMS facilities.
Data Intensive Scientific Workflows on a Federated Cloud: CRADA Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garzoglio, Gabriele
The Fermilab Scientific Computing Division and the KISTI Global Science Experimental Data Hub Center have built a prototypical large-scale infrastructure to handle scientific workflows of stakeholders to run on multiple cloud resources. The demonstrations have been in the areas of (a) Data-Intensive Scientific Workflows on Federated Clouds, (b) Interoperability and Federation of Cloud Resources, and (c) Virtual Infrastructure Automation to enable On-Demand Services.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Srinath Vadlamani; Scott Kruger; Travis Austin
Extended magnetohydrodynamic (MHD) codes are used to model the large, slow-growing instabilities that are projected to limit the performance of International Thermonuclear Experimental Reactor (ITER). The multiscale nature of the extended MHD equations requires an implicit approach. The current linear solvers needed for the implicit algorithm scale poorly because the resultant matrices are so ill-conditioned. A new solver is needed, especially one that scales to the petascale. The most successful scalable parallel processor solvers to date are multigrid solvers. Applying multigrid techniques to a set of equations whose fundamental modes are dispersive waves is a promising solution to CEMM problems.more » For the Phase 1, we implemented multigrid preconditioners from the HYPRE project of the Center for Applied Scientific Computing at LLNL via PETSc of the DOE SciDAC TOPS for the real matrix systems of the extended MHD code NIMROD which is a one of the primary modeling codes of the OFES-funded Center for Extended Magnetohydrodynamic Modeling (CEMM) SciDAC. We implemented the multigrid solvers on the fusion test problem that allows for real matrix systems with success, and in the process learned about the details of NIMROD data structures and the difficulties of inverting NIMROD operators. The further success of this project will allow for efficient usage of future petascale computers at the National Leadership Facilities: Oak Ridge National Laboratory, Argonne National Laboratory, and National Energy Research Scientific Computing Center. The project will be a collaborative effort between computational plasma physicists and applied mathematicians at Tech-X Corporation, applied mathematicians Front Range Scientific Computations, Inc. (who are collaborators on the HYPRE project), and other computational plasma physicists involved with the CEMM project.« less
Water resources scientific information center
Cardin, C. William; Campbell, J.T.
1986-01-01
The Water Resources Scientific Information Center (WRSIC) acquires, abstracts and indexes the major water resources related literature of the world, and makes information available to the water resources community and the public. A component of the Water Resources Division of the US Geological Survey, the Center maintains a searchable computerized bibliographic data base, and publishers a monthly journal of abstracts. Through its services, the Center is able to provide reliable scientific and technical information about the most recent water resources developments, as well as long-term trends and changes. WRSIC was established in 1966 by the Secretary of the Interior to further the objectives of the Water Resources Research Act of 1964--legislation that encouraged research in water resources and the prevention of needless duplication of research efforts. It was determined the WRSIC should be the national center for information on water resources, covering research reports, scientific journals, and other water resources literature of the world. WRSIC would evaluate all water resources literature, catalog selected articles, and make the information available in publications or by computer access. In this way WRSIC would increase the availability and awareness of water related scientific and technical information. (Lantz-PTT)
Analysis of the flight dynamics of the Solar Maximum Mission (SMM) off-sun scientific pointing
NASA Technical Reports Server (NTRS)
Pitone, D. S.; Klein, J. R.
1989-01-01
Algorithms are presented which were created and implemented by the Goddard Space Flight Center's (GSFC's) Solar Maximum Mission (SMM) attitude operations team to support large-angle spacecraft pointing at scientific objectives. The mission objective of the post-repair SMM satellite was to study solar phenomena. However, because the scientific instruments, such as the Coronagraph/Polarimeter (CP) and the Hard X ray Burst Spectrometer (HXRBS), were able to view objects other than the Sun, attitude operations support for attitude pointing at large angles from the nominal solar-pointing attitudes was required. Subsequently, attitude support for SMM was provided for scientific objectives such as Comet Halley, Supernova 1987A, Cygnus X-1, and the Crab Nebula. In addition, the analysis was extended to include the reverse problem, computing the right ascension and declination of a body given the off-Sun angles. This analysis led to the computation of the orbits of seven new solar comets seen in the field-of-view (FOV) of the CP. The activities necessary to meet these large-angle attitude-pointing sequences, such as slew sequence planning, viewing-period prediction, and tracking-bias computation are described. Analysis is presented for the computation of maneuvers and pointing parameters relative to the SMM-unique, Sun-centered reference frame. Finally, science data and independent attitude solutions are used to evaluate the large-angle pointing performance.
Analysis of the flight dynamics of the Solar Maximum Mission (SMM) off-sun scientific pointing
NASA Technical Reports Server (NTRS)
Pitone, D. S.; Klein, J. R.; Twambly, B. J.
1990-01-01
Algorithms are presented which were created and implemented by the Goddard Space Flight Center's (GSFC's) Solar Maximum Mission (SMM) attitude operations team to support large-angle spacecraft pointing at scientific objectives. The mission objective of the post-repair SMM satellite was to study solar phenomena. However, because the scientific instruments, such as the Coronagraph/Polarimeter (CP) and the Hard X-ray Burst Spectrometer (HXRBS), were able to view objects other than the Sun, attitude operations support for attitude pointing at large angles from the nominal solar-pointing attitudes was required. Subsequently, attitude support for SMM was provided for scientific objectives such as Comet Halley, Supernova 1987A, Cygnus X-1, and the Crab Nebula. In addition, the analysis was extended to include the reverse problem, computing the right ascension and declination of a body given the off-Sun angles. This analysis led to the computation of the orbits of seven new solar comets seen in the field-of-view (FOV) of the CP. The activities necessary to meet these large-angle attitude-pointing sequences, such as slew sequence planning, viewing-period prediction, and tracking-bias computation are described. Analysis is presented for the computation of maneuvers and pointing parameters relative to the SMM-unique, Sun-centered reference frame. Finally, science data and independent attitude solutions are used to evaluate the larg-angle pointing performance.
Scientific computations section monthly report, November 1993
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buckner, M.R.
1993-12-30
This progress report from the Savannah River Technology Center contains abstracts from papers from the computational modeling, applied statistics, applied physics, experimental thermal hydraulics, and packaging and transportation groups. Specific topics covered include: engineering modeling and process simulation, criticality methods and analysis, plutonium disposition.
Final Technical Report - Center for Technology for Advanced Scientific Component Software (TASCS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sussman, Alan
2014-10-21
This is a final technical report for the University of Maryland work in the SciDAC Center for Technology for Advanced Scientific Component Software (TASCS). The Maryland work focused on software tools for coupling parallel software components built using the Common Component Architecture (CCA) APIs. Those tools are based on the Maryland InterComm software framework that has been used in multiple computational science applications to build large-scale simulations of complex physical systems that employ multiple separately developed codes.
Educational and Commercial Utilization of a Chemical Information Center, Four Year Summary.
ERIC Educational Resources Information Center
Williams, Martha E.; And Others
The major objective of the IITRI Computer Search Center is to educate and link industry, academia, and government institutions to chemical and other scientific information systems and sources. The Center was developed to meet this objective and is in full operation providing services to users from a variety of machine-readable data bases with…
Mass storage system experiences and future needs at the National Center for Atmospheric Research
NASA Technical Reports Server (NTRS)
Olear, Bernard T.
1991-01-01
A summary and viewgraphs of a discussion presented at the National Space Science Data Center (NSSDC) Mass Storage Workshop is included. Some of the experiences of the Scientific Computing Division at the National Center for Atmospheric Research (NCAR) dealing the the 'data problem' are discussed. A brief history and a development of some basic mass storage system (MSS) principles are given. An attempt is made to show how these principles apply to the integration of various components into NCAR's MSS. Future MSS needs for future computing environments is discussed.
NASA Technical Reports Server (NTRS)
Oliger, Joseph
1992-01-01
The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on 6 June 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under a cooperative agreement with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. A flexible scientific staff is provided through a university faculty visitor program, a post doctoral program, and a student visitor program. Not only does this provide appropriate expertise but it also introduces scientists outside of NASA to NASA problems. A small group of core RIACS staff provides continuity and interacts with an ARC technical monitor and scientific advisory group to determine the RIACS mission. RIACS activities are reviewed and monitored by a USRA advisory council and ARC technical monitor. Research at RIACS is currently being done in the following areas: Parallel Computing; Advanced Methods for Scientific Computing; Learning Systems; High Performance Networks and Technology; Graphics, Visualization, and Virtual Environments.
3-D Imaging In Virtual Environment: A Scientific Clinical and Teaching Tool
NASA Technical Reports Server (NTRS)
Ross, Muriel D.; DeVincenzi, Donald L. (Technical Monitor)
1996-01-01
The advent of powerful graphics workstations and computers has led to the advancement of scientific knowledge through three-dimensional (3-D) reconstruction and imaging of biological cells and tissues. The Biocomputation Center at NASA Ames Research Center pioneered the effort to produce an entirely computerized method for reconstruction of objects from serial sections studied in a transmission electron microscope (TEM). The software developed, ROSS (Reconstruction of Serial Sections), is now being distributed to users across the United States through Space Act Agreements. The software is in widely disparate fields such as geology, botany, biology and medicine. In the Biocomputation Center, ROSS serves as the basis for development of virtual environment technologies for scientific and medical use. This report will describe the Virtual Surgery Workstation Project that is ongoing with clinicians at Stanford University Medical Center, and the role of the Visible Human data in the project.
NASA Langley scientific and technical information output: 1994, volume 1
NASA Technical Reports Server (NTRS)
Phillips, Marilou S. (Compiler); Stewart, Susan H. (Compiler)
1995-01-01
This document is a compilation of the scientific and technical information that the Langley Research Center has produced during the calendar year 1994. Included are citations for Formal Reports, High-Numbered Conference Publications, High-Numbered Technical Memorandums, Contractor Reports, Journal Articles and Other Publications, Meeting Presentations, Computer Programs, Tech Briefs, and Patents.
NASA Langley Scientific and Technical Information Output: 1994. Volume 1
NASA Technical Reports Server (NTRS)
Phillips, Marilou S. (Compiler); Stewart, Susan H. (Compiler)
1995-01-01
This document is a compilation of the scientific and technical information that the Langley Research Center has produced during the calendar year 1994. Included are citations for Formal Reports, High-Numbered Conference Publications, High-Numbered Technical Memorandums, Contractor Reports, Journal Articles and Other Publications, Meeting Presentations, Computer Programs, Tech Briefs, and Patents.
NASA Langley Scientific and Technical Information Output: 1996
NASA Technical Reports Server (NTRS)
Stewart, Susan H. (Compiler); Phillips, Marilou S. (Compiler)
1997-01-01
This document is a compilation of the scientific and technical information that the Langley Research Center has produced during the calendar year 1996. Included are citations for Formal Reports, High-Numbered Conference Publications, High-Numbered Technical Memorandums, Contractor Reports, Journal Articles and Other Publications, Meeting Presentations, Technical Talks, Computer Programs, Tech Briefs, and Patents.
Activities of the Research Institute for Advanced Computer Science
NASA Technical Reports Server (NTRS)
Oliger, Joseph
1994-01-01
The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. Research at RIACS is currently being done in the following areas: (1) parallel computing; (2) advanced methods for scientific computing; (3) high performance networks; and (4) learning systems. RIACS technical reports are usually preprints of manuscripts that have been submitted to research journals or conference proceedings. A list of these reports for the period January 1, 1994 through December 31, 1994 is in the Reports and Abstracts section of this report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Svetlana Shasharina
The goal of the Center for Technology for Advanced Scientific Component Software is to fundamentally changing the way scientific software is developed and used by bringing component-based software development technologies to high-performance scientific and engineering computing. The role of Tech-X work in TASCS project is to provide an outreach to accelerator physics and fusion applications by introducing TASCS tools into applications, testing tools in the applications and modifying the tools to be more usable.
NASA Astrophysics Data System (ADS)
Schulthess, Thomas C.
2013-03-01
The continued thousand-fold improvement in sustained application performance per decade on modern supercomputers keeps opening new opportunities for scientific simulations. But supercomputers have become very complex machines, built with thousands or tens of thousands of complex nodes consisting of multiple CPU cores or, most recently, a combination of CPU and GPU processors. Efficient simulations on such high-end computing systems require tailored algorithms that optimally map numerical methods to particular architectures. These intricacies will be illustrated with simulations of strongly correlated electron systems, where the development of quantum cluster methods, Monte Carlo techniques, as well as their optimal implementation by means of algorithms with improved data locality and high arithmetic density have gone hand in hand with evolving computer architectures. The present work would not have been possible without continued access to computing resources at the National Center for Computational Science of Oak Ridge National Laboratory, which is funded by the Facilities Division of the Office of Advanced Scientific Computing Research, and the Swiss National Supercomputing Center (CSCS) that is funded by ETH Zurich.
NASA Langley Scientific and Technical Information Output, 1995. Volume 1
NASA Technical Reports Server (NTRS)
Stewart, Susan H. (Compiler); Phillips, Marilou S. (Compiler)
1996-01-01
This document is a compilation of the scientific and technical information that the Langley Research Center has produced during the calendar year 1995. Included are citations for formal reports, high-numbered conference publications, high-numbered technical memorandums, contractor reports, journal articles and other publications, meeting presentations, technical talks, computer programs, tech briefs, and patents.
Data Processing Center of Radioastron Project: 3 years of operation.
NASA Astrophysics Data System (ADS)
Shatskaya, Marina
ASC DATA PROCESSING CENTER (DPC) of Radioastron Project is a fail-safe complex centralized system of interconnected software/ hardware components along with organizational procedures. Tasks facing of the scientific data processing center are organization of service information exchange, collection of scientific data, storage of all of scientific data, data science oriented processing. DPC takes part in the informational exchange with two tracking stations in Pushchino (Russia) and Green Bank (USA), about 30 ground telescopes, ballistic center, tracking headquarters and session scheduling center. Enormous flows of information go to Astro Space Center. For the inquiring of enormous data volumes we develop specialized network infrastructure, Internet channels and storage. The computer complex has been designed at the Astro Space Center (ASC) of Lebedev Physical Institute and includes: - 800 TB on-line storage, - 2000 TB hard drive archive, - backup system on magnetic tapes (2000 TB); - 24 TB redundant storage at Pushchino Radio Astronomy Observatory; - Web and FTP servers, - DPC management and data transmission networks. The structure and functions of ASC Data Processing Center are fully adequate to the data processing requirements of the Radioastron Mission and has been successfully confirmed during Fringe Search, Early Science Program and first year of Key Science Program.
Institute for scientific computing research;fiscal year 1999 annual report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keyes, D
2000-03-28
Large-scale scientific computation, and all of the disciplines that support it and help to validate it, have been placed at the focus of Lawrence Livermore National Laboratory by the Accelerated Strategic Computing Initiative (ASCI). The Laboratory operates the computer with the highest peak performance in the world and has undertaken some of the largest and most compute-intensive simulations ever performed. Computers at the architectural extremes, however, are notoriously difficult to use efficiently. Even such successes as the Laboratory's two Bell Prizes awarded in November 1999 only emphasize the need for much better ways of interacting with the results of large-scalemore » simulations. Advances in scientific computing research have, therefore, never been more vital to the core missions of the Laboratory than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, the Laboratory must engage researchers at many academic centers of excellence. In FY 1999, the Institute for Scientific Computing Research (ISCR) has expanded the Laboratory's bridge to the academic community in the form of collaborative subcontracts, visiting faculty, student internships, a workshop, and a very active seminar series. ISCR research participants are integrated almost seamlessly with the Laboratory's Center for Applied Scientific Computing (CASC), which, in turn, addresses computational challenges arising throughout the Laboratory. Administratively, the ISCR flourishes under the Laboratory's University Relations Program (URP). Together with the other four Institutes of the URP, it must navigate a course that allows the Laboratory to benefit from academic exchanges while preserving national security. Although FY 1999 brought more than its share of challenges to the operation of an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and well worth the continued effort. A change of administration for the ISCR occurred during FY 1999. Acting Director John Fitzgerald retired from LLNL in August after 35 years of service, including the last two at helm of the ISCR. David Keyes, who has been a regular visitor in conjunction with ASCI scalable algorithms research since October 1997, overlapped with John for three months and serves half-time as the new Acting Director.« less
Haidar, Azzam; Jagode, Heike; Vaccaro, Phil; ...
2018-03-22
The emergence of power efficiency as a primary constraint in processor and system design poses new challenges concerning power and energy awareness for numerical libraries and scientific applications. Power consumption also plays a major role in the design of data centers, which may house petascale or exascale-level computing systems. At these extreme scales, understanding and improving the energy efficiency of numerical libraries and their related applications becomes a crucial part of the successful implementation and operation of the computing system. In this paper, we study and investigate the practice of controlling a compute system's power usage, and we explore howmore » different power caps affect the performance of numerical algorithms with different computational intensities. Further, we determine the impact, in terms of performance and energy usage, that these caps have on a system running scientific applications. This analysis will enable us to characterize the types of algorithms that benefit most from these power management schemes. Our experiments are performed using a set of representative kernels and several popular scientific benchmarks. Lastly, we quantify a number of power and performance measurements and draw observations and conclusions that can be viewed as a roadmap to achieving energy efficiency in the design and execution of scientific algorithms.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haidar, Azzam; Jagode, Heike; Vaccaro, Phil
The emergence of power efficiency as a primary constraint in processor and system design poses new challenges concerning power and energy awareness for numerical libraries and scientific applications. Power consumption also plays a major role in the design of data centers, which may house petascale or exascale-level computing systems. At these extreme scales, understanding and improving the energy efficiency of numerical libraries and their related applications becomes a crucial part of the successful implementation and operation of the computing system. In this paper, we study and investigate the practice of controlling a compute system's power usage, and we explore howmore » different power caps affect the performance of numerical algorithms with different computational intensities. Further, we determine the impact, in terms of performance and energy usage, that these caps have on a system running scientific applications. This analysis will enable us to characterize the types of algorithms that benefit most from these power management schemes. Our experiments are performed using a set of representative kernels and several popular scientific benchmarks. Lastly, we quantify a number of power and performance measurements and draw observations and conclusions that can be viewed as a roadmap to achieving energy efficiency in the design and execution of scientific algorithms.« less
NASA Technical Reports Server (NTRS)
Oliger, Joseph
1993-01-01
The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on 6 June 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. A flexible scientific staff is provided through a university faculty visitor program, a post doctoral program, and a student visitor program. Not only does this provide appropriate expertise but it also introduces scientists outside of NASA to NASA problems. A small group of core RIACS staff provides continuity and interacts with an ARC technical monitor and scientific advisory group to determine the RIACS mission. RIACS activities are reviewed and monitored by a USRA advisory council and ARC technical monitor. Research at RIACS is currently being done in the following areas: Parallel Computing, Advanced Methods for Scientific Computing, High Performance Networks and Technology, and Learning Systems. Parallel compiler techniques, adaptive numerical methods for flows in complicated geometries, and optimization were identified as important problems to investigate for ARC's involvement in the Computational Grand Challenges of the next decade.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beckman, P.; Martin, D.; Drugan, C.
2010-11-23
This year the Argonne Leadership Computing Facility (ALCF) delivered nearly 900 million core hours of science. The research conducted at their leadership class facility touched our lives in both minute and massive ways - whether it was studying the catalytic properties of gold nanoparticles, predicting protein structures, or unearthing the secrets of exploding stars. The authors remained true to their vision to act as the forefront computational center in extending science frontiers by solving pressing problems for our nation. Our success in this endeavor was due mainly to the Department of Energy's (DOE) INCITE (Innovative and Novel Computational Impact onmore » Theory and Experiment) program. The program awards significant amounts of computing time to computationally intensive, unclassified research projects that can make high-impact scientific advances. This year, DOE allocated 400 million hours of time to 28 research projects at the ALCF. Scientists from around the world conducted the research, representing such esteemed institutions as the Princeton Plasma Physics Laboratory, National Institute of Standards and Technology, and European Center for Research and Advanced Training in Scientific Computation. Argonne also provided Director's Discretionary allocations for research challenges, addressing such issues as reducing aerodynamic noise, critical for next-generation 'green' energy systems. Intrepid - the ALCF's 557-teraflops IBM Blue/Gene P supercomputer - enabled astounding scientific solutions and discoveries. Intrepid went into full production five months ahead of schedule. As a result, the ALCF nearly doubled the days of production computing available to the DOE Office of Science, INCITE awardees, and Argonne projects. One of the fastest supercomputers in the world for open science, the energy-efficient system uses about one-third as much electricity as a machine of comparable size built with more conventional parts. In October 2009, President Barack Obama recognized the excellence of the entire Blue Gene series by awarding it to the National Medal of Technology and Innovation. Other noteworthy achievements included the ALCF's collaboration with the National Energy Research Scientific Computing Center (NERSC) to examine cloud computing as a potential new computing paradigm for scientists. Named Magellan, the DOE-funded initiative will explore which science application programming models work well within the cloud, as well as evaluate the challenges that come with this new paradigm. The ALCF obtained approval for its next-generation machine, a 10-petaflops system to be delivered in 2012. This system will allow us to resolve ever more pressing problems, even more expeditiously through breakthrough science in the years to come.« less
Scientific Computing Strategic Plan for the Idaho National Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whiting, Eric Todd
Scientific computing is a critical foundation of modern science. Without innovations in the field of computational science, the essential missions of the Department of Energy (DOE) would go unrealized. Taking a leadership role in such innovations is Idaho National Laboratory’s (INL’s) challenge and charge, and is central to INL’s ongoing success. Computing is an essential part of INL’s future. DOE science and technology missions rely firmly on computing capabilities in various forms. Modeling and simulation, fueled by innovations in computational science and validated through experiment, are a critical foundation of science and engineering. Big data analytics from an increasing numbermore » of widely varied sources is opening new windows of insight and discovery. Computing is a critical tool in education, science, engineering, and experiments. Advanced computing capabilities in the form of people, tools, computers, and facilities, will position INL competitively to deliver results and solutions on important national science and engineering challenges. A computing strategy must include much more than simply computers. The foundational enabling component of computing at many DOE national laboratories is the combination of a showcase like data center facility coupled with a very capable supercomputer. In addition, network connectivity, disk storage systems, and visualization hardware are critical and generally tightly coupled to the computer system and co located in the same facility. The existence of these resources in a single data center facility opens the doors to many opportunities that would not otherwise be possible.« less
ISCR Annual Report: Fical Year 2004
DOE Office of Scientific and Technical Information (OSTI.GOV)
McGraw, J R
2005-03-03
Large-scale scientific computation and all of the disciplines that support and help to validate it have been placed at the focus of Lawrence Livermore National Laboratory (LLNL) by the Advanced Simulation and Computing (ASC) program of the National Nuclear Security Administration (NNSA) and the Scientific Discovery through Advanced Computing (SciDAC) initiative of the Office of Science of the Department of Energy (DOE). The maturation of computational simulation as a tool of scientific and engineering research is underscored in the November 2004 statement of the Secretary of Energy that, ''high performance computing is the backbone of the nation's science and technologymore » enterprise''. LLNL operates several of the world's most powerful computers--including today's single most powerful--and has undertaken some of the largest and most compute-intensive simulations ever performed. Ultrascale simulation has been identified as one of the highest priorities in DOE's facilities planning for the next two decades. However, computers at architectural extremes are notoriously difficult to use efficiently. Furthermore, each successful terascale simulation only points out the need for much better ways of interacting with the resulting avalanche of data. Advances in scientific computing research have, therefore, never been more vital to LLNL's core missions than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, LLNL must engage researchers at many academic centers of excellence. In Fiscal Year 2004, the Institute for Scientific Computing Research (ISCR) served as one of LLNL's main bridges to the academic community with a program of collaborative subcontracts, visiting faculty, student internships, workshops, and an active seminar series. The ISCR identifies researchers from the academic community for computer science and computational science collaborations with LLNL and hosts them for short- and long-term visits with the aim of encouraging long-term academic research agendas that address LLNL's research priorities. Through such collaborations, ideas and software flow in both directions, and LLNL cultivates its future workforce. The Institute strives to be LLNL's ''eyes and ears'' in the computer and information sciences, keeping the Laboratory aware of and connected to important external advances. It also attempts to be the ''feet and hands'' that carry those advances into the Laboratory and incorporates them into practice. ISCR research participants are integrated into LLNL's Computing and Applied Research (CAR) Department, especially into its Center for Applied Scientific Computing (CASC). In turn, these organizations address computational challenges arising throughout the rest of the Laboratory. Administratively, the ISCR flourishes under LLNL's University Relations Program (URP). Together with the other five institutes of the URP, it navigates a course that allows LLNL to benefit from academic exchanges while preserving national security. While it is difficult to operate an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and worth the continued effort.« less
Four-Year Summary, Educational and Commercial Utilization of a Chemical Information Center, Part II.
ERIC Educational Resources Information Center
Schipma, Peter B., Ed.
The major objective of the Illinois Institute of Technology Retrieval Institute (IITRI) Computer Search Center (CSC) is to educate and link industry, academia, and government institutions to chemical and other scientific information systems and sources. The CSC is in full operation providing services to users from a variety of machine-readable…
Four-Year Summary, Educational and Commercial Utilization of a Chemical Information Center. Part I.
ERIC Educational Resources Information Center
Schipma, Peter B., Ed.
The major objective of the Illinois Institute of Technology (IIT) Computer Search Center (CSC) is to educate and link industry, academia, and government institutions to chemical and other scientific information systems and sources. The CSC is in full operation providing services to users from a variety of machine-readable data bases with minimal…
Multicore: Fallout from a Computing Evolution
Yelick, Kathy [Director, NERSC
2017-12-09
July 22, 2008 Berkeley Lab lecture: Parallel computing used to be reserved for big science and engineering projects, but in two years that's all changed. Even laptops and hand-helds use parallel processors. Unfortunately, the software hasn't kept pace. Kathy Yelick, Director of the National Energy Research Scientific Computing Center at Berkeley Lab, describes the resulting chaos and the computing community's efforts to develop exciting applications that take advantage of tens or hundreds of processors on a single chip.
Postdoctoral Fellow | Center for Cancer Research
The Neuro-Oncology Branch (NOB), Center for Cancer Research (CCR), National Cancer Institute (NCI) of the National Institutes of Health (NIH) is seeking outstanding postdoctoral candidates interested in studying metabolic and cell signaling pathways in the context of brain cancers through construction of computational models amenable to formal computational analysis and simulation. The ability to closely collaborate with the modern metabolomics center developed at CCR provides a unique opportunity for a postdoctoral candidate with a strong theoretical background and interest in demonstrating the incredible potential of computational approaches to solve problems from scientific disciplines and improve lives. The candidate will be given the opportunity to both construct data-driven models, as well as biologically validate the models by demonstrating the ability to predict the effects of altering tumor metabolism in laboratory and clinical settings.
Computer networking at SLR stations
NASA Technical Reports Server (NTRS)
Novotny, Antonin
1993-01-01
There are several existing communication methods to deliver data from the satellite laser ranging (SLR) station to the SLR data center and back: telephonmodem, telex, and computer networks. The SLR scientific community has been exploiting mainly INTERNET, BITNET/EARN, and SPAN. The total of 56 countries are connected to INTERNET and the number of nodes is exponentially growing. The computer networks mentioned above and others are connected through E-mail protocol. The scientific progress of SLR requires the increase of communication speed and the amount of the transmitted data. The TOPEX/POSEIDON test campaign required to deliver Quick Look data (1.7 kB/pass) from a SLR site to SLR data center within 8 hours and full rate data (up to 500 kB/pass) within 24 hours. We developed networking for the remote SLR station in Helwan, Egypt. The reliable scheme for data delivery consists of: compression of MERIT2 format (up to 89 percent), encoding to ASCII Me (files); and e-mail sending from SLR station--e-mail receiving, decoding, and decompression at the center. We do propose to use the ZIP method for compression/decompression and the UUCODE method for ASCII encoding/decoding. This method will be useful for stations connected via telephonemodems or commercial networks. The electronics delivery could solve the problem of the too late receiving of the FR data by SLR data center.
Computer networking at SLR stations
NASA Astrophysics Data System (ADS)
Novotny, Antonin
1993-06-01
There are several existing communication methods to deliver data from the satellite laser ranging (SLR) station to the SLR data center and back: telephonmodem, telex, and computer networks. The SLR scientific community has been exploiting mainly INTERNET, BITNET/EARN, and SPAN. The total of 56 countries are connected to INTERNET and the number of nodes is exponentially growing. The computer networks mentioned above and others are connected through E-mail protocol. The scientific progress of SLR requires the increase of communication speed and the amount of the transmitted data. The TOPEX/POSEIDON test campaign required to deliver Quick Look data (1.7 kB/pass) from a SLR site to SLR data center within 8 hours and full rate data (up to 500 kB/pass) within 24 hours. We developed networking for the remote SLR station in Helwan, Egypt. The reliable scheme for data delivery consists of: compression of MERIT2 format (up to 89 percent), encoding to ASCII Me (files); and e-mail sending from SLR station--e-mail receiving, decoding, and decompression at the center. We do propose to use the ZIP method for compression/decompression and the UUCODE method for ASCII encoding/decoding. This method will be useful for stations connected via telephonemodems or commercial networks. The electronics delivery could solve the problem of the too late receiving of the FR data by SLR data center.
Accelerating Science with the NERSC Burst Buffer Early User Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhimji, Wahid; Bard, Debbie; Romanus, Melissa
NVRAM-based Burst Buffers are an important part of the emerging HPC storage landscape. The National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory recently installed one of the first Burst Buffer systems as part of its new Cori supercomputer, collaborating with Cray on the development of the DataWarp software. NERSC has a diverse user base comprised of over 6500 users in 700 different projects spanning a wide variety of scientific computing applications. The use-cases of the Burst Buffer at NERSC are therefore also considerable and diverse. We describe here performance measurements and lessons learned from the Burstmore » Buffer Early User Program at NERSC, which selected a number of research projects to gain early access to the Burst Buffer and exercise its capability to enable new scientific advancements. To the best of our knowledge this is the first time a Burst Buffer has been stressed at scale by diverse, real user workloads and therefore these lessons will be of considerable benefit to shaping the developing use of Burst Buffers at HPC centers.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bailey, David H.
The NAS Parallel Benchmarks (NPB) are a suite of parallel computer performance benchmarks. They were originally developed at the NASA Ames Research Center in 1991 to assess high-end parallel supercomputers. Although they are no longer used as widely as they once were for comparing high-end system performance, they continue to be studied and analyzed a great deal in the high-performance computing community. The acronym 'NAS' originally stood for the Numerical Aeronautical Simulation Program at NASA Ames. The name of this organization was subsequently changed to the Numerical Aerospace Simulation Program, and more recently to the NASA Advanced Supercomputing Center, althoughmore » the acronym remains 'NAS.' The developers of the original NPB suite were David H. Bailey, Eric Barszcz, John Barton, David Browning, Russell Carter, LeoDagum, Rod Fatoohi, Samuel Fineberg, Paul Frederickson, Thomas Lasinski, Rob Schreiber, Horst Simon, V. Venkatakrishnan and Sisira Weeratunga. The original NAS Parallel Benchmarks consisted of eight individual benchmark problems, each of which focused on some aspect of scientific computing. The principal focus was in computational aerophysics, although most of these benchmarks have much broader relevance, since in a much larger sense they are typical of many real-world scientific computing applications. The NPB suite grew out of the need for a more rational procedure to select new supercomputers for acquisition by NASA. The emergence of commercially available highly parallel computer systems in the late 1980s offered an attractive alternative to parallel vector supercomputers that had been the mainstay of high-end scientific computing. However, the introduction of highly parallel systems was accompanied by a regrettable level of hype, not only on the part of the commercial vendors but even, in some cases, by scientists using the systems. As a result, it was difficult to discern whether the new systems offered any fundamental performance advantage over vector supercomputers, and, if so, which of the parallel offerings would be most useful in real-world scientific computation. In part to draw attention to some of the performance reporting abuses prevalent at the time, the present author wrote a humorous essay 'Twelve Ways to Fool the Masses,' which described in a light-hearted way a number of the questionable ways in which both vendor marketing people and scientists were inflating and distorting their performance results. All of this underscored the need for an objective and scientifically defensible measure to compare performance on these systems.« less
The Magellan Final Report on Cloud Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
,; Coghlan, Susan; Yelick, Katherine
The goal of Magellan, a project funded through the U.S. Department of Energy (DOE) Office of Advanced Scientific Computing Research (ASCR), was to investigate the potential role of cloud computing in addressing the computing needs for the DOE Office of Science (SC), particularly related to serving the needs of mid- range computing and future data-intensive computing workloads. A set of research questions was formed to probe various aspects of cloud computing from performance, usability, and cost. To address these questions, a distributed testbed infrastructure was deployed at the Argonne Leadership Computing Facility (ALCF) and the National Energy Research Scientific Computingmore » Center (NERSC). The testbed was designed to be flexible and capable enough to explore a variety of computing models and hardware design points in order to understand the impact for various scientific applications. During the project, the testbed also served as a valuable resource to application scientists. Applications from a diverse set of projects such as MG-RAST (a metagenomics analysis server), the Joint Genome Institute, the STAR experiment at the Relativistic Heavy Ion Collider, and the Laser Interferometer Gravitational Wave Observatory (LIGO), were used by the Magellan project for benchmarking within the cloud, but the project teams were also able to accomplish important production science utilizing the Magellan cloud resources.« less
Multicore: Fallout From a Computing Evolution (LBNL Summer Lecture Series)
Yelick, Kathy [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). National Energy Research Scientific Computing Center (NERSC)
2018-05-07
Summer Lecture Series 2008: Parallel computing used to be reserved for big science and engineering projects, but in two years that's all changed. Even laptops and hand-helds use parallel processors. Unfortunately, the software hasn't kept pace. Kathy Yelick, Director of the National Energy Research Scientific Computing Center at Berkeley Lab, describes the resulting chaos and the computing community's efforts to develop exciting applications that take advantage of tens or hundreds of processors on a single chip.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keyes, D E; McGraw, J R
2006-02-02
Large-scale scientific computation and all of the disciplines that support and help validate it have been placed at the focus of Lawrence Livermore National Laboratory (LLNL) by the Advanced Simulation and Computing (ASC) program of the National Nuclear Security Administration (NNSA) and the Scientific Discovery through Advanced Computing (SciDAC) initiative of the Office of Science of the Department of Energy (DOE). The maturation of simulation as a fundamental tool of scientific and engineering research is underscored in the President's Information Technology Advisory Committee (PITAC) June 2005 finding that ''computational science has become critical to scientific leadership, economic competitiveness, and nationalmore » security''. LLNL operates several of the world's most powerful computers--including today's single most powerful--and has undertaken some of the largest and most compute-intensive simulations ever performed, most notably the molecular dynamics simulation that sustained more than 100 Teraflop/s and won the 2005 Gordon Bell Prize. Ultrascale simulation has been identified as one of the highest priorities in DOE's facilities planning for the next two decades. However, computers at architectural extremes are notoriously difficult to use in an efficient manner. Furthermore, each successful terascale simulation only points out the need for much better ways of interacting with the resulting avalanche of data. Advances in scientific computing research have, therefore, never been more vital to the core missions of LLNL than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, LLNL must engage researchers at many academic centers of excellence. In FY 2005, the Institute for Scientific Computing Research (ISCR) served as one of LLNL's main bridges to the academic community with a program of collaborative subcontracts, visiting faculty, student internships, workshops, and an active seminar series. The ISCR identifies researchers from the academic community for computer science and computational science collaborations with LLNL and hosts them for both brief and extended visits with the aim of encouraging long-term academic research agendas that address LLNL research priorities. Through these collaborations, ideas and software flow in both directions, and LLNL cultivates its future workforce. The Institute strives to be LLNL's ''eyes and ears'' in the computer and information sciences, keeping the Laboratory aware of and connected to important external advances. It also attempts to be the ''hands and feet'' that carry those advances into the Laboratory and incorporate them into practice. ISCR research participants are integrated into LLNL's Computing Applications and Research (CAR) Department, especially into its Center for Applied Scientific Computing (CASC). In turn, these organizations address computational challenges arising throughout the rest of the Laboratory. Administratively, the ISCR flourishes under LLNL's University Relations Program (URP). Together with the other four institutes of the URP, the ISCR navigates a course that allows LLNL to benefit from academic exchanges while preserving national security. While it is difficult to operate an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and worth the continued effort. The pages of this annual report summarize the activities of the faculty members, postdoctoral researchers, students, and guests from industry and other laboratories who participated in LLNL's computational mission under the auspices of the ISCR during FY 2005.« less
SDC DOCUMENTS APPLICABLE TO STATE AND LOCAL GOVERNMENT PROBLEMS.
Public administration , Urban and regional planning, The administration of justice, Bio-medical systems, Educational systems, Computer program systems, The development and management of computer-based systems, Information retrieval, Simulation. AD numbers are provided for those documents which can be obtained from the Defense Documentation Center or the Department of Commerce’s Clearinghouse for Federal Scientific and Technical Information.
USSR Report, Kommunist, No. 13, September 1986.
1987-01-07
all-union) program for specialization of NPO and industrial enterprises and their scientific research institutes and design bureaus could play a major...machine tools with numerical programming (ChPU), processing centers, automatic machines and groups of automatic machines controlled by computers, and...automatic lines, computer- controlled groups of equipment, comprehensively automated shops and sections) is the most important feature of high technical
77 FR 59933 - Center for Scientific Review; Notice of Closed Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-01
...; Biomedical Computing and Health Informatics Study Section. Date: October 11, 2012. Time: 12:00 p.m. to 4:00 p... funding cycle. (Catalogue of Federal Domestic Assistance Program Nos. 93.306, Comparative Medicine; 93.333...
NASA Technical Reports Server (NTRS)
Treinish, Lloyd A.; Gough, Michael L.; Wildenhain, W. David
1987-01-01
The capability was developed of rapidly producing visual representations of large, complex, multi-dimensional space and earth sciences data sets via the implementation of computer graphics modeling techniques on the Massively Parallel Processor (MPP) by employing techniques recently developed for typically non-scientific applications. Such capabilities can provide a new and valuable tool for the understanding of complex scientific data, and a new application of parallel computing via the MPP. A prototype system with such capabilities was developed and integrated into the National Space Science Data Center's (NSSDC) Pilot Climate Data System (PCDS) data-independent environment for computer graphics data display to provide easy access to users. While developing these capabilities, several problems had to be solved independently of the actual use of the MPP, all of which are outlined.
The Center for Computational Biology: resources, achievements, and challenges
Dinov, Ivo D; Thompson, Paul M; Woods, Roger P; Van Horn, John D; Shattuck, David W; Parker, D Stott
2011-01-01
The Center for Computational Biology (CCB) is a multidisciplinary program where biomedical scientists, engineers, and clinicians work jointly to combine modern mathematical and computational techniques, to perform phenotypic and genotypic studies of biological structure, function, and physiology in health and disease. CCB has developed a computational framework built around the Manifold Atlas, an integrated biomedical computing environment that enables statistical inference on biological manifolds. These manifolds model biological structures, features, shapes, and flows, and support sophisticated morphometric and statistical analyses. The Manifold Atlas includes tools, workflows, and services for multimodal population-based modeling and analysis of biological manifolds. The broad spectrum of biomedical topics explored by CCB investigators include the study of normal and pathological brain development, maturation and aging, discovery of associations between neuroimaging and genetic biomarkers, and the modeling, analysis, and visualization of biological shape, form, and size. CCB supports a wide range of short-term and long-term collaborations with outside investigators, which drive the center's computational developments and focus the validation and dissemination of CCB resources to new areas and scientific domains. PMID:22081221
The Center for Computational Biology: resources, achievements, and challenges.
Toga, Arthur W; Dinov, Ivo D; Thompson, Paul M; Woods, Roger P; Van Horn, John D; Shattuck, David W; Parker, D Stott
2012-01-01
The Center for Computational Biology (CCB) is a multidisciplinary program where biomedical scientists, engineers, and clinicians work jointly to combine modern mathematical and computational techniques, to perform phenotypic and genotypic studies of biological structure, function, and physiology in health and disease. CCB has developed a computational framework built around the Manifold Atlas, an integrated biomedical computing environment that enables statistical inference on biological manifolds. These manifolds model biological structures, features, shapes, and flows, and support sophisticated morphometric and statistical analyses. The Manifold Atlas includes tools, workflows, and services for multimodal population-based modeling and analysis of biological manifolds. The broad spectrum of biomedical topics explored by CCB investigators include the study of normal and pathological brain development, maturation and aging, discovery of associations between neuroimaging and genetic biomarkers, and the modeling, analysis, and visualization of biological shape, form, and size. CCB supports a wide range of short-term and long-term collaborations with outside investigators, which drive the center's computational developments and focus the validation and dissemination of CCB resources to new areas and scientific domains.
Scientific and technical information output of the Langley Research Center for Calender year 1981
NASA Technical Reports Server (NTRS)
1982-01-01
Included are citations for formal reports, quick-release technical memorandums, contractor reports, journal articles and periodical literature, meeting/conference papers, and computer programs. Tech briefs, patents, and oral presentations to conferences/workshops are also included.
ODISEES Availability and Feedback Request
Atmospheric Science Data Center
2014-09-06
... As a follow-up Action from the Atmospheric Science Data Center (ASDC) User Working Group (UWG) held on 24-25 June, we are ... for a common language to describe scientific terms so that a computer can scour the internet, automatically discover relevant information ...
76 FR 64359 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-18
... Brain, Neurotransmission and Aging Special Emphasis Panel. Date: November 1, 2011. Time: 8 a.m. to 5 p.m...: Cell, Computational and Molecular Biology. Date: November 9, 2011. Time: 10 a.m. to 6 p.m. Agenda: To...
Reduced-Order Modeling for Optimization and Control of Complex Flows
2010-11-30
Statistics Colloquium, Auburn, AL, (January 2009). 16. University of Pittsburgh, Mathematics Colloquium, Pittsburgh, PA, (February 2009). 17. Goethe ...Center for Scientific Computing, Goethe University Frankfurt am Main, Ger- many, (June 2009). 18. Air Force Institute of Technology, Wright-Patterson
77 FR 57571 - Center For Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-18
...: Genes, Genomes, and Genetics Integrated Review Group; Genomics, Computational Biology and Technology... Reproductive Sciences Integrated Review Group; Cellular, Molecular and Integrative Reproduction Study Section...: Immunology Integrated Review Group; Cellular and Molecular Immunology--B Study Section. [[Page 57572
Scientific work environments in the next decade
NASA Technical Reports Server (NTRS)
Gomez, Julian E.
1989-01-01
The applications of contemporary computer graphics to scientific visualization is described, with emphasis on the nonintuitive problems. A radically different approach is proposed which centers on the idea of the scientist being in the simulation display space rather than observing it on a screen. Interaction is performed with nonstandard input devices to preserve the feeling of being immersed in the three-dimensional display space. Construction of such a system could begin now with currently available technology.
77 FR 59938 - Center for Scientific Review Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-01
... Panel; Program Project: Drug Addiction. Date: October 30-31, 2012. Time: 8:00 a.m. to 8:30 p.m. Agenda... Biomedical Computational Science and Technology Initiative. Date: October 30, 2012. Time: 3:00 p.m. to 4:00 p...
Scientific programming and high performance computing Research Interests Wind and solar resource assessment , Department of Geography and Environmental Sciences, Denver, CO Research Assistant, National Center for Atmospheric Research (NCAR), Boulder, CO Graduate Instructor and Research Assistant, University of Colorado
Network and computing infrastructure for scientific applications in Georgia
NASA Astrophysics Data System (ADS)
Kvatadze, R.; Modebadze, Z.
2016-09-01
Status of network and computing infrastructure and available services for research and education community of Georgia are presented. Research and Educational Networking Association - GRENA provides the following network services: Internet connectivity, network services, cyber security, technical support, etc. Computing resources used by the research teams are located at GRENA and at major state universities. GE-01-GRENA site is included in European Grid infrastructure. Paper also contains information about programs of Learning Center and research and development projects in which GRENA is participating.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Houston, Johnny L; Geter, Kerry
This Project?s third year of implementation in 2007-2008, the final year, as designated by Elizabeth City State University (ECSU), in cooperation with the National Association of Mathematicians (NAM) Inc., in an effort to promote research and research training programs in computational science ? scientific visualization (CSSV). A major goal of the Project was to attract the energetic and productive faculty, graduate and upper division undergraduate students of diverse ethnicities to a program that investigates science and computational science issues of long-term interest to the Department of Energy (DoE) and the nation. The breadth and depth of computational science?scientific visualization andmore » the magnitude of resources available are enormous for permitting a variety of research activities. ECSU?s Computational Science-Science Visualization Center will serve as a conduit for directing users to these enormous resources.« less
NASA Astrophysics Data System (ADS)
Samios, Nicholas
2014-09-01
Since its inception in 1997, the RIKEN BNL Research Center (RBRC) has been a major force in the realms of Spin Physics, Relativistic Heavy Ion Physics, large scale Computing Physics and the training of a new generation of extremely talented physicists. This has been accomplished through the recruitment of an outstanding non-permanent staff of Fellows and Research associates in theory and experiment. RBRC is now a mature organization that has reached a steady level in the size of scientific and support staff while at the same time retaining its vibrant youth. A brief history of the scientific accomplishments and contributions of the RBRC physicists will be presented as well as a discussion of the unique RBRC management structure.
Contention Bounds for Combinations of Computation Graphs and Network Topologies
2014-08-08
member of STARnet, a Semiconductor Research Corporation program sponsored by MARCO and DARPA, and ASPIRE Lab industrial sponsors and affiliates Intel...Google, Nokia, NVIDIA , Oracle, MathWorks and Samsung. Also funded by U.S. DOE Office of Science, Office of Advanced Scientific Computing Research...DARPA Award Number HR0011-12-2- 0016, the Center for Future Architecture Research, a mem- ber of STARnet, a Semiconductor Research Corporation
Ammonia Oxidation by Abstraction of Three Hydrogen Atoms from a Mo–NH 3 Complex
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhattacharya, Papri; Heiden, Zachariah M.; Wiedner, Eric S.
We report ammonia oxidation by homolytic cleavage of all three H atoms from a Mo-15NH3 complex using the 2,4,6-tri-tert-butylphenoxyl radical to afford a Mo-alkylimido (Mo=15NR) complex (R = 2,4,6-tri-t-butylcyclohexa-2,5-dien-1-one). Reductive cleavage of Mo=15NR generates a terminal Mo≡N nitride, and a [Mo-15NH]+ complex is formed by protonation. Computational analysis describes the energetic profile for the stepwise removal of three H atoms from the Mo-15NH3 complex and the formation of Mo=15NR. Acknowledgment. This work was supported as part of the Center for Molecular Electrocatalysis, an Energy Frontier Re-search Center funded by the U.S. Department of Energy (U.S. DOE), Office of Science, Officemore » of Basic Energy Sciences. EPR and mass spectrometry experiments were performed using EMSL, a national scientific user facility sponsored by the DOE’s Office of Biological and Environmental Research and located at PNNL. The authors thank Dr. Eric D. Walter and Dr. Rosalie Chu for assistance in performing EPR and mass spectroscopy analysis, respectively. Computational resources provided by the National Energy Re-search Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory. Pacific North-west National Laboratory is operated by Battelle for the U.S. DOE.« less
Scientific and technical information output of the Langley Research Center for calendar year 1982
NASA Technical Reports Server (NTRS)
1983-01-01
Citations are presented for 1380 for formal reports; quick-release technical memorandum; contractor reports; journal articles and periodical literature; technical talks and meeting presentations; computer programs; tech briefs, and patents produced during 1982. An author index is provided.
A BIOINFORMATIC STRATEGY TO RAPIDLY CHARACTERIZE CDNA LIBRARIES
A Bioinformatic Strategy to Rapidly Characterize cDNA Libraries
G. Charles Ostermeier1, David J. Dix2 and Stephen A. Krawetz1.
1Departments of Obstetrics and Gynecology, Center for Molecular Medicine and Genetics, & Institute for Scientific Computing, Wayne State Univer...
SPERMATOZOAL RNA PROFILES OF NORMAL FERTILE MEN
What Constitutes the Normal Fertile Male?
G. Charles Ostermeier1, David J. Dix2, David Miller3, Purvesh Khatri4, and Stephen A. Krawetz1.
1Departments of Obstetrics and Gynaecology, Center for Molecular Medicine and Genetics, & Institute for Scientific Computing, Wa...
DOE Office of Scientific and Technical Information (OSTI.GOV)
National Energy Research Supercomputing Center; He, Yun; Kramer, William T.C.
2008-05-07
The newest workhorse of the National Energy Research Scientific Computing Center is a Cray XT4 with 9,736 dual core nodes. This paper summarizes Franklin user experiences from friendly early user period to production period. Selected successful user stories along with top issues affecting user experiences are presented.
LaRC local area networks to support distributed computing
NASA Technical Reports Server (NTRS)
Riddle, E. P.
1984-01-01
The Langley Research Center's (LaRC) Local Area Network (LAN) effort is discussed. LaRC initiated the development of a LAN to support a growing distributed computing environment at the Center. The purpose of the network is to provide an improved capability (over inteactive and RJE terminal access) for sharing multivendor computer resources. Specifically, the network will provide a data highway for the transfer of files between mainframe computers, minicomputers, work stations, and personal computers. An important influence on the overall network design was the vital need of LaRC researchers to efficiently utilize the large CDC mainframe computers in the central scientific computing facility. Although there was a steady migration from a centralized to a distributed computing environment at LaRC in recent years, the work load on the central resources increased. Major emphasis in the network design was on communication with the central resources within the distributed environment. The network to be implemented will allow researchers to utilize the central resources, distributed minicomputers, work stations, and personal computers to obtain the proper level of computing power to efficiently perform their jobs.
File-System Workload on a Scientific Multiprocessor
NASA Technical Reports Server (NTRS)
Kotz, David; Nieuwejaar, Nils
1995-01-01
Many scientific applications have intense computational and I/O requirements. Although multiprocessors have permitted astounding increases in computational performance, the formidable I/O needs of these applications cannot be met by current multiprocessors a their I/O subsystems. To prevent I/O subsystems from forever bottlenecking multiprocessors and limiting the range of feasible applications, new I/O subsystems must be designed. The successful design of computer systems (both hardware and software) depends on a thorough understanding of their intended use. A system designer optimizes the policies and mechanisms for the cases expected to most common in the user's workload. In the case of multiprocessor file systems, however, designers have been forced to build file systems based only on speculation about how they would be used, extrapolating from file-system characterizations of general-purpose workloads on uniprocessor and distributed systems or scientific workloads on vector supercomputers (see sidebar on related work). To help these system designers, in June 1993 we began the Charisma Project, so named because the project sought to characterize 1/0 in scientific multiprocessor applications from a variety of production parallel computing platforms and sites. The Charisma project is unique in recording individual read and write requests-in live, multiprogramming, parallel workloads (rather than from selected or nonparallel applications). In this article, we present the first results from the project: a characterization of the file-system workload an iPSC/860 multiprocessor running production, parallel scientific applications at NASA's Ames Research Center.
NASA Technical Reports Server (NTRS)
Matthews, Christine G.; Posenau, Mary-Anne; Leonard, Desiree M.; Avis, Elizabeth L.; Debure, Kelly R.; Stacy, Kathryn; Vonofenheim, Bill
1992-01-01
The intent is to provide an introduction to the image processing capabilities available at the Langley Research Center (LaRC) Central Scientific Computing Complex (CSCC). Various image processing software components are described. Information is given concerning the use of these components in the Data Visualization and Animation Laboratory at LaRC.
Scientific Visualization in High Speed Network Environments
NASA Technical Reports Server (NTRS)
Vaziri, Arsi; Kutler, Paul (Technical Monitor)
1997-01-01
In several cases, new visualization techniques have vastly increased the researcher's ability to analyze and comprehend data. Similarly, the role of networks in providing an efficient supercomputing environment have become more critical and continue to grow at a faster rate than the increase in the processing capabilities of supercomputers. A close relationship between scientific visualization and high-speed networks in providing an important link to support efficient supercomputing is identified. The two technologies are driven by the increasing complexities and volume of supercomputer data. The interaction of scientific visualization and high-speed networks in a Computational Fluid Dynamics simulation/visualization environment are given. Current capabilities supported by high speed networks, supercomputers, and high-performance graphics workstations at the Numerical Aerodynamic Simulation Facility (NAS) at NASA Ames Research Center are described. Applied research in providing a supercomputer visualization environment to support future computational requirements are summarized.
NASA Astrophysics Data System (ADS)
Ellins, K. K.; Eriksson, S. C.; Samsel, F.; Lavier, L.
2017-12-01
A new undergraduate, upper level geoscience course was developed and taught by faculty and staff of the UT Austin Jackson School of Geosciences, the Center for Agile Technology, and the Texas Advanced Computational Center. The course examined the role of the visual arts in placing the scientific process and knowledge in a broader context and introduced students to innovations in the visual arts that promote scientific investigation through collaboration between geoscientists and artists. The course addressed (1) the role of the visual arts in teaching geoscience concepts and promoting geoscience learning; (2) the application of innovative visualization and artistic techniques to large volumes of geoscience data to enhance scientific understanding and to move scientific investigation forward; and (3) the illustrative power of art to communicate geoscience to the public. In-class activities and discussions, computer lab instruction on the application of Paraview software, reading assignments, lectures, and group projects with presentations comprised the two-credit, semester-long "special topics" course, which was taken by geoscience, computer science, and engineering students. Assessment of student learning was carried out by the instructors and course evaluation was done by an external evaluator using rubrics, likert-scale surveys and focus goups. The course achieved its goals of students' learning the concepts and techniques of the visual arts. The final projects demonstrated this, along with the communication of geologic concepts using what they had learned in the course. The basic skill of sketching for learning and using best practices in visual communication were used extensively and, in most cases, very effectively. The use of an advanced visualization tool, Paraview, was received with mixed reviews because of the lack of time to really learn the tool and the fact that it is not a tool used routinely in geoscience. Those senior students with advanced computer skills saw the importance of this tool. Students worked in teams, more or less effectively, and made suggestions for improving future offerings of the course.
Modeling Subsurface Reactive Flows Using Leadership-Class Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mills, Richard T; Hammond, Glenn; Lichtner, Peter
2009-01-01
We describe our experiences running PFLOTRAN - a code for simulation of coupled hydro-thermal-chemical processes in variably saturated, non-isothermal, porous media - on leadership-class supercomputers, including initial experiences running on the petaflop incarnation of Jaguar, the Cray XT5 at the National Center for Computational Sciences at Oak Ridge National Laboratory. PFLOTRAN utilizes fully implicit time-stepping and is built on top of the Portable, Extensible Toolkit for Scientific Computation (PETSc). We discuss some of the hurdles to 'at scale' performance with PFLOTRAN and the progress we have made in overcoming them on leadership-class computer architectures.
1977-08-24
exceeded a million rubles. POLAND SOME METHODOLOGICAL REMARKS RELATING TO THE FORECASTING MODEL OF COMPUTER DEVELOPMENT Warsaw INFORMATYKA in...PROCESSING SYSTEMS Warsaw INFORMATYKA in Polish Vol 11 No 10, Oct 76 pp 19-20 SEKULA, ZOFIA, Wroclaw [Abstract] The author presents critical remarks...TO ODRA 1300 SYSTEM Warsaw INFORMATYKA in Polish Vol 11 No 9, Sep 76 pp 1-4 BZDULA, CZESLAW, Research and Development Center of MERA-ELWRO Digital
NASA Technical Reports Server (NTRS)
Pratt, Terrence W.
1987-01-01
PISCES 2 is a programming environment and set of extensions to Fortran 77 for parallel programming. It is intended to provide a basis for writing programs for scientific and engineering applications on parallel computers in a way that is relatively independent of the particular details of the underlying computer architecture. This user's manual provides a complete description of the PISCES 2 system as it is currently implemented on the 20 processor Flexible FLEX/32 at NASA Langley Research Center.
Mass storage system experiences and future needs at the National Center for Atmospheric Research
NASA Technical Reports Server (NTRS)
Olear, Bernard T.
1992-01-01
This presentation is designed to relate some of the experiences of the Scientific Computing Division at NCAR dealing with the 'data problem'. A brief history and a development of some basic Mass Storage System (MSS) principles are given. An attempt is made to show how these principles apply to the integration of various components into NCAR's MSS. There is discussion of future MSS needs for future computing environments.
Harnessing the power of emerging petascale platforms
NASA Astrophysics Data System (ADS)
Mellor-Crummey, John
2007-07-01
As part of the US Department of Energy's Scientific Discovery through Advanced Computing (SciDAC-2) program, science teams are tackling problems that require computational simulation and modeling at the petascale. A grand challenge for computer science is to develop software technology that makes it easier to harness the power of these systems to aid scientific discovery. As part of its activities, the SciDAC-2 Center for Scalable Application Development Software (CScADS) is building open source software tools to support efficient scientific computing on the emerging leadership-class platforms. In this paper, we describe two tools for performance analysis and tuning that are being developed as part of CScADS: a tool for analyzing scalability and performance, and a tool for optimizing loop nests for better node performance. We motivate these tools by showing how they apply to S3D, a turbulent combustion code under development at Sandia National Laboratory. For S3D, our node performance analysis tool helped uncover several performance bottlenecks. Using our loop nest optimization tool, we transformed S3D's most costly loop nest to reduce execution time by a factor of 2.94 for a processor working on a 503 domain.
Visions of the Future - the Changing Role of Actors in Data-Intensive Science
NASA Astrophysics Data System (ADS)
Schäfer, L.; Klump, J. F.
2013-12-01
Around the world scientific disciplines are increasingly facing the challenge of a burgeoning volume of research data. This data avalanche consists of a stream of information generated from sensors and scientific instruments, digital recordings, social-science surveys or drawn from the World Wide Web. All areas of the scientific economy are affected by this rapid growth in data, from the logging of digs in Archaeology, telescope data with observations of distant galaxies in Astrophysics or data from polls and surveys in the Social Sciences. The challenge for science is not only to process the data through analysis, reduction and visualization, but also to set up infrastructures for provisioning and storing the data. The rise of new technologies and developments also poses new challenges for the actors in the area of research data infrastructures. Libraries, as one of the actors, enable access to digital media and support the publication of research data and its long-term archiving. Digital media and research data, however, introduce new aspects into the libraries' range of activities. How are we to imagine the library of the future? The library as an interface to the computer centers? Will library and computer center fuse into a new service unit? What role will scientific publishers play in future? Currently the traditional form of publication still carry greater weight - articles for conferences and journals. But will this still be the case in future? New forms of publication are already making their presence felt. The tasks of the computer centers may also change. Yesterday their remit was provisioning of rapid hardware, whereas now everything revolves around the topic of data and services. Finally, how about the researchers themselves? Not such a long time ago, Geoscience was not necessarily seen as linked to Computer Science. Nowadays, modern Geoscience relies heavily on IT and its techniques. Thus, in how far will the profile of the modern geoscientist change? This gives rise to the question of what tools are required to locate and pursue the correct course in a networked world. One tool from the area of innovation management is the scenario technique. This poster will outline visions of the future as possible developments of the scientific world in 2020 (or later). The scenarios presented will show possible developments - both positive and negative. It is up then to the actors themselves to define their own position in this context, to rethink it and consider steps that can achieve a positive development for the future.
Template Interfaces for Agile Parallel Data-Intensive Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramakrishnan, Lavanya; Gunter, Daniel; Pastorello, Gilerto Z.
Tigres provides a programming library to compose and execute large-scale data-intensive scientific workflows from desktops to supercomputers. DOE User Facilities and large science collaborations are increasingly generating large enough data sets that it is no longer practical to download them to a desktop to operate on them. They are instead stored at centralized compute and storage resources such as high performance computing (HPC) centers. Analysis of this data requires an ability to run on these facilities, but with current technologies, scaling an analysis to an HPC center and to a large data set is difficult even for experts. Tigres ismore » addressing the challenge of enabling collaborative analysis of DOE Science data through a new concept of reusable "templates" that enable scientists to easily compose, run and manage collaborative computational tasks. These templates define common computation patterns used in analyzing a data set.« less
Utilizing the Web in the Classroom: Linking Student Scientists with Professional Data.
ERIC Educational Resources Information Center
Seitz, Kristine; Leake, Devin
1999-01-01
Describes how information gathered from a computer database can be used as a springboard to scientific discovery. Specifies directions for studying the homeobox gene PAX-6 using GenBank, a database maintained by the National Center for Biotechnology Information (NCBI). Contains 16 references. (WRM)
National Energy Research Scientific Computing Center
Overview NERSC Mission Contact us Staff Org Chart NERSC History NERSC Stakeholders Usage and User HPC Requirements Reviews NERSC HPC Achievement Awards User Submitted Research Citations NERSC User data archive NERSC Resources Table For Users Live Status User Announcements My NERSC Getting Started
75 FR 1064 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-08
... 20892, 301-435- 1033, [email protected] . Name of Committee: Genes, Genomes, and Genetics Integrated Review Group; Molecular Genetics B Study Section. Date: February 3-4, 2010. Time: 7 p.m. to 6 p.m. Agenda... Committee: Genes, Genomes, and Genetics Integrated Review Group; Genomics, Computational Biology and...
Kidspiration[R] for Inquiry-Centered Activities
ERIC Educational Resources Information Center
Shaw, Edward L., Jr.; Baggett, Paige V.; Salyer, Barbara
2004-01-01
Computer technology can be integrated into science inquiry activities to increase student motivation and enhance and expand scientific thinking. Fifth-grade students used the visual thinking tools in the Kidspiration[R] software program to generate and represent a web of hypotheses around the question, "What affects the distance a marble rolls?"…
Three Essays on the Economics of Information Systems
ERIC Educational Resources Information Center
Jian, Lian
2010-01-01
My dissertation contains three studies centering on the question: how to motivate people to share high quality information on online information aggregation systems, also known as social computing systems? I take a social scientific approach to "identify" the strategic behavior of individuals in information systems, and "analyze" how non-monetary…
NASA Technical Reports Server (NTRS)
2004-01-01
Since its founding in 1992, Global Science & Technology, Inc. (GST), of Greenbelt, Maryland, has been developing technologies and providing services in support of NASA scientific research. GST specialties include scientific analysis, science data and information systems, data visualization, communications, networking and Web technologies, computer science, and software system engineering. As a longtime contractor to Goddard Space Flight Center s Earth Science Directorate, GST scientific, engineering, and information technology staff have extensive qualifications with the synthesis of satellite, in situ, and Earth science data for weather- and climate-related projects. GST s experience in this arena is end-to-end, from building satellite ground receiving systems and science data systems, to product generation and research and analysis.
Data management and analysis for the Earth System Grid
NASA Astrophysics Data System (ADS)
Williams, D. N.; Ananthakrishnan, R.; Bernholdt, D. E.; Bharathi, S.; Brown, D.; Chen, M.; Chervenak, A. L.; Cinquini, L.; Drach, R.; Foster, I. T.; Fox, P.; Hankin, S.; Henson, V. E.; Jones, P.; Middleton, D. E.; Schwidder, J.; Schweitzer, R.; Schuler, R.; Shoshani, A.; Siebenlist, F.; Sim, A.; Strand, W. G.; Wilhelmi, N.; Su, M.
2008-07-01
The international climate community is expected to generate hundreds of petabytes of simulation data within the next five to seven years. This data must be accessed and analyzed by thousands of analysts worldwide in order to provide accurate and timely estimates of the likely impact of climate change on physical, biological, and human systems. Climate change is thus not only a scientific challenge of the first order but also a major technological challenge. In order to address this technological challenge, the Earth System Grid Center for Enabling Technologies (ESG-CET) has been established within the U.S. Department of Energy's Scientific Discovery through Advanced Computing (SciDAC)-2 program, with support from the offices of Advanced Scientific Computing Research and Biological and Environmental Research. ESG-CET's mission is to provide climate researchers worldwide with access to the data, information, models, analysis tools, and computational capabilities required to make sense of enormous climate simulation datasets. Its specific goals are to (1) make data more useful to climate researchers by developing Grid technology that enhances data usability; (2) meet specific distributed database, data access, and data movement needs of national and international climate projects; (3) provide a universal and secure web-based data access portal for broad multi-model data collections; and (4) provide a wide-range of Grid-enabled climate data analysis tools and diagnostic methods to international climate centers and U.S. government agencies. Building on the successes of the previous Earth System Grid (ESG) project, which has enabled thousands of researchers to access tens of terabytes of data from a small number of ESG sites, ESG-CET is working to integrate a far larger number of distributed data providers, high-bandwidth wide-area networks, and remote computers in a highly collaborative problem-solving environment.
Computer Science Research at Langley
NASA Technical Reports Server (NTRS)
Voigt, S. J. (Editor)
1982-01-01
A workshop was held at Langley Research Center, November 2-5, 1981, to highlight ongoing computer science research at Langley and to identify additional areas of research based upon the computer user requirements. A panel discussion was held in each of nine application areas, and these are summarized in the proceedings. Slides presented by the invited speakers are also included. A survey of scientific, business, data reduction, and microprocessor computer users helped identify areas of focus for the workshop. Several areas of computer science which are of most concern to the Langley computer users were identified during the workshop discussions. These include graphics, distributed processing, programmer support systems and tools, database management, and numerical methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerber, Richard; Allcock, William; Beggio, Chris
2014-10-17
U.S. Department of Energy (DOE) High Performance Computing (HPC) facilities are on the verge of a paradigm shift in the way they deliver systems and services to science and engineering teams. Research projects are producing a wide variety of data at unprecedented scale and level of complexity, with community-specific services that are part of the data collection and analysis workflow. On June 18-19, 2014 representatives from six DOE HPC centers met in Oakland, CA at the DOE High Performance Operational Review (HPCOR) to discuss how they can best provide facilities and services to enable large-scale data-driven scientific discovery at themore » DOE national laboratories. The report contains findings from that review.« less
Next Generation Workload Management System For Big Data on Heterogeneous Distributed Computing
NASA Astrophysics Data System (ADS)
Klimentov, A.; Buncic, P.; De, K.; Jha, S.; Maeno, T.; Mount, R.; Nilsson, P.; Oleynik, D.; Panitkin, S.; Petrosyan, A.; Porter, R. J.; Read, K. F.; Vaniachine, A.; Wells, J. C.; Wenaus, T.
2015-05-01
The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS and ALICE are the largest collaborations ever assembled in the sciences and are at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, both experiments rely on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Management System (WMS) for managing the workflow for all data processing on hundreds of data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. The scale is demonstrated by the following numbers: PanDA manages O(102) sites, O(105) cores, O(108) jobs per year, O(103) users, and ATLAS data volume is O(1017) bytes. In 2013 we started an ambitious program to expand PanDA to all available computing resources, including opportunistic use of commercial and academic clouds and Leadership Computing Facilities (LCF). The project titled ‘Next Generation Workload Management and Analysis System for Big Data’ (BigPanDA) is funded by DOE ASCR and HEP. Extending PanDA to clouds and LCF presents new challenges in managing heterogeneity and supporting workflow. The BigPanDA project is underway to setup and tailor PanDA at the Oak Ridge Leadership Computing Facility (OLCF) and at the National Research Center "Kurchatov Institute" together with ALICE distributed computing and ORNL computing professionals. Our approach to integration of HPC platforms at the OLCF and elsewhere is to reuse, as much as possible, existing components of the PanDA system. We will present our current accomplishments with running the PanDA WMS at OLCF and other supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facilities infrastructure for High Energy and Nuclear Physics as well as other data-intensive science applications.
Discovery of the Kalman filter as a practical tool for aerospace and industry
NASA Technical Reports Server (NTRS)
Mcgee, L. A.; Schmidt, S. F.
1985-01-01
The sequence of events which led the researchers at Ames Research Center to the early discovery of the Kalman filter shortly after its introduction into the literature is recounted. The scientific breakthroughs and reformulations that were necessary to transform Kalman's work into a useful tool for a specific aerospace application are described. The resulting extended Kalman filter, as it is now known, is often still referred to simply as the Kalman filter. As the filter's use gained in popularity in the scientific community, the problems of implementation on small spaceborne and airborne computers led to a square-root formulation of the filter to overcome numerical difficulties associated with computer word length. The work that led to this new formulation is also discussed, including the first airborne computer implementation and flight test. Since then the applications of the extended and square-root formulations of the Kalman filter have grown rapidly throughout the aerospace industry.
NASA Technical Reports Server (NTRS)
Bennett, Jerome (Technical Monitor)
2002-01-01
The NASA Center for Computational Sciences (NCCS) is a high-performance scientific computing facility operated, maintained and managed by the Earth and Space Data Computing Division (ESDCD) of NASA Goddard Space Flight Center's (GSFC) Earth Sciences Directorate. The mission of the NCCS is to advance leading-edge science by providing the best people, computers, and data storage systems to NASA's Earth and space sciences programs and those of other U.S. Government agencies, universities, and private institutions. Among the many computationally demanding Earth science research efforts supported by the NCCS in Fiscal Year 1999 (FY99) are the NASA Seasonal-to-Interannual Prediction Project, the NASA Search and Rescue Mission, Earth gravitational model development efforts, the National Weather Service's North American Observing System program, Data Assimilation Office studies, a NASA-sponsored project at the Center for Ocean-Land-Atmosphere Studies, a NASA-sponsored microgravity project conducted by researchers at the City University of New York and the University of Pennsylvania, the completion of a satellite-derived global climate data set, simulations of a new geodynamo model, and studies of Earth's torque. This document presents highlights of these research efforts and an overview of the NCCS, its facilities, and its people.
Proceedings of RIKEN BNL Research Center Workshop
DOE Office of Scientific and Technical Information (OSTI.GOV)
Samios, Nicholas P.
The twelfth evaluation of the RIKEN BNL Research Center (RBRC) took place on November 6 – 8, 2012 at Brookhaven National Laboratory. The members of the Scientific Review Committee (SRC), present at the meeting, were: Prof. Wit Busza, Prof. Miklos Gyulassy, Prof. Kenichi Imai, Prof. Richard Milner (Chair), Prof. Alfred Mueller, Prof. Charles Young Prescott, and Prof. Akira Ukawa. We are pleased that Dr. Hideto En’yo, the Director of the Nishina Institute of RIKEN, Japan, participated in this meeting both in informing the committee of the activities of the RIKEN Nishina Center for Accelerator- Based Science and the role ofmore » RBRC and as an observer of this review. In order to illustrate the breadth and scope of the RBRC program, each member of the Center made a presentation on his/her research efforts. This encompassed three major areas of investigation: theoretical, experimental and computational physics. In addition, the committee met privately with the fellows and postdocs to ascertain their opinions and concerns. Although the main purpose of this review is a report to RIKEN management on the health, scientific value, management and future prospects of the Center, the RBRC management felt that a compendium of the scientific presentations are of sufficient quality and interest that they warrant a wider distribution. Therefore we have made this compilation and present it to the community for its information and enlightenment.« less
NASA Astrophysics Data System (ADS)
Landgrebe, Anton J.
1987-03-01
An overview of research activities at the United Technologies Research Center (UTRC) in the area of Computational Fluid Dynamics (CFD) is presented. The requirement and use of various levels of computers, including supercomputers, for the CFD activities is described. Examples of CFD directed toward applications to helicopters, turbomachinery, heat exchangers, and the National Aerospace Plane are included. Helicopter rotor codes for the prediction of rotor and fuselage flow fields and airloads were developed with emphasis on rotor wake modeling. Airflow and airload predictions and comparisons with experimental data are presented. Examples are presented of recent parabolized Navier-Stokes and full Navier-Stokes solutions for hypersonic shock-wave/boundary layer interaction, and hydrogen/air supersonic combustion. In addition, other examples of CFD efforts in turbomachinery Navier-Stokes methodology and separated flow modeling are presented. A brief discussion of the 3-tier scientific computing environment is also presented, in which the researcher has access to workstations, mid-size computers, and supercomputers.
NASA Technical Reports Server (NTRS)
Landgrebe, Anton J.
1987-01-01
An overview of research activities at the United Technologies Research Center (UTRC) in the area of Computational Fluid Dynamics (CFD) is presented. The requirement and use of various levels of computers, including supercomputers, for the CFD activities is described. Examples of CFD directed toward applications to helicopters, turbomachinery, heat exchangers, and the National Aerospace Plane are included. Helicopter rotor codes for the prediction of rotor and fuselage flow fields and airloads were developed with emphasis on rotor wake modeling. Airflow and airload predictions and comparisons with experimental data are presented. Examples are presented of recent parabolized Navier-Stokes and full Navier-Stokes solutions for hypersonic shock-wave/boundary layer interaction, and hydrogen/air supersonic combustion. In addition, other examples of CFD efforts in turbomachinery Navier-Stokes methodology and separated flow modeling are presented. A brief discussion of the 3-tier scientific computing environment is also presented, in which the researcher has access to workstations, mid-size computers, and supercomputers.
Enabling campus grids with open science grid technology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weitzel, Derek; Bockelman, Brian; Swanson, David
2011-01-01
The Open Science Grid is a recognized key component of the US national cyber-infrastructure enabling scientific discovery through advanced high throughput computing. The principles and techniques that underlie the Open Science Grid can also be applied to Campus Grids since many of the requirements are the same, even if the implementation technologies differ. We find five requirements for a campus grid: trust relationships, job submission, resource independence, accounting, and data management. The Holland Computing Center's campus grid at the University of Nebraska-Lincoln was designed to fulfill the requirements of a campus grid. A bridging daemon was designed to bring non-Condormore » clusters into a grid managed by Condor. Condor features which make it possible to bridge Condor sites into a multi-campus grid have been exploited at the Holland Computing Center as well.« less
The NOAA Scientific Computing System Data Assembly Center
NASA Astrophysics Data System (ADS)
Suchdeve, K. L.; Smith, S. R.; Van Waes, M.
2016-02-01
The Scientific Computing System (SCS) Data Assembly Center (DAC) was established in 2014 by the Office of Marine and Aviation Operations (OMAO) to evaluate the quality of full-resolution (sampling on the order of once per second) data collected by SCS onboard NOAA-operated research vessels. The SCS data are nominally transferred from the vessel to the National Centers for Environmental Information (NCEI) soon after the completion of each cruise and are complimented with detailed cruise metadata from OMAO. The authors will describe tools developed by the SCS DAC to monitor the timeliness of SCS data delivery to NCEI and the completeness of the SCS packages received by NCEI (ensuring the package contains data for all enabled sensors on a given cruise). Feedback to OMAO and NCEI regarding the timeliness and data completeness will be outlined along with challenges encountered by the DAC as it works to develop automated quality assessment of the SCS data packages.Data collected by SCS on NOAA vessels represent a significant investment by the American taxpayer. The mission of the SCS DAC is to ensure that archived SCS data at NCEI are a complete record of the observations made on NOAA research cruises. Archival of complete SCS datasets at NCEI ensures these data are preserved for future generations of scientists, policy makers, and the public.
Using Generic and Context-Specific Scaffolding to Support Authentic Science Inquiry
ERIC Educational Resources Information Center
Belland, Brian R.; Gu, Jiangyue; Armbrust, Sara; Cook, Brant
2013-01-01
In this conceptual paper, we propose an heuristic to balance context-specific and generic scaffolding, as well as computer-based and teacher scaffolding, during instruction centered on authentic, scientific problems. This paper is novel in that many researchers ask a dichotomous question of whether generic or context-specific scaffolding is best,…
76 FR 1442 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-10
... Group; Macromolecular Structure and Function D Study Section. Date: February 8-9, 2011. Time: 8 a.m. to...; Biomedical Computing and Health Informatics Study Section. Date: February 8, 2011. Time: 8 a.m. to 5 p.m... Skin Sciences Integrated Review Group; Skeletal Muscle and Exercise Physiology Study Section. Date...
Using Scenarios to Design Complex Technology-Enhanced Learning Environments
ERIC Educational Resources Information Center
de Jong, Ton; Weinberger, Armin; Girault, Isabelle; Kluge, Anders; Lazonder, Ard W.; Pedaste, Margus; Ludvigsen, Sten; Ney, Muriel; Wasson, Barbara; Wichmann, Astrid; Geraedts, Caspar; Giemza, Adam; Hovardas, Tasos; Julien, Rachel; van Joolingen, Wouter R.; Lejeune, Anne; Manoli, Constantinos C.; Matteman, Yuri; Sarapuu, Tago; Verkade, Alex; Vold, Vibeke; Zacharia, Zacharias C.
2012-01-01
Science Created by You (SCY) learning environments are computer-based environments in which students learn about science topics in the context of addressing a socio-scientific problem. Along their way to a solution for this problem students produce many types of intermediate products or learning objects. SCY learning environments center the entire…
NASA Astrophysics Data System (ADS)
Mills, R. T.; Rupp, K.; Smith, B. F.; Brown, J.; Knepley, M.; Zhang, H.; Adams, M.; Hammond, G. E.
2017-12-01
As the high-performance computing community pushes towards the exascale horizon, power and heat considerations have driven the increasing importance and prevalence of fine-grained parallelism in new computer architectures. High-performance computing centers have become increasingly reliant on GPGPU accelerators and "manycore" processors such as the Intel Xeon Phi line, and 512-bit SIMD registers have even been introduced in the latest generation of Intel's mainstream Xeon server processors. The high degree of fine-grained parallelism and more complicated memory hierarchy considerations of such "manycore" processors present several challenges to existing scientific software. Here, we consider how the massively parallel, open-source hydrologic flow and reactive transport code PFLOTRAN - and the underlying Portable, Extensible Toolkit for Scientific Computation (PETSc) library on which it is built - can best take advantage of such architectures. We will discuss some key features of these novel architectures and our code optimizations and algorithmic developments targeted at them, and present experiences drawn from working with a wide range of PFLOTRAN benchmark problems on these architectures.
On-demand provisioning of HEP compute resources on cloud sites and shared HPC centers
NASA Astrophysics Data System (ADS)
Erli, G.; Fischer, F.; Fleig, G.; Giffels, M.; Hauth, T.; Quast, G.; Schnepf, M.; Heese, J.; Leppert, K.; Arnaez de Pedro, J.; Sträter, R.
2017-10-01
This contribution reports on solutions, experiences and recent developments with the dynamic, on-demand provisioning of remote computing resources for analysis and simulation workflows. Local resources of a physics institute are extended by private and commercial cloud sites, ranging from the inclusion of desktop clusters over institute clusters to HPC centers. Rather than relying on dedicated HEP computing centers, it is nowadays more reasonable and flexible to utilize remote computing capacity via virtualization techniques or container concepts. We report on recent experience from incorporating a remote HPC center (NEMO Cluster, Freiburg University) and resources dynamically requested from the commercial provider 1&1 Internet SE into our intitute’s computing infrastructure. The Freiburg HPC resources are requested via the standard batch system, allowing HPC and HEP applications to be executed simultaneously, such that regular batch jobs run side by side to virtual machines managed via OpenStack [1]. For the inclusion of the 1&1 commercial resources, a Python API and SDK as well as the possibility to upload images were available. Large scale tests prove the capability to serve the scientific use case in the European 1&1 datacenters. The described environment at the Institute of Experimental Nuclear Physics (IEKP) at KIT serves the needs of researchers participating in the CMS and Belle II experiments. In total, resources exceeding half a million CPU hours have been provided by remote sites.
NASA Technical Reports Server (NTRS)
Oliger, Joseph
1992-01-01
The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under a cooperative agreement with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. A flexible scientific staff is provided through a university faculty visitor program, a post doctoral program, and a student visitor program. Not only does this provide appropriate expertise but it also introduces scientists outside of NASA to NASA problems. A small group of core RIACS staff provides continuity and interacts with an ARC technical monitor and scientific advisory group to determine the RIACS mission. RIACS activities are reviewed and monitored by a USRA advisory council and ARC technical monitor. Research at RIACS is currently being done in the following areas: (1) parallel computing; (2) advanced methods for scientific computing; (3) learning systems; (4) high performance networks and technology; and (5) graphics, visualization, and virtual environments. In the past year, parallel compiler techniques and adaptive numerical methods for flows in complicated geometries were identified as important problems to investigate for ARC's involvement in the Computational Grand Challenges of the next decade. We concluded a summer student visitors program during this six months. We had six visiting graduate students that worked on projects over the summer and presented seminars on their work at the conclusion of their visits. RIACS technical reports are usually preprints of manuscripts that have been submitted to research journals or conference proceedings. A list of these reports for the period July 1, 1992 through December 31, 1992 is provided.
77 FR 8266 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-14
..., (Telephone Conference Call). Contact Person: Michael M. Sveda, Ph.D., Scientific Review Officer, Center for..., MD, Ph.D., Scientific Review Officer, Center for Scientific Review, National Institutes of Health...). Contact Person: Richard Panniers, Ph.D., Scientific Review Officer, Center for Scientific Review, National...
Software for Planning Scientific Activities on Mars
NASA Technical Reports Server (NTRS)
Ai-Chang, Mitchell; Bresina, John; Jonsson, Ari; Hsu, Jennifer; Kanefsky, Bob; Morris, Paul; Rajan, Kanna; Yglesias, Jeffrey; Charest, Len; Maldague, Pierre
2003-01-01
Mixed-Initiative Activity Plan Generator (MAPGEN) is a ground-based computer program for planning and scheduling the scientific activities of instrumented exploratory robotic vehicles, within the limitations of available resources onboard the vehicle. MAPGEN is a combination of two prior software systems: (1) an activity-planning program, APGEN, developed at NASA s Jet Propulsion Laboratory and (2) the Europa planner/scheduler from NASA Ames Research Center. MAPGEN performs all of the following functions: Automatic generation of plans and schedules for scientific and engineering activities; Testing of hypotheses (or what-if analyses of various scenarios); Editing of plans; Computation and analysis of resources; and Enforcement and maintenance of constraints, including resolution of temporal and resource conflicts among planned activities. MAPGEN can be used in either of two modes: one in which the planner/scheduler is turned off and only the basic APGEN functionality is utilized, or one in which both component programs are used to obtain the full planning, scheduling, and constraint-maintenance functionality.
The National Center for Biomedical Ontology
Noy, Natalya F; Shah, Nigam H; Whetzel, Patricia L; Chute, Christopher G; Story, Margaret-Anne; Smith, Barry
2011-01-01
The National Center for Biomedical Ontology is now in its seventh year. The goals of this National Center for Biomedical Computing are to: create and maintain a repository of biomedical ontologies and terminologies; build tools and web services to enable the use of ontologies and terminologies in clinical and translational research; educate their trainees and the scientific community broadly about biomedical ontology and ontology-based technology and best practices; and collaborate with a variety of groups who develop and use ontologies and terminologies in biomedicine. The centerpiece of the National Center for Biomedical Ontology is a web-based resource known as BioPortal. BioPortal makes available for research in computationally useful forms more than 270 of the world's biomedical ontologies and terminologies, and supports a wide range of web services that enable investigators to use the ontologies to annotate and retrieve data, to generate value sets and special-purpose lexicons, and to perform advanced analytics on a wide range of biomedical data. PMID:22081220
Uses of the Drupal CMS Collaborative Framework in the Woods Hole Scientific Community (Invited)
NASA Astrophysics Data System (ADS)
Maffei, A. R.; Chandler, C. L.; Work, T. T.; Shorthouse, D.; Furfey, J.; Miller, H.
2010-12-01
Organizations that comprise the Woods Hole scientific community (Woods Hole Oceanographic Institution, Marine Biological Laboratory, USGS Woods Hole Coastal and Marine Science Center, Woods Hole Research Center, NOAA NMFS Northeast Fisheries Science Center, SEA Education Association) have a long history of collaborative activity regarding computing, computer network and information technologies that support common, inter-disciplinary science needs. Over the past several years there has been growing interest in the use of the Drupal Content Management System (CMS) playing a variety of roles in support of research projects resident at several of these organizations. Many of these projects are part of science programs that are national and international in scope. Here we survey the current uses of Drupal within the Woods Hole scientific community and examine reasons it has been adopted. The promise of emerging semantic features in the Drupal framework is examined and projections of how pre-existing Drupal-based websites might benefit are made. Closer examination of Drupal software design exposes it as more than simply a content management system. The flexibility of its architecture; the power of its taxonomy module; the care taken in nurturing the open-source developer community that surrounds it (including organized and often well-attended code sprints); the ability to bind emerging software technologies as Drupal modules; the careful selection process used in adopting core functionality; multi-site hosting and cross-site deployment of updates and a recent trend towards development of use-case inspired Drupal distributions casts Drupal as a general-purpose application deployment framework. Recent work in the semantic arena casts Drupal as an emerging RDF framework as well. Examples of roles played by Drupal-based websites within the Woods Hole scientific community that will be discussed include: science data metadata database, organization main website, biological taxonomy development, bibliographic database, physical media data archive inventory manager, disaster-response website development framework, science project task management, science conference planning, and spreadsheet-to-database converter.
75 FR 35075 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-21
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... unwarranted invasion of personal privacy. Name of Committee: Center for Scientific Review Special Emphasis..., PhD, Scientific Review Officer, Center for Scientific Review, National Institutes of Health, 6701...
78 FR 9064 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-07
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... unwarranted invasion of personal privacy. Name of Committee: Center for Scientific Review Special Emphasis..., Ph.D., Scientific Review Officer, Center for Scientific Review, National Institutes of Health, 6701...
76 FR 35454 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-17
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel, PAR10-074...D, Scientific Review Officer, Center for Scientific Review, National Institutes of Health, 6701...
77 FR 33476 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-06
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel; Small Business... Person: John Firrell, Ph.D., Scientific Review Officer, Center for Scientific Review, National Institutes...
78 FR 66026 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-04
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel; Member Conflict...: Julius Cinque, MS, Scientific Review Officer, Center for Scientific Review, National Institutes of Health...
NASA Astrophysics Data System (ADS)
Strayer, Michael
2007-09-01
Good morning. Welcome to Boston, the home of the Red Sox, Celtics and Bruins, baked beans, tea parties, Robert Parker, and SciDAC 2007. A year ago I stood before you to share the legacy of the first SciDAC program and identify the challenges that we must address on the road to petascale computing—a road E E Cummins described as `. . . never traveled, gladly beyond any experience.' Today, I want to explore the preparations for the rapidly approaching extreme scale (X-scale) generation. These preparations are the first step propelling us along the road of burgeoning scientific discovery enabled by the application of X- scale computing. We look to petascale computing and beyond to open up a world of discovery that cuts across scientific fields and leads us to a greater understanding of not only our world, but our universe. As part of the President's America Competitiveness Initiative, the ASCR Office has been preparing a ten year vision for computing. As part of this planning the LBNL together with ORNL and ANL hosted three town hall meetings on Simulation and Modeling at the Exascale for Energy, Ecological Sustainability and Global Security (E3). The proposed E3 initiative is organized around four programmatic themes: Engaging our top scientists, engineers, computer scientists and applied mathematicians; investing in pioneering large-scale science; developing scalable analysis algorithms, and storage architectures to accelerate discovery; and accelerating the build-out and future development of the DOE open computing facilities. It is clear that we have only just started down the path to extreme scale computing. Plan to attend Thursday's session on the out-briefing and discussion of these meetings. The road to the petascale has been at best rocky. In FY07, the continuing resolution provided 12% less money for Advanced Scientific Computing than either the President, the Senate, or the House. As a consequence, many of you had to absorb a no cost extension for your SciDAC work. I am pleased that the President's FY08 budget restores the funding for SciDAC. Quoting from Advanced Scientific Computing Research description in the House Energy and Water Development Appropriations Bill for FY08, "Perhaps no other area of research at the Department is so critical to sustaining U.S. leadership in science and technology, revolutionizing the way science is done and improving research productivity." As a society we need to revolutionize our approaches to energy, environmental and global security challenges. As we go forward along the road to the X-scale generation, the use of computation will continue to be a critical tool along with theory and experiment in understanding the behavior of the fundamental components of nature as well as for fundamental discovery and exploration of the behavior of complex systems. The foundation to overcome these societal challenges will build from the experiences and knowledge gained as you, members of our SciDAC research teams, work together to attack problems at the tera- and peta- scale. If SciDAC is viewed as an experiment for revolutionizing scientific methodology, then a strategic goal of ASCR program must be to broaden the intellectual base prepared to address the challenges of the new X-scale generation of computing. We must focus our computational science experiences gained over the past five years on the opportunities introduced with extreme scale computing. Our facilities are on a path to provide the resources needed to undertake the first part of our journey. Using the newly upgraded 119 teraflop Cray XT system at the Leadership Computing Facility, SciDAC research teams have in three days performed a 100-year study of the time evolution of the atmospheric CO2 concentration originating from the land surface. The simulation of the El Nino/Southern Oscillation which was part of this study has been characterized as `the most impressive new result in ten years' gained new insight into the behavior of superheated ionic gas in the ITER reactor as a result of an AORSA run on 22,500 processors that achieved over 87 trillion calculations per second (87 teraflops) which is 74% of the system's theoretical peak. Tomorrow, Argonne and IBM will announce that the first IBM Blue Gene/P, a 100 teraflop system, will be shipped to the Argonne Leadership Computing Facility later this fiscal year. By the end of FY2007 ASCR high performance and leadership computing resources will include the 114 teraflop IBM Blue Gene/P; a 102 teraflop Cray XT4 at NERSC and a 119 teraflop Cray XT system at Oak Ridge. Before ringing in the New Year, Oak Ridge will upgrade to 250 teraflops with the replacement of the dual core processors with quad core processors and Argonne will upgrade to between 250-500 teraflops, and next year, a petascale Cray Baker system is scheduled for delivery at Oak Ridge. The multidisciplinary teams in our SciDAC Centers for Enabling Technologies and our SciDAC Institutes must continue to work with our Scientific Application teams to overcome the barriers that prevent effective use of these new systems. These challenges include: the need for new algorithms as well as operating system and runtime software and tools which scale to parallel systems composed of hundreds of thousands processors; program development environments and tools which scale effectively and provide ease of use for developers and scientific end users; and visualization and data management systems that support moving, storing, analyzing, manipulating and visualizing multi-petabytes of scientific data and objects. The SciDAC Centers, located primarily at our DOE national laboratories will take the lead in ensuring that critical computer science and applied mathematics issues are addressed in a timely and comprehensive fashion and to address issues associated with research software lifecycle. In contrast, the SciDAC Institutes, which are university-led centers of excellence, will have more flexibility to pursue new research topics through a range of research collaborations. The Institutes will also work to broaden the intellectual and researcher base—conducting short courses and summer schools to take advantage of new high performance computing capabilities. The SciDAC Outreach Center at Lawrence Berkeley National Laboratory complements the outreach efforts of the SciDAC Institutes. The Outreach Center is our clearinghouse for SciDAC activities and resources and will communicate with the high performance computing community in part to understand their needs for workshops, summer schools and institutes. SciDAC is not ASCR's only effort to broaden the computational science community needed to meet the challenges of the new X-scale generation. I hope that you were able to attend the Computational Science Graduate Fellowship poster session last night. ASCR developed the fellowship in 1991 to meet the nation's growing need for scientists and technology professionals with advanced computer skills. CSGF, now jointly funded between ASCR and NNSA, is more than a traditional academic fellowship. It has provided more than 200 of the best and brightest graduate students with guidance, support and community in preparing them as computational scientists. Today CSGF alumni are bringing their diverse top-level skills and knowledge to research teams at DOE laboratories and in industries such as Proctor and Gamble, Lockheed Martin and Intel. At universities they are working to train the next generation of computational scientists. To build on this success, we intend to develop a wholly new Early Career Principal Investigator's (ECPI) program. Our objective is to stimulate academic research in scientific areas within ASCR's purview especially among faculty in early stages of their academic careers. Last February, we lost Ken Kennedy, one of the leading lights of our community. As we move forward into the extreme computing generation, his vision and insight will be greatly missed. In memorial to Ken Kennedy, we shall designate the ECPI grants to beginning faculty in Computer Science as the Ken Kennedy Fellowship. Watch the ASCR website for more information about ECPI and other early career programs in the computational sciences. We look to you, our scientists, researchers, and visionaries to take X-scale computing and use it to explode scientific discovery in your fields. We at SciDAC will work to ensure that this tool is the sharpest and most precise and efficient instrument to carve away the unknown and reveal the most exciting secrets and stimulating scientific discoveries of our time. The partnership between research and computing is the marriage that will spur greater discovery, and as Spencer said to Susan in Robert Parker's novel, `Sudden Mischief', `We stick together long enough, and we may get as smart as hell'. Michael Strayer
Johnson, R Jeremy
2014-01-01
HIV protease has served as a model protein for understanding protein structure, enzyme kinetics, structure-based drug design, and protein evolution. Inhibitors of HIV protease are also an essential part of effective HIV/AIDS treatment and have provided great societal benefits. The broad applications for HIV protease and its inhibitors make it a perfect framework for integrating foundational topics in biochemistry around a big picture scientific and societal issue. Herein, I describe a series of classroom exercises that integrate foundational topics in biochemistry around the structure, biology, and therapeutic inhibition of HIV protease. These exercises center on foundational topics in biochemistry including thermodynamics, acid/base properties, protein structure, ligand binding, and enzymatic catalysis. The exercises also incorporate regular student practice of scientific skills including analysis of primary literature, evaluation of scientific data, and presentation of technical scientific arguments. Through the exercises, students also gain experience accessing computational biochemical resources such as the protein data bank, Proteopedia, and protein visualization software. As these HIV centered exercises cover foundational topics common to all first semester biochemistry courses, these exercises should appeal to a broad audience of undergraduate students and should be readily integrated into a variety of teaching styles and classroom sizes. © 2014 The International Union of Biochemistry and Molecular Biology.
Kepler Science Operations Center Architecture
NASA Technical Reports Server (NTRS)
Middour, Christopher; Klaus, Todd; Jenkins, Jon; Pletcher, David; Cote, Miles; Chandrasekaran, Hema; Wohler, Bill; Girouard, Forrest; Gunter, Jay P.; Uddin, Kamal;
2010-01-01
We give an overview of the operational concepts and architecture of the Kepler Science Data Pipeline. Designed, developed, operated, and maintained by the Science Operations Center (SOC) at NASA Ames Research Center, the Kepler Science Data Pipeline is central element of the Kepler Ground Data System. The SOC charter is to analyze stellar photometric data from the Kepler spacecraft and report results to the Kepler Science Office for further analysis. We describe how this is accomplished via the Kepler Science Data Pipeline, including the hardware infrastructure, scientific algorithms, and operational procedures. The SOC consists of an office at Ames Research Center, software development and operations departments, and a data center that hosts the computers required to perform data analysis. We discuss the high-performance, parallel computing software modules of the Kepler Science Data Pipeline that perform transit photometry, pixel-level calibration, systematic error-correction, attitude determination, stellar target management, and instrument characterization. We explain how data processing environments are divided to support operational processing and test needs. We explain the operational timelines for data processing and the data constructs that flow into the Kepler Science Data Pipeline.
NASA Astrophysics Data System (ADS)
Michaelis, A.; Wang, W.; Melton, F. S.; Votava, P.; Milesi, C.; Hashimoto, H.; Nemani, R. R.; Hiatt, S. H.
2009-12-01
As the length and diversity of the global earth observation data records grow, modeling and analyses of biospheric conditions increasingly requires multiple terabytes of data from a diversity of models and sensors. With network bandwidth beginning to flatten, transmission of these data from centralized data archives presents an increasing challenge, and costs associated with local storage and management of data and compute resources are often significant for individual research and application development efforts. Sharing community valued intermediary data sets, results and codes from individual efforts with others that are not in direct funded collaboration can also be a challenge with respect to time, cost and expertise. We purpose a modeling, data and knowledge center that houses NASA satellite data, climate data and ancillary data where a focused community may come together to share modeling and analysis codes, scientific results, knowledge and expertise on a centralized platform, named Ecosystem Modeling Center (EMC). With the recent development of new technologies for secure hardware virtualization, an opportunity exists to create specific modeling, analysis and compute environments that are customizable, “archiveable” and transferable. Allowing users to instantiate such environments on large compute infrastructures that are directly connected to large data archives may significantly reduce costs and time associated with scientific efforts by alleviating users from redundantly retrieving and integrating data sets and building modeling analysis codes. The EMC platform also provides the possibility for users receiving indirect assistance from expertise through prefabricated compute environments, potentially reducing study “ramp up” times.
The Practical Obstacles of Data Transfer: Why researchers still love scp
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nam, Hai Ah; Hill, Jason J; Parete-Koon, Suzanne T
The importance of computing facilities is heralded every six months with the announcement of the new Top500 list, showcasing the world s fastest supercomputers. Unfortu- nately, with great computing capability does not come great long-term data storage capacity, which often means users must move their data to their local site archive, to remote sites where they may be doing future computation or anal- ysis, or back to their home institution, else face the dreaded data purge that most HPC centers employ to keep utiliza- tion of large parallel filesystems low to manage performance and capacity. At HPC centers, data transfermore » is crucial to the scientific workflow and will increase in importance as computing systems grow in size. The Energy Sciences Net- work (ESnet) recently launched its fifth generation network, a 100 Gbps high-performance, unclassified national network connecting more than 40 DOE research sites to support scientific research and collaboration. Despite the tenfold increase in bandwidth to DOE research sites amenable to multiple data transfer streams and high throughput, in prac- tice, researchers often under-utilize the network and resort to painfully-slow single stream transfer methods such as scp to avoid the complexity of using multiple stream tools such as GridFTP and bbcp, and contend with frustration from the lack of consistency of available tools between sites. In this study we survey and assess the data transfer methods pro- vided at several DOE supported computing facilities, includ- ing both leadership-computing facilities, connected through ESnet. We present observed transfer rates, suggested opti- mizations, and discuss the obstacles the tools must overcome to receive wide-spread adoption over scp.« less
76 FR 32221 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-03
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel, Fellowship: Cell... Person: Ross D Shonat, PH.D, Scientific Review Officer, Center for Scientific Review, National Institutes...
78 FR 13363 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-27
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel; Small Business...). Contact Person: Bukhtiar H Shah, DVM, Ph.D., Scientific Review Officer, Center for Scientific Review...
77 FR 71429 - Center For Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-30
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center For Scientific Review... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel Cancer Therapy... Conference Call). Contact Person: Lilia Topol, Ph.D., Scientific Review Officer, Center for Scientific Review...
Report on Computing and Networking in the Space Science Laboratory by the SSL Computer Committee
NASA Technical Reports Server (NTRS)
Gallagher, D. L. (Editor)
1993-01-01
The Space Science Laboratory (SSL) at Marshall Space Flight Center is a multiprogram facility. Scientific research is conducted in four discipline areas: earth science and applications, solar-terrestrial physics, astrophysics, and microgravity science and applications. Representatives from each of these discipline areas participate in a Laboratory computer requirements committee, which developed this document. The purpose is to establish and discuss Laboratory objectives for computing and networking in support of science. The purpose is also to lay the foundation for a collective, multiprogram approach to providing these services. Special recognition is given to the importance of the national and international efforts of our research communities toward the development of interoperable, network-based computer applications.
Integrating Grid Services into the Cray XT4 Environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
NERSC; Cholia, Shreyas; Lin, Hwa-Chun Wendy
2009-05-01
The 38640 core Cray XT4"Franklin" system at the National Energy Research Scientific Computing Center (NERSC) is a massively parallel resource available to Department of Energy researchers that also provides on-demand grid computing to the Open Science Grid. The integration of grid services on Franklin presented various challenges, including fundamental differences between the interactive and compute nodes, a stripped down compute-node operating system without dynamic library support, a shared-root environment and idiosyncratic application launching. Inour work, we describe how we resolved these challenges on a running, general-purpose production system to provide on-demand compute, storage, accounting and monitoring services through generic gridmore » interfaces that mask the underlying system-specific details for the end user.« less
75 FR 32956 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-10
... Center, 5701 Marinelli Road, Bethesda, MD 20852. Contact Person: Joanna M. Pyper, PhD, Scientific Review...: Yi-Hsin Liu, PhD, Scientific Review Officer, Center for Scientific Review, National Institutes of... . Name of Committee: Center for Scientific Review Special Emphasis Panel, RFA-DA-10-017: Seek, Test, and...
75 FR 21641 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-26
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel; OBT IRG Member... Call). Contact Person: Angela Y. Ng, MBA, PhD, Scientific Review Officer, Center for Scientific Review...
NASA Astrophysics Data System (ADS)
Corrie, Brian; Zimmerman, Todd
Scientific research is fundamentally collaborative in nature, and many of today's complex scientific problems require domain expertise in a wide range of disciplines. In order to create research groups that can effectively explore such problems, research collaborations are often formed that involve colleagues at many institutions, sometimes spanning a country and often spanning the world. An increasingly common manifestation of such a collaboration is the collaboratory (Bos et al., 2007), a “…center without walls in which the nation's researchers can perform research without regard to geographical location — interacting with colleagues, accessing instrumentation, sharing data and computational resources, and accessing information from digital libraries.” In order to bring groups together on such a scale, a wide range of components need to be available to researchers, including distributed computer systems, remote instrumentation, data storage, collaboration tools, and the financial and human resources to operate and run such a system (National Research Council, 1993). Media Spaces, as both a technology and a social facilitator, have the potential to meet many of these needs. In this chapter, we focus on the use of scientific media spaces (SMS) as a tool for supporting collaboration in scientific research. In particular, we discuss the design, deployment, and use of a set of SMS environments deployed by WestGrid and one of its collaborating organizations, the Centre for Interdisciplinary Research in the Mathematical and Computational Sciences (IRMACS) over a 5-year period.
Tropical Ocean and Global Atmosphere (TOGA) heat exchange project: A summary report
NASA Technical Reports Server (NTRS)
Liu, W. T.; Niiler, P. P.
1985-01-01
A pilot data center to compute ocean atmosphere heat exchange over the tropical ocean is prposed at the Jet Propulsion Laboratory (JPL) in response to the scientific needs of the Tropical Ocean and Global Atmosphere (TOGA) Program. Optimal methods will be used to estimate sea surface temperature (SET), surface wind speed, and humidity from spaceborne observations. A monthly summary of these parameters will be used to compute ocean atmosphere latent heat exchanges. Monthly fields of surface heat flux over tropical oceans will be constructed using estimations of latent heat exchanges and short wave radiation from satellite data. Verification of all satellite data sets with in situ measurements at a few locations will be provided. The data center will be an experimental active archive where the quality and quantity of data required for TOGA flux computation are managed. The center is essential to facilitate the construction of composite data sets from global measurements taken from different sensors on various satellites. It will provide efficient utilization and easy access to the large volume of satellite data available for studies of ocean atmosphere energy exchanges.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerber, Richard; Hack, James; Riley, Katherine
The mission of the U.S. Department of Energy Office of Science (DOE SC) is the delivery of scientific discoveries and major scientific tools to transform our understanding of nature and to advance the energy, economic, and national security missions of the United States. To achieve these goals in today’s world requires investments in not only the traditional scientific endeavors of theory and experiment, but also in computational science and the facilities that support large-scale simulation and data analysis. The Advanced Scientific Computing Research (ASCR) program addresses these challenges in the Office of Science. ASCR’s mission is to discover, develop, andmore » deploy computational and networking capabilities to analyze, model, simulate, and predict complex phenomena important to DOE. ASCR supports research in computational science, three high-performance computing (HPC) facilities — the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory and Leadership Computing Facilities at Argonne (ALCF) and Oak Ridge (OLCF) National Laboratories — and the Energy Sciences Network (ESnet) at Berkeley Lab. ASCR is guided by science needs as it develops research programs, computers, and networks at the leading edge of technologies. As we approach the era of exascale computing, technology changes are creating challenges for science programs in SC for those who need to use high performance computing and data systems effectively. Numerous significant modifications to today’s tools and techniques will be needed to realize the full potential of emerging computing systems and other novel computing architectures. To assess these needs and challenges, ASCR held a series of Exascale Requirements Reviews in 2015–2017, one with each of the six SC program offices,1 and a subsequent Crosscut Review that sought to integrate the findings from each. Participants at the reviews were drawn from the communities of leading domain scientists, experts in computer science and applied mathematics, ASCR facility staff, and DOE program managers in ASCR and the respective program offices. The purpose of these reviews was to identify mission-critical scientific problems within the DOE Office of Science (including experimental facilities) and determine the requirements for the exascale ecosystem that would be needed to address those challenges. The exascale ecosystem includes exascale computing systems, high-end data capabilities, efficient software at scale, libraries, tools, and other capabilities. This effort will contribute to the development of a strategic roadmap for ASCR compute and data facility investments and will help the ASCR Facility Division establish partnerships with Office of Science stakeholders. It will also inform the Office of Science research needs and agenda. The results of the six reviews have been published in reports available on the web at http://exascaleage.org/. This report presents a summary of the individual reports and of common and crosscutting findings, and it identifies opportunities for productive collaborations among the DOE SC program offices.« less
NASA Astrophysics Data System (ADS)
Gusev, A.; Trudkova, N.
2017-09-01
Center "GeoNa" will enable scientists and teachers of the Russian universities to join to advanced achievements of a science, information technologies; to establish scientific communications with foreign colleagues in sphere of the high technology, educational projects and Intellectual-Cognitive Tourism. The Project "Kazan - Moon - 2020+" is directed on the decision of fundamental problems of celestial mechanics, selenodesy and geophysics of the Moon(s) connected to carrying out of complex theoretical researches and computer modelling.
Next Generation Workload Management System For Big Data on Heterogeneous Distributed Computing
Klimentov, A.; Buncic, P.; De, K.; ...
2015-05-22
The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS and ALICE are the largest collaborations ever assembled in the sciences and are at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, both experiments rely on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Managementmore » System (WMS) for managing the workflow for all data processing on hundreds of data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. The scale is demonstrated by the following numbers: PanDA manages O(10 2) sites, O(10 5) cores, O(10 8) jobs per year, O(10 3) users, and ATLAS data volume is O(10 17) bytes. In 2013 we started an ambitious program to expand PanDA to all available computing resources, including opportunistic use of commercial and academic clouds and Leadership Computing Facilities (LCF). The project titled 'Next Generation Workload Management and Analysis System for Big Data' (BigPanDA) is funded by DOE ASCR and HEP. Extending PanDA to clouds and LCF presents new challenges in managing heterogeneity and supporting workflow. The BigPanDA project is underway to setup and tailor PanDA at the Oak Ridge Leadership Computing Facility (OLCF) and at the National Research Center "Kurchatov Institute" together with ALICE distributed computing and ORNL computing professionals. Our approach to integration of HPC platforms at the OLCF and elsewhere is to reuse, as much as possible, existing components of the PanDA system. Finally, we will present our current accomplishments with running the PanDA WMS at OLCF and other supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facilities infrastructure for High Energy and Nuclear Physics as well as other data-intensive science applications.« less
Next Generation Workload Management System For Big Data on Heterogeneous Distributed Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klimentov, A.; Buncic, P.; De, K.
The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS and ALICE are the largest collaborations ever assembled in the sciences and are at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, both experiments rely on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Managementmore » System (WMS) for managing the workflow for all data processing on hundreds of data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. The scale is demonstrated by the following numbers: PanDA manages O(10 2) sites, O(10 5) cores, O(10 8) jobs per year, O(10 3) users, and ATLAS data volume is O(10 17) bytes. In 2013 we started an ambitious program to expand PanDA to all available computing resources, including opportunistic use of commercial and academic clouds and Leadership Computing Facilities (LCF). The project titled 'Next Generation Workload Management and Analysis System for Big Data' (BigPanDA) is funded by DOE ASCR and HEP. Extending PanDA to clouds and LCF presents new challenges in managing heterogeneity and supporting workflow. The BigPanDA project is underway to setup and tailor PanDA at the Oak Ridge Leadership Computing Facility (OLCF) and at the National Research Center "Kurchatov Institute" together with ALICE distributed computing and ORNL computing professionals. Our approach to integration of HPC platforms at the OLCF and elsewhere is to reuse, as much as possible, existing components of the PanDA system. Finally, we will present our current accomplishments with running the PanDA WMS at OLCF and other supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facilities infrastructure for High Energy and Nuclear Physics as well as other data-intensive science applications.« less
A uniform approach for programming distributed heterogeneous computing systems
Grasso, Ivan; Pellegrini, Simone; Cosenza, Biagio; Fahringer, Thomas
2014-01-01
Large-scale compute clusters of heterogeneous nodes equipped with multi-core CPUs and GPUs are getting increasingly popular in the scientific community. However, such systems require a combination of different programming paradigms making application development very challenging. In this article we introduce libWater, a library-based extension of the OpenCL programming model that simplifies the development of heterogeneous distributed applications. libWater consists of a simple interface, which is a transparent abstraction of the underlying distributed architecture, offering advanced features such as inter-context and inter-node device synchronization. It provides a runtime system which tracks dependency information enforced by event synchronization to dynamically build a DAG of commands, on which we automatically apply two optimizations: collective communication pattern detection and device-host-device copy removal. We assess libWater’s performance in three compute clusters available from the Vienna Scientific Cluster, the Barcelona Supercomputing Center and the University of Innsbruck, demonstrating improved performance and scaling with different test applications and configurations. PMID:25844015
DOE Office of Scientific and Technical Information (OSTI.GOV)
Computational Research Division, Lawrence Berkeley National Laboratory; NERSC, Lawrence Berkeley National Laboratory; Computer Science Department, University of California, Berkeley
2009-05-04
We apply auto-tuning to a hybrid MPI-pthreads lattice Boltzmann computation running on the Cray XT4 at National Energy Research Scientific Computing Center (NERSC). Previous work showed that multicore-specific auto-tuning can improve the performance of lattice Boltzmann magnetohydrodynamics (LBMHD) by a factor of 4x when running on dual- and quad-core Opteron dual-socket SMPs. We extend these studies to the distributed memory arena via a hybrid MPI/pthreads implementation. In addition to conventional auto-tuning at the local SMP node, we tune at the message-passing level to determine the optimal aspect ratio as well as the correct balance between MPI tasks and threads permore » MPI task. Our study presents a detailed performance analysis when moving along an isocurve of constant hardware usage: fixed total memory, total cores, and total nodes. Overall, our work points to approaches for improving intra- and inter-node efficiency on large-scale multicore systems for demanding scientific applications.« less
A uniform approach for programming distributed heterogeneous computing systems.
Grasso, Ivan; Pellegrini, Simone; Cosenza, Biagio; Fahringer, Thomas
2014-12-01
Large-scale compute clusters of heterogeneous nodes equipped with multi-core CPUs and GPUs are getting increasingly popular in the scientific community. However, such systems require a combination of different programming paradigms making application development very challenging. In this article we introduce libWater, a library-based extension of the OpenCL programming model that simplifies the development of heterogeneous distributed applications. libWater consists of a simple interface, which is a transparent abstraction of the underlying distributed architecture, offering advanced features such as inter-context and inter-node device synchronization. It provides a runtime system which tracks dependency information enforced by event synchronization to dynamically build a DAG of commands, on which we automatically apply two optimizations: collective communication pattern detection and device-host-device copy removal. We assess libWater's performance in three compute clusters available from the Vienna Scientific Cluster, the Barcelona Supercomputing Center and the University of Innsbruck, demonstrating improved performance and scaling with different test applications and configurations.
75 FR 38111 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-01
..., (Virtual Meeting). Contact Person: Arnold Revzin, PhD, Scientific Review Officer, Center for Scientific..., (Virtual Meeting). Contact Person: Jerry L. Taylor, PhD, Scientific Review Officer, Center for Scientific... Rockledge Drive, Bethesda, MD 20892, (Virtual Meeting). Contact Person: Richard Ingraham, PhD, Scientific...
Department of Defense In-House RDT and E Activities
1976-10-30
BALLISTIC TESTS.FAC AVAL FCR TESIS OF SP ELELTRONIC’ FIl’ CON EQUIP 4 RELATED SYSTEMS E COMPONFNTZ, 35 INSTALLATION: MEDICAL BIOENGINEERINC- R&D LABORATORY...ANALYSIS OF CHEMICAL AND METALLOGRAPHIC EFFECTS, MICROBIOLOGICAL EFFECTS, CLIMATIC ENVIRONMENTAL EFFECTS. TEST AND EVALUATE WARHEADS AND SPECIAL...CCMMUNICATI’N SYST:M INSTRUMENTED DROP ZONES ENGINEERING TEST FACILITY INSTRUMENTATION CALIBRATICN FACILITY SCIENTIFIC COMPUTER CENTER ENVIRONMENTAL TESY
Documentation of the data analysis system for the gamma ray monitor aboard OSO-H
NASA Technical Reports Server (NTRS)
Croteau, S.; Buck, A.; Higbie, P.; Kantauskis, J.; Foss, S.; Chupp, D.; Forrest, D. J.; Suri, A.; Gleske, I.
1973-01-01
The programming system is presented which was developed to prepare the data from the gamma ray monitor on OSO-7 for scientific analysis. The detector, data, and objectives are described in detail. Programs presented include; FEEDER, PASS-1, CAL1, CAL2, PASS-3, Van Allen Belt Predict Program, Computation Center Plot Routine, and Response Function Programs.
78 FR 6127 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-29
..., Computational Biology and Technology Study Section. Date: February 20-21, 2013. Time: 11:00 a.m. to 6:00 p.m... Hematology Integrated Review Group Vascular Cell and Molecular Biology Study Section. Date: February 21-22... Rehabilitation Study Section. Date: February 22, 2013. Time: 8:00 p.m. to 9:00 p.m. Agenda: To review and...
77 FR 27470 - Center for Scientific Review Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-10
..., Prevention and Intervention for Addictions Study Section. Date: June 7-8, 2012. Time: 8:00 a.m. to 5:00 p.m...: Bioengineering Sciences & Technologies Integrated Review Group; Nanotechnology Study Section. Date: June 7-8..., Computational Biology and Technology Study Section. Date: June 7-8, 2012. Time: 8:30 a.m. to 6:00 p.m. Agenda...
75 FR 10491 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-08
...: Computational Biology, Image Processing, and Data Mining. Date: March 18, 2010. Time: 8 a.m. to 6 p.m. Agenda... Science. Date: March 24, 2010. Time: 12 p.m. to 3:30 p.m. Agenda: To review and evaluate grant...; Fellowship: Biophysical and Biochemical Sciences. Date: March 25-26, 2010. Time: 8 a.m. to 5 p.m. Agenda: To...
Early Citability of Data vs Peer-Review like Data Publishing Procedures
NASA Astrophysics Data System (ADS)
Stockhause, Martina; Höck, Heinke; Toussaint, Frank; Lautenschlager, Michael
2014-05-01
The World Data Center for Climate (WDCC) hosted at the German Climate Computing Center (DKRZ) was one of the first data centers, which established a peer-review like data publication procedure resulting in DataCite DOIs. Data in the long-term archive (LTA) is diligently reviewed by data managers and data authors to grant high quality and widely reusability of the published data. This traditional data publication procedure for LTA data bearing DOIs is very time consuming especially for WDCC's high data volumes of climate model data in the order of multiple TBytes. Data is shared with project members and selected scientists months before the data is long-term archived. The scientific community analyses and thus reviews the data leading to data quality improvements. Scientists wish to cite these unstable data in scientific publications before the long-term archiving and the thorough data review process are finalized. A concept for early preprint DOIs for shared but not yet long-term archived data is presented. Requirements on data documentation, persistence and quality and use cases for preprint DOIs within the data life-cycle are discussed as well as questions of how to document the differences of the two DOI types and how to relate them to each other with the recommendation to use LTA DOIs in citations. WDCC wants to offer an additional user service for early citations of data of basic quality without compromising the LTA DOIs, i.e. WDCC's standard DOIs, as trustworthy indicator for high quality data. Referencing Links: World Data Center for Climate (WDCC): http://www.wdc-climate.de German Climate Computing Center (DKRZ): http://www.dkrz.de DataCite: http://datacite.org
Python in the NERSC Exascale Science Applications Program for Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ronaghi, Zahra; Thomas, Rollin; Deslippe, Jack
We describe a new effort at the National Energy Re- search Scientific Computing Center (NERSC) in performance analysis and optimization of scientific Python applications targeting the Intel Xeon Phi (Knights Landing, KNL) many- core architecture. The Python-centered work outlined here is part of a larger effort called the NERSC Exascale Science Applications Program (NESAP) for Data. NESAP for Data focuses on applications that process and analyze high-volume, high-velocity data sets from experimental/observational science (EOS) facilities supported by the US Department of Energy Office of Science. We present three case study applications from NESAP for Data that use Python. These codesmore » vary in terms of “Python purity” from applications developed in pure Python to ones that use Python mainly as a convenience layer for scientists without expertise in lower level programming lan- guages like C, C++ or Fortran. The science case, requirements, constraints, algorithms, and initial performance optimizations for each code are discussed. Our goal with this paper is to contribute to the larger conversation around the role of Python in high-performance computing today and tomorrow, highlighting areas for future work and emerging best practices« less
Educational NASA Computational and Scientific Studies (enCOMPASS)
NASA Technical Reports Server (NTRS)
Memarsadeghi, Nargess
2013-01-01
Educational NASA Computational and Scientific Studies (enCOMPASS) is an educational project of NASA Goddard Space Flight Center aimed at bridging the gap between computational objectives and needs of NASA's scientific research, missions, and projects, and academia's latest advances in applied mathematics and computer science. enCOMPASS achieves this goal via bidirectional collaboration and communication between NASA and academia. Using developed NASA Computational Case Studies in university computer science/engineering and applied mathematics classes is a way of addressing NASA's goals of contributing to the Science, Technology, Education, and Math (STEM) National Objective. The enCOMPASS Web site at http://encompass.gsfc.nasa.gov provides additional information. There are currently nine enCOMPASS case studies developed in areas of earth sciences, planetary sciences, and astrophysics. Some of these case studies have been published in AIP and IEEE's Computing in Science and Engineering magazines. A few university professors have used enCOMPASS case studies in their computational classes and contributed their findings to NASA scientists. In these case studies, after introducing the science area, the specific problem, and related NASA missions, students are first asked to solve a known problem using NASA data and past approaches used and often published in a scientific/research paper. Then, after learning about the NASA application and related computational tools and approaches for solving the proposed problem, students are given a harder problem as a challenge for them to research and develop solutions for. This project provides a model for NASA scientists and engineers on one side, and university students, faculty, and researchers in computer science and applied mathematics on the other side, to learn from each other's areas of work, computational needs and solutions, and the latest advances in research and development. This innovation takes NASA science and engineering applications to computer science and applied mathematics university classes, and makes NASA objectives part of the university curricula. There is great potential for growth and return on investment of this program to the point where every major university in the U.S. would use at least one of these case studies in one of their computational courses, and where every NASA scientist and engineer facing a computational challenge (without having resources or expertise to solve it) would use enCOMPASS to formulate the problem as a case study, provide it to a university, and get back their solutions and ideas.
Integration of Panda Workload Management System with supercomputers
NASA Astrophysics Data System (ADS)
De, K.; Jha, S.; Klimentov, A.; Maeno, T.; Mashinistov, R.; Nilsson, P.; Novikov, A.; Oleynik, D.; Panitkin, S.; Poyda, A.; Read, K. F.; Ryabinkin, E.; Teslyuk, A.; Velikhov, V.; Wells, J. C.; Wenaus, T.
2016-09-01
The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS, one of the largest collaborations ever assembled in the sciences, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Management System for managing the workflow for all data processing on over 140 data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. While PanDA currently uses more than 250000 cores with a peak performance of 0.3+ petaFLOPS, next LHC data taking runs will require more resources than Grid computing can possibly provide. To alleviate these challenges, LHC experiments are engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with supercomputers in United States, Europe and Russia (in particular with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF), Supercomputer at the National Research Center "Kurchatov Institute", IT4 in Ostrava, and others). The current approach utilizes a modified PanDA pilot framework for job submission to the supercomputers batch queues and local data management, with light-weight MPI wrappers to run singlethreaded workloads in parallel on Titan's multi-core worker nodes. This implementation was tested with a variety of Monte-Carlo workloads on several supercomputing platforms. We will present our current accomplishments in running PanDA WMS at supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facility's infrastructure for High Energy and Nuclear Physics, as well as other data-intensive science applications, such as bioinformatics and astro-particle physics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Samios,N.P.
The ninth evaluation of the RIKEN BNL Research Center (RBRC) took place on Nov. 17-18, 2008, at Brookhaven National Laboratory. The members of the Scientific Review Committee (SRC) were Dr. Dr. Wit Busza (Chair), Dr. Miklos Gyulassy, Dr. Akira Masaike, Dr. Richard Milner, Dr. Alfred Mueller, and Dr. Akira Ukawa. We are pleased that Dr. Yasushige Yano, the Director of the Nishina Institute of RIKEN, Japan participated in this meeting both in informing the committee of the activities of the Nishina Institute and the role of RBRC and as an observer of this review. In order to illustrate the breadthmore » and scope of the RBRC program, each member of the Center made a presentation on his/her research efforts. This encompassed three major areas of investigation, theoretical, experimental and computational physics. In addition the committee met privately with the fellows and postdocs to ascertain their opinions and concerns. Although the main purpose of this review is a report to RIKEN Management (Dr. Ryoji Noyori, RIKEN President) on the health, scientific value, management and future prospects of the Center, the RBRC management felt that a compendium of the scientific presentations are of sufficient quality and interest that they warrant a wider distribution. Therefore we have made this compilation and present it to the community for its information and enlightenment.« less
77 FR 68138 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-15
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel; Member Conflict..., (Telephone Conference Call). Contact Person: Richard Panniers, Ph.D., Scientific Review Officer, Center for...
78 FR 61377 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-03
... . Name of Committee: Center for Scientific Review Special Emphasis Panel; Small Business: Drug Discovery... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel; Members Conflicts...
75 FR 61767 - Center For Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-06
...: Center for Scientific Review Special Emphasis Panel, Special: Signal Transduction and Drug Discovery in... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center For Scientific Review... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel, Member Conflict...
78 FR 33427 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-04
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review..., Center for Scientific Review, National Institutes of Health, 6701 Rockledge Drive, Room 2182, MSC 7818... Committee: Center for Scientific Review Special Emphasis Panel; Member Conflict: Stroke, Spinal Cord Injury...
77 FR 62246 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-12
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel Clinical and.... Contact Person: Eileen W. Bradley, DSC, Chief, SBIB IRG, Center for Scientific Review, National Institutes...
77 FR 59198 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-26
....gov . Name of Committee: Center for Scientific Review Special Emphasis Panel; Drug Discovery for the... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel; Program Projects...
Real science at the petascale.
Saksena, Radhika S; Boghosian, Bruce; Fazendeiro, Luis; Kenway, Owain A; Manos, Steven; Mazzeo, Marco D; Sadiq, S Kashif; Suter, James L; Wright, David; Coveney, Peter V
2009-06-28
We describe computational science research that uses petascale resources to achieve scientific results at unprecedented scales and resolution. The applications span a wide range of domains, from investigation of fundamental problems in turbulence through computational materials science research to biomedical applications at the forefront of HIV/AIDS research and cerebrovascular haemodynamics. This work was mainly performed on the US TeraGrid 'petascale' resource, Ranger, at Texas Advanced Computing Center, in the first half of 2008 when it was the largest computing system in the world available for open scientific research. We have sought to use this petascale supercomputer optimally across application domains and scales, exploiting the excellent parallel scaling performance found on up to at least 32 768 cores for certain of our codes in the so-called 'capability computing' category as well as high-throughput intermediate-scale jobs for ensemble simulations in the 32-512 core range. Furthermore, this activity provides evidence that conventional parallel programming with MPI should be successful at the petascale in the short to medium term. We also report on the parallel performance of some of our codes on up to 65 636 cores on the IBM Blue Gene/P system at the Argonne Leadership Computing Facility, which has recently been named the fastest supercomputer in the world for open science.
Methodical and technological aspects of creation of interactive computer learning systems
NASA Astrophysics Data System (ADS)
Vishtak, N. M.; Frolov, D. A.
2017-01-01
The article presents a methodology for the development of an interactive computer training system for training power plant. The methods used in the work are a generalization of the content of scientific and methodological sources on the use of computer-based training systems in vocational education, methods of system analysis, methods of structural and object-oriented modeling of information systems. The relevance of the development of the interactive computer training systems in the preparation of the personnel in the conditions of the educational and training centers is proved. Development stages of the computer training systems are allocated, factors of efficient use of the interactive computer training system are analysed. The algorithm of work performance at each development stage of the interactive computer training system that enables one to optimize time, financial and labor expenditure on the creation of the interactive computer training system is offered.
Job Superscheduler Architecture and Performance in Computational Grid Environments
NASA Technical Reports Server (NTRS)
Shan, Hongzhang; Oliker, Leonid; Biswas, Rupak
2003-01-01
Computational grids hold great promise in utilizing geographically separated heterogeneous resources to solve large-scale complex scientific problems. However, a number of major technical hurdles, including distributed resource management and effective job scheduling, stand in the way of realizing these gains. In this paper, we propose a novel grid superscheduler architecture and three distributed job migration algorithms. We also model the critical interaction between the superscheduler and autonomous local schedulers. Extensive performance comparisons with ideal, central, and local schemes using real workloads from leading computational centers are conducted in a simulation environment. Additionally, synthetic workloads are used to perform a detailed sensitivity analysis of our superscheduler. Several key metrics demonstrate that substantial performance gains can be achieved via smart superscheduling in distributed computational grids.
75 FR 15715 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-30
... Conference Call.) Contact Person: Daniel F. McDonald, PhD, Scientific Review Officer, Center for Scientific.... (301) 435-1215. mcdonald@csr.nih.gov . Name of Committee: Center for Scientific Review Special Emphasis...
77 FR 74675 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-17
... Rockledge Drive, Bethesda, MD 20892, (Telephone Conference Call). Contact Person: Rajiv Kumar, Ph.D., Chief... Chaudhari, Ph.D., Scientific Review Officer, Center for Scientific Review, National Institutes of Health... Conference Call). Contact Person: Syed M Quadri, Ph.D., Scientific Review Officer, Center for Scientific...
NASA Astrophysics Data System (ADS)
Khodachenko, Maxim; Miller, Steven; Stoeckler, Robert; Topf, Florian
2010-05-01
Computational modeling and observational data analysis are two major aspects of the modern scientific research. Both appear nowadays under extensive development and application. Many of the scientific goals of planetary space missions require robust models of planetary objects and environments as well as efficient data analysis algorithms, to predict conditions for mission planning and to interpret the experimental data. Europe has great strength in these areas, but it is insufficiently coordinated; individual groups, models, techniques and algorithms need to be coupled and integrated. Existing level of scientific cooperation and the technical capabilities for operative communication, allow considerable progress in the development of a distributed international Research Infrastructure (RI) which is based on the existing in Europe computational modelling and data analysis centers, providing the scientific community with dedicated services in the fields of their computational and data analysis expertise. These services will appear as a product of the collaborative communication and joint research efforts of the numerical and data analysis experts together with planetary scientists. The major goal of the EUROPLANET-RI / EMDAF is to make computational models and data analysis algorithms associated with particular national RIs and teams, as well as their outputs, more readily available to their potential user community and more tailored to scientific user requirements, without compromising front-line specialized research on model and data analysis algorithms development and software implementation. This objective will be met through four keys subdivisions/tasks of EMAF: 1) an Interactive Catalogue of Planetary Models; 2) a Distributed Planetary Modelling Laboratory; 3) a Distributed Data Analysis Laboratory, and 4) enabling Models and Routines for High Performance Computing Grids. Using the advantages of the coordinated operation and efficient communication between the involved computational modelling, research and data analysis expert teams and their related research infrastructures, EMDAF will provide a 1) flexible, 2) scientific user oriented, 3) continuously developing and fast upgrading computational and data analysis service to support and intensify the European planetary scientific research. At the beginning EMDAF will create a set of demonstrators and operational tests of this service in key areas of European planetary science. This work will aim at the following objectives: (a) Development and implementation of tools for distant interactive communication between the planetary scientists and computing experts (including related RIs); (b) Development of standard routine packages, and user-friendly interfaces for operation of the existing numerical codes and data analysis algorithms by the specialized planetary scientists; (c) Development of a prototype of numerical modelling services "on demand" for space missions and planetary researchers; (d) Development of a prototype of data analysis services "on demand" for space missions and planetary researchers; (e) Development of a prototype of coordinated interconnected simulations of planetary phenomena and objects (global multi-model simulators); (f) Providing the demonstrators of a coordinated use of high performance computing facilities (super-computer networks), done in cooperation with European HPC Grid DEISA.
European Scientific Notes. Volume 35, Number 6.
1981-06-30
center where the operator can relationships are promoted in four ways: regulate the test conditions (air storage 211 +* , FSN 35-6 (1981) tanks, model ...tunnel models , for itself asymmetric aileron deflection and post- and its clients, using computer-controlled stall aerodynamics are in progress...those attending the Discussion nonaqueous solutions, (6) new theoretical was somewhat reduced by the fact that models , (7) Fermi-level concepts in solu
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garzoglio, Gabriele
The Fermilab Grid and Cloud Computing Department and the KISTI Global Science experimental Data hub Center propose a joint project. The goals are to enable scientific workflows of stakeholders to run on multiple cloud resources by use of (a) Virtual Infrastructure Automation and Provisioning, (b) Interoperability and Federat ion of Cloud Resources , and (c) High-Throughput Fabric Virtualization. This is a matching fund project in which Fermilab and KISTI will contribute equal resources .
Format Guide for Scientific and Technical Reports.
1984-01-01
supported by the discussion. Graphkic Services The Graphic Services Section (Code 2632) provides a variety of layout and design services. Camera-ready artwork...complex typography , elaborate graphic elements, extensive computer printouts, and other unusual materials that explain the project. With few exceptions...2630 Publications Branch Office 222/253 72379 S Publications Control Center 222/253 73508 Editorial 222/253 72782 Graphic Services 222/234 72756 73989
ERIC Educational Resources Information Center
Yarker, Morgan Brown
2013-01-01
Research suggests that scientific models and modeling should be topics covered in K-12 classrooms as part of a comprehensive science curriculum. It is especially important when talking about topics in weather and climate, where computer and forecast models are the center of attention. There are several approaches to model based inquiry, but it can…
76 FR 41234 - Advanced Scientific Computing Advisory Committee Charter Renewal
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-13
... Secretariat, General Services Administration, notice is hereby given that the Advanced Scientific Computing... advice and recommendations concerning the Advanced Scientific Computing program in response only to... Advanced Scientific Computing Research program and recommendations based thereon; --Advice on the computing...
NASA Technical Reports Server (NTRS)
Kramer, Williams T. C.; Simon, Horst D.
1994-01-01
This tutorial proposes to be a practical guide for the uninitiated to the main topics and themes of high-performance computing (HPC), with particular emphasis to distributed computing. The intent is first to provide some guidance and directions in the rapidly increasing field of scientific computing using both massively parallel and traditional supercomputers. Because of their considerable potential computational power, loosely or tightly coupled clusters of workstations are increasingly considered as a third alternative to both the more conventional supercomputers based on a small number of powerful vector processors, as well as high massively parallel processors. Even though many research issues concerning the effective use of workstation clusters and their integration into a large scale production facility are still unresolved, such clusters are already used for production computing. In this tutorial we will utilize the unique experience made at the NAS facility at NASA Ames Research Center. Over the last five years at NAS massively parallel supercomputers such as the Connection Machines CM-2 and CM-5 from Thinking Machines Corporation and the iPSC/860 (Touchstone Gamma Machine) and Paragon Machines from Intel were used in a production supercomputer center alongside with traditional vector supercomputers such as the Cray Y-MP and C90.
DOE Office of Scientific and Technical Information (OSTI.GOV)
De, K; Jha, S; Klimentov, A
2016-01-01
The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS, one of the largest collaborations ever assembled in the sciences, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Managementmore » System for managing the workflow for all data processing on over 150 data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. While PanDA currently uses more than 250,000 cores with a peak performance of 0.3 petaFLOPS, LHC data taking runs require more resources than Grid computing can possibly provide. To alleviate these challenges, LHC experiments are engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with supercomputers in United States, Europe and Russia (in particular with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF), MIRA supercomputer at Argonne Leadership Computing Facilities (ALCF), Supercomputer at the National Research Center Kurchatov Institute , IT4 in Ostrava and others). Current approach utilizes modified PanDA pilot framework for job submission to the supercomputers batch queues and local data management, with light-weight MPI wrappers to run single threaded workloads in parallel on LCFs multi-core worker nodes. This implementation was tested with a variety of Monte-Carlo workloads on several supercomputing platforms for ALICE and ATLAS experiments and it is in full production for the ATLAS experiment since September 2015. We will present our current accomplishments with running PanDA WMS at supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facilities infrastructure for High Energy and Nuclear Physics as well as other data-intensive science applications, such as bioinformatics and astro-particle physics.« less
Tools for 3D scientific visualization in computational aerodynamics
NASA Technical Reports Server (NTRS)
Bancroft, Gordon; Plessel, Todd; Merritt, Fergus; Watson, Val
1989-01-01
The purpose is to describe the tools and techniques in use at the NASA Ames Research Center for performing visualization of computational aerodynamics, for example visualization of flow fields from computer simulations of fluid dynamics about vehicles such as the Space Shuttle. The hardware used for visualization is a high-performance graphics workstation connected to a super computer with a high speed channel. At present, the workstation is a Silicon Graphics IRIS 3130, the supercomputer is a CRAY2, and the high speed channel is a hyperchannel. The three techniques used for visualization are post-processing, tracking, and steering. Post-processing analysis is done after the simulation. Tracking analysis is done during a simulation but is not interactive, whereas steering analysis involves modifying the simulation interactively during the simulation. Using post-processing methods, a flow simulation is executed on a supercomputer and, after the simulation is complete, the results of the simulation are processed for viewing. The software in use and under development at NASA Ames Research Center for performing these types of tasks in computational aerodynamics is described. Workstation performance issues, benchmarking, and high-performance networks for this purpose are also discussed as well as descriptions of other hardware for digital video and film recording.
Techniques for animation of CFD results. [computational fluid dynamics
NASA Technical Reports Server (NTRS)
Horowitz, Jay; Hanson, Jeffery C.
1992-01-01
Video animation is becoming increasingly vital to the computational fluid dynamics researcher, not just for presentation, but for recording and comparing dynamic visualizations that are beyond the current capabilities of even the most powerful graphic workstation. To meet these needs, Lewis Research Center has recently established a facility to provide users with easy access to advanced video animation capabilities. However, producing animation that is both visually effective and scientifically accurate involves various technological and aesthetic considerations that must be understood both by the researcher and those supporting the visualization process. These considerations include: scan conversion, color conversion, and spatial ambiguities.
Prediction and characterization of application power use in a high-performance computing environment
Bugbee, Bruce; Phillips, Caleb; Egan, Hilary; ...
2017-02-27
Power use in data centers and high-performance computing (HPC) facilities has grown in tandem with increases in the size and number of these facilities. Substantial innovation is needed to enable meaningful reduction in energy footprints in leadership-class HPC systems. In this paper, we focus on characterizing and investigating application-level power usage. We demonstrate potential methods for predicting power usage based on a priori and in situ characteristics. Lastly, we highlight a potential use case of this method through a simulated power-aware scheduler using historical jobs from a real scientific HPC system.
77 FR 65570 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-29
... Conference Call). Contact Person: Carol Hamelink, Ph.D., Scientific Review Officer, Center for Scientific...--Convention Center, 900 10th Street NW., Washington, DC 20001. Contact Person: Mark P Rubert, Ph.D... for Scientific Review Special Emphasis Panel; Fellowships: Chemistry, Biochemistry and Biophysics...
78 FR 39298 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-01
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review...., Scientific Review Officer, Center for Scientific Review, National Institutes of Health, 6701 Rockledge Drive...: National Institutes of Health, 6701 Rockledge Drive, Bethesda, MD 20892, (Virtual Meeting). Contact Person...
76 FR 35225 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-16
... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel, Drug Discovery... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... Drive, Bethesda, MD 20892. (Telephone Conference Call) Contact Person: Guangyong Ji, PhD, Scientific...
77 FR 74675 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-17
... Scientific Review Special Emphasis Panel Member Conflict: Healthcare Delivery and Methodologies. Date... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... Road NW., Washington, DC 20015. Contact Person: Bo Hong, Ph.D., Scientific Review Officer, Center for...
NASA Astrophysics Data System (ADS)
Doyle, Paul; Mtenzi, Fred; Smith, Niall; Collins, Adrian; O'Shea, Brendan
2012-09-01
The scientific community is in the midst of a data analysis crisis. The increasing capacity of scientific CCD instrumentation and their falling costs is contributing to an explosive generation of raw photometric data. This data must go through a process of cleaning and reduction before it can be used for high precision photometric analysis. Many existing data processing pipelines either assume a relatively small dataset or are batch processed by a High Performance Computing centre. A radical overhaul of these processing pipelines is required to allow reduction and cleaning rates to process terabyte sized datasets at near capture rates using an elastic processing architecture. The ability to access computing resources and to allow them to grow and shrink as demand fluctuates is essential, as is exploiting the parallel nature of the datasets. A distributed data processing pipeline is required. It should incorporate lossless data compression, allow for data segmentation and support processing of data segments in parallel. Academic institutes can collaborate and provide an elastic computing model without the requirement for large centralized high performance computing data centers. This paper demonstrates how a base 10 order of magnitude improvement in overall processing time has been achieved using the "ACN pipeline", a distributed pipeline spanning multiple academic institutes.
Updated Panel-Method Computer Program
NASA Technical Reports Server (NTRS)
Ashby, Dale L.
1995-01-01
Panel code PMARC_12 (Panel Method Ames Research Center, version 12) computes potential-flow fields around complex three-dimensional bodies such as complete aircraft models. Contains several advanced features, including internal mathematical modeling of flow, time-stepping wake model for simulating either steady or unsteady motions, capability for Trefftz computation of drag induced by plane, and capability for computation of off-body and on-body streamlines, and capability of computation of boundary-layer parameters by use of two-dimensional integral boundary-layer method along surface streamlines. Investigators interested in visual representations of phenomena, may want to consider obtaining program GVS (ARC-13361), General visualization System. GVS is Silicon Graphics IRIS program created to support scientific-visualization needs of PMARC_12. GVS available separately from COSMIC. PMARC_12 written in standard FORTRAN 77, with exception of NAMELIST extension used for input.
Computer graphics and the graphic artist
NASA Technical Reports Server (NTRS)
Taylor, N. L.; Fedors, E. G.; Pinelli, T. E.
1985-01-01
A centralized computer graphics system is being developed at the NASA Langley Research Center. This system was required to satisfy multiuser needs, ranging from presentation quality graphics prepared by a graphic artist to 16-mm movie simulations generated by engineers and scientists. While the major thrust of the central graphics system was directed toward engineering and scientific applications, hardware and software capabilities to support the graphic artists were integrated into the design. This paper briefly discusses the importance of computer graphics in research; the central graphics system in terms of systems, software, and hardware requirements; the application of computer graphics to graphic arts, discussed in terms of the requirements for a graphic arts workstation; and the problems encountered in applying computer graphics to the graphic arts. The paper concludes by presenting the status of the central graphics system.
76 FR 31945 - Advanced Scientific Computing Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-02
... DEPARTMENT OF ENERGY Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy... teleconference meeting of the Advanced Scientific Computing Advisory Committee (ASCAC). The Federal [email protected] . FOR FURTHER INFORMATION CONTACT: Melea Baker, Office of Advanced Scientific Computing...
76 FR 63314 - Center for Scientific Review Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-12
..., (Telephone Conference Call). Contact Person: Suzan Nadi, PhD, Scientific Review Officer, Center for... Mandarin Oriental, 1330 Maryland Avenue, SW., Washington, DC 20024. Contact Person: Mark Lindner, PhD.... Contact Person: Lilia Topol, PhD, Scientific Review Officer, Center for Scientific Review, National...
75 FR 63494 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-15
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel, Member Conflict..., (Telephone Conference Call). Contact Person: Joseph G. Rudolph, PhD, Chief and Scientific Review Officer...
From cosmos to connectomes: the evolution of data-intensive science.
Burns, Randal; Vogelstein, Joshua T; Szalay, Alexander S
2014-09-17
The analysis of data requires computation: originally by hand and more recently by computers. Different models of computing are designed and optimized for different kinds of data. In data-intensive science, the scale and complexity of data exceeds the comfort zone of local data stores on scientific workstations. Thus, cloud computing emerges as the preeminent model, utilizing data centers and high-performance clusters, enabling remote users to access and query subsets of the data efficiently. We examine how data-intensive computational systems originally built for cosmology, the Sloan Digital Sky Survey (SDSS), are now being used in connectomics, at the Open Connectome Project. We list lessons learned and outline the top challenges we expect to face. Success in computational connectomics would drastically reduce the time between idea and discovery, as SDSS did in cosmology. Copyright © 2014 Elsevier Inc. All rights reserved.
78 FR 4419 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-22
...: Center for Scientific Review Special Emphasis Panel, Biomedical Imaging and Engineering Area Review. Date... . Name of Committee: Center for Scientific Review Special Emphasis Panel, Member Conflict: Nanotechnology...
77 FR 26771 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-07
[email protected] . Name of Committee: Center for Scientific Review Special Emphasis Panel; Motor... . Name of Committee: Center for Scientific Review Special Emphasis Panel; Member Conflict: Obesity...
75 FR 32958 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-10
...: Center for Scientific Review Special Emphasis Panel; Member Conflicts: Topics in Infectious Diseases and....gov . Name of Committee: Center for Scientific Review Special Emphasis Panel; Fellowships: Immunology...
75 FR 9887 - Advanced Scientific Computing Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-04
... DEPARTMENT OF ENERGY Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy... Advanced Scientific Computing Advisory Committee (ASCAC). Federal Advisory Committee Act (Pub. L. 92-463... INFORMATION CONTACT: Melea Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building...
76 FR 9765 - Advanced Scientific Computing Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-22
... DEPARTMENT OF ENERGY Advanced Scientific Computing Advisory Committee AGENCY: Office of Science... Advanced Scientific Computing Advisory Committee (ASCAC). The Federal Advisory Committee Act (Pub. L. 92... INFORMATION CONTACT: Melea Baker, Office of Advanced Scientific Computing Research, SC-21/Germantown Building...
77 FR 45345 - DOE/Advanced Scientific Computing Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-31
... Recompetition results for Scientific Discovery through Advanced Computing (SciDAC) applications Co-design Public... DEPARTMENT OF ENERGY DOE/Advanced Scientific Computing Advisory Committee AGENCY: Office of... the Advanced Scientific Computing Advisory Committee (ASCAC). The Federal Advisory Committee Act (Pub...
75 FR 64720 - DOE/Advanced Scientific Computing Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-20
... DEPARTMENT OF ENERGY DOE/Advanced Scientific Computing Advisory Committee AGENCY: Department of... the Advanced Scientific Computing Advisory Committee (ASCAC). Federal Advisory Committee Act (Pub. L.... FOR FURTHER INFORMATION CONTACT: Melea Baker, Office of Advanced Scientific Computing Research; SC-21...
Computing through Scientific Abstractions in SysBioPS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chin, George; Stephan, Eric G.; Gracio, Deborah K.
2004-10-13
Today, biologists and bioinformaticists have a tremendous amount of computational power at their disposal. With the availability of supercomputers, burgeoning scientific databases and digital libraries such as GenBank and PubMed, and pervasive computational environments such as the Grid, biologists have access to a wealth of computational capabilities and scientific data at hand. Yet, the rapid development of computational technologies has far exceeded the typical biologist’s ability to effectively apply the technology in their research. Computational sciences research and development efforts such as the Biology Workbench, BioSPICE (Biological Simulation Program for Intra-Cellular Evaluation), and BioCoRE (Biological Collaborative Research Environment) are importantmore » in connecting biologists and their scientific problems to computational infrastructures. On the Computational Cell Environment and Heuristic Entity-Relationship Building Environment projects at the Pacific Northwest National Laboratory, we are jointly developing a new breed of scientific problem solving environment called SysBioPSE that will allow biologists to access and apply computational resources in the scientific research context. In contrast to other computational science environments, SysBioPSE operates as an abstraction layer above a computational infrastructure. The goal of SysBioPSE is to allow biologists to apply computational resources in the context of the scientific problems they are addressing and the scientific perspectives from which they conduct their research. More specifically, SysBioPSE allows biologists to capture and represent scientific concepts and theories and experimental processes, and to link these views to scientific applications, data repositories, and computer systems.« less
Software Innovations Speed Scientific Computing
NASA Technical Reports Server (NTRS)
2012-01-01
To help reduce the time needed to analyze data from missions like those studying the Sun, Goddard Space Flight Center awarded SBIR funding to Tech-X Corporation of Boulder, Colorado. That work led to commercial technologies that help scientists accelerate their data analysis tasks. Thanks to its NASA work, the company doubled its number of headquarters employees to 70 and generated about $190,000 in revenue from its NASA-derived products.
The Voyager Spacecraft. [Jupiter-Saturn mission investigations
NASA Technical Reports Server (NTRS)
1979-01-01
The configuration of the Voyager spacecraft is described as well as the subsystems for power, temperature control, attitude control, and propulsion. Major features of Jupiter and Saturn including their atmospheres, surfaces, and natural satellites are discussed. The 13 onboard experiments and their scientific objectives are explained. Other aspects covered include tracking, data acquisition, and the mission control and computing center. Members of the Voyager team and subcontractors are listed.
Nonlinear Feedback Controllers and Compensators: A State-Dependent Riccati Equation Approach
2003-01-01
Nonlinear Feedback Controllers and Compensators: A State-Dependent Riccati Equation Approach H. T. Banks∗ B. M. Lewis † H. T. Tran‡ Department of...Mathematics Center for Research in Scientific Computation North Carolina State University Raleigh, NC 27695 Abstract State-dependent Riccati equation ...estimating the solution of the Hamilton- Jacobi-Bellman (HJB) equation can be found in a comprehensive review article [5]. Each of these ∗htbanks
NASA Technical Reports Server (NTRS)
Evans, A. B.; Lee, L. L.
1985-01-01
This User Guide provides a general introduction to the structure, use, and handling of magnetic tapes at Langley Research Center (LaRC). The topics covered are tape terminology, physical characteristics, error prevention and detection, and creating, using, and maintaining tapes. Supplementary documentation is referenced where it might be helpful. The documentation is included for the tape utility programs, BLOCK, UNBLOCK, and TAPEDMP, which are available at the Central Scientific Computing Complex at LaRC.
Fourier spectroscopy with a one-million-point transformation
NASA Technical Reports Server (NTRS)
Connes, J.; Delouis, H.; Connes, P.; Guelachvili, G.; Maillard, J.; Michel, G.
1972-01-01
A new type of interferometer for use in Fourier spectroscopy has been devised at the Aime Cotton Laboratory of the National Center for Scientific Research (CNRS), Orsay, France. With this interferometer and newly developed computational techniques, interferograms comprising as many as one million samples can now be transformed. The techniques are described, and examples of spectra of thorium and holmium, derived from one million-point interferograms, are presented.
What’s Wrong With Automatic Speech Recognition (ASR) and How Can We Fix It?
2013-03-01
Jordan Cohen International Computer Science Institute 1947 Center Street, Suite 600 Berkeley, CA 94704 MARCH 2013 Final Report ...This report was cleared for public release by the 88th Air Base Wing Public Affairs Office and is available to the general public, including foreign...711th Human Performance Wing Air Force Research Laboratory This report is published in the interest of scientific and technical
[Activities of Research Institute for Advanced Computer Science
NASA Technical Reports Server (NTRS)
Gross, Anthony R. (Technical Monitor); Leiner, Barry M.
2001-01-01
The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administrations missions. RIACS is located at the NASA Ames Research Center, Moffett Field, California. RIACS research focuses on the three cornerstones of IT research necessary to meet the future challenges of NASA missions: 1. Automated Reasoning for Autonomous Systems Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. 2. Human-Centered Computing Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities. 3. High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to analysis of large scientific datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply IT research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, visiting scientist programs and student summer programs, designed to encourage and facilitate collaboration between the university and NASA IT research communities.
Comparative case study of two biomedical research collaboratories.
Schleyer, Titus K L; Teasley, Stephanie D; Bhatnagar, Rishi
2005-10-25
Working together efficiently and effectively presents a significant challenge in large-scale, complex, interdisciplinary research projects. Collaboratories are a nascent method to help meet this challenge. However, formal collaboratories in biomedical research centers are the exception rather than the rule. The main purpose of this paper is to compare and describe two collaboratories that used off-the-shelf tools and relatively modest resources to support the scientific activity of two biomedical research centers. The two centers were the Great Lakes Regional Center for AIDS Research (HIV/AIDS Center) and the New York University Oral Cancer Research for Adolescent and Adult Health Promotion Center (Oral Cancer Center). In each collaboratory, we used semistructured interviews, surveys, and contextual inquiry to assess user needs and define the technology requirements. We evaluated and selected commercial software applications by comparing their feature sets with requirements and then pilot-testing the applications. Local and remote support staff cooperated in the implementation and end user training for the collaborative tools. Collaboratory staff evaluated each implementation by analyzing utilization data, administering user surveys, and functioning as participant observers. The HIV/AIDS Center primarily required real-time interaction for developing projects and attracting new participants to the center; the Oral Cancer Center, on the other hand, mainly needed tools to support distributed and asynchronous work in small research groups. The HIV/AIDS Center's collaboratory included a center-wide website that also served as the launch point for collaboratory applications, such as NetMeeting, Timbuktu Conference, PlaceWare Auditorium, and iVisit. The collaboratory of the Oral Cancer Center used Groove and Genesys Web conferencing. The HIV/AIDS Center was successful in attracting new scientists to HIV/AIDS research, and members used the collaboratory for developing and implementing new research studies. The Oral Cancer Center successfully supported highly distributed and asynchronous research, and the collaboratory facilitated real-time interaction for analyzing data and preparing publications. The two collaboratory implementations demonstrated the feasibility of supporting biomedical research centers using off-the-shelf commercial tools, but they also identified several barriers to successful collaboration. These barriers included computing platform incompatibilities, network infrastructure complexity, variable availability of local versus remote IT support, low computer and collaborative software literacy, and insufficient maturity of available collaborative software. Factors enabling collaboratory use included collaboration incentives through funding mechanism, a collaborative versus competitive relationship of researchers, leadership by example, and tools well matched to tasks and technical progress. Integrating electronic collaborative tools into routine scientific practice can be successful but requires further research on the technical, social, and behavioral factors influencing the adoption and use of collaboratories.
75 FR 43518 - Advanced Scientific Computing Advisory Committee; Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-26
... DEPARTMENT OF ENERGY Advanced Scientific Computing Advisory Committee; Meeting AGENCY: Office of... Scientific Computing Advisory Committee (ASCAC). Federal Advisory Committee Act (Pub. L. 92-463, 86 Stat. 770...: Melea Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U. S...
78 FR 7791 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-04
... . Name of Committee: Center for Scientific Review Special Emphasis Panel; Fellowship: Immunology. Date...: Center for Scientific Review Special Emphasis Panel; AREA: Immunology. Date: March 1, 2013. Time: 10:00 a...
78 FR 12071 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-21
[email protected] . Name of Committee: Center for Scientific Review Special Emphasis Panel; Stem Cell...: Center for Scientific Review Special Emphasis Panel; Small Business: Skeletal Muscle. Date: March 14...
76 FR 9354 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-17
...: Center for Scientific Review Special Emphasis Panel; Member Conflict: Cancer Biology and Therapy. Date... (Telephone Conference Call). Contact Person: Fouad A. El-Zaatari, PhD, Scientific Review Officer, Center for...
75 FR 39548 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-09
...: Center for Scientific Review Special Emphasis Panel; Program Project: NeuroAIDS. Date: August 4-5, 2010... Committee: Center for Scientific Review Special Emphasis Panel; Member Conflict: AIDS Molecular Biology and...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-12
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Disease Control and Prevention World Trade... Prevention (CDC), announces the establishment of the World Trade Center (WTC) Health Program Scientific..., Designated Federal Officer, World Trade Center Health Program Scientific/Technical Advisory Committee...
Towards prediction of correlated material properties using quantum Monte Carlo methods
NASA Astrophysics Data System (ADS)
Wagner, Lucas
Correlated electron systems offer a richness of physics far beyond noninteracting systems. If we would like to pursue the dream of designer correlated materials, or, even to set a more modest goal, to explain in detail the properties and effective physics of known materials, then accurate simulation methods are required. Using modern computational resources, quantum Monte Carlo (QMC) techniques offer a way to directly simulate electron correlations. I will show some recent results on a few extremely challenging materials including the metal-insulator transition of VO2, the ground state of the doped cuprates, and the pressure dependence of magnetic properties in FeSe. By using a relatively simple implementation of QMC, at least some properties of these materials can be described truly from first principles, without any adjustable parameters. Using the QMC platform, we have developed a way of systematically deriving effective lattice models from the simulation. This procedure is particularly attractive for correlated electron systems because the QMC methods treat the one-body and many-body components of the wave function and Hamiltonian on completely equal footing. I will show some examples of using this downfolding technique and the high accuracy of QMC to connect our intuitive ideas about interacting electron systems with high fidelity simulations. The work in this presentation was supported in part by NSF DMR 1206242, the U.S. Department of Energy, Office of Science, Office of Advanced Scientific Computing Research, Scientific Discovery through Advanced Computing (SciDAC) program under Award Number FG02-12ER46875, and the Center for Emergent Superconductivity, Department of Energy Frontier Research Center under Grant No. DEAC0298CH1088. Computing resources were provided by a Blue Waters Illinois grant and INCITE PhotSuper and SuperMatSim allocations.
NASA Astrophysics Data System (ADS)
Memon, Shahbaz; Vallot, Dorothée; Zwinger, Thomas; Neukirchen, Helmut
2017-04-01
Scientific communities generate complex simulations through orchestration of semi-structured analysis pipelines which involves execution of large workflows on multiple, distributed and heterogeneous computing and data resources. Modeling ice dynamics of glaciers requires workflows consisting of many non-trivial, computationally expensive processing tasks which are coupled to each other. From this domain, we present an e-Science use case, a workflow, which requires the execution of a continuum ice flow model and a discrete element based calving model in an iterative manner. Apart from the execution, this workflow also contains data format conversion tasks that support the execution of ice flow and calving by means of transition through sequential, nested and iterative steps. Thus, the management and monitoring of all the processing tasks including data management and transfer of the workflow model becomes more complex. From the implementation perspective, this workflow model was initially developed on a set of scripts using static data input and output references. In the course of application usage when more scripts or modifications introduced as per user requirements, the debugging and validation of results were more cumbersome to achieve. To address these problems, we identified a need to have a high-level scientific workflow tool through which all the above mentioned processes can be achieved in an efficient and usable manner. We decided to make use of the e-Science middleware UNICORE (Uniform Interface to Computing Resources) that allows seamless and automated access to different heterogenous and distributed resources which is supported by a scientific workflow engine. Based on this, we developed a high-level scientific workflow model for coupling of massively parallel High-Performance Computing (HPC) jobs: a continuum ice sheet model (Elmer/Ice) and a discrete element calving and crevassing model (HiDEM). In our talk we present how the use of a high-level scientific workflow middleware enables reproducibility of results more convenient and also provides a reusable and portable workflow template that can be deployed across different computing infrastructures. Acknowledgements This work was kindly supported by NordForsk as part of the Nordic Center of Excellence (NCoE) eSTICC (eScience Tools for Investigating Climate Change at High Northern Latitudes) and the Top-level Research Initiative NCoE SVALI (Stability and Variation of Arctic Land Ice).
NASA Astrophysics Data System (ADS)
Callaghan, S.; Maechling, P. J.; Juve, G.; Vahi, K.; Deelman, E.; Jordan, T. H.
2015-12-01
The CyberShake computational platform, developed by the Southern California Earthquake Center (SCEC), is an integrated collection of scientific software and middleware that performs 3D physics-based probabilistic seismic hazard analysis (PSHA) for Southern California. CyberShake integrates large-scale and high-throughput research codes to produce probabilistic seismic hazard curves for individual locations of interest and hazard maps for an entire region. A recent CyberShake calculation produced about 500,000 two-component seismograms for each of 336 locations, resulting in over 300 million synthetic seismograms in a Los Angeles-area probabilistic seismic hazard model. CyberShake calculations require a series of scientific software programs. Early computational stages produce data used as inputs by later stages, so we describe CyberShake calculations using a workflow definition language. Scientific workflow tools automate and manage the input and output data and enable remote job execution on large-scale HPC systems. To satisfy the requests of broad impact users of CyberShake data, such as seismologists, utility companies, and building code engineers, we successfully completed CyberShake Study 15.4 in April and May 2015, calculating a 1 Hz urban seismic hazard map for Los Angeles. We distributed the calculation between the NSF Track 1 system NCSA Blue Waters, the DOE Leadership-class system OLCF Titan, and USC's Center for High Performance Computing. This study ran for over 5 weeks, burning about 1.1 million node-hours and producing over half a petabyte of data. The CyberShake Study 15.4 results doubled the maximum simulated seismic frequency from 0.5 Hz to 1.0 Hz as compared to previous studies, representing a factor of 16 increase in computational complexity. We will describe how our workflow tools supported splitting the calculation across multiple systems. We will explain how we modified CyberShake software components, including GPU implementations and migrating from file-based communication to MPI messaging, to greatly reduce the I/O demands and node-hour requirements of CyberShake. We will also present performance metrics from CyberShake Study 15.4, and discuss challenges that producers of Big Data on open-science HPC resources face moving forward.
Large Scale Computing and Storage Requirements for High Energy Physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerber, Richard A.; Wasserman, Harvey
2010-11-24
The National Energy Research Scientific Computing Center (NERSC) is the leading scientific computing facility for the Department of Energy's Office of Science, providing high-performance computing (HPC) resources to more than 3,000 researchers working on about 400 projects. NERSC provides large-scale computing resources and, crucially, the support and expertise needed for scientists to make effective use of them. In November 2009, NERSC, DOE's Office of Advanced Scientific Computing Research (ASCR), and DOE's Office of High Energy Physics (HEP) held a workshop to characterize the HPC resources needed at NERSC to support HEP research through the next three to five years. Themore » effort is part of NERSC's legacy of anticipating users needs and deploying resources to meet those demands. The workshop revealed several key points, in addition to achieving its goal of collecting and characterizing computing requirements. The chief findings: (1) Science teams need access to a significant increase in computational resources to meet their research goals; (2) Research teams need to be able to read, write, transfer, store online, archive, analyze, and share huge volumes of data; (3) Science teams need guidance and support to implement their codes on future architectures; and (4) Projects need predictable, rapid turnaround of their computational jobs to meet mission-critical time constraints. This report expands upon these key points and includes others. It also presents a number of case studies as representative of the research conducted within HEP. Workshop participants were asked to codify their requirements in this case study format, summarizing their science goals, methods of solution, current and three-to-five year computing requirements, and software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, multi-core environment that is expected to dominate HPC architectures over the next few years. The report includes a section that describes efforts already underway or planned at NERSC that address requirements collected at the workshop. NERSC has many initiatives in progress that address key workshop findings and are aligned with NERSC's strategic plans.« less
NASA Astrophysics Data System (ADS)
de Groot, R.
2008-12-01
The Southern California Earthquake Center (SCEC) has been developing groundbreaking computer modeling capabilities for studying earthquakes. These visualizations were initially shared within the scientific community but have recently gained visibility via television news coverage in Southern California. Computers have opened up a whole new world for scientists working with large data sets, and students can benefit from the same opportunities (Libarkin & Brick, 2002). For example, The Great Southern California ShakeOut was based on a potential magnitude 7.8 earthquake on the southern San Andreas fault. The visualization created for the ShakeOut was a key scientific and communication tool for the earthquake drill. This presentation will also feature SCEC Virtual Display of Objects visualization software developed by SCEC Undergraduate Studies in Earthquake Information Technology interns. According to Gordin and Pea (1995), theoretically visualization should make science accessible, provide means for authentic inquiry, and lay the groundwork to understand and critique scientific issues. This presentation will discuss how the new SCEC visualizations and other earthquake imagery achieve these results, how they fit within the context of major themes and study areas in science communication, and how the efficacy of these tools can be improved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zunger, Alex
"Inverse Design: Playing 'Jeopardy' in Materials Science" was submitted by the Center for Inverse Design (CID) to the "Life at the Frontiers of Energy Research" video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. CID, an EFRC directed by Bill Tumas at the National Renewable Energy Laboratory is a partnership of scientists from six institutions: NREL (lead), Northwestern University, University of Colorado, Colorado School of Mines, Stanford University, and Oregon State University. The Office of Basic Energy Sciencesmore » in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges. The mission of the Center for Inverse Design is 'to replace trial-and-error methods used in the development of materials for solar energy conversion with an inverse design approach powered by theory and computation.' Research topics are: solar photovoltaic, photonic, metamaterial, defects, spin dynamics, matter by design, novel materials synthesis, and defect tolerant materials.« less
76 FR 25701 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-05
... Person: David L Williams, PhD, Scientific Review Officer, Center for Scientific Review, National... Drive, Bethesda, MD 20892 (Virtual Meeting). Contact Person: Christine L. Melchior, PhD, Scientific...
Web Services Provide Access to SCEC Scientific Research Application Software
NASA Astrophysics Data System (ADS)
Gupta, N.; Gupta, V.; Okaya, D.; Kamb, L.; Maechling, P.
2003-12-01
Web services offer scientific communities a new paradigm for sharing research codes and communicating results. While there are formal technical definitions of what constitutes a web service, for a user community such as the Southern California Earthquake Center (SCEC), we may conceptually consider a web service to be functionality provided on-demand by an application which is run on a remote computer located elsewhere on the Internet. The value of a web service is that it can (1) run a scientific code without the user needing to install and learn the intricacies of running the code; (2) provide the technical framework which allows a user's computer to talk to the remote computer which performs the service; (3) provide the computational resources to run the code; and (4) bundle several analysis steps and provide the end results in digital or (post-processed) graphical form. Within an NSF-sponsored ITR project coordinated by SCEC, we are constructing web services using architectural protocols and programming languages (e.g., Java). However, because the SCEC community has a rich pool of scientific research software (written in traditional languages such as C and FORTRAN), we also emphasize making existing scientific codes available by constructing web service frameworks which wrap around and directly run these codes. In doing so we attempt to broaden community usage of these codes. Web service wrapping of a scientific code can be done using a "web servlet" construction or by using a SOAP/WSDL-based framework. This latter approach is widely adopted in IT circles although it is subject to rapid evolution. Our wrapping framework attempts to "honor" the original codes with as little modification as is possible. For versatility we identify three methods of user access: (A) a web-based GUI (written in HTML and/or Java applets); (B) a Linux/OSX/UNIX command line "initiator" utility (shell-scriptable); and (C) direct access from within any Java application (and with the correct API interface from within C++ and/or C/Fortran). This poster presentation will provide descriptions of the following selected web services and their origin as scientific application codes: 3D community velocity models for Southern California, geocoordinate conversions (latitude/longitude to UTM), execution of GMT graphical scripts, data format conversions (Gocad to Matlab format), and implementation of Seismic Hazard Analysis application programs that calculate hazard curve and hazard map data sets.
Education through the prism of computation
NASA Astrophysics Data System (ADS)
Kaurov, Vitaliy
2014-03-01
With the rapid development of technology, computation claims its irrevocable place among research components of modern science. Thus to foster a successful future scientist, engineer or educator we need to add computation to the foundations of scientific education. We will discuss what type of paradigm shifts it brings to these foundations on the example of Wolfram Science Summer School. It is one of the most advanced computational outreach programs run by Wolfram Foundation, welcoming participants of almost all ages and backgrounds. Centered on complexity science and physics, it also covers numerous adjacent and interdisciplinary fields such as finance, biology, medicine and even music. We will talk about educational and research experiences in this program during the 12 years of its existence. We will review statistics and outputs the program has produced. Among these are interactive electronic publications at the Wolfram Demonstrations Project and contributions to the computational knowledge engine Wolfram|Alpa.
Transmedia Storytelling in Science Communication: One Subject, Multiple Media, Multiple Stories
NASA Astrophysics Data System (ADS)
Unger, M.; Moloney, K.
2012-12-01
Each communication medium has particular storytelling strengths. For example, video is particularly good at illustrating a progression of events, text at background and context, and games at describing systems. In what USC's Prof. Henry Jenkins described as "transmedia storytelling," multiple media are used simultaneously, in an expansive rather than repetitive way, to better tell a single, complex story. The audience is given multiple entry points to the story, and the story is exposed to diverse and dispersed audiences, ultimately engaging a broader public. We will examine the effectiveness of a transmedia approach to communicating scientific and other complex concepts to a broad and diverse audience. Using the recently developed Educational Visitor Center at the NCAR-Wyoming Supercomputing Center as a case study, we will evaluate the reach of various means of presenting information about the geosciences, climate change and computational science. These will include an assessment of video, mechanical and digital interactive elements, animated movie segments, web-based content, photography, scientific visualizations, printed material and docent-led activities.
View From Camera Not Used During Curiosity's First Six Months on Mars
2017-12-08
This view of Curiosity's left-front and left-center wheels and of marks made by wheels on the ground in the "Yellowknife Bay" area comes from one of six cameras used on Mars for the first time more than six months after the rover landed. The left Navigation Camera (Navcam) linked to Curiosity's B-side computer took this image during the 223rd Martian day, or sol, of Curiosity's work on Mars (March 22, 2013). The wheels are 20 inches (50 centimeters) in diameter. Curiosity carries a pair of main computers, redundant to each other, in order to have a backup available if one fails. Each of the computers, A-side and B-side, also has other redundant subsystems linked to just that computer. Curiosity operated on its A-side from before the August 2012 landing until Feb. 28, when engineers commanded a switch to the B-side in response to a memory glitch on the A-side. One set of activities after switching to the B-side computer has been to check the six engineering cameras that are hard-linked to that computer. The rover's science instruments, including five science cameras, can each be operated by either the A-side or B-side computer, whichever is active. However, each of Curiosity's 12 engineering cameras is linked to just one of the computers. The engineering cameras are the Navigation Camera (Navcam), the Front Hazard-Avoidance Camera (Front Hazcam) and Rear Hazard-Avoidance Camera (Rear Hazcam). Each of those three named cameras has four cameras as part of it: two stereo pairs of cameras, with one pair linked to each computer. Only the pairs linked to the active computer can be used, and the A-side computer was active from before landing, in August, until Feb. 28. All six of the B-side engineering cameras have been used during March 2013 and checked out OK. Image Credit: NASA/JPL-Caltech NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
Computer Center CDC Libraries/NSRDC (Subprograms).
1981-02-01
TRANSFORM." COMM, OF THE ACM, VOL, 10, NO. 10, OCTOBER 1967. 3. SYSTEM/360 SCIENTIFIC SUBROUTINE PACKAGE, IBM TECHNICAL PUBLICATONS DEPARTMENT, 1967...VARIABLE 3) UP TO 9 DEPENDENT VARIABLES PER PLOT. FUNCTIONAL CATEGORIES: J5 LANGUAGE: FORTRAN IV USAGE COMMON /PLO/ NRUN, NPLOT, ITP .6), ITY(6), ITX(61...PLO/ NRUN - NUMBER OF THIS RUN iDEFAULT: 1) NPLOT - NUMBER OF PLOT (DEFAULT: 1 ITP - PAGE TITLE (DEFAULT: BLANK) ITY - Y TITLE (DEFAULT: BLANK) ITX - X
77 FR 25487 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-30
... (Virtual Meeting). Contact Person: Ai-Ping Zou, MD, Ph.D., Scientific Review Officer, Center for Scientific... Scientific Review Special Emphasis Panel; Cancer Therapeutics AREA Grant Applications. Date: May 24, 2012...
Deelman, E.; Callaghan, S.; Field, E.; Francoeur, H.; Graves, R.; Gupta, N.; Gupta, V.; Jordan, T.H.; Kesselman, C.; Maechling, P.; Mehringer, J.; Mehta, G.; Okaya, D.; Vahi, K.; Zhao, L.
2006-01-01
This paper discusses the process of building an environment where large-scale, complex, scientific analysis can be scheduled onto a heterogeneous collection of computational and storage resources. The example application is the Southern California Earthquake Center (SCEC) CyberShake project, an analysis designed to compute probabilistic seismic hazard curves for sites in the Los Angeles area. We explain which software tools were used to build to the system, describe their functionality and interactions. We show the results of running the CyberShake analysis that included over 250,000 jobs using resources available through SCEC and the TeraGrid. ?? 2006 IEEE.
2017-12-08
Oct. 29, 2012 – A day before landfall, Sandy intensified into a Category 2 superstorm nearly 1,000 miles wide. Credit: NASA's Goddard Space Flight Center and NASA Center for Climate Simulation Video and images courtesy of NASA/GSFC/William Putman -- A NASA computer model simulates the astonishing track and forceful winds of Hurricane Sandy. Hurricane Sandy pummeled the East Coast late in 2012’s Atlantic hurricane season, causing 159 deaths and $70 billion in damages. Days before landfall, forecasts of its trajectory were still being made. Some computer models showed that a trough in the jet stream would kick the monster storm away from land and out to sea. Among the earliest to predict its true course was NASA’s GEOS-5 global atmosphere model. The model works by dividing Earth’s atmosphere into a virtual grid of stacked boxes. A supercomputer then solves mathematical equations inside each box to create a weather forecast predicting Sandy’s structure, path and other traits. The NASA model not only produced an accurate track of Sandy, but also captured fine-scale details of the storm’s changing intensity and winds. Watch the video to see it for yourself. For more information, please visit: gmao.gsfc.nasa.gov/research/atmosphericassim/tracking_hur... NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
National Laboratory for Advanced Scientific Visualization at UNAM - Mexico
NASA Astrophysics Data System (ADS)
Manea, Marina; Constantin Manea, Vlad; Varela, Alfredo
2016-04-01
In 2015, the National Autonomous University of Mexico (UNAM) joined the family of Universities and Research Centers where advanced visualization and computing plays a key role to promote and advance missions in research, education, community outreach, as well as business-oriented consulting. This initiative provides access to a great variety of advanced hardware and software resources and offers a range of consulting services that spans a variety of areas related to scientific visualization, among which are: neuroanatomy, embryonic development, genome related studies, geosciences, geography, physics and mathematics related disciplines. The National Laboratory for Advanced Scientific Visualization delivers services through three main infrastructure environments: the 3D fully immersive display system Cave, the high resolution parallel visualization system Powerwall, the high resolution spherical displays Earth Simulator. The entire visualization infrastructure is interconnected to a high-performance-computing-cluster (HPCC) called ADA in honor to Ada Lovelace, considered to be the first computer programmer. The Cave is an extra large 3.6m wide room with projected images on the front, left and right, as well as floor walls. Specialized crystal eyes LCD-shutter glasses provide a strong stereo depth perception, and a variety of tracking devices allow software to track the position of a user's hand, head and wand. The Powerwall is designed to bring large amounts of complex data together through parallel computing for team interaction and collaboration. This system is composed by 24 (6x4) high-resolution ultra-thin (2 mm) bezel monitors connected to a high-performance GPU cluster. The Earth Simulator is a large (60") high-resolution spherical display used for global-scale data visualization like geophysical, meteorological, climate and ecology data. The HPCC-ADA, is a 1000+ computing core system, which offers parallel computing resources to applications that requires large quantity of memory as well as large and fast parallel storage systems. The entire system temperature is controlled by an energy and space efficient cooling solution, based on large rear door liquid cooled heat exchangers. This state-of-the-art infrastructure will boost research activities in the region, offer a powerful scientific tool for teaching at undergraduate and graduate levels, and enhance association and cooperation with business-oriented organizations.
77 FR 50703 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-22
... . Name of Committee: Bioengineering Sciences & Technologies Integrated Review Group; Nanotechnology Study... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel; Dermatology and...
76 FR 35227 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-16
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... commercial property such as patentable material, and personal information concerning individuals associated... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel; Translational...
75 FR 57969 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-23
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... commercial property such as patentable material, and personal information concerning individuals associated... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel; Small Business...
78 FR 14097 - Center for Scientific Review; Notice of Closed Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-04
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... commercial property such as patentable material, and personal information concerning individuals associated... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel; Small Business...
78 FR 64224 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-28
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... commercial property such as patentable material, and personal information concerning individuals associated... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel; Small Business...
76 FR 66075 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-25
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... commercial property such as patentable material, and personal information concerning individuals associated... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel, Small Business...
75 FR 29771 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-27
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... commercial property such as patentable material, and personal information concerning individuals associated... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel; Small Business...
NASA Astrophysics Data System (ADS)
Nugraha, Muhamad Gina; Kaniawati, Ida; Rusdiana, Dadi; Kirana, Kartika Hajar
2016-02-01
Among the purposes of physics learning at high school is to master the physics concepts and cultivate scientific attitude (including critical attitude), develop inductive and deductive reasoning skills. According to Ennis et al., inductive and deductive reasoning skills are part of critical thinking. Based on preliminary studies, both of the competence are lack achieved, it is seen from student learning outcomes is low and learning processes that are not conducive to cultivate critical thinking (teacher-centered learning). One of learning model that predicted can increase mastery concepts and train CTS is inquiry learning model aided computer simulations. In this model, students were given the opportunity to be actively involved in the experiment and also get a good explanation with the computer simulations. From research with randomized control group pretest-posttest design, we found that the inquiry learning model aided computer simulations can significantly improve students' mastery concepts than the conventional (teacher-centered) method. With inquiry learning model aided computer simulations, 20% of students have high CTS, 63.3% were medium and 16.7% were low. CTS greatly contribute to the students' mastery concept with a correlation coefficient of 0.697 and quite contribute to the enhancement mastery concept with a correlation coefficient of 0.603.
The Brazilian Science Data Center (BSDC)
NASA Astrophysics Data System (ADS)
de Almeida, Ulisses Barres; Bodmann, Benno; Giommi, Paolo; Brandt, Carlos H.
Astrophysics and Space Science are becoming increasingly characterised by what is now known as “big data”, the bottlenecks for progress partly shifting from data acquisition to “data mining”. Truth is that the amount and rate of data accumulation in many fields already surpasses the local capabilities for its processing and exploitation, and the efficient conversion of scientific data into knowledge is everywhere a challenge. The result is that, to a large extent, isolated data archives risk being progressively likened to “data graveyards”, where the information stored is not reused for scientific work. Responsible and efficient use of these large data-sets means democratising access and extracting the most science possible from it, which in turn signifies improving data accessibility and integration. Improving data processing capabilities is another important issue specific to researchers and computer scientists of each field. The project presented here wishes to exploit the enormous potential opened up by information technology at our age to advance a model for a science data center in astronomy which aims to expand data accessibility and integration to the largest possible extent and with the greatest efficiency for scientific and educational use. Greater access to data means more people producing and benefiting from information, whereas larger integration of related data from different origins means a greater research potential and increased scientific impact. The project of the BSDC is preoccupied, primarily, with providing tools and solutions for the Brazilian astronomical community. It nevertheless capitalizes on extensive international experience, and is developed in full cooperation with the ASI Science Data Center (ASDC), from the Italian Space Agency, granting it an essential ingredient of internationalisation. The BSDC is Virtual Observatory-complient and part of the “Open Universe”, a global initiative built under the auspices of the United Nations.
Astronomy in the Russian Scientific-Educational Project: "KAZAN-GEONA-2010"
NASA Astrophysics Data System (ADS)
Gusev, A.; Kitiashvili, I.
2006-08-01
The European Union promotes the Sixth Framework Programme. One of the goals of the EU Programme is opening national research and training programs. A special role in the history of the Kazan University was played by the great mathematician Nikolai Lobachevsky - the founder of non-Euclidean geometry (1826). Historically, the thousand-year old city of Kazan and the two-hundred-year old Kazan University carry out the role of the scientific, organizational, and cultural educational center of the Volga region. For the continued successful development of educational and scientific-educational activity of the Russian Federation, the Republic Tatarstan, Kazan was offered the national project: the International Center of the Sciences and Internet Technologies "GeoNa" (Geometry of Nature - GeoNa - is wisdom, enthusiasm, pride, grandeur). This is a modern complex of conference halls including the Center for Internet Technologies, a 3D Planetarium - development of the Moon, PhysicsLand, an active museum of natural sciences, an oceanarium, and a training complex "Spheres of Knowledge". Center GeoNa promotes the direct and effective channel of cooperation with scientific centers around the world. GeoNa will host conferences, congresses, fundamental scientific research sessions of the Moon and planets, and scientific-educational actions: presentation of the international scientific programs on lunar research and modern lunar databases. A more intense program of exchange between scientific centers and organizations for a better knowledge and planning of their astronomical curricula and the introduction of the teaching of astronomy are proposed. Center GeoNa will enable scientists and teachers of the Russian universities with advanced achievements in science and information technologies to join together to establish scientific communications with foreign colleagues in the sphere of the high technology and educational projects with world scientific centers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Maxine D.; Leigh, Jason
2014-02-17
The Blaze high-performance visual computing system serves the high-performance computing research and education needs of University of Illinois at Chicago (UIC). Blaze consists of a state-of-the-art, networked, computer cluster and ultra-high-resolution visualization system called CAVE2(TM) that is currently not available anywhere in Illinois. This system is connected via a high-speed 100-Gigabit network to the State of Illinois' I-WIRE optical network, as well as to national and international high speed networks, such as the Internet2, and the Global Lambda Integrated Facility. This enables Blaze to serve as an on-ramp to national cyberinfrastructure, such as the National Science Foundation’s Blue Waters petascalemore » computer at the National Center for Supercomputing Applications at the University of Illinois at Chicago and the Department of Energy’s Argonne Leadership Computing Facility (ALCF) at Argonne National Laboratory. DOE award # DE-SC005067, leveraged with NSF award #CNS-0959053 for “Development of the Next-Generation CAVE Virtual Environment (NG-CAVE),” enabled us to create a first-of-its-kind high-performance visual computing system. The UIC Electronic Visualization Laboratory (EVL) worked with two U.S. companies to advance their commercial products and maintain U.S. leadership in the global information technology economy. New applications are being enabled with the CAVE2/Blaze visual computing system that is advancing scientific research and education in the U.S. and globally, and help train the next-generation workforce.« less
78 FR 75572 - Center For Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-12
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center For Scientific Review... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel; Obesity, Insulin... and evaluate grant applications. Place: National Institutes of Health, 6701 Rockledge Drive, Bethesda...
77 FR 59935 - Center for Scientific Review Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-01
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review....m. Agenda: To review and evaluate grant applications. Place: National Institutes of Health, 6701... Review Officer, Center for Scientific, Review National Institutes of Health, 6701 Rockledge Drive, Room...
75 FR 8979 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-26
... Review Special Emphasis Panel, Biomaterials, Delivery Systems, and Nanotechnology. Date: March 15-16... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel, PAR08-130...
78 FR 66022 - Center for Scientific Review; Notice of Closed Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-04
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... unwarranted invasion of personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel; Small Business: Biological Chemistry, Biophysics and Drug Discovery. Date: November 4, 2013. Time...
77 FR 75638 - Center for Scientific Review; Notice of Closed Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-21
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel: Stress, Pain and... review and evaluate grant applications. Place: National Institutes of Health, 6701 Rockledge Drive...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garzoglio, Gabriele
The Fermilab Grid and Cloud Computing Department and the KISTI Global Science experimental Data hub Center are working on a multi-year Collaborative Research and Development Agreement.With the knowledge developed in the first year on how to provision and manage a federation of virtual machines through Cloud management systems. In this second year, we expanded the work on provisioning and federation, increasing both scale and diversity of solutions, and we started to build on-demand services on the established fabric, introducing the paradigm of Platform as a Service to assist with the execution of scientific workflows. We have enabled scientific workflows ofmore » stakeholders to run on multiple cloud resources at the scale of 1,000 concurrent machines. The demonstrations have been in the areas of (a) Virtual Infrastructure Automation and Provisioning, (b) Interoperability and Federation of Cloud Resources, and (c) On-demand Services for ScientificWorkflows.« less
The Science DMZ: A Network Design Pattern for Data-Intensive Science
Dart, Eli; Rotman, Lauren; Tierney, Brian; ...
2014-01-01
The ever-increasing scale of scientific data has become a significant challenge for researchers that rely on networks to interact with remote computing systems and transfer results to collaborators worldwide. Despite the availability of high-capacity connections, scientists struggle with inadequate cyberinfrastructure that cripples data transfer performance, and impedes scientific progress. The Science DMZ paradigm comprises a proven set of network design patterns that collectively address these problems for scientists. We explain the Science DMZ model, including network architecture, system configuration, cybersecurity, and performance tools, that creates an optimized network environment for science. We describe use cases from universities, supercomputing centers andmore » research laboratories, highlighting the effectiveness of the Science DMZ model in diverse operational settings. In all, the Science DMZ model is a solid platform that supports any science workflow, and flexibly accommodates emerging network technologies. As a result, the Science DMZ vastly improves collaboration, accelerating scientific discovery.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker, Ann E; Bland, Arthur S Buddy; Hack, James J
Oak Ridge National Laboratory's Leadership Computing Facility (OLCF) continues to deliver the most powerful resources in the U.S. for open science. At 2.33 petaflops peak performance, the Cray XT Jaguar delivered more than 1.5 billion core hours in calendar year (CY) 2010 to researchers around the world for computational simulations relevant to national and energy security; advancing the frontiers of knowledge in physical sciences and areas of biological, medical, environmental, and computer sciences; and providing world-class research facilities for the nation's science enterprise. Scientific achievements by OLCF users range from collaboration with university experimentalists to produce a working supercapacitor thatmore » uses atom-thick sheets of carbon materials to finely determining the resolution requirements for simulations of coal gasifiers and their components, thus laying the foundation for development of commercial-scale gasifiers. OLCF users are pushing the boundaries with software applications sustaining more than one petaflop of performance in the quest to illuminate the fundamental nature of electronic devices. Other teams of researchers are working to resolve predictive capabilities of climate models, to refine and validate genome sequencing, and to explore the most fundamental materials in nature - quarks and gluons - and their unique properties. Details of these scientific endeavors - not possible without access to leadership-class computing resources - are detailed in Section 4 of this report and in the INCITE in Review. Effective operations of the OLCF play a key role in the scientific missions and accomplishments of its users. This Operational Assessment Report (OAR) will delineate the policies, procedures, and innovations implemented by the OLCF to continue delivering a petaflop-scale resource for cutting-edge research. The 2010 operational assessment of the OLCF yielded recommendations that have been addressed (Reference Section 1) and where appropriate, changes in Center metrics were introduced. This report covers CY 2010 and CY 2011 Year to Date (YTD) that unless otherwise specified, denotes January 1, 2011 through June 30, 2011. User Support remains an important element of the OLCF operations, with the philosophy 'whatever it takes' to enable successful research. Impact of this center-wide activity is reflected by the user survey results that show users are 'very satisfied.' The OLCF continues to aggressively pursue outreach and training activities to promote awareness - and effective use - of U.S. leadership-class resources (Reference Section 2). The OLCF continues to meet and in many cases exceed DOE metrics for capability usage (35% target in CY 2010, delivered 39%; 40% target in CY 2011, 54% January 1, 2011 through June 30, 2011). The Schedule Availability (SA) and Overall Availability (OA) for Jaguar were exceeded in CY2010. Given the solution to the VRM problem the SA and OA for Jaguar in CY 2011 are expected to exceed the target metrics of 95% and 90%, respectively (Reference Section 3). Numerous and wide-ranging research accomplishments, scientific support, and technological innovations are more fully described in Sections 4 and 6 and reflect OLCF leadership in enabling high-impact science solutions and vision in creating an exascale-ready center. Financial Management (Section 5) and Risk Management (Section 7) are carried out using best practices approved of by DOE. The OLCF has a valid cyber security plan and Authority to Operate (Section 8). The proposed metrics for 2012 are reflected in Section 9.« less
78 FR 21616 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-11
..., 6701 Rockledge Drive, Bethesda, MD 20892, (Telephone Conference Call). Contact Person: Samuel C Edwards, Ph.D., IRG CHIEF, Center for Scientific Review, National Institutes of Health, 6701 Rockledge Drive...: Center for Scientific Review Special Emphasis Panel; PAR Panel: Pregnancy in Women with Disabilities...
Translations on Eastern Europe, Scientific Affairs, Number 563
1977-12-11
Security Class (This Report) l"*ri:*.SSIFi5P 20. Security Class (This JNCLASSIFIED Pa« 21. No. of Pages 28 22. Price KORM NT1S- 35 (REV. 3...Information Science Association (at the request of the ZSM [Mini- computer System Works] MERA Research and Development Center). The software...the packaging industry will grow at an average rate of 12 percent/year and in Romania will continue to be significant ( 35 percent in 1975, 30.3
2002-07-01
Knowledge From Data .................................................. 25 HIGH-CONFIDENCE SOFTWARE AND SYSTEMS Reliability, Security, and Safety for...NOAA’s Cessna Citation flew over the 16-acre World Trade Center site, scanning with an Optech ALSM unit. The system recorded data points from 33,000...provide the data storage and compute power for intelligence analysis, high-performance national defense systems , and critical scientific research • Large
75 FR 52356 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-25
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review; Notice of Closed Meetings Pursuant to section 10(d) of the Federal Advisory Committee Act, as amended (5... Committee: Center for Scientific Review Special Emphasis Panel, Member Conflict: Reproduction and Metabolism...
78 FR 52552 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-23
... & Technologies Integrated Review Group, Nanotechnology Study Section. Date: September 26-27, 2013. Time: 8:00 a.m... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel, Member Conflict...
75 FR 41212 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-15
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review; Notice of Closed Meetings Pursuant to section 10(d) of the Federal Advisory Committee Act, as amended (5... Committee: Center for Scientific Review Special Emphasis Panel; Early Embryo Development Pluripotency. Date...
78 FR 5468 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-25
....gov . Name of Committee: Center for Scientific Review Special Emphasis Panel; Small Business: Sensory....nih.gov . Name of Committee: Center for Scientific Review Special Emphasis Panel; Small Business...; Small Business: Risk, Prevention and Health Behavior. Date: February 21-22, 2013. Time: 8:00 a.m. to 5...
76 FR 36555 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-22
... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel, Molecular Genetics. Date: July 6, 2011. Time: 2 p.m. to 4 p.m. Agenda: To review and evaluate grant applications... Committee: Center for Scientific Review Special Emphasis Panel, Statistical Genetics. Date: July 28, 2011...
75 FR 11894 - Center For Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-12
... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel; Cancer Diagnostics and Therapeutics SBIR/STTR. Date: March 18, 2010. Time: 2 p.m. to 5 p.m. Agenda: To review and... of Committee: Center for Scientific Review Special Emphasis Panel; Member Conflict: Cancer Therapy...
77 FR 24725 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-25
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel; Healthcare Delivery Methodologies Member Applications. Date: May 7, 2012. Time: 12 p.m. to 2 p.m. Agenda: To review...
75 FR 31796 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-04
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review; Notice of Closed Meetings Pursuant to section 10(d) of the Federal Advisory Committee Act, as amended (5...: Center for Scientific Review Special Emphasis Panel; Tooth Development, Mobility and Mineralization. Date...
75 FR 56554 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-16
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review; Notice of Closed Meetings Pursuant to section 10(d) of the Federal Advisory Committee Act, as amended (5... Committee: Center for Scientific Review Special Emphasis Panel; Member Conflict: Memory and Cognition. Date...
78 FR 65345 - Center for Scientific Review; Amended Notice of Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-31
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review; Amended Notice of Meeting Notice is hereby given of a change in the meeting of the Center for Scientific Review Special Emphasis Panel, Community-Level Health Promotion: Prevention and Intervention, October 22...
Energy Consumption Management of Virtual Cloud Computing Platform
NASA Astrophysics Data System (ADS)
Li, Lin
2017-11-01
For energy consumption management research on virtual cloud computing platforms, energy consumption management of virtual computers and cloud computing platform should be understood deeper. Only in this way can problems faced by energy consumption management be solved. In solving problems, the key to solutions points to data centers with high energy consumption, so people are in great need to use a new scientific technique. Virtualization technology and cloud computing have become powerful tools in people’s real life, work and production because they have strong strength and many advantages. Virtualization technology and cloud computing now is in a rapid developing trend. It has very high resource utilization rate. In this way, the presence of virtualization and cloud computing technologies is very necessary in the constantly developing information age. This paper has summarized, explained and further analyzed energy consumption management questions of the virtual cloud computing platform. It eventually gives people a clearer understanding of energy consumption management of virtual cloud computing platform and brings more help to various aspects of people’s live, work and son on.
Unified Performance and Power Modeling of Scientific Workloads
DOE Office of Scientific and Technical Information (OSTI.GOV)
Song, Shuaiwen; Barker, Kevin J.; Kerbyson, Darren J.
2013-11-17
It is expected that scientific applications executing on future large-scale HPC must be optimized not only in terms of performance, but also in terms of power consumption. As power and energy become increasingly constrained resources, researchers and developers must have access to tools that will allow for accurate prediction of both performance and power consumption. Reasoning about performance and power consumption in concert will be critical for achieving maximum utilization of limited resources on future HPC systems. To this end, we present a unified performance and power model for the Nek-Bone mini-application developed as part of the DOE's CESAR Exascalemore » Co-Design Center. Our models consider the impact of computation, point-to-point communication, and collective communication« less
Microgravity sciences application visiting scientist program
NASA Technical Reports Server (NTRS)
Glicksman, Martin; Vanalstine, James
1995-01-01
Marshall Space Flight Center pursues scientific research in the area of low-gravity effects on materials and processes. To facilitate these Government performed research responsibilities, a number of supplementary research tasks were accomplished by a group of specialized visiting scientists. They participated in work on contemporary research problems with specific objectives related to current or future space flight experiments and defined and established independent programs of research which were based on scientific peer review and the relevance of the defined research to NASA microgravity for implementing a portion of the national program. The programs included research in the following areas: protein crystal growth, X-ray crystallography and computer analysis of protein crystal structure, optimization and analysis of protein crystal growth techniques, and design and testing of flight hardware.
NASA Astrophysics Data System (ADS)
Turinsky, Paul J.; Martin, William R.
2017-04-01
In this special issue of the Journal of Computational Physics, the research and development completed at the time of manuscript submission by the Consortium for Advanced Simulation of Light Water Reactors (CASL) is presented. CASL is the first of several Energy Innovation Hubs that have been created by the Department of Energy. The Hubs are modeled after the strong scientific management characteristics of the Manhattan Project and AT&T Bell Laboratories, and function as integrated research centers that combine basic and applied research with engineering to accelerate scientific discovery that addresses critical energy issues. Lifetime of a Hub is expected to be five or ten years depending upon performance, with CASL being granted a ten year lifetime.
Preparation for microgravity - The role of the Microgravity Material Science Laboratory
NASA Technical Reports Server (NTRS)
Johnston, J. Christopher; Rosenthal, Bruce N.; Meyer, Maryjo B.; Glasgow, Thomas K.
1988-01-01
Experiments at the NASA Lewis Research Center's Microgravity Material Science Laboratory using physical and mathematical models to delineate the effects of gravity on processes of scientific and commercial interest are discussed. Where possible, transparent model systems are used to visually track convection, settling, crystal growth, phase separation, agglomeration, vapor transport, diffusive flow, and polymer reactions. Materials studied include metals, alloys, salts, glasses, ceramics, and polymers. Specific technologies discussed include the General Purpose furnace used in the study of metals and crystal growth, the isothermal dendrite growth apparatus, the electromagnetic levitator/instrumented drop tube, the high temperature directional solidification furnace, the ceramics and polymer laboratories and the center's computing facilities.
First-Principles Thermodynamics Study of Spinel MgAl 2 O 4 Surface Stability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cai, Qiuxia; Wang, Jian-guo; Wang, Yong
The surface stability of all possible terminations for three low-index (111, 110, 100) structures of the spinel MgAl2O4 has been studied using first-principles based thermodynamic approach. The surface Gibbs free energy results indicate that the 100_AlO2 termination is the most stable surface structure under ultra-high vacuum at T=1100 K regardless of Al-poor or Al-rich environment. With increasing oxygen pressure, the 111_O2(Al) termination becomes the most stable surface in the Al-rich environment. The oxygen vacancy formation is thermodynamically favorable over the 100_AlO2, 111_O2(Al) and the (111) structure with Mg/O connected terminations. On the basis of surface Gibbs free energies for bothmore » perfect and defective surface terminations, the 100_AlO2 and 111_O2(Al) are the most dominant surfaces in Al-rich environment under atmospheric condition. This is also consistent with our previously reported experimental observation. This work was supported by a Laboratory Directed Research and Development (LDRD) project of the Pacific Northwest National Laboratory (PNNL). The computing time was granted by the National Energy Research Scientific Computing Center (NERSC). Part of computing time was also granted by a scientific theme user proposal in the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL), which is a U.S. Department of Energy national scientific user facility located at PNNL in Richland, Washington.« less
NASA Technical Reports Server (NTRS)
1987-01-01
The Research Institute for Advanced Computer Science (RIACS) was established at the NASA Ames Research Center in June of 1983. RIACS is privately operated by the Universities Space Research Association (USRA), a consortium of 64 universities with graduate programs in the aerospace sciences, under several Cooperative Agreements with NASA. RIACS's goal is to provide preeminent leadership in basic and applied computer science research as partners in support of NASA's goals and missions. In pursuit of this goal, RIACS contributes to several of the grand challenges in science and engineering facing NASA: flying an airplane inside a computer; determining the chemical properties of materials under hostile conditions in the atmospheres of earth and the planets; sending intelligent machines on unmanned space missions; creating a one-world network that makes all scientific resources, including those in space, accessible to all the world's scientists; providing intelligent computational support to all stages of the process of scientific investigation from problem formulation to results dissemination; and developing accurate global models for climatic behavior throughout the world. In working with these challenges, we seek novel architectures, and novel ways to use them, that exploit the potential of parallel and distributed computation and make possible new functions that are beyond the current reach of computing machines. The investigation includes pattern computers as well as the more familiar numeric and symbolic computers, and it includes networked systems of resources distributed around the world. We believe that successful computer science research is interdisciplinary: it is driven by (and drives) important problems in other disciplines. We believe that research should be guided by a clear long-term vision with planned milestones. And we believe that our environment must foster and exploit innovation. Our activities and accomplishments for the calendar year 1987 and our plans for 1988 are reported.
PREFACE: New trends in Computer Simulations in Physics and not only in physics
NASA Astrophysics Data System (ADS)
Shchur, Lev N.; Krashakov, Serge A.
2016-02-01
In this volume we have collected papers based on the presentations given at the International Conference on Computer Simulations in Physics and beyond (CSP2015), held in Moscow, September 6-10, 2015. We hope that this volume will be helpful and scientifically interesting for readers. The Conference was organized for the first time with the common efforts of the Moscow Institute for Electronics and Mathematics (MIEM) of the National Research University Higher School of Economics, the Landau Institute for Theoretical Physics, and the Science Center in Chernogolovka. The name of the Conference emphasizes the multidisciplinary nature of computational physics. Its methods are applied to the broad range of current research in science and society. The choice of venue was motivated by the multidisciplinary character of the MIEM. It is a former independent university, which has recently become the part of the National Research University Higher School of Economics. The Conference Computer Simulations in Physics and beyond (CSP) is planned to be organized biannually. This year's Conference featured 99 presentations, including 21 plenary and invited talks ranging from the analysis of Irish myths with recent methods of statistical physics, to computing with novel quantum computers D-Wave and D-Wave2. This volume covers various areas of computational physics and emerging subjects within the computational physics community. Each section was preceded by invited talks presenting the latest algorithms and methods in computational physics, as well as new scientific results. Both parallel and poster sessions paid special attention to numerical methods, applications and results. For all the abstracts presented at the conference please follow the link http://csp2015.ac.ru/files/book5x.pdf
NASA Astrophysics Data System (ADS)
Johnson, R.; Foster, S.
2005-12-01
The National Center for Atmospheric Research (NCAR) in Boulder, Colorado, is a leading institution in scientific research, education and service associated with exploring and understanding our atmosphere and its interactions with the Sun, the oceans, the biosphere, and human society. NCAR draws thousands of public and scientific visitors from around the world to its Mesa Laboratory facility annually for educational as well as research purposes. Public visitors include adult visitors, clubs, and families on an informal visit to NCAR and its exhibits, as well as classroom and summer camp groups. Additionally, NCAR provides extensive computational and visualization services, which can be used not only for scientific, but also public informational purposes. As such, NCAR's audience provides an opportunity to address both formal and informal education through the programs that we offer. The University Corporation for Atmospheric Research (UCAR) Office of Education and Outreach works with NCAR to develop and implement a highly-integrated strategy for reaching both formal and informal audiences through programs that range from events and exhibits to professional development (for scientists and educators) and bilingual distance learning. The hallmarks of our program include close collaboration with scientists, multi-purposing resources where appropriate for maximum efficiency, and a commitment to engage populations historically underrepresented in science in the geosciences.
76 FR 5182 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-28
... Scientific Review Special Emphasis Panel, PAR-10-074: Technology Development for High-Throughput Structural...: Center for Scientific Review Special Emphasis Panel, Risk Prevention and Intervention Addictions...
Gpu Implementation of a Viscous Flow Solver on Unstructured Grids
NASA Astrophysics Data System (ADS)
Xu, Tianhao; Chen, Long
2016-06-01
Graphics processing units have gained popularities in scientific computing over past several years due to their outstanding parallel computing capability. Computational fluid dynamics applications involve large amounts of calculations, therefore a latest GPU card is preferable of which the peak computing performance and memory bandwidth are much better than a contemporary high-end CPU. We herein focus on the detailed implementation of our GPU targeting Reynolds-averaged Navier-Stokes equations solver based on finite-volume method. The solver employs a vertex-centered scheme on unstructured grids for the sake of being capable of handling complex topologies. Multiple optimizations are carried out to improve the memory accessing performance and kernel utilization. Both steady and unsteady flow simulation cases are carried out using explicit Runge-Kutta scheme. The solver with GPU acceleration in this paper is demonstrated to have competitive advantages over the CPU targeting one.
Job Scheduling in a Heterogeneous Grid Environment
NASA Technical Reports Server (NTRS)
Shan, Hong-Zhang; Smith, Warren; Oliker, Leonid; Biswas, Rupak
2004-01-01
Computational grids have the potential for solving large-scale scientific problems using heterogeneous and geographically distributed resources. However, a number of major technical hurdles must be overcome before this potential can be realized. One problem that is critical to effective utilization of computational grids is the efficient scheduling of jobs. This work addresses this problem by describing and evaluating a grid scheduling architecture and three job migration algorithms. The architecture is scalable and does not assume control of local site resources. The job migration policies use the availability and performance of computer systems, the network bandwidth available between systems, and the volume of input and output data associated with each job. An extensive performance comparison is presented using real workloads from leading computational centers. The results, based on several key metrics, demonstrate that the performance of our distributed migration algorithms is significantly greater than that of a local scheduling framework and comparable to a non-scalable global scheduling approach.
Research Projects, Technical Reports and Publications
NASA Technical Reports Server (NTRS)
Oliger, Joseph
1996-01-01
The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. A flexible scientific staff is provided through a university faculty visitor program, a post doctoral program, and a student visitor program. Not only does this provide appropriate expertise but it also introduces scientists outside of NASA to NASA problems. A small group of core RIACS staff provides continuity and interacts with an ARC technical monitor and scientific advisory group to determine the RIACS mission. RIACS activities are reviewed and monitored by a USRA advisory council and ARC technical monitor. Research at RIACS is currently being done in the following areas: Advanced Methods for Scientific Computing High Performance Networks During this report pefiod Professor Antony Jameson of Princeton University, Professor Wei-Pai Tang of the University of Waterloo, Professor Marsha Berger of New York University, Professor Tony Chan of UCLA, Associate Professor David Zingg of University of Toronto, Canada and Assistant Professor Andrew Sohn of New Jersey Institute of Technology have been visiting RIACS. January 1, 1996 through September 30, 1996 RIACS had three staff scientists, four visiting scientists, one post-doctoral scientist, three consultants, two research associates and one research assistant. RIACS held a joint workshop with Code 1 29-30 July 1996. The workshop was held to discuss needs and opportunities in basic research in computer science in and for NASA applications. There were 14 talks given by NASA, industry and university scientists and three open discussion sessions. There were approximately fifty participants. A proceedings is being prepared. It is planned to have similar workshops on an annual basis. RIACS technical reports are usually preprints of manuscripts that have been submitted to research 'ournals or conference proceedings. A list of these reports for the period January i 1, 1996 through September 30, 1996 is in the Reports and Abstracts section of this report.
Comparative Case Study of Two Biomedical Research Collaboratories
Teasley, Stephanie D; Bhatnagar, Rishi
2005-01-01
Background Working together efficiently and effectively presents a significant challenge in large-scale, complex, interdisciplinary research projects. Collaboratories are a nascent method to help meet this challenge. However, formal collaboratories in biomedical research centers are the exception rather than the rule. Objective The main purpose of this paper is to compare and describe two collaboratories that used off-the-shelf tools and relatively modest resources to support the scientific activity of two biomedical research centers. The two centers were the Great Lakes Regional Center for AIDS Research (HIV/AIDS Center) and the New York University Oral Cancer Research for Adolescent and Adult Health Promotion Center (Oral Cancer Center). Methods In each collaboratory, we used semistructured interviews, surveys, and contextual inquiry to assess user needs and define the technology requirements. We evaluated and selected commercial software applications by comparing their feature sets with requirements and then pilot-testing the applications. Local and remote support staff cooperated in the implementation and end user training for the collaborative tools. Collaboratory staff evaluated each implementation by analyzing utilization data, administering user surveys, and functioning as participant observers. Results The HIV/AIDS Center primarily required real-time interaction for developing projects and attracting new participants to the center; the Oral Cancer Center, on the other hand, mainly needed tools to support distributed and asynchronous work in small research groups. The HIV/AIDS Center’s collaboratory included a center-wide website that also served as the launch point for collaboratory applications, such as NetMeeting, Timbuktu Conference, PlaceWare Auditorium, and iVisit. The collaboratory of the Oral Cancer Center used Groove and Genesys Web conferencing. The HIV/AIDS Center was successful in attracting new scientists to HIV/AIDS research, and members used the collaboratory for developing and implementing new research studies. The Oral Cancer Center successfully supported highly distributed and asynchronous research, and the collaboratory facilitated real-time interaction for analyzing data and preparing publications. Conclusions The two collaboratory implementations demonstrated the feasibility of supporting biomedical research centers using off-the-shelf commercial tools, but they also identified several barriers to successful collaboration. These barriers included computing platform incompatibilities, network infrastructure complexity, variable availability of local versus remote IT support, low computer and collaborative software literacy, and insufficient maturity of available collaborative software. Factors enabling collaboratory use included collaboration incentives through funding mechanism, a collaborative versus competitive relationship of researchers, leadership by example, and tools well matched to tasks and technical progress. Integrating electronic collaborative tools into routine scientific practice can be successful but requires further research on the technical, social, and behavioral factors influencing the adoption and use of collaboratories. PMID:16403717
78 FR 66028 - Center for Scientific Review; Cancellation of Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-04
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review; Cancellation of Meeting Notice is hereby given of the cancellation of the Center for Scientific Review Special Emphasis Panel, October 04, 2013, 11:00 a.m. to October 04, 2013, 2:00 p.m., Washington Hilton, 1919...
77 FR 66623 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-06
... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel; Program Project: Mass Spectrometry Resource. Date: November 26-28, 2012. Time: 7:00 p.m. to 12:00 p.m. Agenda: To review... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review...
78 FR 65343 - Center for Scientific Review; Amended Notice of Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-31
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review; Amended Notice of Meeting Notice is hereby given of a change in the meeting of the Center for Scientific Review Special Emphasis Panel, Genetic Epidemiology, October 29, 2013, 09:30 a.m. to October 29, 2013, 10...
78 FR 68857 - Center for Scientific Review Amended; Notice of Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-15
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review Amended; Notice of Meeting Notice is hereby given of a change in the meeting of the Center for Scientific Review Special Emphasis Panel, Synthetic and Biological Chemistry A, October 21, 2013, 12:00 p.m. to...
78 FR 40487 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-05
... Committee: Center for Scientific Review Special Emphasis Panel; Biomedical Technology Research Center: A Biomedical- Informatics Research Network for Big Data. Date: July 30-August 1, 2013. Time: 6:00 p.m. to 1:00... Scientific Review Special Emphasis Panel; Gene Therapy Member Conflicts. Date: July 30, 2013. Time: 3:00 p.m...
76 FR 22111 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-20
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel, Member Conflict: Healthcare Delivery and Methodologies. Date: May 17-18, 2011. Time: 9 a.m. to 5 p.m. Agenda: To review and...
78 FR 50426 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-19
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel; Member Conflict: Healthcare Delivery and Methodology Early. Date: September 12, 2013. Time: 8:00 a.m. to 8:00 p.m. Agenda: To...
77 FR 40624 - Center for Scientific Review; Notice of Closed Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-10
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel Member Conflit: Healthcare Delivery and Methodologies. Date: July 16, 2012. Time: 11:15 a.m. to 1 p.m. Agenda: To review and...
75 FR 14174 - Center for Scientific Review; Amended Notice of Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-24
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review; Amended Notice of Meeting Notice is hereby given of a change in the meeting of the Center for Scientific... location remains the same. The meeting title has been changed to ``Member Conflict: Health and Behavior...
78 FR 65349 - Center for Scientific Review; Amended Notice of Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-31
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review; Amended Notice of Meeting Notice is hereby given of a change in the meeting of the Center for Scientific Review Special Emphasis Panel, Skeletal Biology and Regeneration Overflow, October 9, 2013, 06:30 a.m. to...
78 FR 67178 - Center for Scientific Review; Amended Notice of Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-08
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review; Amended Notice of Meeting Notice is hereby given of a change in the meeting of the Center for Scientific Review Special Emphasis Panel, Member Conflict: Alcohol and Glucose, October 16, 2013, 01:00 p.m. to...
76 FR 6805 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-08
... Committee: Center for Scientific Review Special Emphasis Panel; Small Business: Immunology. Date: March 10... Scientific Review Special Emphasis Panel; Member Conflict: Topics on Infectious Diseases and Microbiology...
[AERA. Dream machines and computing practices at the Mathematical Center].
Alberts, Gerard; De Beer, Huub T
2008-01-01
Dream machines may be just as effective as the ones materialised. Their symbolic thrust can be quite powerful. The Amsterdam 'Mathematisch Centrum' (Mathematical Center), founded February 11, 1946, created a Computing Department in an effort to realise its goal of serving society. When Aad van Wijngaarden was appointed as head of the Computing Department, however, he claimed space for scientific research and computer construction, next to computing as a service. Still, the computing service following the five stage style of Hartree's numerical analysis remained a dominant characteristic of the work of the Computing Department. The high level of ambition held by Aad van Wijngaarden lead to ever renewed projections of big automatic computers, symbolised by the never-built AERA. Even a machine that was actually constructed, the ARRA which followed A.D. Booth's design of the ARC, never made it into real operation. It did serve Van Wijngaarden to bluff his way into the computer age by midsummer 1952. Not until January 1954 did the computing department have a working stored program computer, which for reasons of policy went under the same name: ARRA. After just one other machine, the ARMAC, had been produced, a separate company, Electrologica, was set up for the manufacture of computers, which produced the rather successful X1 computer. The combination of ambition and absence of a working machine lead to a high level of work on programming, way beyond the usual ideas of libraries of subroutines. Edsger W. Dijkstra in particular led the way to an emphasis on the duties of the programmer within the pattern of numerical analysis. Programs generating programs, known elsewhere as autocoding systems, were at the 'Mathematisch Centrum' called 'superprograms'. Practical examples were usually called a 'complex', in Dutch, where in English one might say 'system'. Historically, this is where software begins. Dekker's matrix complex, Dijkstra's interrupt system, Dijkstra and Zonneveld's ALGOL compiler--which for housekeeping contained 'the complex'--were actual examples of such super programs. In 1960 this compiler gave the Mathematical Center a leading edge in the early development of software.
78 FR 7438 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-01
..., 6701 Rockledge Drive, Bethesda, MD 20892, (Telephone Conference Call). Contact Person: Lisa Steele, Ph..., CA 90405. Contact Person: Valerie Durrant, Ph.D., Scientific Review Officer, Center for Scientific...
78 FR 27975 - Center for Scientific Review Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-13
... Scientific Review Special Emphasis Panel; Systemic Injury By Environmental Exposure. Date: June 11, 2013... . Name of Committee: Center for Scientific Review Special Emphasis Panel; Member conflict: Lung Diseases...
XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Mid-year report FY17 Q2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moreland, Kenneth D.; Pugmire, David; Rogers, David
The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less
XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Year-end report FY17.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moreland, Kenneth D.; Pugmire, David; Rogers, David
The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less
XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem. Mid-year report FY16 Q2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moreland, Kenneth D.; Sewell, Christopher; Childs, Hank
The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less
XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Year-end report FY15 Q4.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moreland, Kenneth D.; Sewell, Christopher; Childs, Hank
The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less
SCEAPI: A unified Restful Web API for High-Performance Computing
NASA Astrophysics Data System (ADS)
Rongqiang, Cao; Haili, Xiao; Shasha, Lu; Yining, Zhao; Xiaoning, Wang; Xuebin, Chi
2017-10-01
The development of scientific computing is increasingly moving to collaborative web and mobile applications. All these applications need high-quality programming interface for accessing heterogeneous computing resources consisting of clusters, grid computing or cloud computing. In this paper, we introduce our high-performance computing environment that integrates computing resources from 16 HPC centers across China. Then we present a bundle of web services called SCEAPI and describe how it can be used to access HPC resources with HTTP or HTTPs protocols. We discuss SCEAPI from several aspects including architecture, implementation and security, and address specific challenges in designing compatible interfaces and protecting sensitive data. We describe the functions of SCEAPI including authentication, file transfer and job management for creating, submitting and monitoring, and how to use SCEAPI in an easy-to-use way. Finally, we discuss how to exploit more HPC resources quickly for the ATLAS experiment by implementing the custom ARC compute element based on SCEAPI, and our work shows that SCEAPI is an easy-to-use and effective solution to extend opportunistic HPC resources.
NAS Technical Summaries, March 1993 - February 1994
NASA Technical Reports Server (NTRS)
1995-01-01
NASA created the Numerical Aerodynamic Simulation (NAS) Program in 1987 to focus resources on solving critical problems in aeroscience and related disciplines by utilizing the power of the most advanced supercomputers available. The NAS Program provides scientists with the necessary computing power to solve today's most demanding computational fluid dynamics problems and serves as a pathfinder in integrating leading-edge supercomputing technologies, thus benefitting other supercomputer centers in government and industry. The 1993-94 operational year concluded with 448 high-speed processor projects and 95 parallel projects representing NASA, the Department of Defense, other government agencies, private industry, and universities. This document provides a glimpse at some of the significant scientific results for the year.
NAS technical summaries. Numerical aerodynamic simulation program, March 1992 - February 1993
NASA Technical Reports Server (NTRS)
1994-01-01
NASA created the Numerical Aerodynamic Simulation (NAS) Program in 1987 to focus resources on solving critical problems in aeroscience and related disciplines by utilizing the power of the most advanced supercomputers available. The NAS Program provides scientists with the necessary computing power to solve today's most demanding computational fluid dynamics problems and serves as a pathfinder in integrating leading-edge supercomputing technologies, thus benefitting other supercomputer centers in government and industry. The 1992-93 operational year concluded with 399 high-speed processor projects and 91 parallel projects representing NASA, the Department of Defense, other government agencies, private industry, and universities. This document provides a glimpse at some of the significant scientific results for the year.
Scientific Grid activities and PKI deployment in the Cybermedia Center, Osaka University.
Akiyama, Toyokazu; Teranishi, Yuuichi; Nozaki, Kazunori; Kato, Seiichi; Shimojo, Shinji; Peltier, Steven T; Lin, Abel; Molina, Tomas; Yang, George; Lee, David; Ellisman, Mark; Naito, Sei; Koike, Atsushi; Matsumoto, Shuichi; Yoshida, Kiyokazu; Mori, Hirotaro
2005-10-01
The Cybermedia Center (CMC), Osaka University, is a research institution that offers knowledge and technology resources obtained from advanced researches in the areas of large-scale computation, information and communication, multimedia content and education. Currently, CMC is involved in Japanese national Grid projects such as JGN II (Japan Gigabit Network), NAREGI and BioGrid. Not limited to Japan, CMC also actively takes part in international activities such as PRAGMA. In these projects and international collaborations, CMC has developed a Grid system that allows scientists to perform their analysis by remote-controlling the world's largest ultra-high voltage electron microscope located in Osaka University. In another undertaking, CMC has assumed a leadership role in BioGrid by sharing its experiences and knowledge on the system development for the area of biology. In this paper, we will give an overview of the BioGrid project and introduce the progress of the Telescience unit, which collaborates with the Telescience Project led by the National Center for Microscopy and Imaging Research (NCMIR). Furthermore, CMC collaborates with seven Computing Centers in Japan, NAREGI and National Institute of Informatics to deploy PKI base authentication infrastructure. The current status of this project and future collaboration with Grid Projects will be delineated in this paper.
75 FR 4090 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-26
... El Camino Real, San Diego, CA 92130. Contact Person: William A. Greenberg, PhD., Scientific Review... (Virtual Meeting). Contact Person: Fouad A. El-Zaatari, PhD., Scientific Review Officer, Center for... for Scientific Review Special Emphasis Panel, Small Business: Experimental Cancer Therapeutics. Date...
76 FR 59707 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-27
..., (Virtual Meeting). Contact Person: Rebecca Henry, PhD, Scientific Review Officer, Center for Scientific... Rockledge Drive, Bethesda, MD 20892, (Virtual Meeting). Contact Person: George M Barnas, PhD, Scientific... Institutes of Health, 6701 Rockledge Drive, Bethesda, MD 20892, (Virtual Meeting). Contact Person: George M...
77 FR 8269 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-14
.... Place: Mayflower Park Hotel, 405 Olive Way, Seattle, WA 98101. Contact Person: Rebecca Henry, Ph.D... 20814. Contact Person: Melinda Jenkins, Ph.D., Scientific Review Officer, Center for Scientific Review...
78 FR 42969 - Center For Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-18
... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel;, Molecular and... for Scientific Review Special Emphasis Panel; Genetics of Cell Regulation. Date: August 1, 2013. Time...
Automated real-time software development
NASA Technical Reports Server (NTRS)
Jones, Denise R.; Walker, Carrie K.; Turkovich, John J.
1993-01-01
A Computer-Aided Software Engineering (CASE) system has been developed at the Charles Stark Draper Laboratory (CSDL) under the direction of the NASA Langley Research Center. The CSDL CASE tool provides an automated method of generating source code and hard copy documentation from functional application engineering specifications. The goal is to significantly reduce the cost of developing and maintaining real-time scientific and engineering software while increasing system reliability. This paper describes CSDL CASE and discusses demonstrations that used the tool to automatically generate real-time application code.
The scientific research potential of virtual worlds.
Bainbridge, William Sims
2007-07-27
Online virtual worlds, electronic environments where people can work and interact in a somewhat realistic manner, have great potential as sites for research in the social, behavioral, and economic sciences, as well as in human-centered computer science. This article uses Second Life and World of Warcraft as two very different examples of current virtual worlds that foreshadow future developments, introducing a number of research methodologies that scientists are now exploring, including formal experimentation, observational ethnography, and quantitative analysis of economic markets or social networks.
75 FR 8371 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-24
... Rockledge Drive, Room 3139, Bethesda, MD 20892, (301) 435-1712, [email protected]csr.nih.gov . Name of Committee: AIDS..., [email protected]csr.nih.gov . Name of Committee: Center for Scientific Review Special Emphasis Panel, Urology...-435- 1501, [email protected]csr.nih.gov . Name of Committee: Center for Scientific Review Special Emphasis...
76 FR 51379 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-18
... Drive, Bethesda, MD 20892, (Virtual Meeting). Contact Person: John Bleasdale, PhD, Scientific Review.... Contact Person: Reed A Graves, PhD, Scientific Review Officer, Center for Scientific Review, National... Group Biochemistry and Biophysics of Membranes Study Section. Date: September 26-27, 2011. Time: 8 a.m...
78 FR 31951 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-28
..., Bethesda, MD 20892, (Virtual Meeting). Contact Person: Nuria E Assa-Munt, Ph.D., Scientific Review Officer...; Small Business: Education, Psychology, and Biology in Health Behavior. Date: June 24-25, 2013. Time: 8..., (Virtual Meeting). Contact Person: David R Filpula, Ph.D., Scientific Review Officer, Center for Scientific...
76 FR 81953 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-29
... Person: John Firrell, Ph.D., Scientific Review Officer, Center for Scientific Review, National Institutes...: Biological Chemistry and Macromolecular Biophysics. Date: January 19-20, 2012. Time: 11 a.m. to 10 p.m... Drive, Bethesda, MD 20892 (Virtual Meeting). Contact Person: Donald L. Schneider, Ph.D., Scientific...
76 FR 77544 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-13
... 20892 (Virtual Meeting). Contact Person: John Firrell, Ph.D., Scientific Review Officer, Center for..., Bethesda, MD 20892 (Virtual Meeting). Contact Person: Aruna K Behera, Ph.D., Scientific Review Officer... Wharf, 2620 Jones Street, San Francisco, CA 94133. Contact Person: Joseph D Mosca, Ph.D., Scientific...
The Moon in the Russian scientific-educational project: Kazan-GeoNa-2010
NASA Astrophysics Data System (ADS)
Gusev, A.; Kitiashvili, I.; Petrova, N.
Historically thousand-year Kazan city and the two-hundred-year Kazan university Russia carry out a role of the scientific-organizational and cultural-educational center of Volga region For the further successful development of educational and scientific-educational activity of the Russian Federation the Republic Tatarstan Kazan is offered the national project - the International Center of the Science and the Internet of Technologies bf GeoNa bf Geo metry of bf Na ture - bf GeoNa is developed - wisdom enthusiasm pride grandeur which includes a modern complex of conference halls up to 4 thousand places the Center the Internet of Technologies 3D Planetarium - development of the Moon PhysicsLand an active museum of natural sciences an oceanarium training a complex Spheres of Knowledge botanical and landscape oases In center bf GeoNa will be hosted conferences congresses fundamental scientific researches of the Moon scientific-educational actions presentation of the international scientific programs on lunar research modern lunar databases exhibition Hi-tech of the equipment the extensive cultural-educational tourist and cognitive programs Center bf GeoNa will enable scientists and teachers of the Russian universities to join to advanced achievements of a science information technologies to establish scientific communications with foreign colleagues in sphere of the high technology and educational projects with world space centers
76 FR 22113 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-20
...: Cathleen L Cooper, PhD, Scientific Review Officer, Center for Scientific Review, National Institutes of... Alexandria Old Town, 1767 King Street, Alexandria, VA 22314. Contact Person: Cathleen L Cooper, PhD...
1981-01-01
Spacelab was a versatile laboratory carried in the Space Shuttle's cargo bay for special research flights. Its various elements could be combined to accommodate the many types of scientific research that could best be performed in space. Spacelab consisted of an enclosed, pressurized laboratory module and open U-shaped pallets located at the rear of the laboratory module. The laboratory module contained utilities, computers, work benches, and instrument racks to conduct scientific experiments in astronomy, physics, chemistry, biology, medicine, and engineering. Equipment, such as telescopes, anternas, and sensors, was mounted on pallets for direct exposure to space. A 1-meter (3.3-ft.) diameter aluminum tunnel, resembling a z-shaped tube, connected the crew compartment (mid deck) to the module. The reusable Spacelab allowed scientists to bring experiment samples back to Earth for post-flight analysis. Spacelab was a cooperative venture of the European Space Agency (ESA) and NASA. ESA was responsible for funding, developing, and building of Spacelab, while NASA was responsible for the launch and operational use of Spacelab. Spacelab missions were cooperative efforts between scientists and engineers from around the world. Teams from NASA centers, universities, private industry, government agencies and international space organizations designed the experiments. The Marshall Space Flight Center was NASA's lead center for monitoring the development of Spacelab and managing the program.
Scientists Uncover Origins of the Sun’s Swirling Spicules
2017-12-08
At any given moment, as many as 10 million wild jets of solar material burst from the sun’s surface. They erupt as fast as 60 miles per second, and can reach lengths of 6,000 miles before collapsing. These are spicules, and despite their grass-like abundance, scientists didn’t understand how they form. Now, for the first time, a computer simulation — so detailed it took a full year to run — shows how spicules form, helping scientists understand how spicules can break free of the sun’s surface and surge upward so quickly. Watch here and more at: go.nasa.gov/2t3toMx Credits: NASA’s Goddard Space Flight Center/Joy Ng, producer NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
Supporting Scientific Experimentation and Reasoning in Young Elementary School Students
NASA Astrophysics Data System (ADS)
Varma, Keisha
2014-06-01
Researchers from multiple perspectives have shown that young students can engage in the scientific reasoning involved in science experimentation. However, there is little research on how well these young students learn in inquiry-based learning environments that focus on using scientific experimentation strategies to learn new scientific information. This work investigates young children's science concept learning via inquiry-based instruction on the thermodynamics system in a developmentally appropriate, technology-supported learning environment. First- and third-grade students participate in three sets of guided experimentation activities that involve using handheld computers to measure change in temperature given different types of insulation materials. Findings from pre- and post-comparisons show that students at both grade levels are able to learn about the thermodynamics system through engaging in the guided experiment activities. The instruction groups outperformed the control groups on multiple measures of thermodynamics knowledge, and the older children outperform the younger children. Knowledge gains are discussed in the context of mental models of the thermodynamics system that include the individual concepts mentioned above and the relationships between them. This work suggests that young students can benefit from science instruction centered on experimentation activities. It shows the benefits of presenting complex scientific information authentic contexts and the importance of providing the necessary scaffolding for meaningful scientific inquiry and experimentation.
77 FR 13134 - Center for Scientific Review ; Notice of Closed Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-05
..., (Virtual Meeting) Contact Person: Ai-Ping Zou, MD, Ph.D., Scientific Review Officer, Center for Scientific... Assistance Program Nos. 93.306, Comparative Medicine; 93.333, Clinical Research, 93.306, 93.333, 93.337, 93...
Collaborative Group Learning Approaches for Teaching Comparative Planetology
NASA Astrophysics Data System (ADS)
Slater, S. J.; Slater, T. F.
2013-12-01
Modern science education reform documents propose that the teaching of contemporary students should focus on doing science, rather than simply memorizing science. Duschl, Schweingruber, and Shouse (2007) eloquently argue for four science proficiencies for students. Students should: (i) Know, use, and interpret scientific explanations of the natural world; (ii) Generate and evaluate scientific evidence and explanations; (iii) Understand the nature and development of scientific knowledge; and (iv) Participate productively in scientific practices and discourse. In response, scholars with the CAPER Center for Astronomy & Physics Education Research are creating and field-tested two separate instructional approaches. The first of these is a series of computer-mediated, inquiry learning experiences for non-science majoring undergraduates based upon an inquiry-oriented teaching approach framed by the notions of backwards faded-scaffolding as an overarching theme for instruction. Backwards faded-scaffolding is a strategy where the conventional and rigidly linear scientific method is turned on its head and students are first taught how to create conclusions based on evidence, then how experimental design creates evidence, and only at the end introduces students to the most challenging part of inquiry - inventing scientifically appropriate questions. Planetary science databases and virtual environments used by students to conduct scientific investigations include the NASA and JPL Solar System Simulator and Eyes on the Solar System as well as the USGS Moon and Mars Global GIS Viewers. The second of these is known widely as a Lecture-Tutorial approach. Lecture-Tutorials are self-contained, collaborative group activities. The materials are designed specifically to be easily integrated into the lecture course and directly address the needs of busy and heavily-loaded teaching faculty for effective, student-centered, classroom-ready materials that do not require a drastic course revision for implementation. Students are asked to reason about difficult concepts, while working in pairs, and to discuss their ideas openly. Extensive evaluation results consistently suggest that both the backwards faded-scaffolding and the Lecture-Tutorials approaches are successful at engaging students in self-directed scientific discourse as measured by the Views on Scientific Inquiry (VOSI) as well as increasing their knowledge of science as measured by the Test Of Atronomy STandards (TOAST).
Performance Assessment Institute-NV
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lombardo, Joesph
2012-12-31
The National Supercomputing Center for Energy and the Environment’s intention is to purchase a multi-purpose computer cluster in support of the Performance Assessment Institute (PA Institute). The PA Institute will serve as a research consortium located in Las Vegas Nevada with membership that includes: national laboratories, universities, industry partners, and domestic and international governments. This center will provide a one-of-a-kind centralized facility for the accumulation of information for use by Institutions of Higher Learning, the U.S. Government, and Regulatory Agencies and approved users. This initiative will enhance and extend High Performance Computing (HPC) resources in Nevada to support critical nationalmore » and international needs in "scientific confirmation". The PA Institute will be promoted as the leading Modeling, Learning and Research Center worldwide. The program proposes to utilize the existing supercomputing capabilities and alliances of the University of Nevada Las Vegas as a base, and to extend these resource and capabilities through a collaborative relationship with its membership. The PA Institute will provide an academic setting for interactive sharing, learning, mentoring and monitoring of multi-disciplinary performance assessment and performance confirmation information. The role of the PA Institute is to facilitate research, knowledge-increase, and knowledge-sharing among users.« less
Tools for 3D scientific visualization in computational aerodynamics at NASA Ames Research Center
NASA Technical Reports Server (NTRS)
Bancroft, Gordon; Plessel, Todd; Merritt, Fergus; Watson, Val
1989-01-01
Hardware, software, and techniques used by the Fluid Dynamics Division (NASA) for performing visualization of computational aerodynamics, which can be applied to the visualization of flow fields from computer simulations of fluid dynamics about the Space Shuttle, are discussed. Three visualization techniques applied, post-processing, tracking, and steering, are described, as well as the post-processing software packages used, PLOT3D, SURF (Surface Modeller), GAS (Graphical Animation System), and FAST (Flow Analysis software Toolkit). Using post-processing methods a flow simulation was executed on a supercomputer and, after the simulation was complete, the results were processed for viewing. It is shown that the high-resolution, high-performance three-dimensional workstation combined with specially developed display and animation software provides a good tool for analyzing flow field solutions obtained from supercomputers.
Scientific Services on the Cloud
NASA Astrophysics Data System (ADS)
Chapman, David; Joshi, Karuna P.; Yesha, Yelena; Halem, Milt; Yesha, Yaacov; Nguyen, Phuong
Scientific Computing was one of the first every applications for parallel and distributed computation. To this date, scientific applications remain some of the most compute intensive, and have inspired creation of petaflop compute infrastructure such as the Oak Ridge Jaguar and Los Alamos RoadRunner. Large dedicated hardware infrastructure has become both a blessing and a curse to the scientific community. Scientists are interested in cloud computing for much the same reason as businesses and other professionals. The hardware is provided, maintained, and administrated by a third party. Software abstraction and virtualization provide reliability, and fault tolerance. Graduated fees allow for multi-scale prototyping and execution. Cloud computing resources are only a few clicks away, and by far the easiest high performance distributed platform to gain access to. There may still be dedicated infrastructure for ultra-scale science, but the cloud can easily play a major part of the scientific computing initiative.
ERIC Educational Resources Information Center
Kömek, Emre; Yagiz, Dursun; Kurt, Murat
2015-01-01
The purpose of this study is to analyze scientific literacy levels relevant to science and technology classes among gifted students that participate in scientific activities at science and art centers. This study investigated whether there was a significant difference in scientific literacy levels among gifted students according to the areas of…
78 FR 52206 - Center For Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-22
...: Biophysics, Biochemistry and Chemistry. Date: September 18-19, 2013. Time: 8:00 a.m. to 3:00 p.m. Agenda: To..., Bethesda, MD 20892 (Virtual Meeting). Contact Person: John L Bowers, Ph.D., Scientific Review Officer..., Ph.D., Scientific Review Officer, Center for Scientific Review, National Institutes of Health, 6701...
76 FR 58522 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-21
...: Ai-Ping Zou, MD, PhD, Scientific Review Officer, Center for Scientific Review, National Institutes of... Nos. 93.306, Comparative Medicine; 93.333, Clinical Research, 93.306, 93.333, 93.337, 93.393-93.396...
Twelve Scientific Specialists of the Peenemuende Team
NASA Technical Reports Server (NTRS)
2004-01-01
Twelve scientific specialists of the Peenemuende team at the front of Building 4488, Redstone Arsenal, Huntsville, Alabama. They led the Army's space efforts at ABMA before transfer of the team to National Aeronautic and Space Administration (NASA), George C. Marshall Space Flight Center (MSFC). (Left to right) Dr. Ernst Stuhlinger, Director, Research Projects Office; Dr. Helmut Hoelzer, Director, Computation Laboratory: Karl L. Heimburg, Director, Test Laboratory; Dr. Ernst Geissler, Director, Aeroballistics Laboratory; Erich W. Neubert, Director, Systems Analysis Reliability Laboratory; Dr. Walter Haeussermarn, Director, Guidance and Control Laboratory; Dr. Wernher von Braun, Director Development Operations Division; William A. Mrazek, Director, Structures and Mechanics Laboratory; Hans Hueter, Director, System Support Equipment Laboratory;Eberhard Rees, Deputy Director, Development Operations Division; Dr. Kurt Debus, Director Missile Firing Laboratory; Hans H. Maus, Director, Fabrication and Assembly Engineering Laboratory
NASA Technical Reports Server (NTRS)
1994-01-01
Charge Coupled Devices (CCDs) are high technology silicon chips that connect light directly into electronic or digital images, which can be manipulated or enhanced by computers. When Goddard Space Flight Center (GSFC) scientists realized that existing CCD technology could not meet scientific requirements for the Hubble Space Telescope Imagining Spectrograph, GSFC contracted with Scientific Imaging Technologies, Inc. (SITe) to develop an advanced CCD. SITe then applied many of the NASA-driven enhancements to the manufacture of CCDs for digital mammography. The resulting device images breast tissue more clearly and efficiently. The LORAD Stereo Guide Breast Biopsy system incorporates SITe's CCD as part of a digital camera system that is replacing surgical biopsy in many cases. Known as stereotactic needle biopsy, it is performed under local anesthesia with a needle and saves women time, pain, scarring, radiation exposure and money.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Geveci, Berk; Maynard, Robert
The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. The XVis project brought together collaborators from predominant DOE projects for visualization on accelerators and combining their respectivemore » features into a new visualization toolkit called VTK-m.« less
78 FR 17411 - Board of Scientific Counselors, National Center for Health Statistics
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-21
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Disease Control and Prevention Board of Scientific Counselors, National Center for Health Statistics In accordance with section 10(a)(2) of the...), National Center for Health Statistics (NCHS) announces the following meeting of the aforementioned...
78 FR 48438 - Board of Scientific Counselors, National Center for Health Statistics
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-08
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Disease Control and Prevention Board of Scientific Counselors, National Center for Health Statistics In accordance with section 10(a)(2) of the...), National Center for Health Statistics (NCHS) announces the following meeting of the aforementioned...
Interfacing with in-Situ Data Networks during the Arctic Boreal Vulnerability Experiment (ABoVE)
NASA Astrophysics Data System (ADS)
McInerney, M.; Griffith, P. C.; Duffy, D.; Hoy, E.; Schnase, J. L.; Sinno, S.; Thompson, J. H.
2014-12-01
The Arctic Boreal Vulnerability Experiment (ABoVE) is designed to improve understanding of the causes and impacts of ecological changes in Arctic/boreal regions, and will integrate field-based studies, modeling, and data from airborne and satellite remote sensing. ABoVE will result in a fuller understanding of ecosystem vulnerability and resilience to environmental change in the Arctic and boreal regions of western North America, and provide scientific information required to develop options for societal responses to the impacts of these changes. The studies sponsored by NASA during ABoVE will be coordinated with research and in-situ monitoring activities being sponsored by a number of national and international partners. The NASA Center for Climate Simulation at the Goddard Space Flight Center has partnered with the NASA Carbon Cycle & Ecosystems Office to create a science cloud designed for this field campaign - the ABoVE Science Cloud (ASC). The ASC combines high performance computing with emerging technologies to create an environment specifically designed for large-scale modeling, analysis of remote sensing data, copious disk storage with integrated data management, and integration of core variables from in-situ networks identified by the ABoVE Science Definition Team. In this talk, we will present the scientific requirements driving the development of the ABoVE Science Cloud, discuss the necessary interfaces, both computational and human, with in-situ monitoring networks, and show examples of how the ASC is being used to meet the needs of the ABoVE campaign.
77 FR 51033 - Center For Scientific Review; Notice of Closed Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-23
...: Biochemistry. Date: August 27, 2012. Time: 10:45 a.m. to 12:15 p.m. Agenda: To review and evaluate grant... Conference Call). Contact Person: Nuria E. Assa-Munt, Ph.D., Scientific Review Officer, Center for Scientific...
75 FR 22607 - Board of Scientific Counselors, Coordinating Center for Infectious Diseases (CCID)
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-29
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Disease Control and Prevention Board of Scientific Counselors, Coordinating Center for Infectious Diseases (CCID) In accordance with section 10(a)(2) of the Federal Advisory Committee Act (Pub. L. 92-463), the Centers for Disease Control and...
78 FR 38983 - World Trade Center Health Program Scientific/Technical Advisory Committee (WTCHP-STAC)
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-28
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Disease Control and Prevention World Trade Center Health Program Scientific/Technical Advisory Committee (WTCHP-STAC) Correction: This notice was... and Control, (BSC, NCIPC) and the name of the Committee should read World Trade Center Health Program...
78 FR 41046 - Advanced Scientific Computing Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-09
... Services Administration, notice is hereby given that the Advanced Scientific Computing Advisory Committee will be renewed for a two-year period beginning on July 1, 2013. The Committee will provide advice to the Director, Office of Science (DOE), on the Advanced Scientific Computing Research Program managed...
Institutional shared resources and translational cancer research.
De Paoli, Paolo
2009-06-29
The development and maintenance of adequate shared infrastructures is considered a major goal for academic centers promoting translational research programs. Among infrastructures favoring translational research, centralized facilities characterized by shared, multidisciplinary use of expensive laboratory instrumentation, or by complex computer hardware and software and/or by high professional skills are necessary to maintain or improve institutional scientific competitiveness. The success or failure of a shared resource program also depends on the choice of appropriate institutional policies and requires an effective institutional governance regarding decisions on staffing, existence and composition of advisory committees, policies and of defined mechanisms of reporting, budgeting and financial support of each resource. Shared Resources represent a widely diffused model to sustain cancer research; in fact, web sites from an impressive number of research Institutes and Universities in the U.S. contain pages dedicated to the SR that have been established in each Center, making a complete view of the situation impossible. However, a nation-wide overview of how Cancer Centers develop SR programs is available on the web site for NCI-designated Cancer Centers in the U.S., while in Europe, information is available for individual Cancer centers. This article will briefly summarize the institutional policies, the organizational needs, the characteristics, scientific aims, and future developments of SRs necessary to develop effective translational research programs in oncology.In fact, the physical build-up of SRs per se is not sufficient for the successful translation of biomedical research. Appropriate policies to improve the academic culture in collaboration, the availability of educational programs for translational investigators, the existence of administrative facilitations for translational research and an efficient organization supporting clinical trial recruitment and management represent essential tools, providing solutions to overcome existing barriers in the development of translational research in biomedical research centers.
Institutional shared resources and translational cancer research
De Paoli, Paolo
2009-01-01
The development and maintenance of adequate shared infrastructures is considered a major goal for academic centers promoting translational research programs. Among infrastructures favoring translational research, centralized facilities characterized by shared, multidisciplinary use of expensive laboratory instrumentation, or by complex computer hardware and software and/or by high professional skills are necessary to maintain or improve institutional scientific competitiveness. The success or failure of a shared resource program also depends on the choice of appropriate institutional policies and requires an effective institutional governance regarding decisions on staffing, existence and composition of advisory committees, policies and of defined mechanisms of reporting, budgeting and financial support of each resource. Shared Resources represent a widely diffused model to sustain cancer research; in fact, web sites from an impressive number of research Institutes and Universities in the U.S. contain pages dedicated to the SR that have been established in each Center, making a complete view of the situation impossible. However, a nation-wide overview of how Cancer Centers develop SR programs is available on the web site for NCI-designated Cancer Centers in the U.S., while in Europe, information is available for individual Cancer centers. This article will briefly summarize the institutional policies, the organizational needs, the characteristics, scientific aims, and future developments of SRs necessary to develop effective translational research programs in oncology. In fact, the physical build-up of SRs per se is not sufficient for the successful translation of biomedical research. Appropriate policies to improve the academic culture in collaboration, the availability of educational programs for translational investigators, the existence of administrative facilitations for translational research and an efficient organization supporting clinical trial recruitment and management represent essential tools, providing solutions to overcome existing barriers in the development of translational research in biomedical research centers. PMID:19563639
DePaolo, Donald J. (Director, Center for Nanoscale Control of Geologic CO2); NCGC Staff
2017-12-09
'Carbon in Underland' was submitted by the Center for Nanoscale Control of Geologic CO2 (NCGC) to the 'Life at the Frontiers of Energy Research' video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. This video was selected as one of five winners by a distinguished panel of judges for its 'entertaining animation and engaging explanations of carbon sequestration'. NCGC, an EFRC directed by Donald J. DePaolo at Lawrence Berkeley National Laboratory is a partnership of scientists from seven institutions: LBNL (lead) Massachusetts Institute of Technology, Lawrence Livermore National Laboratory, Oak Ridge National Laboratory, University of California, Davis, Ohio State University, and Washington University in St. Louis. The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges. The mission of the Center for Nanoscale Control of Geologic CO{sub 2} is 'to use new investigative tools, combined with experiments and computer simulations, to build a fundamental understanding of molecular-to-pore-scale processes in fluid-rock systems, and to demonstrate the ability to control critical aspects of flow, transport, and mineralization in porous rock media as applied to geologic sequestration of CO{sub 2}. Research topics are: bio-inspired, CO{sub 2} (store), greenhouse gas, and interfacial characterization.
Developing a Scientific Virtue-Based Approach to Science Ethics Training.
Pennock, Robert T; O'Rourke, Michael
2017-02-01
Responsible conduct of research training typically includes only a subset of the issues that ought to be included in science ethics and sometimes makes ethics appear to be a set of externally imposed rules rather than something intrinsic to scientific practice. A new approach to science ethics training based upon Pennock's notion of the scientific virtues may help avoid such problems. This paper motivates and describes three implementations-theory-centered, exemplar-centered, and concept-centered-that we have developed in courses and workshops to introduce students to this scientific virtue-based approach.
IYA Resources From The Harvard Smithsonian Center For Astrophysics
NASA Astrophysics Data System (ADS)
Reinfeld, Erika L.; Dussault, M. E.; Gould, R. R.; Steel, S. J.; Schneps, M. H.; Grainger, C. A.; Griswold, A.
2008-05-01
From museum exhibitions to professional development videos, the Science Education Department at the Harvard-Smithsonian Center for Astrophysics (CfA) has a long tradition of producing high quality education resources for students, teachers, and the public. This poster highlights new resources available to astronomers of all ages and backgrounds during the International Year of Astronomy. The MicroObservatory online telescope center will allow anyone with an email address to recapture the observations of Galileo on their own personal computers. The Beyond the Solar System professional development project follows in the footsteps of "A Private Universe" and "Minds of Our Own," providing new resources developed with the latest in scientific and educational research. And, in 2009, we will open a new traveling museum exhibition about black holes, featuring innovative new technologies, visualizations, and components designed with input from youth centers across the country. Learn more about these projects as the CfA continues to open the universe to new observers.
NASA Astrophysics Data System (ADS)
Altomare, Albino; Cesario, Eugenio; Mastroianni, Carlo
2016-10-01
The opportunity of using Cloud resources on a pay-as-you-go basis and the availability of powerful data centers and high bandwidth connections are speeding up the success and popularity of Cloud systems, which is making on-demand computing a common practice for enterprises and scientific communities. The reasons for this success include natural business distribution, the need for high availability and disaster tolerance, the sheer size of their computational infrastructure, and/or the desire to provide uniform access times to the infrastructure from widely distributed client sites. Nevertheless, the expansion of large data centers is resulting in a huge rise of electrical power consumed by hardware facilities and cooling systems. The geographical distribution of data centers is becoming an opportunity: the variability of electricity prices, environmental conditions and client requests, both from site to site and with time, makes it possible to intelligently and dynamically (re)distribute the computational workload and achieve as diverse business goals as: the reduction of costs, energy consumption and carbon emissions, the satisfaction of performance constraints, the adherence to Service Level Agreement established with users, etc. This paper proposes an approach that helps to achieve the business goals established by the data center administrators. The workload distribution is driven by a fitness function, evaluated for each data center, which weighs some key parameters related to business objectives, among which, the price of electricity, the carbon emission rate, the balance of load among the data centers etc. For example, the energy costs can be reduced by using a "follow the moon" approach, e.g. by migrating the workload to data centers where the price of electricity is lower at that time. Our approach uses data about historical usage of the data centers and data about environmental conditions to predict, with the help of regressive models, the values of the parameters of the fitness function, and then to appropriately tune the weights assigned to the parameters in accordance to the business goals. Preliminary experimental results, presented in this paper, show encouraging benefits.
76 FR 46308 - Center for Scientific Review; Notice of Closed Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-02
... Person: Ai-Ping Zou, MD, PHD, Scientific Review Officer,Center for Scientific Review, National Institutes..., Clinical Research, 93.306, 93.333, 93.337, 93.393-93.396, 93.837-93.844, 93.846-93.878, 93.892, 93.893...
78 FR 22893 - Center for Scientific Review; Notice of Closed Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-17
...). Contact Person: Ai-Ping Zou, M.D., Ph.D., Scientific Review Officer, Center for Scientific Review...; 93.333, Clinical Research, 93.306, 93.333, 93.337, 93.393-93.396, 93.837-93.844, 93.846-93.878, 93...
77 FR 16845 - Center for Scientific Review; Notice of Closed Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-22
... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel; Center: Mass Spectrometry Resource for Biology and Medicine. Date: April 1-3, 2012. Time: 8 p.m. to 12 p.m. Agenda: To...
NASA Astrophysics Data System (ADS)
Klimentov, A.; De, K.; Jha, S.; Maeno, T.; Nilsson, P.; Oleynik, D.; Panitkin, S.; Wells, J.; Wenaus, T.
2016-10-01
The.LHC, operating at CERN, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe. ATLAS, one of the largest collaborations ever assembled in the sciences, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Management System for managing the workflow for all data processing on over 150 data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. While PanDA currently uses more than 250,000 cores with a peak performance of 0.3 petaFLOPS, LHC data taking runs require more resources than grid can possibly provide. To alleviate these challenges, LHC experiments are engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with supercomputers in United States, in particular with Titan supercomputer at Oak Ridge Leadership Computing Facility. Current approach utilizes modified PanDA pilot framework for job submission to the supercomputers batch queues and local data management, with light-weight MPI wrappers to run single threaded workloads in parallel on LCFs multi-core worker nodes. This implementation was tested with a variety of Monte-Carlo workloads on several supercomputing platforms for ALICE and ATLAS experiments and it is in full pro duction for the ATLAS since September 2015. We will present our current accomplishments with running PanDA at supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facilities infrastructure for High Energy and Nuclear Physics as well as other data-intensive science applications, such as bioinformatics and astro-particle physics.
The Swiss Data Science Center on a mission to empower reproducible, traceable and reusable science
NASA Astrophysics Data System (ADS)
Schymanski, Stanislaus; Bouillet, Eric; Verscheure, Olivier
2017-04-01
Our abilities to collect, store and analyse scientific data have sky-rocketed in the past decades, but at the same time, a disconnect between data scientists, domain experts and data providers has begun to emerge. Data scientists are developing more and more powerful algorithms for data mining and analysis, while data providers are making more and more data publicly available, and yet many, if not most, discoveries are based on specific data and/or algorithms that "are available from the authors upon request". In the strong belief that scientific progress would be much faster if reproduction and re-use of such data and algorithms was made easier, the Swiss Data Science Center (SDSC) has committed to provide an open framework for the handling and tracking of scientific data and algorithms, from raw data and first principle equations to final data products and visualisations, modular simulation models and benchmark evaluation algorithms. Led jointly by EPFL and ETH Zurich, the SDSC is composed of a distributed multi-disciplinary team of data scientists and experts in select domains. The center aims to federate data providers, data and computer scientists, and subject-matter experts around a cutting-edge analytics platform offering user-friendly tooling and services to help with the adoption of Open Science, fostering research productivity and excellence. In this presentation, we will discuss our vision of a high-scalable open but secure community-based platform for sharing, accessing, exploring, and analyzing scientific data in easily reproducible workflows, augmented by automated provenance and impact tracking, knowledge graphs, fine-grained access right and digital right management, and a variety of domain-specific software tools. For maximum interoperability, transparency and ease of use, we plan to utilize notebook interfaces wherever possible, such as Apache Zeppelin and Jupyter. Feedback and suggestions from the audience will be gratefully considered.
78 FR 48880 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-12
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... grant applications. Place: National Institutes of Health, 6701 Rockledge Drive, Bethesda, MD 20892... Scientific Review, National Institutes of Health, 6701 Rockledge Drive, Room 5194, MSC 7846, Bethesda, MD...
78 FR 38999 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-28
..., (Virtual Meeting). Contact Person: Elena Smirnova, Ph.D., Scientific Review Officer, Center for Scientific... of Health, 6701 Rockledge Drive, Bethesda, MD 20892, (Virtual Meeting). Contact Person: Kenneth A... Institutes of Health, 6701 Rockledge Drive, Bethesda, MD 20892, (Virtual Meeting). Contact Person: Krish...
77 FR 16248 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-20
... (Virtual Meeting). Contact Person: Kenneth A Roebuck, Ph.D., Scientific Review Officer, Center for... Rockledge Drive, Bethesda, MD 20892 (Virtual Meeting). Contact Person: Robert Freund, Ph.D., Scientific.... Place: National Institutes of Health, 6701 Rockledge Drive, Bethesda, MD 20892 (Virtual Meeting...
75 FR 4830 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-29
... for Scientific Review Special Emphasis Panel, Applications in Mechanisms of Emotion, Stress and Health... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... applications. Place: National Institutes of Health, 6701 Rockledge Drive, Bethesda, MD 20892, (Telephone...
75 FR 16815 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-02
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... grant applications. Place: National Institutes of Health, 6701 Rockledge Drive, Bethesda, MD 20892... Scientific Review, National Institutes of Health, 6701 Rockledge Drive, Room 6200, MSC 7804 (For courier...
Whole earth modeling: developing and disseminating scientific software for computational geophysics.
NASA Astrophysics Data System (ADS)
Kellogg, L. H.
2016-12-01
Historically, a great deal of specialized scientific software for modeling and data analysis has been developed by individual researchers or small groups of scientists working on their own specific research problems. As the magnitude of available data and computer power has increased, so has the complexity of scientific problems addressed by computational methods, creating both a need to sustain existing scientific software, and expand its development to take advantage of new algorithms, new software approaches, and new computational hardware. To that end, communities like the Computational Infrastructure for Geodynamics (CIG) have been established to support the use of best practices in scientific computing for solid earth geophysics research and teaching. Working as a scientific community enables computational geophysicists to take advantage of technological developments, improve the accuracy and performance of software, build on prior software development, and collaborate more readily. The CIG community, and others, have adopted an open-source development model, in which code is developed and disseminated by the community in an open fashion, using version control and software repositories like Git. One emerging issue is how to adequately identify and credit the intellectual contributions involved in creating open source scientific software. The traditional method of disseminating scientific ideas, peer reviewed publication, was not designed for review or crediting scientific software, although emerging publication strategies such software journals are attempting to address the need. We are piloting an integrated approach in which authors are identified and credited as scientific software is developed and run. Successful software citation requires integration with the scholarly publication and indexing mechanisms as well, to assign credit, ensure discoverability, and provide provenance for software.
Integrating Data Base into the Elementary School Science Program.
ERIC Educational Resources Information Center
Schlenker, Richard M.
This document describes seven science activities that combine scientific principles and computers. The objectives for the activities are to show students how the computer can be used as a tool to store and arrange scientific data, provide students with experience using the computer as a tool to manage scientific data, and provide students with…
78 FR 2681 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-14
...-435-1212, [email protected] . Name of Committee: Immunology Integrated Review Group; Innate Immunity... Scientific Review Special Emphasis Panel; Member Conflicts: Pain and Hearing Date: February 12-13, 2013. Time... Committee: Center for Scientific Review Special Emphasis Panel; Member Conflict: Radiation Oncology. Date...
77 FR 55851 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-11
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... grant applications. Place: National Institutes of Health, 6701 Rockledge Drive, Bethesda, MD 20892... Scientific Review, National Institutes of Health, 6701 Rockledge Drive, Room 5217A, MSC 7846, Bethesda, MD...
78 FR 32670 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-31
... Scientific Review Special Emphasis Panel; Fellowships: Brain Disorders, Language, Communication, and Related...: Center for Scientific Review Special Emphasis Panel; PAR Panel: Brain Disorders in the Developing World... Review Special Emphasis Panel; Small Business: Health Informatics. Date: June 28, 2013. Time: 8:30 a.m...
76 FR 60058 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-28
... for Scientific Review Special Emphasis Panel, RFA Panel: Understanding and Promoting Health Literacy... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... Review, National Institutes of Health, 6701 Rockledge Drive, Room 4140, MSC 7814, Bethesda, MD 20892, 301...
77 FR 35415 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-13
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... Review, National Institutes of Health, 6701 Rockledge Drive, Room 5182, MSC 7844, Bethesda, MD 20892... for Scientific Review, National Institutes of Health, 6701 Rockledge Drive, Room 5216, MSC 7852...
77 FR 48527 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-14
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... Scientific Review, National Institutes of Health, 6701 Rockledge Drive, Room 4150, MSC 7806, Bethesda, MD... Institutes of Health, 6701 Rockledge Drive, Bethesda, MD 20892, (Telephone Conference Call). Contact Person...
78 FR 40756 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-08
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... evaluate grant applications. Place: National Institutes of Health, 6701 Rockledge Drive, Bethesda, MD 20892... Scientific Review, National Institutes of Health, 6701 Rockledge Drive, Room 5186, MSC 7846, Bethesda, MD...
Spin-Off Successes of SETI Research at Berkeley
NASA Astrophysics Data System (ADS)
Douglas, K. A.; Anderson, D. P.; Bankay, R.; Chen, H.; Cobb, J.; Korpela, E. J.; Lebofsky, M.; Parsons, A.; von Korff, J.; Werthimer, D.
2009-12-01
Our group contributes to the Search for Extra-Terrestrial Intelligence (SETI) by developing and using world-class signal processing computers to analyze data collected on the Arecibo telescope. Although no patterned signal of extra-terrestrial origin has yet been detected, and the immediate prospects for making such a detection are highly uncertain, the SETI@home project has nonetheless proven the value of pursuing such research through its impact on the fields of distributed computing, real-time signal processing, and radio astronomy. The SETI@home project has spun off the Center for Astronomy Signal Processing and Electronics Research (CASPER) and the Berkeley Open Infrastructure for Networked Computing (BOINC), both of which are responsible for catalyzing a smorgasbord of new research in scientific disciplines in countries around the world. Futhermore, the data collected and archived for the SETI@home project is proving valuable in data-mining experiments for mapping neutral galatic hydrogen and for detecting black-hole evaporation.
A high performance scientific cloud computing environment for materials simulations
NASA Astrophysics Data System (ADS)
Jorissen, K.; Vila, F. D.; Rehr, J. J.
2012-09-01
We describe the development of a scientific cloud computing (SCC) platform that offers high performance computation capability. The platform consists of a scientific virtual machine prototype containing a UNIX operating system and several materials science codes, together with essential interface tools (an SCC toolset) that offers functionality comparable to local compute clusters. In particular, our SCC toolset provides automatic creation of virtual clusters for parallel computing, including tools for execution and monitoring performance, as well as efficient I/O utilities that enable seamless connections to and from the cloud. Our SCC platform is optimized for the Amazon Elastic Compute Cloud (EC2). We present benchmarks for prototypical scientific applications and demonstrate performance comparable to local compute clusters. To facilitate code execution and provide user-friendly access, we have also integrated cloud computing capability in a JAVA-based GUI. Our SCC platform may be an alternative to traditional HPC resources for materials science or quantum chemistry applications.
Leveraging Python Interoperability Tools to Improve Sapphire's Usability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gezahegne, A; Love, N S
2007-12-10
The Sapphire project at the Center for Applied Scientific Computing (CASC) develops and applies an extensive set of data mining algorithms for the analysis of large data sets. Sapphire's algorithms are currently available as a set of C++ libraries. However many users prefer higher level scripting languages such as Python for their ease of use and flexibility. In this report, we evaluate four interoperability tools for the purpose of wrapping Sapphire's core functionality with Python. Exposing Sapphire's functionality through a Python interface would increase its usability and connect its algorithms to existing Python tools.
Spacelab data analysis and interactive control study
NASA Technical Reports Server (NTRS)
Tarbell, T. D.; Drake, J. F.
1980-01-01
The study consisted of two main tasks, a series of interviews of Spacelab users and a survey of data processing and display equipment. Findings from the user interviews on questions of interactive control, downlink data formats, and Spacelab computer software development are presented. Equipment for quick look processing and display of scientific data in the Spacelab Payload Operations Control Center (POCC) was surveyed. Results of this survey effort are discussed in detail, along with recommendations for NASA development of several specific display systems which meet common requirements of many Spacelab experiments.
Katz, Ralph V; Kegeles, S Stephen; Green, B Lee; Kressin, Nancy R; James, Sherman A; Claudio, Cristina
2003-01-01
This article is intended to provide a relatively complete picture of how a pilot study--conceived and initiated within an NIDCR-funded RRCMOH--matured into a solid line of investigation within that center and "with legs" into a fully funded study within the next generation of NIDCR centers on this topic of health disparities, the Centers for Research to Reduce Oral Health Disparities. It highlights the natural opportunity that these centers provide for multicenter. cross-disciplinary research and for research career pipelining for college and dental school students; with a focus, in this case, on minority students. Futhermore, this series of events demonstrates the rich potential that these types of research centers have to contribute in ways that far exceed the scientific outcomes that form their core. In this instance, the NMOHRC played a central--and critical, if unanticipated--role in contributing to two events of national significance, namely the presidential apology to the African American community for the research abuses of the USPHS--Tuskegee syphilis study and the establishment of the National Center for Bioethics in Research and Health Care at Tuskegee University. Research Centers supported by the NIH are fully intended to create a vortex of scientific activity that goes well beyond the direct scientific aims of the studies initially funded within those centers. The maxim is that the whole should be greater than the sum of its initial constituent studies or parts. We believe that NMOHRC did indeed achieve that maxim--even extending "the whole" to include broad societal impact. well beyond the scope of important, but mere, scientific outcomes--all within the concept and appropriate functions of a scientific NIH-funded research center.
Optimizing CyberShake Seismic Hazard Workflows for Large HPC Resources
NASA Astrophysics Data System (ADS)
Callaghan, S.; Maechling, P. J.; Juve, G.; Vahi, K.; Deelman, E.; Jordan, T. H.
2014-12-01
The CyberShake computational platform is a well-integrated collection of scientific software and middleware that calculates 3D simulation-based probabilistic seismic hazard curves and hazard maps for the Los Angeles region. Currently each CyberShake model comprises about 235 million synthetic seismograms from about 415,000 rupture variations computed at 286 sites. CyberShake integrates large-scale parallel and high-throughput serial seismological research codes into a processing framework in which early stages produce files used as inputs by later stages. Scientific workflow tools are used to manage the jobs, data, and metadata. The Southern California Earthquake Center (SCEC) developed the CyberShake platform using USC High Performance Computing and Communications systems and open-science NSF resources.CyberShake calculations were migrated to the NSF Track 1 system NCSA Blue Waters when it became operational in 2013, via an interdisciplinary team approach including domain scientists, computer scientists, and middleware developers. Due to the excellent performance of Blue Waters and CyberShake software optimizations, we reduced the makespan (a measure of wallclock time-to-solution) of a CyberShake study from 1467 to 342 hours. We will describe the technical enhancements behind this improvement, including judicious introduction of new GPU software, improved scientific software components, increased workflow-based automation, and Blue Waters-specific workflow optimizations.Our CyberShake performance improvements highlight the benefits of scientific workflow tools. The CyberShake workflow software stack includes the Pegasus Workflow Management System (Pegasus-WMS, which includes Condor DAGMan), HTCondor, and Globus GRAM, with Pegasus-mpi-cluster managing the high-throughput tasks on the HPC resources. The workflow tools handle data management, automatically transferring about 13 TB back to SCEC storage.We will present performance metrics from the most recent CyberShake study, executed on Blue Waters. We will compare the performance of CPU and GPU versions of our large-scale parallel wave propagation code, AWP-ODC-SGT. Finally, we will discuss how these enhancements have enabled SCEC to move forward with plans to increase the CyberShake simulation frequency to 1.0 Hz.
Dehydration of 1-octadecanol over H-BEA: A combined experimental and computational study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Song, Wenji; Liu, Yuanshuai; Barath, Eszter
Liquid phase dehydration of 1-octdecanol, which is intermediately formed during the hydrodeoxygenation of microalgae oil, has been explored in a combined experimental and computational study. The alkyl chain of C18 alcohol interacts with acid sites during diffusion inside the zeolite pores, resulting in an inefficient utilization of the Brønsted acid sites for samples with high acid site concentrations. The parallel intra- and inter- molecular dehydration pathways having different activation energies pass through alternative reaction intermediates. Formation of surface-bound alkoxide species is the rate-limiting step during intramolecular dehydration, whereas intermolecular dehydration proceeds via a bulky dimer intermediate. Octadecene is the primarymore » dehydration product over H-BEA at 533 K. Despite of the main contribution of Brønsted acid sites towards both dehydration pathways, Lewis acid sites are also active in the formation of dioctadecyl ether. The intramolecular dehydration to octadecene and cleavage of the intermediately formed ether, however, require strong BAS. L. Wang, D. Mei and J. A. Lercher, acknowledge the partial support from the US Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences & Biosciences. Pacific Northwest National Laboratory (PNNL) is a multiprogram national laboratory operated for DOE by Battelle. Computing time was granted by the grand challenge of computational catalysis of the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) and by the National Energy Research Scientific Computing Center (NERSC). EMSL is a national scientific user facility located at Pacific Northwest National Laboratory (PNNL) and sponsored by DOE’s Office of Biological and Environmental Research.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-08
... Scientific Counselors, Office of Public Health Preparedness and Response, Board of Scientific Counselors (BSC... Director, Centers for Disease Control and Prevention (CDC), and the Director, Office of Public Health... Public Health Practice Executive Assistant, Centers for Disease Control and Prevention, 1600 Clifton Road...
78 FR 51735 - Center For Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-21
...: Biological Chemistry and Macromolecular Biophysics. Date: August 26, 2013. Time: 11:00 a.m. to 12:00 p.m... Drive, Bethesda, MD 20892 (Telephone Conference Call). Contact Person: Nitsa Rosenzweig, Ph.D... Conference Call). Contact Person: Aftab A Ansari, Ph.D., Scientific Review Officer, Center for Scientific...
76 FR 38405 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-30
..., 6701 Rockledge Drive, Bethesda, MD 20892 (Virtual Meeting). Contact Person: Michael H Chaitin, PhD... (Virtual Meeting). Contact Person: David L Williams, PhD, Scientific Review Officer, Center for Scientific... Health, 6701 Rockledge Drive, Bethesda, MD 20817 (Virtual Meeting). Contact Person: David L Williams, PhD...
76 FR 35900 - Center For Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-20
...: Child Psychopathology and Developmental Disabilities. Date: July 5, 2011. Time: 1 p.m. to 3:30 p.m... Drive, Bethesda, MD 20892, (Telephone Conference Call). Contact Person: Melissa Gerald, PhD, Scientific Review Officer, Center for Scientific Review, National Institutes of Health, 6701 Rockledge Drive, Room...
75 FR 65020 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-21
...: Psychopathology, Developmental Disabilities, Stress and Aging. Date: November 12, 2010. Time: 8 a.m. to 5 p.m... Scientific Review, National Institutes of Health, 6701 Rockledge Drive, Room 3182, MSC 7759, Bethesda, MD..., Center for Scientific Review, National Institutes of Health, 6701 Rockledge Drive, Room 5208, MSC 7852...
Sim, Ida; Tu, Samson W.; Carini, Simona; Lehmann, Harold P.; Pollock, Brad H.; Peleg, Mor; Wittkowski, Knut M.
2013-01-01
To date, the scientific process for generating, interpreting, and applying knowledge has received less informatics attention than operational processes for conducting clinical studies. The activities of these scientific processes — the science of clinical research — are centered on the study protocol, which is the abstract representation of the scientific design of a clinical study. The Ontology of Clinical Research (OCRe) is an OWL 2 model of the entities and relationships of study design protocols for the purpose of computationally supporting the design and analysis of human studies. OCRe’s modeling is independent of any specific study design or clinical domain. It includes a study design typology and a specialized module called ERGO Annotation for capturing the meaning of eligibility criteria. In this paper, we describe the key informatics use cases of each phase of a study’s scientific lifecycle, present OCRe and the principles behind its modeling, and describe applications of OCRe and associated technologies to a range of clinical research use cases. OCRe captures the central semantics that underlies the scientific processes of clinical research and can serve as an informatics foundation for supporting the entire range of knowledge activities that constitute the science of clinical research. PMID:24239612
Scientific Visualization & Modeling for Earth Systems Science Education
NASA Technical Reports Server (NTRS)
Chaudhury, S. Raj; Rodriguez, Waldo J.
2003-01-01
Providing research experiences for undergraduate students in Earth Systems Science (ESS) poses several challenges at smaller academic institutions that might lack dedicated resources for this area of study. This paper describes the development of an innovative model that involves students with majors in diverse scientific disciplines in authentic ESS research. In studying global climate change, experts typically use scientific visualization techniques applied to remote sensing data collected by satellites. In particular, many problems related to environmental phenomena can be quantitatively addressed by investigations based on datasets related to the scientific endeavours such as the Earth Radiation Budget Experiment (ERBE). Working with data products stored at NASA's Distributed Active Archive Centers, visualization software specifically designed for students and an advanced, immersive Virtual Reality (VR) environment, students engage in guided research projects during a structured 6-week summer program. Over the 5-year span, this program has afforded the opportunity for students majoring in biology, chemistry, mathematics, computer science, physics, engineering and science education to work collaboratively in teams on research projects that emphasize the use of scientific visualization in studying the environment. Recently, a hands-on component has been added through science student partnerships with school-teachers in data collection and reporting for the GLOBE Program (GLobal Observations to Benefit the Environment).
Constructing Scientific Arguments Using Evidence from Dynamic Computational Climate Models
ERIC Educational Resources Information Center
Pallant, Amy; Lee, Hee-Sun
2015-01-01
Modeling and argumentation are two important scientific practices students need to develop throughout school years. In this paper, we investigated how middle and high school students (N = 512) construct a scientific argument based on evidence from computational models with which they simulated climate change. We designed scientific argumentation…
Hera - The HEASARC's New Data Analysis Service
NASA Technical Reports Server (NTRS)
Pence, William
2006-01-01
Hera is the new computer service provided by the HEASARC at the NASA Goddard Space Flight Center that enables qualified student and professional astronomical researchers to immediately begin analyzing scientific data from high-energy astrophysics missions. All the necessary resources needed to do the data analysis are freely provided by Hera, including: * the latest version of the hundreds of scientific analysis programs in the HEASARC's HEASOFT package, as well as most of the programs in the Chandra CIAO package and the XMM-Newton SAS package. * high speed access to the terabytes of data in the HEASARC's high energy astrophysics Browse data archive. * a cluster of fast Linw workstations to run the software * ample local disk space to temporarily store the data and results. Some of the many features and different modes of using Hera are illustrated in this poster presentation.
Parallel processing for scientific computations
NASA Technical Reports Server (NTRS)
Alkhatib, Hasan S.
1995-01-01
The scope of this project dealt with the investigation of the requirements to support distributed computing of scientific computations over a cluster of cooperative workstations. Various experiments on computations for the solution of simultaneous linear equations were performed in the early phase of the project to gain experience in the general nature and requirements of scientific applications. A specification of a distributed integrated computing environment, DICE, based on a distributed shared memory communication paradigm has been developed and evaluated. The distributed shared memory model facilitates porting existing parallel algorithms that have been designed for shared memory multiprocessor systems to the new environment. The potential of this new environment is to provide supercomputing capability through the utilization of the aggregate power of workstations cooperating in a cluster interconnected via a local area network. Workstations, generally, do not have the computing power to tackle complex scientific applications, making them primarily useful for visualization, data reduction, and filtering as far as complex scientific applications are concerned. There is a tremendous amount of computing power that is left unused in a network of workstations. Very often a workstation is simply sitting idle on a desk. A set of tools can be developed to take advantage of this potential computing power to create a platform suitable for large scientific computations. The integration of several workstations into a logical cluster of distributed, cooperative, computing stations presents an alternative to shared memory multiprocessor systems. In this project we designed and evaluated such a system.
NASA Astrophysics Data System (ADS)
Schnase, J. L.; Duffy, D. Q.; Tamkin, G. S.; Strong, S.; Ripley, D.; Gill, R.; Sinno, S. S.; Shen, Y.; Carriere, L. E.; Brieger, L.; Moore, R.; Rajasekar, A.; Schroeder, W.; Wan, M.
2011-12-01
Scientific data services are becoming an important part of the NASA Center for Climate Simulation's mission. Our technological response to this expanding role is built around the concept of specialized virtual climate data servers, repetitive cloud provisioning, image-based deployment and distribution, and virtualization-as-a-service. A virtual climate data server is an OAIS-compliant, iRODS-based data server designed to support a particular type of scientific data collection. iRODS is data grid middleware that provides policy-based control over collection-building, managing, querying, accessing, and preserving large scientific data sets. We have developed prototype vCDSs to manage NetCDF, HDF, and GeoTIF data products. We use RPM scripts to build vCDS images in our local computing environment, our local Virtual Machine Environment, NASA's Nebula Cloud Services, and Amazon's Elastic Compute Cloud. Once provisioned into these virtualized resources, multiple vCDSs can use iRODS's federation and realized object capabilities to create an integrated ecosystem of data servers that can scale and adapt to changing requirements. This approach enables platform- or software-as-a-service deployment of the vCDSs and allows the NCCS to offer virtualization-as-a-service, a capacity to respond in an agile way to new customer requests for data services, and a path for migrating existing services into the cloud. We have registered MODIS Atmosphere data products in a vCDS that contains 54 million registered files, 630TB of data, and over 300 million metadata values. We are now assembling IPCC AR5 data into a production vCDS that will provide the platform upon which NCCS's Earth System Grid (ESG) node publishes to the extended science community. In this talk, we describe our approach, experiences, lessons learned, and plans for the future.
Astronauts Working in Spacelab
NASA Technical Reports Server (NTRS)
1999-01-01
This Quick Time movie captures astronaut Jan Davis and her fellow crew members working in the Spacelab, a versatile laboratory carried in the Space Shuttle's cargo bay for special research flights. Its various elements can be combined to accommodate the many types of scientific research that can best be performed in space. Spacelab consisted of an enclosed, pressurized laboratory module and open U-shaped pallets located at the rear of the laboratory module. The laboratory module contained utilities, computers, work benches, and instrument racks to conduct scientific experiments in astronomy, physics, chemistry, biology, medicine, and engineering. Equipment, such as telescopes, antennas, and sensors, is mounted on pallets for direct exposure to space. A 1-meter (3.3-ft.) diameter aluminum tunnel, resembling a z-shaped tube, connected the crew compartment (mid deck) to the module. The reusable Spacelab allowed scientists to bring experiment samples back to Earth for post-flight analysis. Spacelab was a cooperative venture of the European Space Agency (ESA) and NASA. ESA was responsible for funding, developing, and building Spacelab, while NASA was responsible for the launch and operational use of Spacelab. Spacelab missions were cooperative efforts between scientists and engineers from around the world. Teams from NASA centers, universities, private industry, government agencies and international space organizations designed the experiments. The Marshall Space Flight Center was NASA's lead center for monitoring the development of Spacelab and managing the program.
Gigantic Wave Discovered in Perseus Galaxy Cluster
2017-12-08
Combining data from NASA's Chandra X-ray Observatory with radio observations and computer simulations, an international team of scientists has discovered a vast wave of hot gas in the nearby Perseus galaxy cluster. Spanning some 200,000 light-years, the wave is about twice the size of our own Milky Way galaxy. The researchers say the wave formed billions of years ago, after a small galaxy cluster grazed Perseus and caused its vast supply of gas to slosh around an enormous volume of space. "Perseus is one of the most massive nearby clusters and the brightest one in X-rays, so Chandra data provide us with unparalleled detail," said lead scientist Stephen Walker at NASA's Goddard Space Flight Center in Greenbelt, Maryland. "The wave we've identified is associated with the flyby of a smaller cluster, which shows that the merger activity that produced these giant structures is still ongoing." Read more at nasa.gov Credit: NASA's Goddard Space Flight Center/Stephen Walker href="http://www.nasa.gov/audience/formedia/features/MP_Photo_Guidelines.html" rel="nofollow">NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
NASA Astrophysics Data System (ADS)
Cox, S. J.; Wyborn, L. A.; Fraser, R.; Rankine, T.; Woodcock, R.; Vote, J.; Evans, B.
2012-12-01
The Virtual Geophysics Laboratory (VGL) is web portal that provides geoscientists with an integrated online environment that: seamlessly accesses geophysical and geoscience data services from the AuScope national geoscience information infrastructure; loosely couples these data to a variety of gesocience software tools; and provides large scale processing facilities via cloud computing. VGL is a collaboration between CSIRO, Geoscience Australia, National Computational Infrastructure, Monash University, Australian National University and the University of Queensland. The VGL provides a distributed system whereby a user can enter an online virtual laboratory to seamlessly connect to OGC web services for geoscience data. The data is supplied in open standards formats using international standards like GeoSciML. A VGL user uses a web mapping interface to discover and filter the data sources using spatial and attribute filters to define a subset. Once the data is selected the user is not required to download the data. VGL collates the service query information for later in the processing workflow where it will be staged directly to the computing facilities. The combination of deferring data download and access to Cloud computing enables VGL users to access their data at higher resolutions and to undertake larger scale inversions, more complex models and simulations than their own local computing facilities might allow. Inside the Virtual Geophysics Laboratory, the user has access to a library of existing models, complete with exemplar workflows for specific scientific problems based on those models. For example, the user can load a geological model published by Geoscience Australia, apply a basic deformation workflow provided by a CSIRO scientist, and have it run in a scientific code from Monash. Finally the user can publish these results to share with a colleague or cite in a paper. This opens new opportunities for access and collaboration as all the resources (models, code, data, processing) are shared in the one virtual laboratory. VGL provides end users with access to an intuitive, user-centered interface that leverages cloud storage and cloud and cluster processing from both the research communities and commercial suppliers (e.g. Amazon). As the underlying data and information services are agnostic of the scientific domain, they can support many other data types. This fundamental characteristic results in a highly reusable virtual laboratory infrastructure that could also be used for example natural hazards, satellite processing, soil geochemistry, climate modeling, agriculture crop modeling.
Origin of Marshall Space Flight Center (MSFC)
2004-04-15
Twelve scientific specialists of the Peenemuende team at the front of Building 4488, Redstone Arsenal, Huntsville, Alabama. They led the Army's space efforts at ABMA before transfer of the team to National Aeronautic and Space Administration (NASA), George C. Marshall Space Flight Center (MSFC). (Left to right) Dr. Ernst Stuhlinger, Director, Research Projects Office; Dr. Helmut Hoelzer, Director, Computation Laboratory: Karl L. Heimburg, Director, Test Laboratory; Dr. Ernst Geissler, Director, Aeroballistics Laboratory; Erich W. Neubert, Director, Systems Analysis Reliability Laboratory; Dr. Walter Haeussermarn, Director, Guidance and Control Laboratory; Dr. Wernher von Braun, Director Development Operations Division; William A. Mrazek, Director, Structures and Mechanics Laboratory; Hans Hueter, Director, System Support Equipment Laboratory;Eberhard Rees, Deputy Director, Development Operations Division; Dr. Kurt Debus, Director Missile Firing Laboratory; Hans H. Maus, Director, Fabrication and Assembly Engineering Laboratory
Relativistic Collisions of Highly-Charged Ions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ionescu, Dorin; Belkacem, Ali
1998-11-19
The physics of elementary atomic processes in relativistic collisions between highly-charged ions and atoms or other ions is briefly discussed, and some recent theoretical and experimental results in this field are summarized. They include excitation, capture, ionization, and electron-positron pair creation. The numerical solution of the two-center Dirac equation in momentum space is shown to be a powerful nonperturbative method for describing atomic processes in relativistic collisions involving heavy and highly-charged ions. By propagating negative-energy wave packets in time the evolution of the QED vacuum around heavy ions in relativistic motion is investigated. Recent results obtained from numerical calculations usingmore » massively parallel processing on the Cray-T3E supercomputer of the National Energy Research Scientific Computer Center (NERSC) at Berkeley National Laboratory are presented.« less
Implementation and Challenges of the Tsunami Warning System in the Western Mediterranean
NASA Astrophysics Data System (ADS)
Schindelé, F.; Gailler, A.; Hébert, H.; Loevenbruck, A.; Gutierrez, E.; Monnier, A.; Roudil, P.; Reymond, D.; Rivera, L.
2015-03-01
The French Tsunami Warning Center (CENALT) has been in operation since 2012. It is contributing to the North-eastern and Mediterranean (NEAM) tsunami warning and mitigation system coordinated by the United Nations Educational, Scientific, and Cultural Organization, and benefits from data exchange with several foreign institutes. This center is supported by the French Government and provides French civil-protection authorities and member states of the NEAM region with relevant messages for assessing potential tsunami risk when an earthquake has occurred in the Western Mediterranean sea or the Northeastern Atlantic Ocean. To achieve its objectives, CENALT has developed a series of innovative techniques based on recent research results in seismology for early tsunami warning, monitoring of sea level variations and detection capability, and effective numerical computation of ongoing tsunamis.
78 FR 21959 - Center for Scientific Review ; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-12
... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel; Member Conflict... Panel; RFA DA13-003: Tobacco Centers of Regulatory Science for Research, Relevant to the Family Smoking... Special Emphasis Panel; Member Conflict: Behavioral Interventions to Address Multiple Chronic Health...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-08
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Disease Control and Prevention Board of Scientific Counselors, National Center for Injury Prevention and Control, (BSC, NCIPC) In accordance with Section 10(a)(2) of the Federal Advisory Committee Act (Pub. L. 92-463), the Centers for Disease Control and Prevention (CDC) announces, the followin...
NASA Technical Reports Server (NTRS)
1998-01-01
This report highlights the challenging work accomplished during fiscal year 1997 by Ames research scientists and engineers. The work is divided into accomplishments that support the goals of NASA s four Strategic Enterprises: Aeronautics and Space Transportation Technology, Space Science, Human Exploration and Development of Space (HEDS), and Earth Science. NASA Ames Research Center s research effort in the Space, Earth, and HEDS Enterprises is focused i n large part to support Ames lead role for Astrobiology, which broadly defined is the scientific study of the origin, distribution, and future of life in the universe. This NASA initiative in Astrobiology is a broad science effort embracing basic research, technology development, and flight missions. Ames contributions to the Space Science Enterprise are focused in the areas of exobiology, planetary systems, astrophysics, and space technology. Ames supports the Earth Science Enterprise by conducting research and by developing technology with the objective of expanding our knowledge of the Earth s atmosphere and ecosystems. Finallv, Ames supports the HEDS Enterprise by conducting research, managing spaceflight projects, and developing technologies. A key objective is to understand the phenomena surrounding the effects of gravity on living things. Ames has also heen designated the Agency s Center of Evcellence for Information Technnlogv. The three cornerstones of Information Technology research at Ames are automated reasoning, human-centered computing, and high performance computing and networking.
Ultra-low-energy analog straintronics using multiferroic composites
NASA Astrophysics Data System (ADS)
Roy, Kuntal
2014-03-01
Multiferroic devices, i.e., a magnetostrictive nanomagnet strain-coupled with a piezoelectric layer, are promising as binary switches for ultra-low-energy digital computing in beyond Moore's law era [Roy, K. Appl. Phys. Lett. 103, 173110 (2013), Roy, K. et al. Appl. Phys. Lett. 99, 063108 (2011), Phys. Rev. B 83, 224412 (2011), Scientific Reports (Nature Publishing Group) 3, 3038 (2013), J. Appl. Phys. 112, 023914 (2012)]. We show here that such multiferroic devices, apart from performing digital computation, can be also utilized for analog computing purposes, e.g., voltage amplification, filter etc. The analog computing capability is conceived by considering that magnetization's mean orientation shifts gradually although nanomagnet's potential minima changes abruptly. Using tunneling magnetoresistance (TMR) measurement, a continuous output voltage while varying the input voltage can be produced. Stochastic Landau-Lifshitz-Gilbert (LLG) equation in the presence of room-temperature (300 K) thermal fluctuations is solved to demonstrate the analog computing capability of such multiferroic devices. This work was supported in part by FAME, one of six centers of STARnet, a Semiconductor Research Corporation program sponsored by MARCO and DARPA.
NASA Astrophysics Data System (ADS)
Ethier, Stephane; Lin, Zhihong
2001-10-01
Earlier this year, the National Energy Research Scientific Computing center (NERSC) took delivery of the second most powerful computer in the world. With its 2,528 processors running at a peak performance of 1.5 GFlops, this IBM SP machine has a theoretical performance of almost 3.8 TFlops. To efficiently harness such computing power in one single code is not an easy task and requires a good knowledge of the computer's architecture. Here we present the steps that we followed to improve our gyrokinetic micro-turbulence code GTC in order to take advantage of the new 16-way shared memory nodes of the NERSC IBM SP. Performance results are shown as well as details about the improved mixed-mode MPI-OpenMP model that we use. The enhancements to the code allowed us to tackle much bigger problem sizes, getting closer to our goal of simulating an ITER-size tokamak with both kinetic ions and electrons.(This work is supported by DOE Contract No. DE-AC02-76CH03073 (PPPL), and in part by the DOE Fusion SciDAC Project.)
Energy Innovation Hubs: A Home for Scientific Collaboration
Chu, Steven
2017-12-11
Secretary Chu will host a live, streaming Q&A session with the directors of the Energy Innovation Hubs on Tuesday, March 6, at 2:15 p.m. EST. The directors will be available for questions regarding their teams' work and the future of American energy. Ask your questions in the comments below, or submit them on Facebook, Twitter (@energy), or send an e-mail to newmedia@hq.doe.gov, prior or during the live event. Dr. Hank Foley is the director of the Greater Philadelphia Innovation Cluster for Energy-Efficient Buildings, which is pioneering new data intensive techniques for designing and operating energy efficient buildings, including advanced computer modeling. Dr. Douglas Kothe is the director of the Consortium for Advanced Simulation of Light Water Reactors, which uses powerful supercomputers to create "virtual" reactors that will help improve the safety and performance of both existing and new nuclear reactors. Dr. Nathan Lewis is the director of the Joint Center for Artificial Photosynthesis, which focuses on how to produce fuels from sunlight, water, and carbon dioxide. The Energy Innovation Hubs are major integrated research centers, with researchers from many different institutions and technical backgrounds. Each hub is focused on a specific high priority goal, rapidly accelerating scientific discoveries and shortening the path from laboratory innovation to technological development and commercial deployment of critical energy technologies. Ask your questions in the comments below, or submit them on Facebook, Twitter (@energy), or send an e-mail to newmedia@energy.gov, prior or during the live event. The Energy Innovation Hubs are major integrated research centers, with researchers from many different institutions and technical backgrounds. Each Hub is focused on a specific high priority goal, rapidly accelerating scientific discoveries and shortening the path from laboratory innovation to technological development and commercial deployment of critical energy technologies. Dr. Hank Holey is the director of the Greater Philadelphia Innovation Cluster for Energy-Efficient Buildings, which is pioneering new data intensive techniques for designing and operating energy efficient buildings, including advanced computer modeling. Dr. Douglas Kothe is the director of the Modeling and Simulation for Nuclear Reactors Hub, which uses powerful supercomputers to create "virtual" reactors that will help improve the safety and performance of both existing and new nuclear reactors. Dr. Nathan Lewis is the director of the Joint Center for Artificial Photosynthesis Hub, which focuses on how to produce biofuels from sunlight, water, and carbon dioxide.
76 FR 33322 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-08
... Scientific Review Special Emphasis Panel, Member Conflict: Cancer Therapeutics. Date: June 29, 2011. Time: 1... . Name of Committee: Center for Scientific Review Special Emphasis Panel, Small Business: Cancer Diagnostics and Treatments. Date: July 12-13, 2011. Time: 10 a.m. to 6 p.m. Agenda: To review and evaluate...
OMPC: an Open-Source MATLAB®-to-Python Compiler
Jurica, Peter; van Leeuwen, Cees
2008-01-01
Free access to scientific information facilitates scientific progress. Open-access scientific journals are a first step in this direction; a further step is to make auxiliary and supplementary materials that accompany scientific publications, such as methodological procedures and data-analysis tools, open and accessible to the scientific community. To this purpose it is instrumental to establish a software base, which will grow toward a comprehensive free and open-source language of technical and scientific computing. Endeavors in this direction are met with an important obstacle. MATLAB®, the predominant computation tool in many fields of research, is a closed-source commercial product. To facilitate the transition to an open computation platform, we propose Open-source MATLAB®-to-Python Compiler (OMPC), a platform that uses syntax adaptation and emulation to allow transparent import of existing MATLAB® functions into Python programs. The imported MATLAB® modules will run independently of MATLAB®, relying on Python's numerical and scientific libraries. Python offers a stable and mature open source platform that, in many respects, surpasses commonly used, expensive commercial closed source packages. The proposed software will therefore facilitate the transparent transition towards a free and general open-source lingua franca for scientific computation, while enabling access to the existing methods and algorithms of technical computing already available in MATLAB®. OMPC is available at http://ompc.juricap.com. PMID:19225577
NASA Astrophysics Data System (ADS)
Mezzacappa, Anthony
2005-01-01
On 26-30 June 2005 at the Grand Hyatt on Union Square in San Francisco several hundred computational scientists from around the world came together for what can certainly be described as a celebration of computational science. Scientists from the SciDAC Program and scientists from other agencies and nations were joined by applied mathematicians and computer scientists to highlight the many successes in the past year where computation has led to scientific discovery in a variety of fields: lattice quantum chromodynamics, accelerator modeling, chemistry, biology, materials science, Earth and climate science, astrophysics, and combustion and fusion energy science. Also highlighted were the advances in numerical methods and computer science, and the multidisciplinary collaboration cutting across science, mathematics, and computer science that enabled these discoveries. The SciDAC Program was conceived and funded by the US Department of Energy Office of Science. It is the Office of Science's premier computational science program founded on what is arguably the perfect formula: the priority and focus is science and scientific discovery, with the understanding that the full arsenal of `enabling technologies' in applied mathematics and computer science must be brought to bear if we are to have any hope of attacking and ultimately solving today's computational Grand Challenge problems. The SciDAC Program has been in existence for four years, and many of the computational scientists funded by this program will tell you that the program has given them the hope of addressing their scientific problems in full realism for the very first time. Many of these scientists will also tell you that SciDAC has also fundamentally changed the way they do computational science. We begin this volume with one of DOE's great traditions, and core missions: energy research. As we will see, computation has been seminal to the critical advances that have been made in this arena. Of course, to understand our world, whether it is to understand its very nature or to understand it so as to control it for practical application, will require explorations on all of its scales. Computational science has been no less an important tool in this arena than it has been in the arena of energy research. From explorations of quantum chromodynamics, the fundamental theory that describes how quarks make up the protons and neutrons of which we are composed, to explorations of the complex biomolecules that are the building blocks of life, to explorations of some of the most violent phenomena in our universe and of the Universe itself, computation has provided not only significant insight, but often the only means by which we have been able to explore these complex, multicomponent systems and by which we have been able to achieve scientific discovery and understanding. While our ultimate target remains scientific discovery, it certainly can be said that at a fundamental level the world is mathematical. Equations ultimately govern the evolution of the systems of interest to us, be they physical, chemical, or biological systems. The development and choice of discretizations of these underlying equations is often a critical deciding factor in whether or not one is able to model such systems stably, faithfully, and practically, and in turn, the algorithms to solve the resultant discrete equations are the complementary, critical ingredient in the recipe to model the natural world. The use of parallel computing platforms, especially at the TeraScale, and the trend toward even larger numbers of processors, continue to present significant challenges in the development and implementation of these algorithms. Computational scientists often speak of their `workflows'. A workflow, as the name suggests, is the sum total of all complex and interlocking tasks, from simulation set up, execution, and I/O, to visualization and scientific discovery, through which the advancement in our understanding of the natural world is realized. For the computational scientist, enabling such workflows presents myriad, signiflcant challenges, and it is computer scientists that are called upon at such times to address these challenges. Simulations are currently generating data at the staggering rate of tens of TeraBytes per simulation, over the course of days. In the next few years, these data generation rates are expected to climb exponentially to hundreds of TeraBytes per simulation, performed over the course of months. The output, management, movement, analysis, and visualization of these data will be our key to unlocking the scientific discoveries buried within the data. And there is no hope of generating such data to begin with, or of scientific discovery, without stable computing platforms and a sufficiently high and sustained performance of scientific applications codes on them. Thus, scientific discovery in the realm of computational science at the TeraScale and beyond will occur at the intersection of science, applied mathematics, and computer science. The SciDAC Program was constructed to mirror this reality, and the pages that follow are a testament to the efficacy of such an approach. We would like to acknowledge the individuals on whose talents and efforts the success of SciDAC 2005 was based. Special thanks go to Betsy Riley for her work on the SciDAC 2005 Web site and meeting agenda, for lining up our corporate sponsors, for coordinating all media communications, and for her efforts in processing the proceedings contributions, to Sherry Hempfling for coordinating the overall SciDAC 2005 meeting planning, for handling a significant share of its associated communications, and for coordinating with the ORNL Conference Center and Grand Hyatt, to Angela Harris for producing many of the documents and records on which our meeting planning was based and for her efforts in coordinating with ORNL Graphics Services, to Angie Beach of the ORNL Conference Center for her efforts in procurement and setting up and executing the contracts with the hotel, and to John Bui and John Smith for their superb wireless networking and A/V set up and support. We are grateful for the relentless efforts of all of these individuals, their remarkable talents, and for the joy of working with them during this past year. They were the cornerstones of SciDAC 2005. Thanks also go to Kymba A'Hearn and Patty Boyd for on-site registration, Brittany Hagen for administrative support, Bruce Johnston for netcast support, Tim Jones for help with the proceedings and Web site, Sherry Lamb for housing and registration, Cindy Lathum for Web site design, Carolyn Peters for on-site registration, and Dami Rich for graphic design. And we would like to express our appreciation to the Oak Ridge National Laboratory, especially Jeff Nichols, the Argonne National Laboratory, the Lawrence Berkeley National Laboratory, and to our corporate sponsors, Cray, IBM, Intel, and SGI, for their support. We would like to extend special thanks also to our plenary speakers, technical speakers, poster presenters, and panelists for all of their efforts on behalf of SciDAC 2005 and for their remarkable achievements and contributions. We would like to express our deep appreciation to Lali Chatterjee, Graham Douglas and Margaret Smith of Institute of Physics Publishing, who worked tirelessly in order to provide us with this finished volume within two months, which is nothing short of miraculous. Finally, we wish to express our heartfelt thanks to Michael Strayer, SciDAC Director, whose vision it was to focus SciDAC 2005 on scientific discovery, around which all of the excitement we experienced revolved, and to our DOE SciDAC program managers, especially Fred Johnson, for their support, input, and help throughout.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Jeff
"Carbon in Underland" was submitted by the Center for Nanoscale Controls on Geologic CO2 (NCGC) to the "Life at the Frontiers of Energy Research" video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. This video was selected as one of five winners by a distinguished panel of judges for its "entertaining animation and engaging explanations of carbon sequestration". NCGC, an EFRC directed by Donald J. DePaolo at Lawrence Berkeley National Laboratory is a partnership of scientists from sevenmore » institutions: LBNL (lead) Massachusetts Institute of Technology, Lawrence Livermore National Laboratory, Oak Ridge National Laboratory, University of California, Davis, Ohio State University, and Washington University in St. Louis. The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges. The mission of the Center for Nanoscale Control of Geologic CO2 is 'to use new investigative tools, combined with experiments and computer simulations, to build a fundamental understanding of molecular-to-pore-scale processes in fluid-rock systems, and to demonstrate the ability to control critical aspects of flow, transport, and mineralization in porous rock media as applied to geologic sequestration of CO2. Research topics are: bio-inspired, CO2 (store), greenhouse gas, and interfacial characterization.« less
Advances in the Remote Monitoring of Balloon Flights
NASA Astrophysics Data System (ADS)
Breeding, S.
At the National Scientific Balloon Facility (NSBF), we must staff the Long Duration Balloon (LDB) control center 24 hours a day during LDB flights. This requires three daily shifts of two operators (balloon control and tdrss scheduling). In addition to this we also have one engineer on-call as LDB Lead to resolve technical issues and one manager on-call for flight management. These on-call periods are typically 48 to 72 hours in length. In the past the on-call staff had to travel to the LDB control center in order to monitor the status of a flight in any detail. This becomes problematic as flight durations push out beyond 20 to 30 day lengths, as these staff members are not available for business travel during these periods. This paper describes recent advances which allow for the remote monitoring of scientific balloon flight ground station computer displays. This allows balloon flight managers and lead engineers to check flight status and performance from any location with a network or telephone connection. This capability frees key personnel from the NSBF base during flights. It also allows other interested parties to check on the flight status at their convenience.
78 FR 107 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-02
... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel Biophysics and...-1747, [email protected] . Name of Committee: Biological Chemistry and Macromolecular Biophysics...
77 FR 77080 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-31
... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel; Biophysics and...-1747, [email protected] . Name of Committee: Biological Chemistry and Macromolecular Biophysics...
76 FR 70463 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-14
... commercial property such as patentable material, and personal information concerning individuals associated... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel; NANO Special...
75 FR 27351 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-14
... Committee: Biology of Development and Aging Integrated Review Group; Development--2 Study Section. Date...: Center for Scientific Review Special Emphasis Panel; ARRA: Developmental Brain Disorders Competitive...
77 FR 37424 - Center For Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-21
... Special Emphasis Panel; Small Business Biological Chemistry, Biophyscis and Drug Discovery. Date: July 16...: Center for Scientific Review Special Emphasis Panel; Fellowship: Chemistry, Biochemistry, Biophysics, and...
77 FR 64814 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-23
... Panel; Member Conflict: Neurological Disorders, Brain Tumors, and Eye Development and Diseases. Date... Committee: Center for Scientific Review Special Emphasis Panel; Member Conflict: Cell Biology. Date...
ERIC Educational Resources Information Center
Adams, Stephen T.
2004-01-01
Although one role of computers in science education is to help students learn specific science concepts, computers are especially intriguing as a vehicle for fostering the development of epistemological knowledge about the nature of scientific knowledge--what it means to "know" in a scientific sense (diSessa, 1985). In this vein, the…
77 FR 2548 - Board of Scientific Counselors, National Center for Health Statistics
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-18
... Scientific Counselors, National Center for Health Statistics In accordance with section 10(a)(2) of the...), National Center for Health Statistics (NCHS) announces the following meeting of the aforementioned...; review of the ambulatory and hospital care statistics program; a discussion of the NHANES genetics...
Computational Fluid Dynamics Program at NASA Ames Research Center
NASA Technical Reports Server (NTRS)
Holst, Terry L.
1989-01-01
The Computational Fluid Dynamics (CFD) Program at NASA Ames Research Center is reviewed and discussed. The technical elements of the CFD Program are listed and briefly discussed. These elements include algorithm research, research and pilot code development, scientific visualization, advanced surface representation, volume grid generation, and numerical optimization. Next, the discipline of CFD is briefly discussed and related to other areas of research at NASA Ames including experimental fluid dynamics, computer science research, computational chemistry, and numerical aerodynamic simulation. These areas combine with CFD to form a larger area of research, which might collectively be called computational technology. The ultimate goal of computational technology research at NASA Ames is to increase the physical understanding of the world in which we live, solve problems of national importance, and increase the technical capabilities of the aerospace community. Next, the major programs at NASA Ames that either use CFD technology or perform research in CFD are listed and discussed. Briefly, this list includes turbulent/transition physics and modeling, high-speed real gas flows, interdisciplinary research, turbomachinery demonstration computations, complete aircraft aerodynamics, rotorcraft applications, powered lift flows, high alpha flows, multiple body aerodynamics, and incompressible flow applications. Some of the individual problems actively being worked in each of these areas is listed to help define the breadth or extent of CFD involvement in each of these major programs. State-of-the-art examples of various CFD applications are presented to highlight most of these areas. The main emphasis of this portion of the presentation is on examples which will not otherwise be treated at this conference by the individual presentations. Finally, a list of principal current limitations and expected future directions is given.
Final Report for DOE Award ER25756
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kesselman, Carl
2014-11-17
The SciDAC-funded Center for Enabling Distributed Petascale Science (CEDPS) was established to address technical challenges that arise due to the frequent geographic distribution of data producers (in particular, supercomputers and scientific instruments) and data consumers (people and computers) within the DOE laboratory system. Its goal is to produce technical innovations that meet DOE end-user needs for (a) rapid and dependable placement of large quantities of data within a distributed high-performance environment, and (b) the convenient construction of scalable science services that provide for the reliable and high-performance processing of computation and data analysis requests from many remote clients. The Centermore » is also addressing (c) the important problem of troubleshooting these and other related ultra-high-performance distributed activities from the perspective of both performance and functionality« less
Opportunities and choice in a new vector era
NASA Astrophysics Data System (ADS)
Nowak, A.
2014-06-01
This work discusses the significant changes in computing landscape related to the progression of Moore's Law, and the implications on scientific computing. Particular attention is devoted to the High Energy Physics domain (HEP), which has always made good use of threading, but levels of parallelism closer to the hardware were often left underutilized. Findings of the CERN openlab Platform Competence Center are reported in the context of expanding "performance dimensions", and especially the resurgence of vectors. These suggest that data oriented designs are feasible in HEP and have considerable potential for performance improvements on multiple levels, but will rarely trump algorithmic enhancements. Finally, an analysis of upcoming hardware and software technologies identifies heterogeneity as a major challenge for software, which will require more emphasis on scalable, efficient design.
A Look at the Impact of High-End Computing Technologies on NASA Missions
NASA Technical Reports Server (NTRS)
Biswas, Rupak; Dunbar, Jill; Hardman, John; Bailey, F. Ron; Wheeler, Lorien; Rogers, Stuart
2012-01-01
From its bold start nearly 30 years ago and continuing today, the NASA Advanced Supercomputing (NAS) facility at Ames Research Center has enabled remarkable breakthroughs in the space agency s science and engineering missions. Throughout this time, NAS experts have influenced the state-of-the-art in high-performance computing (HPC) and related technologies such as scientific visualization, system benchmarking, batch scheduling, and grid environments. We highlight the pioneering achievements and innovations originating from and made possible by NAS resources and know-how, from early supercomputing environment design and software development, to long-term simulation and analyses critical to design safe Space Shuttle operations and associated spinoff technologies, to the highly successful Kepler Mission s discovery of new planets now capturing the world s imagination.
Brocher, Thomas M.; Carr, Michael D.; Halsing, David L.; John, David A.; Langenheim, V.E.; Mangan, Margaret T.; Marvin-DiPasquale, Mark C.; Takekawa, John Y.; Tiedeman, Claire
2006-01-01
In the spring of 2004, the U.S. Geological Survey (USGS) Menlo Park Center Council commissioned an interdisciplinary working group to develop a forward-looking science strategy for the USGS Menlo Park Science Center in California (hereafter also referred to as "the Center"). The Center has been the flagship research center for the USGS in the western United States for more than 50 years, and the Council recognizes that science priorities must be the primary consideration guiding critical decisions made about the future evolution of the Center. In developing this strategy, the working group consulted widely within the USGS and with external clients and collaborators, so that most stakeholders had an opportunity to influence the science goals and operational objectives.The Science Goals are to: Natural Hazards: Conduct natural-hazard research and assessments critical to effective mitigation planning, short-term forecasting, and event response. Ecosystem Change: Develop a predictive understanding of ecosystem change that advances ecosystem restoration and adaptive management. Natural Resources: Advance the understanding of natural resources in a geologic, hydrologic, economic, environmental, and global context. Modeling Earth System Processes: Increase and improve capabilities for quantitative simulation, prediction, and assessment of Earth system processes.The strategy presents seven key Operational Objectives with specific actions to achieve the scientific goals. These Operational Objectives are to:Provide a hub for technology, laboratories, and library services to support science in the Western Region. Increase advanced computing capabilities and promote sharing of these resources. Enhance the intellectual diversity, vibrancy, and capacity of the work force through improved recruitment and retention. Strengthen client and collaborative relationships in the community at an institutional level.Expand monitoring capability by increasing density, sensitivity, and efficiency and reducing costs of instruments and networks. Encourage a breadth of scientific capabilities in Menlo Park to foster interdisciplinary science. Communicate USGS science to a diverse audience.
77 FR 511 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-05
... Emphasis Panel, PAR-11-228: Shared Instrumentation: Cell Biology, Physiology and Robotics. Date: February 1...: Center for Scientific Review Special Emphasis Panel, Multidisciplinary Healthcare Delivery Research AREA...
77 FR 30540 - Center for Scientific Review Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-23
... Committee: Center for Scientific Review Special Emphasis Panel; Drug Discovery for the Nervous System...: Digestive, Kidney and Urological Systems Integrated Review Group; Kidney Molecular Biology and Genitourinary...
76 FR 6486 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-04
... Review Special Emphasis Panel; Member Conflict: Memory, Pain and Auditory Neuroscience. Date: March 8-9...: Center for Scientific Review Special Emphasis Panel; Member Conflict: Learning, Alcohol and...
76 FR 56771 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-14
... Committee: Center for Scientific Review Special Emphasis Panel, Molecular Neuroscience. Date: October 6....gov . (Catalogue of Federal Domestic Assistance Program Nos. 93.306, Comparative Medicine; 93.333...
78 FR 35292 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-12
... Committee: Center for Scientific Review Special Emphasis Panel; Member Conflict: Child and Adolescent....gov . (Catalogue of Federal Domestic Assistance Program Nos. 93.306, Comparative Medicine; 93.333...
76 FR 10382 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-24
... Biomedicine and Agriculture: Infectious Diseases, Immunology and the Circulatory System. Date: March 21, 2011... Committee: Center for Scientific Review Special Emphasis Panel; Bioengineering Special Topics. Date: March...
NASA Astrophysics Data System (ADS)
Chulaki, A.; Kuznetsova, M. M.; Rastaetter, L.; MacNeice, P. J.; Shim, J. S.; Pulkkinen, A. A.; Taktakishvili, A.; Mays, M. L.; Mendoza, A. M. M.; Zheng, Y.; Mullinix, R.; Collado-Vega, Y. M.; Maddox, M. M.; Pembroke, A. D.; Wiegand, C.
2015-12-01
Community Coordinated Modeling Center (CCMC) is a NASA affiliated interagency partnership with the primary goal of aiding the transition of modern space science models into space weather forecasting while supporting space science research. Additionally, over the past ten years it has established itself as a global space science education resource supporting undergraduate and graduate education and research, and spreading space weather awareness worldwide. A unique combination of assets, capabilities and close ties to the scientific and educational communities enable this small group to serve as a hub for raising generations of young space scientists and engineers. CCMC resources are publicly available online, providing unprecedented global access to the largest collection of modern space science models (developed by the international research community). CCMC has revolutionized the way simulations are utilized in classrooms settings, student projects, and scientific labs and serves hundreds of educators, students and researchers every year. Another major CCMC asset is an expert space weather prototyping team primarily serving NASA's interplanetary space weather needs. Capitalizing on its unrivaled capabilities and experiences, the team provides in-depth space weather training to students and professionals worldwide, and offers an amazing opportunity for undergraduates to engage in real-time space weather monitoring, analysis, forecasting and research. In-house development of state-of-the-art space weather tools and applications provides exciting opportunities to students majoring in computer science and computer engineering fields to intern with the software engineers at the CCMC while also learning about the space weather from the NASA scientists.
Center for Technology for Advanced Scientific Componet Software (TASCS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Govindaraju, Madhusudhan
Advanced Scientific Computing Research Computer Science FY 2010Report Center for Technology for Advanced Scientific Component Software: Distributed CCA State University of New York, Binghamton, NY, 13902 Summary The overall objective of Binghamton's involvement is to work on enhancements of the CCA environment, motivated by the applications and research initiatives discussed in the proposal. This year we are working on re-focusing our design and development efforts to develop proof-of-concept implementations that have the potential to significantly impact scientific components. We worked on developing parallel implementations for non-hydrostatic code and worked on a model coupling interface for biogeochemical computations coded in MATLAB.more » We also worked on the design and implementation modules that will be required for the emerging MapReduce model to be effective for scientific applications. Finally, we focused on optimizing the processing of scientific datasets on multi-core processors. Research Details We worked on the following research projects that we are working on applying to CCA-based scientific applications. 1. Non-Hydrostatic Hydrodynamics: Non-static hydrodynamics are significantly more accurate at modeling internal waves that may be important in lake ecosystems. Non-hydrostatic codes, however, are significantly more computationally expensive, often prohibitively so. We have worked with Chin Wu at the University of Wisconsin to parallelize non-hydrostatic code. We have obtained a speed up of about 26 times maximum. Although this is significant progress, we hope to improve the performance further, such that it becomes a practical alternative to hydrostatic codes. 2. Model-coupling for water-based ecosystems: To answer pressing questions about water resources requires that physical models (hydrodynamics) be coupled with biological and chemical models. Most hydrodynamics codes are written in Fortran, however, while most ecologists work in MATLAB. This disconnect creates a great barrier. To address this, we are working on a model coupling interface that will allow biogeochemical computations written in MATLAB to couple with Fortran codes. This will greatly improve the productivity of ecosystem scientists. 2. Low overhead and Elastic MapReduce Implementation Optimized for Memory and CPU-Intensive Applications: Since its inception, MapReduce has frequently been associated with Hadoop and large-scale datasets. Its deployment at Amazon in the cloud, and its applications at Yahoo! for large-scale distributed document indexing and database building, among other tasks, have thrust MapReduce to the forefront of the data processing application domain. The applicability of the paradigm however extends far beyond its use with data intensive applications and diskbased systems, and can also be brought to bear in processing small but CPU intensive distributed applications. MapReduce however carries its own burdens. Through experiments using Hadoop in the context of diverse applications, we uncovered latencies and delay conditions potentially inhibiting the expected performance of a parallel execution in CPU-intensive applications. Furthermore, as it currently stands, MapReduce is favored for data-centric applications, and as such tends to be solely applied to disk-based applications. The paradigm, falls short in bringing its novelty to diskless systems dedicated to in-memory applications, and compute intensive programs processing much smaller data, but requiring intensive computations. In this project, we focused both on the performance of processing large-scale hierarchical data in distributed scientific applications, as well as the processing of smaller but demanding input sizes primarily used in diskless, and memory resident I/O systems. We designed LEMO-MR [1], a Low overhead, elastic, configurable for in- memory applications, and on-demand fault tolerance, an optimized implementation of MapReduce, for both on disk and in memory applications. We conducted experiments to identify not only the necessary components of this model, but also trade offs and factors to be considered. We have initial results to show the efficacy of our implementation in terms of potential speedup that can be achieved for representative data sets used by cloud applications. We have quantified the performance gains exhibited by our MapReduce implementation over Apache Hadoop in a compute intensive environment. 3. Cache Performance Optimization for Processing XML and HDF-based Application Data on Multi-core Processors: It is important to design and develop scientific middleware libraries to harness the opportunities presented by emerging multi-core processors. Implementations of scientific middleware and applications that do not adapt to the programming paradigm when executing on emerging processors can severely impact the overall performance. In this project, we focused on the utilization of the L2 cache, which is a critical shared resource on chip multiprocessors (CMP). The access pattern of the shared L2 cache, which is dependent on how the application schedules and assigns processing work to each thread, can either enhance or hurt the ability to hide memory latency on a multi-core processor. Therefore, while processing scientific datasets such as HDF5, it is essential to conduct fine-grained analysis of cache utilization, to inform scheduling decisions in multi-threaded programming. In this project, using the TAU toolkit for performance feedback from dual- and quad-core machines, we conducted performance analysis and recommendations on how processing threads can be scheduled on multi-core nodes to enhance the performance of a class of scientific applications that requires processing of HDF5 data. In particular, we quantified the gains associated with the use of the adaptations we have made to the Cache-Affinity and Balanced-Set scheduling algorithms to improve L2 cache performance, and hence the overall application execution time [2]. References: 1. Zacharia Fadika, Madhusudhan Govindaraju, ``MapReduce Implementation for Memory-Based and Processing Intensive Applications'', accepted in 2nd IEEE International Conference on Cloud Computing Technology and Science, Indianapolis, USA, Nov 30 - Dec 3, 2010. 2. Rajdeep Bhowmik, Madhusudhan Govindaraju, ``Cache Performance Optimization for Processing XML-based Application Data on Multi-core Processors'', in proceedings of The 10th IEEE/ACM International Symposium on Cluster, Cloud and Grid Computing, May 17-20, 2010, Melbourne, Victoria, Australia. Contact Information: Madhusudhan Govindaraju Binghamton University State University of New York (SUNY) mgovinda@cs.binghamton.edu Phone: 607-777-4904« less
75 FR 1793 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-13
...; Social Psychology, Personality and Interpersonal Processes Study Section. Date: February 4-5, 2010. Time..., MD 20892 (Virtual Meeting). Contact: Bob Weller, PhD, Scientific Review Officer, Center for...
75 FR 3912 - Center for Scientific Review; Notice of Closed Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-25
... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel; Molecular... Federal Domestic Assistance Program Nos. 93.306, Comparative Medicine; 93.333, Clinical Research, 93.306...
77 FR 26022 - Center for Scientific Review; Notice of Closed Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-02
... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel; Molecular and... of Federal Domestic Assistance Program Nos. 93.306, Comparative Medicine; 93.333, Clinical Research...
77 FR 20832 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-06
...: Center for Scientific Review Special Emphasis Panel; Molecular Genetics Program Projects. Date: May 1... . (Catalogue of Federal Domestic Assistance Program Nos. 93.306, Comparative Medicine; 93.333, Clinical...
78 FR 66371 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-05
....gov . Name of Committee: Center for Scientific Review Special Emphasis Panel; Member Conflicts: Child... . (Catalogue of Federal Domestic Assistance Program Nos. 93.306, Comparative Medicine; 93.333, Clinical...
77 FR 61009 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-05
... Committee: Center for Scientific Review Special Emphasis Panel; Program Project: Prenatal Stress and Child..., [email protected] . (Catalogue of Federal Domestic Assistance Program Nos. 93.306, Comparative Medicine...
78 FR 54665 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-05
...: Center for Scientific Review Special Emphasis Panel; Basic Biology of Neurological Disorders. Date..., Bethesda, MD 20892, 301-435- 1242, [email protected] . Name of Committee: Biological Chemistry and...
77 FR 14028 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-08
....gov . Name of Committee: Center for Scientific Review Special Emphasis Panel; Biological Chemistry and...-1323, [email protected] . Name of Committee: Biology of Development and Aging Integrated Review...
76 FR 60507 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-29
... Emphasis Panel, Small Business: Diabetes, Obesity and Reproductive Sciences. Date: October 25-26, 2011... . Name of Committee: Center for Scientific Review Special Emphasis Panel, Fellowship: Genes, Genomes, and...
Using the High-Level Based Program Interface to Facilitate the Large Scale Scientific Computing
Shang, Yizi; Shang, Ling; Gao, Chuanchang; Lu, Guiming; Ye, Yuntao; Jia, Dongdong
2014-01-01
This paper is to make further research on facilitating the large-scale scientific computing on the grid and the desktop grid platform. The related issues include the programming method, the overhead of the high-level program interface based middleware, and the data anticipate migration. The block based Gauss Jordan algorithm as a real example of large-scale scientific computing is used to evaluate those issues presented above. The results show that the high-level based program interface makes the complex scientific applications on large-scale scientific platform easier, though a little overhead is unavoidable. Also, the data anticipation migration mechanism can improve the efficiency of the platform which needs to process big data based scientific applications. PMID:24574931
X-Ray Computed Tomography Monitors Damage in Composites
NASA Technical Reports Server (NTRS)
Baaklini, George Y.
1997-01-01
The NASA Lewis Research Center recently codeveloped a state-of-the-art x-ray CT facility (designated SMS SMARTSCAN model 100-112 CITA by Scientific Measurement Systems, Inc., Austin, Texas). This multipurpose, modularized, digital x-ray facility includes an imaging system for digital radiography, CT, and computed laminography. The system consists of a 160-kV microfocus x-ray source, a solid-state charge-coupled device (CCD) area detector, a five-axis object-positioning subassembly, and a Sun SPARCstation-based computer system that controls data acquisition and image processing. The x-ray source provides a beam spot size down to 3 microns. The area detector system consists of a 50- by 50- by 3-mm-thick terbium-doped glass fiber-optic scintillation screen, a right-angle mirror, and a scientific-grade, digital CCD camera with a resolution of 1000 by 1018 pixels and 10-bit digitization at ambient cooling. The digital output is recorded with a high-speed, 16-bit frame grabber that allows data to be binned. The detector can be configured to provide a small field-of-view, approximately 45 by 45 mm in cross section, or a larger field-of-view, approximately 60 by 60 mm in cross section. Whenever the highest spatial resolution is desired, the small field-of-view is used, and for larger samples with some reduction in spatial resolution, the larger field-of-view is used.
Defining Computational Thinking for Mathematics and Science Classrooms
ERIC Educational Resources Information Center
Weintrop, David; Beheshti, Elham; Horn, Michael; Orton, Kai; Jona, Kemi; Trouille, Laura; Wilensky, Uri
2016-01-01
Science and mathematics are becoming computational endeavors. This fact is reflected in the recently released Next Generation Science Standards and the decision to include "computational thinking" as a core scientific practice. With this addition, and the increased presence of computation in mathematics and scientific contexts, a new…
The JINR Tier1 Site Simulation for Research and Development Purposes
NASA Astrophysics Data System (ADS)
Korenkov, V.; Nechaevskiy, A.; Ososkov, G.; Pryahina, D.; Trofimov, V.; Uzhinskiy, A.; Voytishin, N.
2016-02-01
Distributed complex computing systems for data storage and processing are in common use in the majority of modern scientific centers. The design of such systems is usually based on recommendations obtained via a preliminary simulated model used and executed only once. However big experiments last for years and decades, and the development of their computing system is going on, not only quantitatively but also qualitatively. Even with the substantial efforts invested in the design phase to understand the systems configuration, it would be hard enough to develop a system without additional research of its future evolution. The developers and operators face the problem of the system behaviour predicting after the planned modifications. A system for grid and cloud services simulation is developed at LIT (JINR, Dubna). This simulation system is focused on improving the effciency of the grid/cloud structures development by using the work quality indicators of some real system. The development of such kind of software is very important for making a new grid/cloud infrastructure for such big scientific experiments like the JINR Tier1 site for WLCG. The simulation of some processes of the Tier1 site is considered as an example of our application approach.
NASA Technical Reports Server (NTRS)
Talbot, Bryan; Zhou, Shu-Jia; Higgins, Glenn; Zukor, Dorothy (Technical Monitor)
2002-01-01
One of the most significant challenges in large-scale climate modeling, as well as in high-performance computing in other scientific fields, is that of effectively integrating many software models from multiple contributors. A software framework facilitates the integration task, both in the development and runtime stages of the simulation. Effective software frameworks reduce the programming burden for the investigators, freeing them to focus more on the science and less on the parallel communication implementation. while maintaining high performance across numerous supercomputer and workstation architectures. This document surveys numerous software frameworks for potential use in Earth science modeling. Several frameworks are evaluated in depth, including Parallel Object-Oriented Methods and Applications (POOMA), Cactus (from (he relativistic physics community), Overture, Goddard Earth Modeling System (GEMS), the National Center for Atmospheric Research Flux Coupler, and UCLA/UCB Distributed Data Broker (DDB). Frameworks evaluated in less detail include ROOT, Parallel Application Workspace (PAWS), and Advanced Large-Scale Integrated Computational Environment (ALICE). A host of other frameworks and related tools are referenced in this context. The frameworks are evaluated individually and also compared with each other.
Catalytic N 2 Reduction to Silylamines and Thermodynamics of N 2 Binding at Square Planar Fe
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prokopchuk, Demyan E.; Wiedner, Eric S.; Walter, Eric D.
The geometric constraints imposed by a tetradentate P 4N 2 ligand play an essential role in stabilizing square planar Fe complexes with changes in metal oxidation state. A combination of high-pressure electrochemistry and variable temperature UV-vis spectroscopy were used to obtain these thermodynamic measurements, while X-ray crystallography, 57Fe Mössbauer spectroscopy, and EPR spectroscopy were used to fully characterize these new compounds. Analysis of Fe 0, FeI, and FeII complexes reveals that the free energy of N 2 binding across three oxidation states spans more than 37 kcal mol -1. The square pyramidal Fe0(N 2)(P 4N 2) complex catalyzes the conversionmore » of N 2 to N(SiR 3) 3 (R = Me, Et) at room temperature, representing the highest turnover number (TON) of any Fe-based N 2 silylation catalyst to date (up to 65 equiv N(SiMe 3) 3 per Fe center). Elevated N 2 pressures (> 1 atm) have a dramatic effect on catalysis, increasing N 2 solubility and the thermodynamic N 2 binding affinity at Fe0(N 2)(P 4N 2). Acknowledgment. This research was supported as part of the Center for Molecular Electrocatalysis, an Energy Frontier Research Center funded by the U.S. Department of Energy (DOE), Office of Science, Office of Basic Energy Sciences. EPR experiments were performed using EMSL, a national scientific user facility sponsored by the DOE’s Office of Biological and Environmental Research and located at Pacific Northwest National Laboratory (PNNL). PNNL is operated by Battelle for the U.S. DOE. Computational resources were provided by the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory. The authors thank Prof. Yisong Alex Guo at Carnegie Mellon University for recording Mössbauer data for some complexes and Emma Wellington and Kaye Kuphal for their assistance with the collection of Mössbauer data at Colgate University, Dr. Katarzyna Grubel for X-ray assistance, and Dr. Rosalie Chu for mass spectrometry assistance. The authors also thank Dr. Aaron Appel and Dr. Alex Kendall for helpful discussions.« less
ERIC Educational Resources Information Center
Halbauer, Siegfried
1976-01-01
It was considered that students of intensive scientific Russian courses could learn vocabulary more efficiently if they were taught word stems and how to combine them with prefixes and suffixes to form scientific words. The computer programs developed to identify the most important stems is discussed. (Text is in German.) (FB)
Bethel, E. Wes [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Computational Research Division and Scientific Visualization Group
2018-05-07
Summer Lecture Series 2008: Scientific visualization transforms abstract data into readily comprehensible images, provide a vehicle for "seeing the unseeable," and play a central role in both experimental and computational sciences. Wes Bethel, who heads the Scientific Visualization Group in the Computational Research Division, presents an overview of visualization and computer graphics, current research challenges, and future directions for the field.
75 FR 6674 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-10
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review... Scientific Review Special Emphasis Panel, Microvascular Interactions. Date: March 3, 2010. Time: 3 p.m. to 5... Social Consequences of HIV/AIDS Study Section. Date: March 15-16, 2010. Time: 8 a.m. to 5 p.m. Agenda: To...
Data-driven Ontology Development: A Case Study at NASA's Atmospheric Science Data Center
NASA Astrophysics Data System (ADS)
Hertz, J.; Huffer, E.; Kusterer, J.
2012-12-01
Well-founded ontologies are key to enabling transformative semantic technologies and accelerating scientific research. One example is semantically enabled search and discovery, making scientific data accessible and more understandable by accurately modeling a complex domain. The ontology creation process remains a challenge for many anxious to pursue semantic technologies. The key may be that the creation process -- whether formal, community-based, automated or semi-automated -- should encompass not only a foundational core and supplemental resources but also a focus on the purpose or mission the ontology is created to support. Are there tools or processes to de-mystify, assess or enhance the resulting ontology? We suggest that comparison and analysis of a domain-focused ontology can be made using text engineering tools for information extraction, tokenizers, named entity transducers and others. The results are analyzed to ensure the ontology reflects the core purpose of the domain's mission and that the ontology integrates and describes the supporting data in the language of the domain - how the science is analyzed and discussed among all users of the data. Commonalities and relationships among domain resources describing the Clouds and Earth's Radiant Energy (CERES) Bi-Directional Scan (BDS) datasets from NASA's Atmospheric Science Data Center are compared. The domain resources include: a formal ontology created for CERES; scientific works such as papers, conference proceedings and notes; information extracted from the datasets (i.e., header metadata); and BDS scientific documentation (Algorithm Theoretical Basis Documents, collection guides, data quality summaries and others). These resources are analyzed using the open source software General Architecture for Text Engineering, a mature framework for computational tasks involving human language.
78 FR 7790 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-04
... . Name of Committee: Center for Scientific Review Special Emphasis Panel; Rare Diseases. Date: February... Genetics Integrated Review Group; Genetics of Health and Disease Study Section. Date: February 25-26, 2013...
77 FR 24967 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-26
... Special Emphasis Panel PAR12-010 Smoking and Tobacco Revision Applications: Social Sciences and Population... . Name of Committee: Center for Scientific Review Special Emphasis Panel Social Sciences and Population...
78 FR 60293 - Center for Scientific Review; Notice of Closed Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-01
... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel; Stem Cell... Federal Domestic Assistance Program Nos. 93.306, Comparative Medicine; 93.333, Clinical Research, 93.306...
76 FR 64090 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-17
... Special Emphasis Panel, Oxidative Stress, Aging, and Transmitters. Date: November 9-10, 2011. Time: 8 a.m...: Center for Scientific Review Special Emphasis Panel, Fellowships: Physiology and Pathobiology of...
75 FR 4576 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-28
...: Center for Scientific Review Special Emphasis Panel, Academic-Industry Partnership in Cancer Imaging.... Place: Ritz Carlton Hotel, 1150 22nd Street, NW., Washington, DC 20037. Contact Person: Fouad A. El...
77 FR 67385 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-09
...: Center for Scientific Review Special Emphasis Panel Molecular Mechanism of Neurodegeneration. Date... Assistance Program Nos. 93.306, Comparative Medicine; 93.333, Clinical Research, 93.306, 93.333, 93.337, 93...
76 FR 24900 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-03
... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel; Brain Function... Special Emphasis Panel; Member Conflict: Risk Prevention and Health Behavior. Date: June 6, 2011. Time: 11...
77 FR 35413 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-13
... Panel; Program Project: Mechanisms of Drug Disposition During Pregnancy. Date: July 9, 2012. Time: 12:00... of Committee: Center for Scientific Review Special Emphasis Panel; Member Conflict: Nutrition and...
76 FR 21386 - Center for Scientific Review; Notice of Closed Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-15
... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel; Brain Development... Federal Domestic Assistance Program Nos. 93.306, Comparative Medicine; 93.333, Clinical Research, 93.306...
78 FR 10186 - Center For Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-13
... Committee: Center for Scientific Review Special Emphasis Panel; RFA: EY 13-001 Basic Behavioral Research on... Panel; Fellowships: Cell Biology, Developmental Biology and Bioengineering. Date: March 7, 2013. Time: 8...
Scientific Visualization, Seeing the Unseeable
LBNL
2017-12-09
June 24, 2008 Berkeley Lab lecture: Scientific visualization transforms abstract data into readily comprehensible images, provide a vehicle for "seeing the unseeable," and play a central role in bo... June 24, 2008 Berkeley Lab lecture: Scientific visualization transforms abstract data into readily comprehensible images, provide a vehicle for "seeing the unseeable," and play a central role in both experimental and computational sciences. Wes Bethel, who heads the Scientific Visualization Group in the Computational Research Division, presents an overview of visualization and computer graphics, current research challenges, and future directions for the field.
75 FR 55333 - Board of Scientific Counselors, National Center for Health Statistics, (BSC, NCHS)
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-10
... Scientific Counselors, National Center for Health Statistics, (BSC, NCHS) In accordance with section 10(a)(2... Prevention (CDC), National Center for Health Statistics (NCHS) announces the following meeting of [email protected] or Virginia Cain, [email protected] at least 10 days in advance for requirements). All visitors...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-21
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Disease Control and Prevention (CDC) Board of Scientific Counselors (BSC), Coordinating Center for Health Promotion (CCHP): Notice of Charter Amendment... both the CDC and the Agency for Toxic Substances and Disease Registry. Dated: July 13, 2010. Elaine L...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-05
... Devices and Radiological Health's (CDRH's or the Center's) draft process to clarify and more quickly inform stakeholders when CDRH has changed its expectations relating to, or otherwise has new scientific... scientific information changes CDRH's regulatory thinking, it has been challenging for the Center to...
Atmosphere of Freedom: Sixty Years at the NASA Ames Research Center
NASA Technical Reports Server (NTRS)
Bugos, Glenn E.; Launius, Roger (Technical Monitor)
2000-01-01
Throughout Ames History, four themes prevail: a commitment to hiring the best people; cutting-edge research tools; project management that gets things done faster, better and cheaper; and outstanding research efforts that serve the scientific professions and the nation. More than any other NASA Center, Ames remains shaped by its origins in the NACA (National Advisory Committee for Aeronautics). Not that its missions remain the same. Sure, Ames still houses the world's greatest collection of wind tunnels and simulation facilities, its aerodynamicists remain among the best in the world, and pilots and engineers still come for advice on how to build better aircraft. But that is increasingly part of Ames' past. Ames people have embraced two other missions for its future. First, intelligent systems and information science will help NASA use new tools in supercomputing, networking, telepresence and robotics. Second, astrobiology will explore lore the prospects for life on Earth and beyond. Both new missions leverage Ames long-standing expertise in computation and in the life sciences, as well as its relations with the computing and biotechnology firms working in the Silicon Valley community that has sprung up around the Center. Rather than the NACA missions, it is the NACA culture that still permeates Ames. The Ames way of research management privileges the scientists and engineers working in the laboratories. They work in an atmosphere of freedom, laced with the expectation of integrity and responsibility. Ames researchers are free to define their research goals and define how they contribute to the national good. They are expected to keep their fingers on the pulse of their disciplines, to be ambitious yet frugal in organizing their efforts, and to always test their theories in the laboratory or in the field. Ames' leadership ranks, traditionally, are cultivated within this scientific community. Rather than manage and supervise these researchers, Ames leadership merely guided them, represents them to NASA headquarters and the world outside, then steps out of the way before they get run over.
University of Rochester, Laboratory for Laser Energetics
NASA Astrophysics Data System (ADS)
1987-01-01
In FY86 the Laboratory has produced a list of accomplishments in which it takes pride. LLE has met every laser-fusion program milestone to date in a program of research for direct-drive ultraviolet laser fusion originally formulated in 1981. LLE scientists authored or co-authored 135 scientific papers during 1985 to 1986. The collaborative experiments with NRL, LANL, and LLNL have led to a number of important ICF results. The cryogenic target system developed by KMS Fusion for LLE will be used in future high-density experiments on OMEGA to demonstrate the compression of thermonuclear fuel to 100 to 200 times that of solid (20 to 40 g/cm) in a test of the direct-drive concept, as noted in the National Academy of Sciences' report. The excellence of the advanced technology efforts at LLE is illustrated by the establishment of the Ultrafast Science Center by the Department of Defense through the Air Force Office of Scientific Research. Research in the Center will concentrate on bridging the gap between high-speed electronics and ultrafast optics by providing education, research, and development in areas critical to future communications and high-speed computer systems. The Laboratory for Laser Energetics continues its pioneering work on the interaction of intense radiation with matter. This includes inertial-fusion and advanced optical and optical electronics research; training people in the technology and applications of high-power, short-pulse lasers; and interacting with the scientific community, business, industry, and government to promote the growth of laser technology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blankenship, Robert E.
"PARC - Scientific Exchange Program" was submitted by the Photosynthetic Antenna Research Center (PARC) to the "Life at the Frontiers of Energy Research" video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. PARC, an EFRC directed by Robert E. Blankenship at Washington University in St. Louis, is a partnership of scientists from ten institutions. The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) inmore » 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges.« less
Blankenship, Robert E. (Director, Photosynthetic Antenna Research Center); PARC Staff
2017-12-09
'PARC - Scientific Exchange Program' was submitted by the Photosynthetic Antenna Research Center (PARC) to the 'Life at the Frontiers of Energy Research' video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. PARC, an EFRC directed by Robert E. Blankenship at Washington University in St. Louis, is a partnership of scientists from ten institutions. The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges.
New NASA 3D Animation Shows Seven Days of Simulated Earth Weather
2014-08-11
This visualization shows early test renderings of a global computational model of Earth's atmosphere based on data from NASA's Goddard Earth Observing System Model, Version 5 (GEOS-5). This particular run, called Nature Run 2, was run on a supercomputer, spanned 2 years of simulation time at 30 minute intervals, and produced Petabytes of output. The visualization spans a little more than 7 days of simulation time which is 354 time steps. The time period was chosen because a simulated category-4 typhoon developed off the coast of China. The 7 day period is repeated several times during the course of the visualization. Credit: NASA's Scientific Visualization Studio Read more or download here: svs.gsfc.nasa.gov/goto?4180 NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
OMPC: an Open-Source MATLAB-to-Python Compiler.
Jurica, Peter; van Leeuwen, Cees
2009-01-01
Free access to scientific information facilitates scientific progress. Open-access scientific journals are a first step in this direction; a further step is to make auxiliary and supplementary materials that accompany scientific publications, such as methodological procedures and data-analysis tools, open and accessible to the scientific community. To this purpose it is instrumental to establish a software base, which will grow toward a comprehensive free and open-source language of technical and scientific computing. Endeavors in this direction are met with an important obstacle. MATLAB((R)), the predominant computation tool in many fields of research, is a closed-source commercial product. To facilitate the transition to an open computation platform, we propose Open-source MATLAB((R))-to-Python Compiler (OMPC), a platform that uses syntax adaptation and emulation to allow transparent import of existing MATLAB((R)) functions into Python programs. The imported MATLAB((R)) modules will run independently of MATLAB((R)), relying on Python's numerical and scientific libraries. Python offers a stable and mature open source platform that, in many respects, surpasses commonly used, expensive commercial closed source packages. The proposed software will therefore facilitate the transparent transition towards a free and general open-source lingua franca for scientific computation, while enabling access to the existing methods and algorithms of technical computing already available in MATLAB((R)). OMPC is available at http://ompc.juricap.com.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of manymore » computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.« less
76 FR 65739 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-24
... Committee: Center for Scientific Review Special Emphasis Panel, Discovery, Design, and Development of... Assistance Program Nos. 93.306, Comparative Medicine; 93.333, Clinical Research, 93.306, 93.333, 93.337, 93...
77 FR 31861 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-30
... Special Emphasis Panel; Member Conflict: Teen Relationship Violence. Date: June 20, 2012. Time: 2:00 p.m... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel; Member Conflict...
76 FR 21385 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-15
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Center for Scientific Review; Notice of Closed Meetings Pursuant to section 10(d) of the Federal Advisory Committee Act, as amended (5...
76 FR 58523 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-21
...: Center for Scientific Review Special Emphasis Panel, Fellowships: Sensory, Motor, and Cognitive Neuroscience. Date: October 20-21, 2011. Time: 8 a.m. to 5 p.m. Agenda: To review and evaluate grant...
76 FR 44942 - Center for Scientific Review; Notice of Closed Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-27
... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel; Molecular... Assistance Program Nos. 93.306, Comparative Medicine; 93.333, Clinical Research, 93.306, 93.333, 93.337, 93...
76 FR 42719 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-19
... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel, Molecular... Assistance Program Nos. 93.306, Comparative Medicine; 93.333, Clinical Research, 93.306, 93.333, 93.337, 93...
76 FR 34719 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-14
...: Westin Seattle Hotel, 1900 5th Avenue, Seattle, WA 98101. Contact Person: Jose H Guerrier, PhD...: Center for Scientific Review Special Emphasis Panel, Small Business: Respiratory Sciences. Date: July 14...
76 FR 59413 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-26
... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel, Hypertension and... Panel, Member Conflict: Gastrointestinal Pathophysiology. Date: October 20, 2011. Time: 12 p.m. to 3 p.m...
78 FR 9404 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-08
... personal privacy. Name of Committee: Center for Scientific Review Special Emphasis Panel; SBIB Pediatric... Panel; PAR-10-244: The Collaborative Genetic Study of Nicotine Dependence. Date: March 5-6, 2013. Time...
Scaling to diversity: The DERECHOS distributed infrastructure for analyzing and sharing data
NASA Astrophysics Data System (ADS)
Rilee, M. L.; Kuo, K. S.; Clune, T.; Oloso, A.; Brown, P. G.
2016-12-01
Integrating Earth Science data from diverse sources such as satellite imagery and simulation output can be expensive and time-consuming, limiting scientific inquiry and the quality of our analyses. Reducing these costs will improve innovation and quality in science. The current Earth Science data infrastructure focuses on downloading data based on requests formed from the search and analysis of associated metadata. And while the data products provided by archives may use the best available data sharing technologies, scientist end-users generally do not have such resources (including staff) available to them. Furthermore, only once an end-user has received the data from multiple diverse sources and has integrated them can the actual analysis and synthesis begin. The cost of getting from idea to where synthesis can start dramatically slows progress. In this presentation we discuss a distributed computational and data storage framework that eliminates much of the aforementioned cost. The SciDB distributed array database is central as it is optimized for scientific computing involving very large arrays, performing better than less specialized frameworks like Spark. Adding spatiotemporal functions to the SciDB creates a powerful platform for analyzing and integrating massive, distributed datasets. SciDB allows Big Earth Data analysis to be performed "in place" without the need for expensive downloads and end-user resources. Spatiotemporal indexing technologies such as the hierarchical triangular mesh enable the compute and storage affinity needed to efficiently perform co-located and conditional analyses minimizing data transfers. These technologies automate the integration of diverse data sources using the framework, a critical step beyond current metadata search and analysis. Instead of downloading data into their idiosyncratic local environments, end-users can generate and share data products integrated from diverse multiple sources using a common shared environment, turning distributed active archive centers (DAACs) from warehouses into distributed active analysis centers.
U.S, Department of Energy's Bioenergy Research Centers An Overview of the Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2009-07-01
Alternative fuels from renewable cellulosic biomass--plant stalks, trunks, stems, and leaves--are expected to significantly reduce U.S. dependence on imported oil while enhancing national energy security and decreasing the environmental impacts of energy use. Ethanol and other advanced biofuels from cellulosic biomass are renewable alternatives that could increase domestic production of transportation fuels, revitalize rural economies, and reduce carbon dioxide and pollutant emissions. According to U.S. Secretary of Energy Steven Chu, 'Developing the next generation of biofuels is key to our effort to end our dependence on foreign oil and address the climate crisis while creating millions of new jobs thatmore » can't be outsourced'. In the United States, the Energy Independence and Security Act (EISA) of 2007 is an important driver for the sustainable development of renewable biofuels. As part of EISA, the Renewable Fuel Standard mandates that 36 billion gallons of biofuels are to be produced annually by 2022, of which 16 billion gallons are expected to come from cellulosic feedstocks. Although cellulosic ethanol production has been demonstrated on a pilot level, developing a cost-effective, commercial-scale cellulosic biofuel industry will require transformational science to significantly streamline current production processes. Woodchips, grasses, cornstalks, and other cellulosic biomass are widely abundant but more difficult to break down into sugars than corn grain--the primary source of U.S. ethanol fuel production today. Biological research is key to accelerating the deconstruction of cellulosic biomass into sugars that can be converted to biofuels. The Department of Energy (DOE) Office of Science continues to play a major role in inspiring, supporting, and guiding the biotechnology revolution over the past 25 years. The DOE Genomic Science Program is advancing a new generation of research focused on achieving whole-systems understanding for biology. This program is bringing together scientists in diverse fields to understand the complex biology underlying solutions to DOE missions in energy production, environmental remediation, and climate change science. New interdisciplinary research communities are emerging, as are knowledgebases and scientific and computational resources critical to advancing large-scale, genome-based biology. To focus the most advanced biotechnology-based resources on the biological challenges of biofuel production, DOE established three Bioenergy Research Centers (BRCs) in September 2007. Each center is pursuing the basic research underlying a range of high-risk, high-return biological solutions for bioenergy applications. Advances resulting from the BRCs will provide the knowledge needed to develop new biobased products, methods, and tools that the emerging biofuel industry can use. The scientific rationale for these centers and for other fundamental genomic research critical to the biofuel industry was established at a DOE workshop involving members of the research community (see sidebar, Biofuel Research Plan, below). The DOE BRCs have developed automated, high-throughput analysis pipelines that will accelerate scientific discovery for biology-based biofuel research. The three centers, which were selected through a scientific peer-review process, are based in geographically diverse locations--the Southeast, the Midwest, and the West Coast--with partners across the nation. DOE's Oak Ridge National Laboratory leads the BioEnergy Science Center (BESC) in Tennessee; the University of Wisconsin-Madison leads the Great Lakes Bioenergy Research Center (GLBRC); and DOE's Lawrence Berkeley National Laboratory leads the DOE Joint BioEnergy Institute (JBEI) in California. Each center represents a multidisciplinary partnership with expertise spanning the physical and biological sciences, including genomics, microbial and plant biology, analytical chemistry, computational biology and bioinformatics, and engineering. Institutional partners include DOE national laboratories, universities, private companies, and nonprofit organizations.« less
Exploring Cloud Computing for Large-scale Scientific Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Guang; Han, Binh; Yin, Jian
This paper explores cloud computing for large-scale data-intensive scientific applications. Cloud computing is attractive because it provides hardware and software resources on-demand, which relieves the burden of acquiring and maintaining a huge amount of resources that may be used only once by a scientific application. However, unlike typical commercial applications that often just requires a moderate amount of ordinary resources, large-scale scientific applications often need to process enormous amount of data in the terabyte or even petabyte range and require special high performance hardware with low latency connections to complete computation in a reasonable amount of time. To address thesemore » challenges, we build an infrastructure that can dynamically select high performance computing hardware across institutions and dynamically adapt the computation to the selected resources to achieve high performance. We have also demonstrated the effectiveness of our infrastructure by building a system biology application and an uncertainty quantification application for carbon sequestration, which can efficiently utilize data and computation resources across several institutions.« less
[Organization of clinical research: in general and visceral surgery].
Schneider, M; Werner, J; Weitz, J; Büchler, M W
2010-04-01
The structural organization of research facilities within a surgical university center should aim at strengthening the department's research output and likewise provide opportunities for the scientific education of academic surgeons. We suggest a model in which several independent research groups within a surgical department engage in research projects covering various aspects of surgically relevant basic, translational or clinical research. In order to enhance the translational aspects of surgical research, a permanent link needs to be established between the department's scientific research projects and its chief interests in clinical patient care. Importantly, a focus needs to be placed on obtaining evidence-based data to judge the efficacy of novel diagnostic and treatment concepts. Integration of modern technologies from the fields of physics, computer science and molecular medicine into surgical research necessitates cooperation with external research facilities, which can be strengthened by coordinated support programs offered by research funding institutions.
Utah Virtual Lab: JAVA interactivity for teaching science and statistics on line.
Malloy, T E; Jensen, G C
2001-05-01
The Utah on-line Virtual Lab is a JAVA program run dynamically off a database. It is embedded in StatCenter (www.psych.utah.edu/learn/statsampler.html), an on-line collection of tools and text for teaching and learning statistics. Instructors author a statistical virtual reality that simulates theories and data in a specific research focus area by defining independent, predictor, and dependent variables and the relations among them. Students work in an on-line virtual environment to discover the principles of this simulated reality: They go to a library, read theoretical overviews and scientific puzzles, and then go to a lab, design a study, collect and analyze data, and write a report. Each student's design and data analysis decisions are computer-graded and recorded in a database; the written research report can be read by the instructor or by other students in peer groups simulating scientific conventions.
Odessa Observatory as a Cultural and Scientific Educational Center on the Black Sea
NASA Astrophysics Data System (ADS)
Karetnikov, V. G.; Dorokhova, T. N.
2007-10-01
Odessa is a large port city on the Black Sea. Historically, transport, economy and business problems call forth the necessity of a major astronomical center in the city. In 2006 the Astronomical Observatory of Odessa National University celebrates its 135th Anniversary. Some interesting astronomical buildings and instruments of the 19th and 20th centuries, and the extensive scientific investigations which are reflected in numerous publications make the Observatory not only a scientific and educational establishment but also an historical and cultural center.
NASA Astrophysics Data System (ADS)
Kulchitsky, A.; Maurits, S.; Watkins, B.
2006-12-01
With the widespread availability of the Internet today, many people can monitor various scientific research activities. It is important to accommodate this interest providing on-line access to dynamic and illustrative Web-resources, which could demonstrate different aspects of ongoing research. It is especially important to explain and these research activities for high school and undergraduate students, thereby providing more information for making decisions concerning their future studies. Such Web resources are also important to clarify scientific research for the general public, in order to achieve better awareness of research progress in various fields. Particularly rewarding is dissemination of information about ongoing projects within Universities and research centers to their local communities. The benefits of this type of scientific outreach are mutual, since development of Web-based automatic systems is prerequisite for many research projects targeting real-time monitoring and/or modeling of natural conditions. Continuous operation of such systems provide ongoing research opportunities for the statistically massive validation of the models, as well. We have developed a Web-based system to run the University of Alaska Fairbanks Polar Ionospheric Model in real-time. This model makes use of networking and computational resources at the Arctic Region Supercomputing Center. This system was designed to be portable among various operating systems and computational resources. Its components can be installed across different computers, separating Web servers and computational engines. The core of the system is a Real-Time Management module (RMM) written Python, which facilitates interactions of remote input data transfers, the ionospheric model runs, MySQL database filling, and PHP scripts for the Web-page preparations. The RMM downloads current geophysical inputs as soon as they become available at different on-line depositories. This information is processed to provide inputs for the next ionospheic model time step and then stored in a MySQL database as the first part of the time-specific record. The RMM then performs synchronization of the input times with the current model time, prepares a decision on initialization for the next model time step, and monitors its execution. Then, as soon as the model completes computations for the next time step, RMM visualizes the current model output into various short-term (about 1-2 hours) forecasting products and compares prior results with available ionospheric measurements. The RMM places prepared images into the MySQL database, which can be located on a different computer node, and then proceeds to the next time interval continuing the time-loop. The upper-level interface of this real-time system is the a PHP-based Web site (http://www.arsc.edu/SpaceWeather/new). This site provides general information about the Earth polar and adjacent mid-latitude ionosphere, allows for monitoring of the current developments and short-term forecasts, and facilitates access to the comparisons archive stored in the database.
An Overview of the Computational Physics and Methods Group at Los Alamos National Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker, Randal Scott
CCS Division was formed to strengthen the visibility and impact of computer science and computational physics research on strategic directions for the Laboratory. Both computer science and computational science are now central to scientific discovery and innovation. They have become indispensable tools for all other scientific missions at the Laboratory. CCS Division forms a bridge between external partners and Laboratory programs, bringing new ideas and technologies to bear on today’s important problems and attracting high-quality technical staff members to the Laboratory. The Computational Physics and Methods Group CCS-2 conducts methods research and develops scientific software aimed at the latest andmore » emerging HPC systems.« less
[Earth Science Technology Office's Computational Technologies Project
NASA Technical Reports Server (NTRS)
Fischer, James (Technical Monitor); Merkey, Phillip
2005-01-01
This grant supported the effort to characterize the problem domain of the Earth Science Technology Office's Computational Technologies Project, to engage the Beowulf Cluster Computing Community as well as the High Performance Computing Research Community so that we can predict the applicability of said technologies to the scientific community represented by the CT project and formulate long term strategies to provide the computational resources necessary to attain the anticipated scientific objectives of the CT project. Specifically, the goal of the evaluation effort is to use the information gathered over the course of the Round-3 investigations to quantify the trends in scientific expectations, the algorithmic requirements and capabilities of high-performance computers to satisfy this anticipated need.
Low Latency Workflow Scheduling and an Application of Hyperspectral Brightness Temperatures
NASA Astrophysics Data System (ADS)
Nguyen, P. T.; Chapman, D. R.; Halem, M.
2012-12-01
New system analytics for Big Data computing holds the promise of major scientific breakthroughs and discoveries from the exploration and mining of the massive data sets becoming available to the science community. However, such data intensive scientific applications face severe challenges in accessing, managing and analyzing petabytes of data. While the Hadoop MapReduce environment has been successfully applied to data intensive problems arising in business, there are still many scientific problem domains where limitations in the functionality of MapReduce systems prevent its wide adoption by those communities. This is mainly because MapReduce does not readily support the unique science discipline needs such as special science data formats, graphic and computational data analysis tools, maintaining high degrees of computational accuracies, and interfacing with application's existing components across heterogeneous computing processors. We address some of these limitations by exploiting the MapReduce programming model for satellite data intensive scientific problems and address scalability, reliability, scheduling, and data management issues when dealing with climate data records and their complex observational challenges. In addition, we will present techniques to support the unique Earth science discipline needs such as dealing with special science data formats (HDF and NetCDF). We have developed a Hadoop task scheduling algorithm that improves latency by 2x for a scientific workflow including the gridding of the EOS AIRS hyperspectral Brightness Temperatures (BT). This workflow processing algorithm has been tested at the Multicore Computing Center private Hadoop based Intel Nehalem cluster, as well as in a virtual mode under the Open Source Eucalyptus cloud. The 55TB AIRS hyperspectral L1b Brightness Temperature record has been gridded at the resolution of 0.5x1.0 degrees, and we have computed a 0.9 annual anti-correlation to the El Nino Southern oscillation in the Nino 4 region, as well as a 1.9 Kelvin decadal Arctic warming in the 4u and 12u spectral regions. Additionally, we will present the frequency of extreme global warming events by the use of a normalized maximum BT in a grid cell relative to its local standard deviation. A low-latency Hadoop scheduling environment maintains data integrity and fault tolerance in a MapReduce data intensive Cloud environment while improving the "time to solution" metric by 35% when compared to a more traditional parallel processing system for the same dataset. Our next step will be to improve the usability of our Hadoop task scheduling system, to enable rapid prototyping of data intensive experiments by means of processing "kernels". We will report on the performance and experience of implementing these experiments on the NEX testbed, and propose the use of a graphical directed acyclic graph (DAG) interface to help us develop on-demand scientific experiments. Our workflow system works within Hadoop infrastructure as a replacement for the FIFO or FairScheduler, thus the use of Apache "Pig" latin or other Apache tools may also be worth investigating on the NEX system to improve the usability of our workflow scheduling infrastructure for rapid experimentation.
CICT Computing, Information, and Communications Technology Program
NASA Technical Reports Server (NTRS)
Laufenberg, Lawrence; Tu, Eugene (Technical Monitor)
2002-01-01
The CICT Program is part of the NASA Aerospace Technology Enterprise's fundamental technology thrust to develop tools. processes, and technologies that enable new aerospace system capabilities and missions. The CICT Program's four key objectives are: Provide seamless access to NASA resources- including ground-, air-, and space-based distributed information technology resources-so that NASA scientists and engineers can more easily control missions, make new scientific discoveries, and design the next-generation space vehicles, provide high-data delivery from these assets directly to users for missions, develop goal-oriented human-centered systems, and research, develop and evaluate revolutionary technology.
GIS plays key role in NYC Rescue and Relief Operation
NASA Astrophysics Data System (ADS)
Showstack, Randy
New York City, Sept. 17—The posters of missing loved ones are pasted onto New York City walls and street signs six days after 2 hijacked commercial airlines destroyed the World Trade Center in lower Manhattan on September 11. Several miles uptown from “ground zero,” heightened security hovers around the city's Office of Emergency Management rescue and relief command center, an around-the-clock operation. Police, firefighters, military, officials with the Federal Emergency Management Agency, communications technicians, and a beehive of others work in controlled chaos in this cavernous, convention center-sized hall, lined with computers and adorned with several American flags.After the original command center at 7 World Trade Center collapsed to rubble as an after-effect of the plane strikes, city officials scrambled to recreate it. Alan Leidner, director of New York's citywide geographic information systems (GIS), and who is with the Department of Information Technology and Telecommunications, knew that maps would be an integral component of the rescue and relief efforts. Maps provide emergency workers and others with accurate and detailed scientific data in the form of visual aids upon which they can make informed decisions.
76 FR 14036 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-15
..., [email protected]csr.nih.gov . This notice is being published less than 15 days prior to the meeting due to the timing limitations imposed by the review and funding cycle. Name of Committee: Center for Scientific... 7852, Bethesda, MD 20892, (301) 435-1166, [email protected]csr.nih.gov . Name of Committee: Center for...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-08
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Disease Control and Prevention World Trade Center Health Program Scientific/Technical Advisory Committee (WTCHP STAC or Advisory Committee), National Institute for Occupational Safety and Health (NIOSH) In accordance with section 10(a)(2) of the Federal Advisory Committee Act (Pub. L. 92-463),...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-02
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Disease Control and Prevention World Trade Center Health Program Scientific/Technical Advisory Committee (WTCHP STAC or Advisory Committee), National Institute for Occupational Safety and Health (NIOSH) Notice of Cancellation: This notice was published in the Federal Register on December 29,...