Sample records for computer science center

  1. Berkeley Lab - Materials Sciences Division

    Science.gov Websites

    Computational Study of Excited-State Phenomena in Energy Materials Center for X-ray Optics MSD Facilities Ion and Materials Physics Scattering and Instrumentation Science Centers Center for Computational Study of Sciences Centers Center for Computational Study of Excited-State Phenomena in Energy Materials Center for X

  2. Exploring the Relationships between Self-Efficacy and Preference for Teacher Authority among Computer Science Majors

    ERIC Educational Resources Information Center

    Lin, Che-Li; Liang, Jyh-Chong; Su, Yi-Ching; Tsai, Chin-Chung

    2013-01-01

    Teacher-centered instruction has been widely adopted in college computer science classrooms and has some benefits in training computer science undergraduates. Meanwhile, student-centered contexts have been advocated to promote computer science education. How computer science learners respond to or prefer the two types of teacher authority,…

  3. NASA Center for Computational Sciences: History and Resources

    NASA Technical Reports Server (NTRS)

    2000-01-01

    The Nasa Center for Computational Sciences (NCCS) has been a leading capacity computing facility, providing a production environment and support resources to address the challenges facing the Earth and space sciences research community.

  4. Mathematics and Computer Science | Argonne National Laboratory

    Science.gov Websites

    Genomics and Systems Biology LCRCLaboratory Computing Resource Center MCSGMidwest Center for Structural Genomics NAISENorthwestern-Argonne Institute of Science & Engineering SBCStructural Biology Center

  5. Cornell University Center for Advanced Computing

    Science.gov Websites

    Resource Center Data Management (RDMSG) Computational Agriculture National Science Foundation Other Public agriculture technology acquired Lifka joins National Science Foundation CISE Advisory Committee © Cornell

  6. Computers as learning resources in the health sciences: impact and issues.

    PubMed Central

    Ellis, L B; Hannigan, G G

    1986-01-01

    Starting with two computer terminals in 1972, the Health Sciences Learning Resources Center of the University of Minnesota Bio-Medical Library expanded its instructional facilities to ten terminals and thirty-five microcomputers by 1985. Computer use accounted for 28% of total center circulation. The impact of these resources on health sciences curricula is described and issues related to use, support, and planning are raised and discussed. Judged by their acceptance and educational value, computers are successful health sciences learning resources at the University of Minnesota. PMID:3518843

  7. EOS MLS Science Data Processing System: A Description of Architecture and Capabilities

    NASA Technical Reports Server (NTRS)

    Cuddy, David T.; Echeverri, Mark D.; Wagner, Paul A.; Hanzel, Audrey T.; Fuller, Ryan A.

    2006-01-01

    This paper describes the architecture and capabilities of the Science Data Processing System (SDPS) for the EOS MLS. The SDPS consists of two major components--the Science Computing Facility and the Science Investigator-led Processing System. The Science Computing Facility provides the facilities for the EOS MLS Science Team to perform the functions of scientific algorithm development, processing software development, quality control of data products, and scientific analyses. The Science Investigator-led Processing System processes and reprocesses the science data for the entire mission and delivers the data products to the Science Computing Facility and to the Goddard Space Flight Center Earth Science Distributed Active Archive Center, which archives and distributes the standard science products.

  8. Computational Science News | Computational Science | NREL

    Science.gov Websites

    -Cooled High-Performance Computing Technology at the ESIF February 28, 2018 NREL Launches New Website for High-Performance Computing System Users The National Renewable Energy Laboratory (NREL) Computational Science Center has launched a revamped website for users of the lab's high-performance computing (HPC

  9. Using Frameworks in a Government Contracting Environment: Case Study at the NASA Center for Computational Sciences

    NASA Technical Reports Server (NTRS)

    McGalliard, James

    2008-01-01

    A viewgraph describing the use of multiple frameworks by NASA, GSA, and U.S. Government agencies is presented. The contents include: 1) Federal Systems Integration and Management Center (FEDSIM) and NASA Center for Computational Sciences (NCCS) Environment; 2) Ruling Frameworks; 3) Implications; and 4) Reconciling Multiple Frameworks.

  10. High-Performance Computing Data Center Warm-Water Liquid Cooling |

    Science.gov Websites

    Computational Science | NREL Warm-Water Liquid Cooling High-Performance Computing Data Center Warm-Water Liquid Cooling NREL's High-Performance Computing Data Center (HPC Data Center) is liquid water Liquid cooling technologies offer a more energy-efficient solution that also allows for effective

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hules, John

    This 1998 annual report from the National Scientific Energy Research Computing Center (NERSC) presents the year in review of the following categories: Computational Science; Computer Science and Applied Mathematics; and Systems and Services. Also presented are science highlights in the following categories: Basic Energy Sciences; Biological and Environmental Research; Fusion Energy Sciences; High Energy and Nuclear Physics; and Advanced Scientific Computing Research and Other Projects.

  12. 77 FR 38630 - Open Internet Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-28

    ... Computer Science and Co-Founder of the Berkman Center for Internet and Society, Harvard University, is... of Technology Computer Science and Artificial Intelligence Laboratory, is appointed vice-chairperson... Jennifer Rexford, Professor of Computer Science, Princeton University Dennis Roberson, Vice Provost...

  13. Joint the Center for Applied Scientific Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gamblin, Todd; Bremer, Timo; Van Essen, Brian

    The Center for Applied Scientific Computing serves as Livermore Lab’s window to the broader computer science, computational physics, applied mathematics, and data science research communities. In collaboration with academic, industrial, and other government laboratory partners, we conduct world-class scientific research and development on problems critical to national security. CASC applies the power of high-performance computing and the efficiency of modern computational methods to the realms of stockpile stewardship, cyber and energy security, and knowledge discovery for intelligence applications.

  14. Cloudbursting - Solving the 3-body problem

    NASA Astrophysics Data System (ADS)

    Chang, G.; Heistand, S.; Vakhnin, A.; Huang, T.; Zimdars, P.; Hua, H.; Hood, R.; Koenig, J.; Mehrotra, P.; Little, M. M.; Law, E.

    2014-12-01

    Many science projects in the future will be accomplished through collaboration among 2 or more NASA centers along with, potentially, external scientists. Science teams will be composed of more geographically dispersed individuals and groups. However, the current computing environment does not make this easy and seamless. By being able to share computing resources among members of a multi-center team working on a science/ engineering project, limited pre-competition funds could be more efficiently applied and technical work could be conducted more effectively with less time spent moving data or waiting for computing resources to free up. Based on the work from an NASA CIO IT Labs task, this presentation will highlight our prototype work in identifying the feasibility and identify the obstacles, both technical and management, to perform "Cloudbursting" among private clouds located at three different centers. We will demonstrate the use of private cloud computing infrastructure at the Jet Propulsion Laboratory, Langley Research Center, and Ames Research Center to provide elastic computation to each other to perform parallel Earth Science data imaging. We leverage elastic load balancing and auto-scaling features at each data center so that each location can independently define how many resources to allocate to a particular job that was "bursted" from another data center and demonstrate that compute capacity scales up and down with the job. We will also discuss future work in the area, which could include the use of cloud infrastructure from different cloud framework providers as well as other cloud service providers.

  15. SANs and Large Scale Data Migration at the NASA Center for Computational Sciences

    NASA Technical Reports Server (NTRS)

    Salmon, Ellen M.

    2004-01-01

    Evolution and migration are a way of life for provisioners of high-performance mass storage systems that serve high-end computers used by climate and Earth and space science researchers: the compute engines come and go, but the data remains. At the NASA Center for Computational Sciences (NCCS), disk and tape SANs are deployed to provide high-speed I/O for the compute engines and the hierarchical storage management systems. Along with gigabit Ethernet, they also enable the NCCS's latest significant migration: the transparent transfer of 300 Til3 of legacy HSM data into the new Sun SAM-QFS cluster.

  16. Storage and network bandwidth requirements through the year 2000 for the NASA Center for Computational Sciences

    NASA Technical Reports Server (NTRS)

    Salmon, Ellen

    1996-01-01

    The data storage and retrieval demands of space and Earth sciences researchers have made the NASA Center for Computational Sciences (NCCS) Mass Data Storage and Delivery System (MDSDS) one of the world's most active Convex UniTree systems. Science researchers formed the NCCS's Computer Environments and Research Requirements Committee (CERRC) to relate their projected supercomputing and mass storage requirements through the year 2000. Using the CERRC guidelines and observations of current usage, some detailed projections of requirements for MDSDS network bandwidth and mass storage capacity and performance are presented.

  17. Center for Center for Technology for Advanced Scientific Component Software (TASCS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kostadin, Damevski

    A resounding success of the Scientific Discovery through Advanced Computing (SciDAC) program is that high-performance computational science is now universally recognized as a critical aspect of scientific discovery [71], complementing both theoretical and experimental research. As scientific communities prepare to exploit unprecedented computing capabilities of emerging leadership-class machines for multi-model simulations at the extreme scale [72], it is more important than ever to address the technical and social challenges of geographically distributed teams that combine expertise in domain science, applied mathematics, and computer science to build robust and flexible codes that can incorporate changes over time. The Center for Technologymore » for Advanced Scientific Component Software (TASCS)1 tackles these these issues by exploiting component-based software development to facilitate collaborative high-performance scientific computing.« less

  18. High performance computing for advanced modeling and simulation of materials

    NASA Astrophysics Data System (ADS)

    Wang, Jue; Gao, Fei; Vazquez-Poletti, Jose Luis; Li, Jianjiang

    2017-02-01

    The First International Workshop on High Performance Computing for Advanced Modeling and Simulation of Materials (HPCMS2015) was held in Austin, Texas, USA, Nov. 18, 2015. HPCMS 2015 was organized by Computer Network Information Center (Chinese Academy of Sciences), University of Michigan, Universidad Complutense de Madrid, University of Science and Technology Beijing, Pittsburgh Supercomputing Center, China Institute of Atomic Energy, and Ames Laboratory.

  19. After-Hours Science: Microchips and Onion Dip.

    ERIC Educational Resources Information Center

    Brugger, Steve

    1984-01-01

    Computer programs were developed for a science center nutrition exhibit. The exhibit was recognized by the National Science Teachers Association Search for Excellence in Science Education as an outstanding science program. The computer programs (Apple II) and their use in the exhibit are described. (BC)

  20. Characteristics of the Navy Laboratory Warfare Center Technical Workforce

    DTIC Science & Technology

    2013-09-29

    Mathematics and Information Science (M&IS) Actuarial Science 1510 Computer Science 1550 Gen. Math & Statistics 1501 Mathematics 1520 Operations...Admin. Network Systems & Data Communication Analysts Actuaries Mathematicians Operations Research Analyst Statisticians Social Science (SS...workforce was sub-divided into six broad occupational groups: Life Science , Physical Science , Engineering, Mathematics, Computer Science and Information

  1. Center for Computing Research Summer Research Proceedings 2015.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bradley, Andrew Michael; Parks, Michael L.

    2015-12-18

    The Center for Computing Research (CCR) at Sandia National Laboratories organizes a summer student program each summer, in coordination with the Computer Science Research Institute (CSRI) and Cyber Engineering Research Institute (CERI).

  2. Vanderbilt University Institute of Imaging Science Center for Computational Imaging XNAT: A multimodal data archive and processing environment.

    PubMed

    Harrigan, Robert L; Yvernault, Benjamin C; Boyd, Brian D; Damon, Stephen M; Gibney, Kyla David; Conrad, Benjamin N; Phillips, Nicholas S; Rogers, Baxter P; Gao, Yurui; Landman, Bennett A

    2016-01-01

    The Vanderbilt University Institute for Imaging Science (VUIIS) Center for Computational Imaging (CCI) has developed a database built on XNAT housing over a quarter of a million scans. The database provides framework for (1) rapid prototyping, (2) large scale batch processing of images and (3) scalable project management. The system uses the web-based interfaces of XNAT and REDCap to allow for graphical interaction. A python middleware layer, the Distributed Automation for XNAT (DAX) package, distributes computation across the Vanderbilt Advanced Computing Center for Research and Education high performance computing center. All software are made available in open source for use in combining portable batch scripting (PBS) grids and XNAT servers. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. How the Theory of Computing Can Help in Space Exploration

    NASA Technical Reports Server (NTRS)

    Kreinovich, Vladik; Longpre, Luc

    1997-01-01

    The opening of the NASA Pan American Center for Environmental and Earth Sciences (PACES) at the University of Texas at El Paso made it possible to organize the student Center for Theoretical Research and its Applications in Computer Science (TRACS). In this abstract, we briefly describe the main NASA-related research directions of the TRACS center, and give an overview of the preliminary results of student research.

  4. NASA Center for Climate Simulation (NCCS) Advanced Technology AT5 Virtualized Infiniband Report

    NASA Technical Reports Server (NTRS)

    Thompson, John H.; Bledsoe, Benjamin C.; Wagner, Mark; Shakshober, John; Fromkin, Russ

    2013-01-01

    The NCCS is part of the Computational and Information Sciences and Technology Office (CISTO) of Goddard Space Flight Center's (GSFC) Sciences and Exploration Directorate. The NCCS's mission is to enable scientists to increase their understanding of the Earth, the solar system, and the universe by supplying state-of-the-art high performance computing (HPC) solutions. To accomplish this mission, the NCCS (https://www.nccs.nasa.gov) provides high performance compute engines, mass storage, and network solutions to meet the specialized needs of the Earth and space science user communities

  5. Kepler Science Operations Center Architecture

    NASA Technical Reports Server (NTRS)

    Middour, Christopher; Klaus, Todd; Jenkins, Jon; Pletcher, David; Cote, Miles; Chandrasekaran, Hema; Wohler, Bill; Girouard, Forrest; Gunter, Jay P.; Uddin, Kamal; hide

    2010-01-01

    We give an overview of the operational concepts and architecture of the Kepler Science Data Pipeline. Designed, developed, operated, and maintained by the Science Operations Center (SOC) at NASA Ames Research Center, the Kepler Science Data Pipeline is central element of the Kepler Ground Data System. The SOC charter is to analyze stellar photometric data from the Kepler spacecraft and report results to the Kepler Science Office for further analysis. We describe how this is accomplished via the Kepler Science Data Pipeline, including the hardware infrastructure, scientific algorithms, and operational procedures. The SOC consists of an office at Ames Research Center, software development and operations departments, and a data center that hosts the computers required to perform data analysis. We discuss the high-performance, parallel computing software modules of the Kepler Science Data Pipeline that perform transit photometry, pixel-level calibration, systematic error-correction, attitude determination, stellar target management, and instrument characterization. We explain how data processing environments are divided to support operational processing and test needs. We explain the operational timelines for data processing and the data constructs that flow into the Kepler Science Data Pipeline.

  6. The Kepler Science Data Processing Pipeline Source Code Road Map

    NASA Technical Reports Server (NTRS)

    Wohler, Bill; Jenkins, Jon M.; Twicken, Joseph D.; Bryson, Stephen T.; Clarke, Bruce Donald; Middour, Christopher K.; Quintana, Elisa Victoria; Sanderfer, Jesse Thomas; Uddin, Akm Kamal; Sabale, Anima; hide

    2016-01-01

    We give an overview of the operational concepts and architecture of the Kepler Science Processing Pipeline. Designed, developed, operated, and maintained by the Kepler Science Operations Center (SOC) at NASA Ames Research Center, the Science Processing Pipeline is a central element of the Kepler Ground Data System. The SOC consists of an office at Ames Research Center, software development and operations departments, and a data center which hosts the computers required to perform data analysis. The SOC's charter is to analyze stellar photometric data from the Kepler spacecraft and report results to the Kepler Science Office for further analysis. We describe how this is accomplished via the Kepler Science Processing Pipeline, including, the software algorithms. We present the high-performance, parallel computing software modules of the pipeline that perform transit photometry, pixel-level calibration, systematic error correction, attitude determination, stellar target management, and instrument characterization.

  7. Remote Science Operation Center research

    NASA Technical Reports Server (NTRS)

    Banks, P. M.

    1986-01-01

    Progress in the following areas is discussed: the design, planning and operation of a remote science payload operations control center; design and planning of a data link via satellite; and the design and prototyping of an advanced workstation environment for multi-media (3-D computer aided design/computer aided engineering, voice, video, text) communications and operations.

  8. The Mathematics and Computer Science Learning Center (MLC).

    ERIC Educational Resources Information Center

    Abraham, Solomon T.

    The Mathematics and Computer Science Learning Center (MLC) was established in the Department of Mathematics at North Carolina Central University during the fall semester of the 1982-83 academic year. The initial operations of the MLC were supported by grants to the University from the Burroughs-Wellcome Company and the Kenan Charitable Trust Fund.…

  9. Computational Science | NREL

    Science.gov Websites

    Science Photo of person viewing 3D visualization of a wind turbine The NREL Computational Science challenges in fields ranging from condensed matter physics and nonlinear dynamics to computational fluid dynamics. NREL is also home to the most energy-efficient data center in the world, featuring Peregrine-the

  10. 1999 NCCS Highlights

    NASA Technical Reports Server (NTRS)

    Bennett, Jerome (Technical Monitor)

    2002-01-01

    The NASA Center for Computational Sciences (NCCS) is a high-performance scientific computing facility operated, maintained and managed by the Earth and Space Data Computing Division (ESDCD) of NASA Goddard Space Flight Center's (GSFC) Earth Sciences Directorate. The mission of the NCCS is to advance leading-edge science by providing the best people, computers, and data storage systems to NASA's Earth and space sciences programs and those of other U.S. Government agencies, universities, and private institutions. Among the many computationally demanding Earth science research efforts supported by the NCCS in Fiscal Year 1999 (FY99) are the NASA Seasonal-to-Interannual Prediction Project, the NASA Search and Rescue Mission, Earth gravitational model development efforts, the National Weather Service's North American Observing System program, Data Assimilation Office studies, a NASA-sponsored project at the Center for Ocean-Land-Atmosphere Studies, a NASA-sponsored microgravity project conducted by researchers at the City University of New York and the University of Pennsylvania, the completion of a satellite-derived global climate data set, simulations of a new geodynamo model, and studies of Earth's torque. This document presents highlights of these research efforts and an overview of the NCCS, its facilities, and its people.

  11. A Report on the Design and Construction of the University of Massachusetts Computer Science Center.

    ERIC Educational Resources Information Center

    Massachusetts State Office of the Inspector General, Boston.

    This report describes a review conducted by the Massachusetts Office of the Inspector General on the construction of the Computer Science and Development Center at the University of Massachusetts, Amherst. The office initiated the review after hearing concerns about the management of the project, including its delayed completion and substantial…

  12. Center for Aeronautics and Space Information Sciences

    NASA Technical Reports Server (NTRS)

    Flynn, Michael J.

    1992-01-01

    This report summarizes the research done during 1991/92 under the Center for Aeronautics and Space Information Science (CASIS) program. The topics covered are computer architecture, networking, and neural nets.

  13. Scientific Computing Strategic Plan for the Idaho National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whiting, Eric Todd

    Scientific computing is a critical foundation of modern science. Without innovations in the field of computational science, the essential missions of the Department of Energy (DOE) would go unrealized. Taking a leadership role in such innovations is Idaho National Laboratory’s (INL’s) challenge and charge, and is central to INL’s ongoing success. Computing is an essential part of INL’s future. DOE science and technology missions rely firmly on computing capabilities in various forms. Modeling and simulation, fueled by innovations in computational science and validated through experiment, are a critical foundation of science and engineering. Big data analytics from an increasing numbermore » of widely varied sources is opening new windows of insight and discovery. Computing is a critical tool in education, science, engineering, and experiments. Advanced computing capabilities in the form of people, tools, computers, and facilities, will position INL competitively to deliver results and solutions on important national science and engineering challenges. A computing strategy must include much more than simply computers. The foundational enabling component of computing at many DOE national laboratories is the combination of a showcase like data center facility coupled with a very capable supercomputer. In addition, network connectivity, disk storage systems, and visualization hardware are critical and generally tightly coupled to the computer system and co located in the same facility. The existence of these resources in a single data center facility opens the doors to many opportunities that would not otherwise be possible.« less

  14. The Development of a Learning Dashboard for Lecturers: A Case Study on a Student-Centered E-Learning Environment

    ERIC Educational Resources Information Center

    Santoso, Harry B.; Batuparan, Alivia Khaira; Isal, R. Yugo K.; Goodridge, Wade H.

    2018-01-01

    Student Centered e-Learning Environment (SCELE) is a Moodle-based learning management system (LMS) that has been modified to enhance learning within a computer science department curriculum offered by the Faculty of Computer Science of large public university in Indonesia. This Moodle provided a mechanism to record students' activities when…

  15. Roy Fraley | NREL

    Science.gov Websites

    Roy Fraley Roy Fraley Professional II-Engineer Roy.Fraley@nrel.gov | 303-384-6468 Roy Fraley is the high-performance computing (HPC) data center engineer with the Computational Science Center's HPC

  16. UC Merced Center for Computational Biology Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Colvin, Michael; Watanabe, Masakatsu

    Final report for the UC Merced Center for Computational Biology. The Center for Computational Biology (CCB) was established to support multidisciplinary scientific research and academic programs in computational biology at the new University of California campus in Merced. In 2003, the growing gap between biology research and education was documented in a report from the National Academy of Sciences, Bio2010 Transforming Undergraduate Education for Future Research Biologists. We believed that a new type of biological sciences undergraduate and graduate programs that emphasized biological concepts and considered biology as an information science would have a dramatic impact in enabling the transformationmore » of biology. UC Merced as newest UC campus and the first new U.S. research university of the 21st century was ideally suited to adopt an alternate strategy - to create a new Biological Sciences majors and graduate group that incorporated the strong computational and mathematical vision articulated in the Bio2010 report. CCB aimed to leverage this strong commitment at UC Merced to develop a new educational program based on the principle of biology as a quantitative, model-driven science. Also we expected that the center would be enable the dissemination of computational biology course materials to other university and feeder institutions, and foster research projects that exemplify a mathematical and computations-based approach to the life sciences. As this report describes, the CCB has been successful in achieving these goals, and multidisciplinary computational biology is now an integral part of UC Merced undergraduate, graduate and research programs in the life sciences. The CCB began in fall 2004 with the aid of an award from U.S. Department of Energy (DOE), under its Genomes to Life program of support for the development of research and educational infrastructure in the modern biological sciences. This report to DOE describes the research and academic programs made possible by the CCB from its inception until August, 2010, at the end of the final extension. Although DOE support for the center ended in August 2010, the CCB will continue to exist and support its original objectives. The research and academic programs fostered by the CCB have led to additional extramural funding from other agencies, and we anticipate that CCB will continue to provide support for quantitative and computational biology program at UC Merced for many years to come. Since its inception in fall 2004, CCB research projects have continuously had a multi-institutional collaboration with Lawrence Livermore National Laboratory (LLNL), and the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign, as well as individual collaborators at other sites. CCB affiliated faculty cover a broad range of computational and mathematical research including molecular modeling, cell biology, applied math, evolutional biology, bioinformatics, etc. The CCB sponsored the first distinguished speaker series at UC Merced, which had an important role is spreading the word about the computational biology emphasis at this new campus. One of CCB's original goals is to help train a new generation of biologists who bridge the gap between the computational and life sciences. To archive this goal, by summer 2006, a new program - summer undergraduate internship program, have been established under CCB to train the highly mathematical and computationally intensive Biological Science researchers. By the end of summer 2010, 44 undergraduate students had gone through this program. Out of those participants, 11 students have been admitted to graduate schools and 10 more students are interested in pursuing graduate studies in the sciences. The center is also continuing to facilitate the development and dissemination of undergraduate and graduate course materials based on the latest research in computational biology.« less

  17. News Focus: NSF Director Erich Bloch Discusses Foundation's Problems, Outlook.

    ERIC Educational Resources Information Center

    Chemical and Engineering News, 1987

    1987-01-01

    Relates the comments offered in an interview with Erich Bloch, the National Science Foundation (NSF) Director. Discusses issues related to NSF and its funding, engineering research centers, involvement with industry, concern for science education, computer centers, and its affiliation with the social sciences. (ML)

  18. Ethics, Identity, and Political Vision: Toward a Justice-Centered Approach to Equity in Computer Science Education

    ERIC Educational Resources Information Center

    Vakil, Sepehr

    2018-01-01

    In this essay, Sepehr Vakil argues that a more serious engagement with critical traditions in education research is necessary to achieve a justice-centered approach to equity in computer science (CS) education. With CS rapidly emerging as a distinct feature of K-12 public education in the United States, calls to expand CS education are often…

  19. 78 FR 69138 - Proposal Review Panel for Computing Communication Foundations; Notice of Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-18

    ... NATIONAL SCIENCE FOUNDATION Proposal Review Panel for Computing Communication Foundations; Notice... National Science Foundation announces the following meeting: Name: Site Visit, Proposal Panel Review for Science and Technology Centers--Integrative Partnerships ( 1192). Date/Time: December 3, 2013, 6:30 p.m.-8...

  20. 77 FR 70483 - Proposal Review Panel for Computing Communication Foundations; Notice of Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-26

    ... NATIONAL SCIENCE FOUNDATION Proposal Review Panel for Computing Communication Foundations; Notice... National Science Foundation announces the following meeting: Name: Site Visit, Proposal Panel Review for Science and Technology Centers--Integrative Partnerships ( 1192). Date/Time: December 3, 2012, 6:30 p.m.-8...

  1. Role of Computers in Sci-Tech Libraries.

    ERIC Educational Resources Information Center

    Bichteler, Julie; And Others

    1986-01-01

    Articles in this theme issue discuss applications of microcomputers in science/technology libraries, a UNIX-based online catalog, online versus print sources, computer-based statistics, and the applicability and implications of the Matheson-Cooper Report on health science centers for science/technology libraries. A bibliography of new reference…

  2. Proceedings: Computer Science and Data Systems Technical Symposium, volume 1

    NASA Technical Reports Server (NTRS)

    Larsen, Ronald L.; Wallgren, Kenneth

    1985-01-01

    Progress reports and technical updates of programs being performed by NASA centers are covered. Presentations in viewgraph form are included for topics in three categories: computer science, data systems and space station applications.

  3. NASA Tech Briefs, March 1995

    NASA Technical Reports Server (NTRS)

    1995-01-01

    This issue contains articles with a special focus on Computer-Aided design and engineering amd a research report on the Ames Research Center. Other subjects in this issue are: Electronic Components and Circuits, Electronic Systems, Physical Sciences, Materials, Computer Programs, Mechanics, Machinery, Manufacturing/Fabrication, Mathematics and Information Sciences and Life Sciences

  4. New developments in delivering public access to data from the National Center for Computational Toxicology at the EPA

    EPA Science Inventory

    Researchers at EPA’s National Center for Computational Toxicology integrate advances in biology, chemistry, and computer science to examine the toxicity of chemicals and help prioritize chemicals for further research based on potential human health risks. The goal of this researc...

  5. Community Information Centers and the Computer.

    ERIC Educational Resources Information Center

    Carroll, John M.; Tague, Jean M.

    Two computer data bases have been developed by the Computer Science Department at the University of Western Ontario for "Information London," the local community information center. One system, called LONDON, permits Boolean searches of a file of 5,000 records describing human service agencies in the London area. The second system,…

  6. Synergies and Distinctions between Computational Disciplines in Biomedical Research: Perspective from the Clinical and Translational Science Award Programs

    PubMed Central

    Bernstam, Elmer V.; Hersh, William R.; Johnson, Stephen B.; Chute, Christopher G.; Nguyen, Hien; Sim, Ida; Nahm, Meredith; Weiner, Mark; Miller, Perry; DiLaura, Robert P.; Overcash, Marc; Lehmann, Harold P.; Eichmann, David; Athey, Brian D.; Scheuermann, Richard H.; Anderson, Nick; Starren, Justin B.; Harris, Paul A.; Smith, Jack W.; Barbour, Ed; Silverstein, Jonathan C.; Krusch, David A.; Nagarajan, Rakesh; Becich, Michael J.

    2010-01-01

    Clinical and translational research increasingly requires computation. Projects may involve multiple computationally-oriented groups including information technology (IT) professionals, computer scientists and biomedical informaticians. However, many biomedical researchers are not aware of the distinctions among these complementary groups, leading to confusion, delays and sub-optimal results. Although written from the perspective of clinical and translational science award (CTSA) programs within academic medical centers, the paper addresses issues that extend beyond clinical and translational research. The authors describe the complementary but distinct roles of operational IT, research IT, computer science and biomedical informatics using a clinical data warehouse as a running example. In general, IT professionals focus on technology. The authors distinguish between two types of IT groups within academic medical centers: central or administrative IT (supporting the administrative computing needs of large organizations) and research IT (supporting the computing needs of researchers). Computer scientists focus on general issues of computation such as designing faster computers or more efficient algorithms, rather than specific applications. In contrast, informaticians are concerned with data, information and knowledge. Biomedical informaticians draw on a variety of tools, including but not limited to computers, to solve information problems in health care and biomedicine. The paper concludes with recommendations regarding administrative structures that can help to maximize the benefit of computation to biomedical research within academic health centers. PMID:19550198

  7. CESDIS

    NASA Technical Reports Server (NTRS)

    1994-01-01

    CESDIS, the Center of Excellence in Space Data and Information Sciences was developed jointly by NASA, Universities Space Research Association (USRA), and the University of Maryland in 1988 to focus on the design of advanced computing techniques and data systems to support NASA Earth and space science research programs. CESDIS is operated by USRA under contract to NASA. The Director, Associate Director, Staff Scientists, and administrative staff are located on-site at NASA's Goddard Space Flight Center in Greenbelt, Maryland. The primary CESDIS mission is to increase the connection between computer science and engineering research programs at colleges and universities and NASA groups working with computer applications in Earth and space science. The 1993-94 CESDIS year included a broad range of computer science research applied to NASA problems. This report provides an overview of these research projects and programs as well as a summary of the various other activities of CESDIS in support of NASA and the university research community, We have had an exciting and challenging year.

  8. A parallel-processing approach to computing for the geographic sciences; applications and systems enhancements

    USGS Publications Warehouse

    Crane, Michael; Steinwand, Dan; Beckmann, Tim; Krpan, Greg; Liu, Shu-Guang; Nichols, Erin; Haga, Jim; Maddox, Brian; Bilderback, Chris; Feller, Mark; Homer, George

    2001-01-01

    The overarching goal of this project is to build a spatially distributed infrastructure for information science research by forming a team of information science researchers and providing them with similar hardware and software tools to perform collaborative research. Four geographically distributed Centers of the U.S. Geological Survey (USGS) are developing their own clusters of low-cost, personal computers into parallel computing environments that provide a costeffective way for the USGS to increase participation in the high-performance computing community. Referred to as Beowulf clusters, these hybrid systems provide the robust computing power required for conducting information science research into parallel computing systems and applications.

  9. Communications among data and science centers

    NASA Technical Reports Server (NTRS)

    Green, James L.

    1990-01-01

    The ability to electronically access and query the contents of remote computer archives is of singular importance in space and earth sciences; the present evaluation of such on-line information networks' development status foresees swift expansion of their data capabilities and complexity, in view of the volumes of data that will continue to be generated by NASA missions. The U.S.'s National Space Science Data Center (NSSDC) manages NASA's largest science computer network, the Space Physics Analysis Network; a comprehensive account is given of the structure of NSSDC international access through BITNET, and of connections to the NSSDC available in the Americas via the International X.25 network.

  10. Proceedings: Computer Science and Data Systems Technical Symposium, volume 2

    NASA Technical Reports Server (NTRS)

    Larsen, Ronald L.; Wallgren, Kenneth

    1985-01-01

    Progress reports and technical updates of programs being performed by NASA centers are covered. Presentations in viewgraph form, along with abstracts, are included for topics in three catagories: computer science, data systems, and space station applications.

  11. Bringing Computational Thinking into the High School Science and Math Classroom

    NASA Astrophysics Data System (ADS)

    Trouille, Laura; Beheshti, E.; Horn, M.; Jona, K.; Kalogera, V.; Weintrop, D.; Wilensky, U.; University CT-STEM Project, Northwestern; University CenterTalent Development, Northwestern

    2013-01-01

    Computational thinking (for example, the thought processes involved in developing algorithmic solutions to problems that can then be automated for computation) has revolutionized the way we do science. The Next Generation Science Standards require that teachers support their students’ development of computational thinking and computational modeling skills. As a result, there is a very high demand among teachers for quality materials. Astronomy provides an abundance of opportunities to support student development of computational thinking skills. Our group has taken advantage of this to create a series of astronomy-based computational thinking lesson plans for use in typical physics, astronomy, and math high school classrooms. This project is funded by the NSF Computing Education for the 21st Century grant and is jointly led by Northwestern University’s Center for Interdisciplinary Exploration and Research in Astrophysics (CIERA), the Computer Science department, the Learning Sciences department, and the Office of STEM Education Partnerships (OSEP). I will also briefly present the online ‘Astro Adventures’ courses for middle and high school students I have developed through NU’s Center for Talent Development. The online courses take advantage of many of the amazing online astronomy enrichment materials available to the public, including a range of hands-on activities and the ability to take images with the Global Telescope Network. The course culminates with an independent computational research project.

  12. GSDC: A Unique Data Center in Korea for HEP research

    NASA Astrophysics Data System (ADS)

    Ahn, Sang-Un

    2017-04-01

    Global Science experimental Data hub Center (GSDC) at Korea Institute of Science and Technology Information (KISTI) is a unique data center in South Korea established for promoting the fundamental research fields by supporting them with the expertise on Information and Communication Technology (ICT) and the infrastructure for High Performance Computing (HPC), High Throughput Computing (HTC) and Networking. GSDC has supported various research fields in South Korea dealing with the large scale of data, e.g. RENO experiment for neutrino research, LIGO experiment for gravitational wave detection, Genome sequencing project for bio-medical, and HEP experiments such as CDF at FNAL, Belle at KEK, and STAR at BNL. In particular, GSDC has run a Tier-1 center for ALICE experiment using the LHC at CERN since 2013. In this talk, we present the overview on computing infrastructure that GSDC runs for the research fields and we discuss on the data center infrastructure management system deployed at GSDC.

  13. Association of Small Computer Users in Education (ASCUE) Summer Conference Proceedings (30th, North Myrtle Beach, South Carolina, June 7-12, 1997).

    ERIC Educational Resources Information Center

    Smith, Peter, Ed.

    Papers from a conference on small college computing issues are: "An On-line Microcomputer Course for Pre-service Teachers" (Mary K. Abkemeier); "The Mathematics and Computer Science Learning Center (MLC)" (Solomon T. Abraham); "Multimedia for the Non-Computer Science Faculty Member" (Stephen T. Anderson, Sr.); "Achieving Continuous Improvement:…

  14. The Benefits of Making Data from the EPA National Center for Computational Toxicology available for reuse (ACS Fall meeting 3 of 12)

    EPA Science Inventory

    Researchers at EPA’s National Center for Computational Toxicology (NCCT) integrate advances in biology, chemistry, exposure and computer science to help prioritize chemicals for further research based on potential human health risks. The goal of this research is to quickly evalua...

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shankar, Arjun

    Computer scientist Arjun Shankar is director of the Compute and Data Environment for Science (CADES), ORNL’s multidisciplinary big data computing center. CADES offers computing, networking and data analytics to facilitate workflows for both ORNL and external research projects.

  16. Report on Computing and Networking in the Space Science Laboratory by the SSL Computer Committee

    NASA Technical Reports Server (NTRS)

    Gallagher, D. L. (Editor)

    1993-01-01

    The Space Science Laboratory (SSL) at Marshall Space Flight Center is a multiprogram facility. Scientific research is conducted in four discipline areas: earth science and applications, solar-terrestrial physics, astrophysics, and microgravity science and applications. Representatives from each of these discipline areas participate in a Laboratory computer requirements committee, which developed this document. The purpose is to establish and discuss Laboratory objectives for computing and networking in support of science. The purpose is also to lay the foundation for a collective, multiprogram approach to providing these services. Special recognition is given to the importance of the national and international efforts of our research communities toward the development of interoperable, network-based computer applications.

  17. Experiences with Transitioning Science Data Production from a Symmetric Multiprocessor Platform to a Linux Cluster Environment

    NASA Astrophysics Data System (ADS)

    Walter, R. J.; Protack, S. P.; Harris, C. J.; Caruthers, C.; Kusterer, J. M.

    2008-12-01

    NASA's Atmospheric Science Data Center at the NASA Langley Research Center performs all of the science data processing for the Multi-angle Imaging SpectroRadiometer (MISR) instrument. MISR is one of the five remote sensing instruments flying aboard NASA's Terra spacecraft. From the time of Terra launch in December 1999 until February 2008, all MISR science data processing was performed on a Silicon Graphics, Inc. (SGI) platform. However, dramatic improvements in commodity computing technology coupled with steadily declining project budgets during that period eventually made transitioning MISR processing to a commodity computing environment both feasible and necessary. The Atmospheric Science Data Center has successfully ported the MISR science data processing environment from the SGI platform to a Linux cluster environment. There were a multitude of technical challenges associated with this transition. Even though the core architecture of the production system did not change, the manner in which it interacted with underlying hardware was fundamentally different. In addition, there are more potential throughput bottlenecks in a cluster environment than there are in a symmetric multiprocessor environment like the SGI platform and each of these had to be addressed. Once all the technical issues associated with the transition were resolved, the Atmospheric Science Data Center had a MISR science data processing system with significantly higher throughput than the SGI platform at a fraction of the cost. In addition to the commodity hardware, free and open source software such as S4PM, Sun Grid Engine, PostgreSQL and Ganglia play a significant role in the new system. Details of the technical challenges and resolutions, software systems, performance improvements, and cost savings associated with the transition will be discussed. The Atmospheric Science Data Center in Langley's Science Directorate leads NASA's program for the processing, archival and distribution of Earth science data in the areas of radiation budget, clouds, aerosols, and tropospheric chemistry. The Data Center was established in 1991 to support NASA's Earth Observing System and the U.S. Global Change Research Program. It is unique among NASA data centers in the size of its archive, cutting edge computing technology, and full range of data services. For more information regarding ASDC data holdings, documentation, tools and services, visit http://eosweb.larc.nasa.gov

  18. An Analysis of Cloud Computing with Amazon Web Services for the Atmospheric Science Data Center

    NASA Astrophysics Data System (ADS)

    Gleason, J. L.; Little, M. M.

    2013-12-01

    NASA science and engineering efforts rely heavily on compute and data handling systems. The nature of NASA science data is such that it is not restricted to NASA users, instead it is widely shared across a globally distributed user community including scientists, educators, policy decision makers, and the public. Therefore NASA science computing is a candidate use case for cloud computing where compute resources are outsourced to an external vendor. Amazon Web Services (AWS) is a commercial cloud computing service developed to use excess computing capacity at Amazon, and potentially provides an alternative to costly and potentially underutilized dedicated acquisitions whenever NASA scientists or engineers require additional data processing. AWS desires to provide a simplified avenue for NASA scientists and researchers to share large, complex data sets with external partners and the public. AWS has been extensively used by JPL for a wide range of computing needs and was previously tested on a NASA Agency basis during the Nebula testing program. Its ability to support the Langley Science Directorate needs to be evaluated by integrating it with real world operational needs across NASA and the associated maturity that would come with that. The strengths and weaknesses of this architecture and its ability to support general science and engineering applications has been demonstrated during the previous testing. The Langley Office of the Chief Information Officer in partnership with the Atmospheric Sciences Data Center (ASDC) has established a pilot business interface to utilize AWS cloud computing resources on a organization and project level pay per use model. This poster discusses an effort to evaluate the feasibility of the pilot business interface from a project level perspective by specifically using a processing scenario involving the Clouds and Earth's Radiant Energy System (CERES) project.

  19. The Student/Library Computer Science Collaborative

    ERIC Educational Resources Information Center

    Hahn, Jim

    2015-01-01

    With funding from an Institute of Museum and Library Services demonstration grant, librarians of the Undergraduate Library at the University of Illinois at Urbana-Champaign partnered with students in computer science courses to design and build student-centered mobile apps. The grant work called for demonstration of student collaboration…

  20. Introduction to USRA

    NASA Technical Reports Server (NTRS)

    Davis, M. H. (Editor); Singy, A. (Editor)

    1994-01-01

    The Universities Space Research Association (USRA) was incorporated 25 years ago in the District of Columbia as a private nonprofit corporation under the auspices of the National Academy of Sciences. Institutional membership in the association has grown from 49 colleges and universities, when it was founded, to 76 in 1993. USRA provides a mechanism through which universities can cooperate effectively with one another, with the government, and with other organizations to further space science and technology and to promote education in these areas. Its mission is carried out through the institutes, centers, divisions, and programs that are described in detail in this booklet. These include the Lunar and Planetary Institute, the Institute for Computer Applications in Science and Engineering (ICASE), the Research Institute for Advanced Computer Science (RIACS), and the Center of Excellence in Space Data and Information Sciences (CESDIS).

  1. Space and Earth Sciences, Computer Systems, and Scientific Data Analysis Support, Volume 1

    NASA Technical Reports Server (NTRS)

    Estes, Ronald H. (Editor)

    1993-01-01

    This Final Progress Report covers the specific technical activities of Hughes STX Corporation for the last contract triannual period of 1 June through 30 Sep. 1993, in support of assigned task activities at Goddard Space Flight Center (GSFC). It also provides a brief summary of work throughout the contract period of performance on each active task. Technical activity is presented in Volume 1, while financial and level-of-effort data is presented in Volume 2. Technical support was provided to all Division and Laboratories of Goddard's Space Sciences and Earth Sciences Directorates. Types of support include: scientific programming, systems programming, computer management, mission planning, scientific investigation, data analysis, data processing, data base creation and maintenance, instrumentation development, and management services. Mission and instruments supported include: ROSAT, Astro-D, BBXRT, XTE, AXAF, GRO, COBE, WIND, UIT, SMM, STIS, HEIDI, DE, URAP, CRRES, Voyagers, ISEE, San Marco, LAGEOS, TOPEX/Poseidon, Pioneer-Venus, Galileo, Cassini, Nimbus-7/TOMS, Meteor-3/TOMS, FIFE, BOREAS, TRMM, AVHRR, and Landsat. Accomplishments include: development of computing programs for mission science and data analysis, supercomputer applications support, computer network support, computational upgrades for data archival and analysis centers, end-to-end management for mission data flow, scientific modeling and results in the fields of space and Earth physics, planning and design of GSFC VO DAAC and VO IMS, fabrication, assembly, and testing of mission instrumentation, and design of mission operations center.

  2. Template Interfaces for Agile Parallel Data-Intensive Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramakrishnan, Lavanya; Gunter, Daniel; Pastorello, Gilerto Z.

    Tigres provides a programming library to compose and execute large-scale data-intensive scientific workflows from desktops to supercomputers. DOE User Facilities and large science collaborations are increasingly generating large enough data sets that it is no longer practical to download them to a desktop to operate on them. They are instead stored at centralized compute and storage resources such as high performance computing (HPC) centers. Analysis of this data requires an ability to run on these facilities, but with current technologies, scaling an analysis to an HPC center and to a large data set is difficult even for experts. Tigres ismore » addressing the challenge of enabling collaborative analysis of DOE Science data through a new concept of reusable "templates" that enable scientists to easily compose, run and manage collaborative computational tasks. These templates define common computation patterns used in analyzing a data set.« less

  3. Development of Interactive Computer Programs To Help Students Transfer Basic Skills to College Level Science and Behavioral Science Courses.

    ERIC Educational Resources Information Center

    Mikulecky, Larry

    Interactive computer programs, developed at Indiana University's Learning Skills Center, were designed to model effective strategies for reading biology and psychology textbooks. For each subject area, computer programs and textbook passages were used to instruct and model for students how to identify key concepts, compare and contrast concepts,…

  4. Bayesian Research at the NASA Ames Research Center,Computational Sciences Division

    NASA Technical Reports Server (NTRS)

    Morris, Robin D.

    2003-01-01

    NASA Ames Research Center is one of NASA s oldest centers, having started out as part of the National Advisory Committee on Aeronautics, (NACA). The site, about 40 miles south of San Francisco, still houses many wind tunnels and other aviation related departments. In recent years, with the growing realization that space exploration is heavily dependent on computing and data analysis, its focus has turned more towards Information Technology. The Computational Sciences Division has expanded rapidly as a result. In this article, I will give a brief overview of some of the past and present projects with a Bayesian content. Much more than is described here goes on with the Division. The web pages at http://ic.arc. nasa.gov give more information on these, and the other Division projects.

  5. Center of Excellence in Space Data and Information Science, Year 9

    NASA Technical Reports Server (NTRS)

    Yesha, Yelena

    1997-01-01

    This report summarizes the range of computer science related activities undertaken by CESDIS(Center of Excellence in Space Data and Information Sciences) for NASA in the twelve months from July 1, 1996 through June 30, 1997. These activities address issues related to accessing, processing, and analyzing data from space observing systems through collaborative efforts with university, industry, and NASA space and Earth scientists.

  6. Comprehensive report of aeropropulsion, space propulsion, space power, and space science applications of the Lewis Research Center

    NASA Technical Reports Server (NTRS)

    1988-01-01

    The research activities of the Lewis Research Center for 1988 are summarized. The projects included are within basic and applied technical disciplines essential to aeropropulsion, space propulsion, space power, and space science/applications. These disciplines are materials science and technology, structural mechanics, life prediction, internal computational fluid mechanics, heat transfer, instruments and controls, and space electronics.

  7. Use of PL/1 in a Bibliographic Information Retrieval System.

    ERIC Educational Resources Information Center

    Schipma, Peter B.; And Others

    The Information Sciences section of ITT Research Institute (IITRI) has developed a Computer Search Center and is currently conducting a research project to explore computer searching of a variety of machine-readable data bases. The Center provides Selective Dissemination of Information services to academic, industrial and research organizations…

  8. 4th Annual Conference for African-American Researchers in the Mathematical Sciences (CAARMS4). Preliminary Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tapia, Richard

    1998-06-01

    In June, The Center for Research on Parallel Computation (CRPC), an NSF-funded Science and Technology Center, hosted the 4th Annual Conference for African-American Reserachers in the Mathematical Sciences (CAARMS4) at Rice University. The main goal of this conference was to highlight current work by African-American researchers and graduate students in mathematics. This conference strengthened the mathematical sciences by encouraging the increased participation of African-American and underrepresented groups into the field, facilitating working relationships between them and helping to cultivate their careers. In addition to the talks there was a graduate student poster session and tutorials on topics in mathematics andmore » computer science. These talks, presentations, and discussions brought a broader perspective to the critical issues involving minority participation in mathematics.« less

  9. Data Serving Climate Simulation Science at the NASA Center for Climate Simulation

    NASA Technical Reports Server (NTRS)

    Salmon, Ellen M.

    2011-01-01

    The NASA Center for Climate Simulation (NCCS) provides high performance computational resources, a multi-petabyte archive, and data services in support of climate simulation research and other NASA-sponsored science. This talk describes the NCCS's data-centric architecture and processing, which are evolving in anticipation of researchers' growing requirements for higher resolution simulations and increased data sharing among NCCS users and the external science community.

  10. Science on a Sphere exhibit

    NASA Image and Video Library

    2009-03-31

    Students from Xavier University Preparatory School in New Orleans view the newest exhibit at StenniSphere, the visitor center at NASA's John C. Stennis Space Center - Science on a Sphere, a 68-inch global presentation of planetary data. StenniSphere is only the third NASA visitor center to offer the computer system, which uses four projectors to display data on a globe and present a dynamic, revolving, animated view of Earth and other planets.

  11. Science on a Sphere exhibit

    NASA Technical Reports Server (NTRS)

    2009-01-01

    Students from Xavier University Preparatory School in New Orleans view the newest exhibit at StenniSphere, the visitor center at NASA's John C. Stennis Space Center - Science on a Sphere, a 68-inch global presentation of planetary data. StenniSphere is only the third NASA visitor center to offer the computer system, which uses four projectors to display data on a globe and present a dynamic, revolving, animated view of Earth and other planets.

  12. Jackson State University's Center for Spatial Data Research and Applications: New facilities and new paradigms

    NASA Technical Reports Server (NTRS)

    Davis, Bruce E.; Elliot, Gregory

    1989-01-01

    Jackson State University recently established the Center for Spatial Data Research and Applications, a Geographical Information System (GIS) and remote sensing laboratory. Taking advantage of new technologies and new directions in the spatial (geographic) sciences, JSU is building a Center of Excellence in Spatial Data Management. New opportunities for research, applications, and employment are emerging. GIS requires fundamental shifts and new demands in traditional computer science and geographic training. The Center is not merely another computer lab but is one setting the pace in a new applied frontier. GIS and its associated technologies are discussed. The Center's facilities are described. An ARC/INFO GIS runs on a Vax mainframe, with numerous workstations. Image processing packages include ELAS, LIPS, VICAR, and ERDAS. A host of hardware and software peripheral are used in support. Numerous projects are underway, such as the construction of a Gulf of Mexico environmental data base, development of AI in image processing, a land use dynamics study of metropolitan Jackson, and others. A new academic interdisciplinary program in Spatial Data Management is under development, combining courses in Geography and Computer Science. The broad range of JSU's GIS and remote sensing activities is addressed. The impacts on changing paradigms in the university and in the professional world conclude the discussion.

  13. Computer Science Research at Langley

    NASA Technical Reports Server (NTRS)

    Voigt, S. J. (Editor)

    1982-01-01

    A workshop was held at Langley Research Center, November 2-5, 1981, to highlight ongoing computer science research at Langley and to identify additional areas of research based upon the computer user requirements. A panel discussion was held in each of nine application areas, and these are summarized in the proceedings. Slides presented by the invited speakers are also included. A survey of scientific, business, data reduction, and microprocessor computer users helped identify areas of focus for the workshop. Several areas of computer science which are of most concern to the Langley computer users were identified during the workshop discussions. These include graphics, distributed processing, programmer support systems and tools, database management, and numerical methods.

  14. Enabling campus grids with open science grid technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weitzel, Derek; Bockelman, Brian; Swanson, David

    2011-01-01

    The Open Science Grid is a recognized key component of the US national cyber-infrastructure enabling scientific discovery through advanced high throughput computing. The principles and techniques that underlie the Open Science Grid can also be applied to Campus Grids since many of the requirements are the same, even if the implementation technologies differ. We find five requirements for a campus grid: trust relationships, job submission, resource independence, accounting, and data management. The Holland Computing Center's campus grid at the University of Nebraska-Lincoln was designed to fulfill the requirements of a campus grid. A bridging daemon was designed to bring non-Condormore » clusters into a grid managed by Condor. Condor features which make it possible to bridge Condor sites into a multi-campus grid have been exploited at the Holland Computing Center as well.« less

  15. 78 FR 63946 - Fisheries of the Caribbean, Gulf of Mexico, and South Atlantic; Revisions to Headboat Reporting...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-25

    ... electronically (via computer or internet) on a weekly basis or at intervals shorter than a week if notified by the NMFS' Southeast Fisheries Science Center (SEFSC) Science and Research Director (SRD), and would... regulations to explicitly require that headboats submit their fishing information electronically (via computer...

  16. Educational Impact of Digital Visualization Tools on Digital Character Production Computer Science Courses

    ERIC Educational Resources Information Center

    van Langeveld, Mark Christensen

    2009-01-01

    Digital character production courses have traditionally been taught in art departments. The digital character production course at the University of Utah is centered, drawing uniformly from art and engineering disciplines. Its design has evolved to include a synergy of computer science, functional art and human anatomy. It gives students an…

  17. Facilitating Collegial Exchange among Science Teachers: An Experiment in Computer-Based Conferencing. Technical Report 86-14.

    ERIC Educational Resources Information Center

    Katz, Mary Maxwell; And Others

    Teacher isolation is a significant problem in the science teaching profession. Traditional inservice solutions are often plagued by logistical difficulties or occur too infrequently to build ongoing teacher networks. Educational Technology Center (ETC) researchers reasoned that computer-based conferencing might promote collegial exchange among…

  18. Comparing levels of school performance to science teachers' reports on knowledge/skills, instructional use and student use of computers

    NASA Astrophysics Data System (ADS)

    Kerr, Rebecca

    The purpose of this descriptive quantitative and basic qualitative study was to examine fifth and eighth grade science teachers' responses, perceptions of the role of technology in the classroom, and how they felt that computer applications, tools, and the Internet influence student understanding. The purposeful sample included survey and interview responses from fifth grade and eighth grade general and physical science teachers. Even though they may not be generalizable to other teachers or classrooms due to a low response rate, findings from this study indicated teachers with fewer years of teaching science had a higher level of computer use but less computer access, especially for students, in the classroom. Furthermore, teachers' choice of professional development moderated the relationship between the level of school performance and teachers' knowledge/skills, with the most positive relationship being with workshops that occurred outside of the school. Eighteen interviews revealed that teachers perceived the role of technology in classroom instruction mainly as teacher-centered and supplemental, rather than student-centered activities.

  19. CESDIS

    NASA Technical Reports Server (NTRS)

    1994-01-01

    CESDIS, the Center of Excellence in Space Data and Information Sciences was developed jointly by NASA, Universities Space Research Association (USRA), and the University of Maryland in 1988 to focus on the design of advanced computing techniques and data systems to support NASA Earth and space science research programs. CESDIS is operated by USRA under contract to NASA. The Director, Associate Director, Staff Scientists, and administrative staff are located on-site at NASA's Goddard Space Flight Center in Greenbelt, Maryland. The primary CESDIS mission is to increase the connection between computer science and engineering research programs at colleges and universities and NASA groups working with computer applications in Earth and space science. Research areas of primary interest at CESDIS include: 1) High performance computing, especially software design and performance evaluation for massively parallel machines; 2) Parallel input/output and data storage systems for high performance parallel computers; 3) Data base and intelligent data management systems for parallel computers; 4) Image processing; 5) Digital libraries; and 6) Data compression. CESDIS funds multiyear projects at U. S. universities and colleges. Proposals are accepted in response to calls for proposals and are selected on the basis of peer reviews. Funds are provided to support faculty and graduate students working at their home institutions. Project personnel visit Goddard during academic recess periods to attend workshops, present seminars, and collaborate with NASA scientists on research projects. Additionally, CESDIS takes on specific research tasks of shorter duration for computer science research requested by NASA Goddard scientists.

  20. The Center for Nanophase Materials Sciences

    NASA Astrophysics Data System (ADS)

    Lowndes, Douglas

    2005-03-01

    The Center for Nanophase Materials Sciences (CNMS) located at Oak Ridge National Laboratory (ORNL) will be the first DOE Nanoscale Science Research Center to begin operation, with construction to be completed in April 2005 and initial operations in October 2005. The CNMS' scientific program has been developed through workshops with the national community, with the goal of creating a highly collaborative research environment to accelerate discovery and drive technological advances. Research at the CNMS is organized under seven Scientific Themes selected to address challenges to understanding and to exploit particular ORNL strengths (see http://cnms.ornl.govhttp://cnms.ornl.gov). These include extensive synthesis and characterization capabilities for soft, hard, nanostructured, magnetic and catalytic materials and their composites; neutron scattering at the Spallation Neutron Source and High Flux Isotope Reactor; computational nanoscience in the CNMS' Nanomaterials Theory Institute and utilizing facilities and expertise of the Center for Computational Sciences and the new Leadership Scientific Computing Facility at ORNL; a new CNMS Nanofabrication Research Laboratory; and a suite of unique and state-of-the-art instruments to be made reliably available to the national community for imaging, manipulation, and properties measurements on nanoscale materials in controlled environments. The new research facilities will be described together with the planned operation of the user research program, the latter illustrated by the current ``jump start'' user program that utilizes existing ORNL/CNMS facilities.

  1. New Horizons Regional Education Center 1999 FIRST Robotics Competition

    NASA Technical Reports Server (NTRS)

    Purman, Richard I.

    1999-01-01

    The New Horizons Regional Education Center (NHREC) in Hampton, VA sought and received NASA funding to support its participation in the 1999 FIRST Robotics competition. FIRST, Inc. (For Inspiration and Recognition of Science and Technology) is an organization which encourages the application of creative science, math, and computer science principles to solve real-world engineering problems. The FIRST competition is an international engineering contest featuring high school, government, and business partnerships.

  2. New Horizons Regional Education Center 2001 FIRST Robotics Competition

    NASA Technical Reports Server (NTRS)

    2001-01-01

    The New Horizons Regional Education Center (NHREC) in Hampton, VA sought and received NASA funding to support its participation in the 2001 FIRST Robotics competition. FIRST, Inc. (For Inspiration and Recognition of Science and Technology) is an organization which encourages the application of creative science, math, and computer science principles to solve real-world engineering problems. The FIRST competition is an international engineering contest featuring high school, government, and business partnerships.

  3. Berkeley Lab - Materials Sciences Division

    Science.gov Websites

    Computational Study of Excited-State Phenomena in Energy Materials Center for X-ray Optics MSD Facilities Ion Facilities and Centers Staff Center for X-ray Optics Patrick Naulleau Director 510-486-4529 2-432 PNaulleau

  4. Exposure Science and the US EPA National Center for Computational Toxicology

    EPA Science Inventory

    The emerging field of computational toxicology applies mathematical and computer models and molecular biological and chemical approaches to explore both qualitative and quantitative relationships between sources of environmental pollutant exposure and adverse health outcomes. The...

  5. A Queue Simulation Tool for a High Performance Scientific Computing Center

    NASA Technical Reports Server (NTRS)

    Spear, Carrie; McGalliard, James

    2007-01-01

    The NASA Center for Computational Sciences (NCCS) at the Goddard Space Flight Center provides high performance highly parallel processors, mass storage, and supporting infrastructure to a community of computational Earth and space scientists. Long running (days) and highly parallel (hundreds of CPUs) jobs are common in the workload. NCCS management structures batch queues and allocates resources to optimize system use and prioritize workloads. NCCS technical staff use a locally developed discrete event simulation tool to model the impacts of evolving workloads, potential system upgrades, alternative queue structures and resource allocation policies.

  6. 15 CFR Supplement No. 4 to Part 748 - Authorities Administering Import Certificate/Delivery Verification (IC/DV) and End-User Statement...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 11011 Indian IC For the “organized” sector, except for computers and related equipment: Directorate... 110011 Indian IC For computers and related electronic items: Department of Electronics, Lok Nayak Bhawan... and Exports 5, Civic Center Islamabad IC Joint Science Advisor, Ministry of Science and Technology...

  7. 15 CFR Supplement No. 4 to Part 748 - Authorities Administering Import Certificate/Delivery Verification (IC/DV) and End-User Statement...

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 11011 Indian IC For the “organized” sector, except for computers and related equipment: Directorate... 110011 Indian IC For computers and related electronic items: Department of Electronics, Lok Nayak Bhawan... and Exports 5, Civic Center Islamabad IC Joint Science Advisor, Ministry of Science and Technology...

  8. 15 CFR Supplement No. 4 to Part 748 - Authorities Administering Import Certificate/Delivery Verification (IC/DV) and End-User Statement...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 11011 Indian IC For the “organized” sector, except for computers and related equipment: Directorate... 110011 Indian IC For computers and related electronic items: Department of Electronics, Lok Nayak Bhawan... and Exports 5, Civic Center Islamabad IC Joint Science Advisor, Ministry of Science and Technology...

  9. 15 CFR Supplement No. 4 to Part 748 - Authorities Administering Import Certificate/Delivery Verification (IC/DV) and End-User Statement...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 11011 Indian IC For the “organized” sector, except for computers and related equipment: Directorate... 110011 Indian IC For computers and related electronic items: Department of Electronics, Lok Nayak Bhawan... Controller of Imports and Exports 5, Civic Center Islamabad IC Joint Science Advisor, Ministry of Science and...

  10. 15 CFR Supplement No. 4 to Part 748 - Authorities Administering Import Certificate/Delivery Verification (IC/DV) and End-User Statement...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 11011 Indian IC For the “organized” sector, except for computers and related equipment: Directorate... 110011 Indian IC For computers and related electronic items: Department of Electronics, Lok Nayak Bhawan... Controller of Imports and Exports 5, Civic Center Islamabad IC Joint Science Advisor, Ministry of Science and...

  11. Women in Community College: Factors Related to Intentions to Pursue Computer Science

    ERIC Educational Resources Information Center

    Denner, Jill; Werner, Linda; O'Connor, Lisa

    2015-01-01

    Community colleges (CC) are obvious places to recruit more women into computer science. Enrollment at CCs has grown in response to a struggling economy, and students are more likely to be from underrepresented groups than students enrolled in 4-year universities (National Center for Education Statistics, 2008). However, we know little about why so…

  12. Public Dialogue on Science in Sweden.

    ERIC Educational Resources Information Center

    Dyring, Annagreta

    1988-01-01

    Explains how Sweden has proceeded to popularize science. Addresses topics dealing with policy, the energy debate, booklets with large circulation, computers and society, contacts between schools and research, building up small science centers, mass media, literary quality, children's responsibility, and some of the challenges. (RT)

  13. Pioneering University/Industry Venture Explores VLSI Frontiers.

    ERIC Educational Resources Information Center

    Davis, Dwight B.

    1983-01-01

    Discusses industry-sponsored programs in semiconductor research, focusing on Stanford University's Center for Integrated Systems (CIS). CIS, while pursuing research in semiconductor very-large-scale integration, is merging the fields of computer science, information science, and physical science. Issues related to these university/industry…

  14. Investigating Impact Metrics for Performance for the US EPA National Center for Computational Toxicology (ACS Fall meeting)

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Computational Toxicology Program integrates advances in biology, chemistry, and computer science to help prioritize chemicals for further research based on potential human health risks. This work involves computational and data drive...

  15. Mass Storage System Upgrades at the NASA Center for Computational Sciences

    NASA Technical Reports Server (NTRS)

    Tarshish, Adina; Salmon, Ellen; Macie, Medora; Saletta, Marty

    2000-01-01

    The NASA Center for Computational Sciences (NCCS) provides supercomputing and mass storage services to over 1200 Earth and space scientists. During the past two years, the mass storage system at the NCCS went through a great deal of changes both major and minor. Tape drives, silo control software, and the mass storage software itself were upgraded, and the mass storage platform was upgraded twice. Some of these upgrades were aimed at achieving year-2000 compliance, while others were simply upgrades to newer and better technologies. In this paper we will describe these upgrades.

  16. Optic Glomeruli: Biological Circuits that Compute Target Identity

    DTIC Science & Technology

    2013-11-01

    vitripennis. Insect Mol. Biol. Suppl. 1:121-36. Strausfeld NJ. 2012. Arthropod Brains. Evolution , Functional Elegance and Historical Significance. Harvard...Neuroscience and Center for Insect Science University of Arizona Tucson, AZ 85721 Contract No. FA8651-10-1-0001 November 2013 FINAL REPORT...PERFORMING ORGANIZATION REPORT NUMBER Department of Neuroscience and Center for Insect Science University of Arizona Tucson, AZ 85721

  17. Does Cloud Computing in the Atmospheric Sciences Make Sense? A case study of hybrid cloud computing at NASA Langley Research Center

    NASA Astrophysics Data System (ADS)

    Nguyen, L.; Chee, T.; Minnis, P.; Spangenberg, D.; Ayers, J. K.; Palikonda, R.; Vakhnin, A.; Dubois, R.; Murphy, P. R.

    2014-12-01

    The processing, storage and dissemination of satellite cloud and radiation products produced at NASA Langley Research Center are key activities for the Climate Science Branch. A constellation of systems operates in sync to accomplish these goals. Because of the complexity involved with operating such intricate systems, there are both high failure rates and high costs for hardware and system maintenance. Cloud computing has the potential to ameliorate cost and complexity issues. Over time, the cloud computing model has evolved and hybrid systems comprising off-site as well as on-site resources are now common. Towards our mission of providing the highest quality research products to the widest audience, we have explored the use of the Amazon Web Services (AWS) Cloud and Storage and present a case study of our results and efforts. This project builds upon NASA Langley Cloud and Radiation Group's experience with operating large and complex computing infrastructures in a reliable and cost effective manner to explore novel ways to leverage cloud computing resources in the atmospheric science environment. Our case study presents the project requirements and then examines the fit of AWS with the LaRC computing model. We also discuss the evaluation metrics, feasibility, and outcomes and close the case study with the lessons we learned that would apply to others interested in exploring the implementation of the AWS system in their own atmospheric science computing environments.

  18. Final Technical Progress Report; Closeout Certifications; CSSV Newsletter Volume I; CSSV Newsletter Volume II; CSSV Activity Journal; CSSV Final Financial Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Houston, Johnny L; Geter, Kerry

    This Project?s third year of implementation in 2007-2008, the final year, as designated by Elizabeth City State University (ECSU), in cooperation with the National Association of Mathematicians (NAM) Inc., in an effort to promote research and research training programs in computational science ? scientific visualization (CSSV). A major goal of the Project was to attract the energetic and productive faculty, graduate and upper division undergraduate students of diverse ethnicities to a program that investigates science and computational science issues of long-term interest to the Department of Energy (DoE) and the nation. The breadth and depth of computational science?scientific visualization andmore » the magnitude of resources available are enormous for permitting a variety of research activities. ECSU?s Computational Science-Science Visualization Center will serve as a conduit for directing users to these enormous resources.« less

  19. Delivering an Informational Hub for Data at the National Center for Computational Toxicology (ACS Spring Meeting) 7 of 7

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Computational Toxicology Program integrates advances in biology, chemistry, and computer science to help prioritize chemicals for further research based on potential human health risks. This work involves computational and data drive...

  20. Computers, Networks, and Desegregation at San Jose High Academy.

    ERIC Educational Resources Information Center

    Solomon, Gwen

    1987-01-01

    Describes magnet high school which was created in California to meet desegregation requirements and emphasizes computer technology. Highlights include local computer networks that connect science and music labs, the library/media center, business computer lab, writing lab, language arts skills lab, and social studies classrooms; software; teacher…

  1. ASCR Cybersecurity for Scientific Computing Integrity - Research Pathways and Ideas Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peisert, Sean; Potok, Thomas E.; Jones, Todd

    At the request of the U.S. Department of Energy's (DOE) Office of Science (SC) Advanced Scientific Computing Research (ASCR) program office, a workshop was held June 2-3, 2015, in Gaithersburg, MD, to identify potential long term (10 to +20 year) cybersecurity fundamental basic research and development challenges, strategies and roadmap facing future high performance computing (HPC), networks, data centers, and extreme-scale scientific user facilities. This workshop was a follow-on to the workshop held January 7-9, 2015, in Rockville, MD, that examined higher level ideas about scientific computing integrity specific to the mission of the DOE Office of Science. Issues includedmore » research computation and simulation that takes place on ASCR computing facilities and networks, as well as network-connected scientific instruments, such as those run by various DOE Office of Science programs. Workshop participants included researchers and operational staff from DOE national laboratories, as well as academic researchers and industry experts. Participants were selected based on the submission of abstracts relating to the topics discussed in the previous workshop report [1] and also from other ASCR reports, including "Abstract Machine Models and Proxy Architectures for Exascale Computing" [27], the DOE "Preliminary Conceptual Design for an Exascale Computing Initiative" [28], and the January 2015 machine learning workshop [29]. The workshop was also attended by several observers from DOE and other government agencies. The workshop was divided into three topic areas: (1) Trustworthy Supercomputing, (2) Extreme-Scale Data, Knowledge, and Analytics for Understanding and Improving Cybersecurity, and (3) Trust within High-end Networking and Data Centers. Participants were divided into three corresponding teams based on the category of their abstracts. The workshop began with a series of talks from the program manager and workshop chair, followed by the leaders for each of the three topics and a representative of each of the four major DOE Office of Science Advanced Scientific Computing Research Facilities: the Argonne Leadership Computing Facility (ALCF), the Energy Sciences Network (ESnet), the National Energy Research Scientific Computing Center (NERSC), and the Oak Ridge Leadership Computing Facility (OLCF). The rest of the workshop consisted of topical breakout discussions and focused writing periods that produced much of this report.« less

  2. The growth of the UniTree mass storage system at the NASA Center for Computational Sciences

    NASA Technical Reports Server (NTRS)

    Tarshish, Adina; Salmon, Ellen

    1993-01-01

    In October 1992, the NASA Center for Computational Sciences made its Convex-based UniTree system generally available to users. The ensuing months saw the growth of near-online data from nil to nearly three terabytes, a doubling of the number of CPU's on the facility's Cray YMP (the primary data source for UniTree), and the necessity for an aggressive regimen for repacking sparse tapes and hierarchical 'vaulting' of old files to freestanding tape. Connectivity was enhanced as well with the addition of UltraNet HiPPI. This paper describes the increasing demands placed on the storage system's performance and throughput that resulted from the significant augmentation of compute-server processor power and network speed.

  3. A Comprehensive Review of Computer Science and Data Processing Education in Community Colleges and Area Vocational-Technical Centers.

    ERIC Educational Resources Information Center

    Florida State Community Coll. Coordinating Board, Tallahassee.

    In 1987-88, the Florida State Board of Community Colleges and the Division of Vocational, Adult, and Community Education jointly conducted a review of instructional programs in computer science and data processing in order to determine needs for state policy changes and funding priorities. The process involved a review of printed resources on…

  4. RIACS

    NASA Technical Reports Server (NTRS)

    Moore, Robert C.

    1998-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities that serves as a bridge between NASA and the academic community. Under a five-year co-operative agreement with NASA, research at RIACS is focused on areas that are strategically enabling to the Ames Research Center's role as NASA's Center of Excellence for Information Technology. The primary mission of RIACS is charted to carry out research and development in computer science. This work is devoted in the main to tasks that are strategically enabling with respect to NASA's bold mission in space exploration and aeronautics. There are three foci for this work: (1) Automated Reasoning. (2) Human-Centered Computing. and (3) High Performance Computing and Networking. RIACS has the additional goal of broadening the base of researcher in these areas of importance to the nation's space and aeronautics enterprises. Through its visiting scientist program, RIACS facilitates the participation of university-based researchers, including both faculty and students, in the research activities of NASA and RIACS. RIACS researchers work in close collaboration with NASA computer scientists on projects such as the Remote Agent Experiment on Deep Space One mission, and Super-Resolution Surface Modeling.

  5. ENVIRONMENTAL STATISTICS INITIATIVE

    EPA Science Inventory

    EPA's Center of Excellence (COE) for Environmental Computational Science is intended to integrate cutting-edge science and emerging information technology (IT) solutions for input to the decision-making process. Complementing the research goals of EPA's COE, the NERL has initiat...

  6. Student teaching and research laboratory focusing on brain-computer interface paradigms--A creative environment for computer science students.

    PubMed

    Rutkowski, Tomasz M

    2015-08-01

    This paper presents an applied concept of a brain-computer interface (BCI) student research laboratory (BCI-LAB) at the Life Science Center of TARA, University of Tsukuba, Japan. Several successful case studies of the student projects are reviewed together with the BCI Research Award 2014 winner case. The BCI-LAB design and project-based teaching philosophy is also explained. Future teaching and research directions summarize the review.

  7. Johnson Space Center Research and Technology 1997 Annual Report

    NASA Technical Reports Server (NTRS)

    1998-01-01

    This report highlights key projects and technologies at Johnson Space Center for 1997. The report focuses on the commercial potential of the projects and technologies and is arranged by CorpTech Major Products Groups. Emerging technologies in these major disciplines we summarized: solar system sciences, life sciences, technology transfer, computer sciences, space technology, and human support technology. Them NASA advances have a range of potential commercial applications, from a school internet manager for networks to a liquid metal mirror for optical measurements.

  8. Computer science: Key to a space program renaissance. The 1981 NASA/ASEE summer study on the use of computer science and technology in NASA. Volume 2: Appendices

    NASA Technical Reports Server (NTRS)

    Freitas, R. A., Jr. (Editor); Carlson, P. A. (Editor)

    1983-01-01

    Adoption of an aggressive computer science research and technology program within NASA will: (1) enable new mission capabilities such as autonomous spacecraft, reliability and self-repair, and low-bandwidth intelligent Earth sensing; (2) lower manpower requirements, especially in the areas of Space Shuttle operations, by making fuller use of control center automation, technical support, and internal utilization of state-of-the-art computer techniques; (3) reduce project costs via improved software verification, software engineering, enhanced scientist/engineer productivity, and increased managerial effectiveness; and (4) significantly improve internal operations within NASA with electronic mail, managerial computer aids, an automated bureaucracy and uniform program operating plans.

  9. RIACS FY2002 Annual Report

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.; Gross, Anthony R. (Technical Monitor)

    2002-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. Operated by the Universities Space Research Association (a non-profit university consortium), RIACS is located at the NASA Ames Research Center, Moffett Field, California. It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in September 2003. Ames has been designated NASA's Center of Excellence in Information Technology. In this capacity, Ames is charged with the responsibility to build an Information Technology (IT) Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA Ames and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of IT research necessary to meet the future challenges of NASA missions: 1) Automated Reasoning for Autonomous Systems; 2) Human-Centered Computing; and 3) High Performance Computing and Networking. In addition, RIACS collaborates with NASA scientists to apply IT research to a variety of NASA application domains including aerospace technology, earth science, life sciences, and astrobiology. RIACS also engages in other activities, such as workshops, seminars, visiting scientist programs and student summer programs, designed to encourage and facilitate collaboration between the university and NASA IT research communities.

  10. Steve Hammond | NREL

    Science.gov Websites

    Hammond Photo of Steven Hammond Steve Hammond Center Director II-Technical Steven.Hammond@nrel.gov | 303-275-4121 Steve Hammond is director of the Computational Science Center at the National Renewable includes leading NREL's efforts in energy efficient data centers. Prior to NREL, Steve managed the

  11. Nontrivial, Nonintelligent, Computer-Based Learning.

    ERIC Educational Resources Information Center

    Bork, Alfred

    1987-01-01

    This paper describes three interactive computer programs used with personal computers to present science learning modules for all ages. Developed by groups of teachers at the Educational Technology Center at the University of California, Irvine, these instructional materials do not use the techniques of contemporary artificial intelligence. (GDC)

  12. Science Information System in Japan. NIER Occasional Paper 02/83.

    ERIC Educational Resources Information Center

    Matsumura, Tamiko

    This paper describes the development of a proposed Japanese Science Information System (SIS), a nationwide network of research and academic libraries, large-scale computer centers, national research institutes, and other organizations, to be formed for the purpose of sharing information and resources in the natural sciences, technology, the…

  13. Exploring the role of pendant amines in transition metal complexes for the reduction of N2 to hydrazine and ammonia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhattacharya, Papri; Prokopchuk, Demyan E.; Mock, Michael T.

    2017-03-01

    This review examines the synthesis and acid reactivity of transition metal dinitrogen complexes bearing diphosphine ligands containing pendant amine groups in the second coordination sphere. This manuscript is a review of the work performed in the Center for Molecular Electrocatalysis. This work was supported as part of the Center for Molecular Electrocatalysis, an Energy Frontier Research Center funded by the U.S. Department of Energy (U.S. DOE), Office of Science, Office of Basic Energy Sciences. EPR studies on Fe were performed using EMSL, a national scientific user facility sponsored by the DOE’s Office of Biological and Environmental Research and located atmore » PNNL. Computational resources were provided by the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory. Pacific Northwest National Laboratory is operated by Battelle for the U.S. DOE.« less

  14. Gigascale Silicon Research Center for Design and Test

    DTIC Science & Technology

    2000-01-07

    students Kanna Shimizu and Chris Wilson participated in a meeting at Intel hosted by Mani Azimi, with Moenes, Ching-Tsun, Fred Rastgar, and Mani...Prof. David Dill Researchers: Kanna Shimizu Bus specifications are currently informal, resulting in ambiguities and inconsistencies. We’ve been...Expected Graduation: 6/1/2000 Advisor: Dill Last Name: Shimizu First Name: Kanna Work Address: Department of Computer Science, Gates Computer Science

  15. Removing the center from computing: biology's new mode of digital knowledge production.

    PubMed

    November, Joseph

    2011-06-01

    This article shows how the USA's National Institutes of Health (NIH) helped to bring about a major shift in the way computers are used to produce knowledge and in the design of computers themselves as a consequence of its early 1960s efforts to introduce information technology to biologists. Starting in 1960 the NIH sought to reform the life sciences by encouraging researchers to make use of digital electronic computers, but despite generous federal support biologists generally did not embrace the new technology. Initially the blame fell on biologists' lack of appropriate (i.e. digital) data for computers to process. However, when the NIH consulted MIT computer architect Wesley Clark about this problem, he argued that the computer's quality as a device that was centralized posed an even greater challenge to potential biologist users than did the computer's need for digital data. Clark convinced the NIH that if the agency hoped to effectively computerize biology, it would need to satisfy biologists' experimental and institutional needs by providing them the means to use a computer without going to a computing center. With NIH support, Clark developed the 1963 Laboratory Instrument Computer (LINC), a small, real-time interactive computer intended to be used inside the laboratory and controlled entirely by its biologist users. Once built, the LINC provided a viable alternative to the 1960s norm of large computers housed in computing centers. As such, the LINC not only became popular among biologists, but also served in later decades as an important precursor of today's computing norm in the sciences and far beyond, the personal computer.

  16. Arctic Boreal Vulnerability Experiment (ABoVE) Science Cloud

    NASA Astrophysics Data System (ADS)

    Duffy, D.; Schnase, J. L.; McInerney, M.; Webster, W. P.; Sinno, S.; Thompson, J. H.; Griffith, P. C.; Hoy, E.; Carroll, M.

    2014-12-01

    The effects of climate change are being revealed at alarming rates in the Arctic and Boreal regions of the planet. NASA's Terrestrial Ecology Program has launched a major field campaign to study these effects over the next 5 to 8 years. The Arctic Boreal Vulnerability Experiment (ABoVE) will challenge scientists to take measurements in the field, study remote observations, and even run models to better understand the impacts of a rapidly changing climate for areas of Alaska and western Canada. The NASA Center for Climate Simulation (NCCS) at the Goddard Space Flight Center (GSFC) has partnered with the Terrestrial Ecology Program to create a science cloud designed for this field campaign - the ABoVE Science Cloud. The cloud combines traditional high performance computing with emerging technologies to create an environment specifically designed for large-scale climate analytics. The ABoVE Science Cloud utilizes (1) virtualized high-speed InfiniBand networks, (2) a combination of high-performance file systems and object storage, and (3) virtual system environments tailored for data intensive, science applications. At the center of the architecture is a large object storage environment, much like a traditional high-performance file system, that supports data proximal processing using technologies like MapReduce on a Hadoop Distributed File System (HDFS). Surrounding the storage is a cloud of high performance compute resources with many processing cores and large memory coupled to the storage through an InfiniBand network. Virtual systems can be tailored to a specific scientist and provisioned on the compute resources with extremely high-speed network connectivity to the storage and to other virtual systems. In this talk, we will present the architectural components of the science cloud and examples of how it is being used to meet the needs of the ABoVE campaign. In our experience, the science cloud approach significantly lowers the barriers and risks to organizations that require high performance computing solutions and provides the NCCS with the agility required to meet our customers' rapidly increasing and evolving requirements.

  17. Symposium on the Interface: Computing Science and Statistics (20th). Theme: Computationally Intensive Methods in Statistics Held in Reston, Virginia on April 20-23, 1988

    DTIC Science & Technology

    1988-08-20

    34 William A. Link, Patuxent Wildlife Research Center "Increasing reliability of multiversion fault-tolerant software design by modulation," Junryo 3... Multiversion lault-Tolerant Software Design by Modularization Junryo Miyashita Department of Computer Science California state University at san Bernardino Fault...They shall beE refered to as " multiversion fault-tolerant software design". Onel problem of developing multi-versions of a program is the high cost

  18. Center for Building Science: Annual report, FY 1986

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cairns, E.J.; Rosenfeld, A.H.

    1987-05-01

    The Center for Building Science consists of four programs in the Applied Science Division: energy analysis, buildings energy systems, windows and lighting, and indoor environment. It was established to provide an umbrella so that goups in different programs but with similar interests could combine to perform joint research, develop new research areas, share resources, and produce joint publications. As detailed below, potential savings for the U.S. society from energy efficient buildings are enormous. But these savings can only be realized through an expanding federal RandD program that develops expertise in this new area. The Center for Building Science develops efficientmore » new building componenets, computer models, data and information systems, and trains needed builidng scientists. 135 refs., 72 figs., 18 tabs.« less

  19. New frontiers in design synthesis

    NASA Technical Reports Server (NTRS)

    Goldin, D. S.; Venneri, S. L.; Noor, A. K.

    1999-01-01

    The Intelligent Synthesis Environment (ISE), which is one of the major strategic technologies under development at NASA centers and the University of Virginia, is described. One of the major objectives of ISE is to significantly enhance the rapid creation of innovative affordable products and missions. ISE uses a synergistic combination of leading-edge technologies, including high performance computing, high capacity communications and networking, human-centered computing, knowledge-based engineering, computational intelligence, virtual product development, and product information management. The environment will link scientists, design teams, manufacturers, suppliers, and consultants who participate in the mission synthesis as well as in the creation and operation of the aerospace system. It will radically advance the process by which complex science missions are synthesized, and high-tech engineering Systems are designed, manufactured and operated. The five major components critical to ISE are human-centered computing, infrastructure for distributed collaboration, rapid synthesis and simulation tools, life cycle integration and validation, and cultural change in both the engineering and science creative process. The five components and their subelements are described. Related U.S. government programs are outlined and the future impact of ISE on engineering research and education is discussed.

  20. Integrating Laptop Computers into Classroom: Attitudes, Needs, and Professional Development of Science Teachers—A Case Study

    NASA Astrophysics Data System (ADS)

    Klieger, Aviva; Ben-Hur, Yehuda; Bar-Yossef, Nurit

    2010-04-01

    The study examines the professional development of junior-high-school teachers participating in the Israeli "Katom" (Computer for Every Class, Student and Teacher) Program, begun in 2004. A three-circle support and training model was developed for teachers' professional development. The first circle applies to all teachers in the program; the second, to all teachers at individual schools; the third to teachers of specific disciplines. The study reveals and describes the attitudes of science teachers to the integration of laptop computers and to the accompanying professional development model. Semi-structured interviews were conducted with eight science teachers from the four schools participating in the program. The interviews were analyzed according to the internal relational framework taken from the information that arose from the interviews. Two factors influenced science teachers' professional development: (1) Introduction of laptops to the teachers and students. (2) The support and training system. Interview analysis shows that the disciplinary training is most relevant to teachers and they are very interested in belonging to the professional science teachers' community. They also prefer face-to-face meetings in their school. Among the difficulties they noted were the new learning environment, including control of student computers, computer integration in laboratory work and technical problems. Laptop computers contributed significantly to teachers' professional and personal development and to a shift from teacher-centered to student-centered teaching. One-to-One laptops also changed the schools' digital culture. The findings are important for designing concepts and models for professional development when introducing technological innovation into the educational system.

  1. 2009 ALCF annual report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beckman, P.; Martin, D.; Drugan, C.

    2010-11-23

    This year the Argonne Leadership Computing Facility (ALCF) delivered nearly 900 million core hours of science. The research conducted at their leadership class facility touched our lives in both minute and massive ways - whether it was studying the catalytic properties of gold nanoparticles, predicting protein structures, or unearthing the secrets of exploding stars. The authors remained true to their vision to act as the forefront computational center in extending science frontiers by solving pressing problems for our nation. Our success in this endeavor was due mainly to the Department of Energy's (DOE) INCITE (Innovative and Novel Computational Impact onmore » Theory and Experiment) program. The program awards significant amounts of computing time to computationally intensive, unclassified research projects that can make high-impact scientific advances. This year, DOE allocated 400 million hours of time to 28 research projects at the ALCF. Scientists from around the world conducted the research, representing such esteemed institutions as the Princeton Plasma Physics Laboratory, National Institute of Standards and Technology, and European Center for Research and Advanced Training in Scientific Computation. Argonne also provided Director's Discretionary allocations for research challenges, addressing such issues as reducing aerodynamic noise, critical for next-generation 'green' energy systems. Intrepid - the ALCF's 557-teraflops IBM Blue/Gene P supercomputer - enabled astounding scientific solutions and discoveries. Intrepid went into full production five months ahead of schedule. As a result, the ALCF nearly doubled the days of production computing available to the DOE Office of Science, INCITE awardees, and Argonne projects. One of the fastest supercomputers in the world for open science, the energy-efficient system uses about one-third as much electricity as a machine of comparable size built with more conventional parts. In October 2009, President Barack Obama recognized the excellence of the entire Blue Gene series by awarding it to the National Medal of Technology and Innovation. Other noteworthy achievements included the ALCF's collaboration with the National Energy Research Scientific Computing Center (NERSC) to examine cloud computing as a potential new computing paradigm for scientists. Named Magellan, the DOE-funded initiative will explore which science application programming models work well within the cloud, as well as evaluate the challenges that come with this new paradigm. The ALCF obtained approval for its next-generation machine, a 10-petaflops system to be delivered in 2012. This system will allow us to resolve ever more pressing problems, even more expeditiously through breakthrough science in the years to come.« less

  2. [Activities of Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2001-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administrations missions. RIACS is located at the NASA Ames Research Center, Moffett Field, California. RIACS research focuses on the three cornerstones of IT research necessary to meet the future challenges of NASA missions: 1. Automated Reasoning for Autonomous Systems Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. 2. Human-Centered Computing Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities. 3. High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to analysis of large scientific datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply IT research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, visiting scientist programs and student summer programs, designed to encourage and facilitate collaboration between the university and NASA IT research communities.

  3. Research Networks and Technology Migration (RESNETSII)

    DTIC Science & Technology

    2004-07-01

    Laboratory (LBNL), The International Computer Science Institute (ICSI) Center for Internet Research (ICIR) DARWIN Developing protocols and...degradation in network loss, delay and throughput AT&T Center for Internet Research at ICSI (ACIRI), AT&T Labs-Research, University Of Massachusetts

  4. Iceland: Eyjafjallajökull Volcano

    Atmospheric Science Data Center

    2013-04-17

    ... erroneous impression that they are below the land surface. A quantitative computer analysis is necessary to separate out wind and height. ... MD. The MISR data were obtained from the NASA Langley Research Center Atmospheric Science Data Center in Hampton, VA. Image ...

  5. Iceland: Eyjafjallajökull Volcano

    Atmospheric Science Data Center

    2013-04-17

    ... causes motion of the plume features between camera views. A quantitative computer analysis is necessary to separate out wind and height ... MD. The MISR data were obtained from the NASA Langley Research Center Atmospheric Science Data Center in Hampton, VA. Image ...

  6. Delivering The Benefits of Chemical-Biological Integration in Computational Toxicology at the EPA (ACS Fall meeting)

    EPA Science Inventory

    Abstract: Researchers at the EPA’s National Center for Computational Toxicology integrate advances in biology, chemistry, and computer science to examine the toxicity of chemicals and help prioritize chemicals for further research based on potential human health risks. The intent...

  7. Introduction to the theory of machines and languages

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weidhaas, P. P.

    1976-04-01

    This text is intended to be an elementary ''guided tour'' through some basic concepts of modern computer science. Various models of computing machines and formal languages are studied in detail. Discussions center around questions such as, ''What is the scope of problems that can or cannot be solved by computers.''

  8. A parallel-processing approach to computing for the geographic sciences

    USGS Publications Warehouse

    Crane, Michael; Steinwand, Dan; Beckmann, Tim; Krpan, Greg; Haga, Jim; Maddox, Brian; Feller, Mark

    2001-01-01

    The overarching goal of this project is to build a spatially distributed infrastructure for information science research by forming a team of information science researchers and providing them with similar hardware and software tools to perform collaborative research. Four geographically distributed Centers of the U.S. Geological Survey (USGS) are developing their own clusters of low-cost personal computers into parallel computing environments that provide a costeffective way for the USGS to increase participation in the high-performance computing community. Referred to as Beowulf clusters, these hybrid systems provide the robust computing power required for conducting research into various areas, such as advanced computer architecture, algorithms to meet the processing needs for real-time image and data processing, the creation of custom datasets from seamless source data, rapid turn-around of products for emergency response, and support for computationally intense spatial and temporal modeling.

  9. RIACS

    NASA Technical Reports Server (NTRS)

    Moore, Robert C.

    1998-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities that serves as a bridge between NASA and the academic community. Under a five-year co-operative agreement with NASA, research at RIACS is focused on areas that are strategically enabling to the Ames Research Center's role as NASA's Center of Excellence for Information Technology. Research is carried out by a staff of full-time scientist,augmented by visitors, students, post doctoral candidates and visiting university faculty. The primary mission of RIACS is charted to carry out research and development in computer science. This work is devoted in the main to tasks that are strategically enabling with respect to NASA's bold mission in space exploration and aeronautics. There are three foci for this work: Automated Reasoning. Human-Centered Computing. and High Performance Computing and Networking. RIACS has the additional goal of broadening the base of researcher in these areas of importance to the nation's space and aeronautics enterprises. Through its visiting scientist program, RIACS facilitates the participation of university-based researchers, including both faculty and students, in the research activities of NASA and RIACS. RIACS researchers work in close collaboration with NASA computer scientists on projects such as the Remote Agent Experiment on Deep Space One mission, and Super-Resolution Surface Modeling.

  10. Computer Support for Knowledge Communication in Science Exhibitions: Novel Perspectives from Research on Collaborative Learning

    ERIC Educational Resources Information Center

    Knipfer, Kristin; Mayr, Eva; Zahn, Carmen; Schwan, Stephan; Hesse, Friedrich W.

    2009-01-01

    In this article, the potentials of advanced technologies for learning in science exhibitions are outlined. For this purpose, we conceptualize science exhibitions as "dynamic information space for knowledge building" which includes three pathways of knowledge communication. This article centers on the second pathway, that is, knowledge…

  11. Army Maneuver Center of Excellence

    DTIC Science & Technology

    2012-10-18

    agreements throughout DoD DARPA, JIEDDO, DHS, FAA, DoE, NSA , NASA, SMDC, etc. Strategic Partnerships Benefit the Army Materiel Enterprise External... Neuroscience Network Sciences Hierarchical Computing Extreme Energy Science Autonomous Systems Technology Emerging Sciences Meso-scale (grain...scales • Improvements in Soldier-system overall performance → operational neuroscience and advanced simulation and training technologies

  12. The U.S. "Tox21 Community" and the Future of Toxicology

    EPA Science Inventory

    In early 2008, the National Institute of Environmental Health Sciences/National Toxicology Program, the NIH Chemical Genomics Center, and the Environmental Protection Agency’s National Center for Computational Toxicology entered into a Memorandum of Understanding to collaborate o...

  13. Digital optical computers at the optoelectronic computing systems center

    NASA Technical Reports Server (NTRS)

    Jordan, Harry F.

    1991-01-01

    The Digital Optical Computing Program within the National Science Foundation Engineering Research Center for Opto-electronic Computing Systems has as its specific goal research on optical computing architectures suitable for use at the highest possible speeds. The program can be targeted toward exploiting the time domain because other programs in the Center are pursuing research on parallel optical systems, exploiting optical interconnection and optical devices and materials. Using a general purpose computing architecture as the focus, we are developing design techniques, tools and architecture for operation at the speed of light limit. Experimental work is being done with the somewhat low speed components currently available but with architectures which will scale up in speed as faster devices are developed. The design algorithms and tools developed for a general purpose, stored program computer are being applied to other systems such as optimally controlled optical communication networks.

  14. Activities of the Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Oliger, Joseph

    1994-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. Research at RIACS is currently being done in the following areas: (1) parallel computing; (2) advanced methods for scientific computing; (3) high performance networks; and (4) learning systems. RIACS technical reports are usually preprints of manuscripts that have been submitted to research journals or conference proceedings. A list of these reports for the period January 1, 1994 through December 31, 1994 is in the Reports and Abstracts section of this report.

  15. 3D Object Recognition: Symmetry and Virtual Views

    DTIC Science & Technology

    1992-12-01

    NAME(S) AND ADDRESS(ES) 8. PERFORMING ORGANIZATIONI Artificial Intelligence Laboratory REPORT NUMBER 545 Technology Square AIM 1409 Cambridge... ARTIFICIAL INTELLIGENCE LABORATORY and CENTER FOR BIOLOGICAL AND COMPUTATIONAL LEARNING A.I. Memo No. 1409 December 1992 C.B.C.L. Paper No. 76 3D Object...research done within the Center for Biological and Computational Learning in the Department of Brain and Cognitive Sciences, and at the Artificial

  16. Merging the Machines of Modern Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wolf, Laura; Collins, Jim

    Two recent projects have harnessed supercomputing resources at the US Department of Energy’s Argonne National Laboratory in a novel way to support major fusion science and particle collider experiments. Using leadership computing resources, one team ran fine-grid analysis of real-time data to make near-real-time adjustments to an ongoing experiment, while a second team is working to integrate Argonne’s supercomputers into the Large Hadron Collider/ATLAS workflow. Together these efforts represent a new paradigm of the high-performance computing center as a partner in experimental science.

  17. Research and Technology 1997

    NASA Technical Reports Server (NTRS)

    1998-01-01

    This report highlights the challenging work accomplished during fiscal year 1997 by Ames research scientists and engineers. The work is divided into accomplishments that support the goals of NASA s four Strategic Enterprises: Aeronautics and Space Transportation Technology, Space Science, Human Exploration and Development of Space (HEDS), and Earth Science. NASA Ames Research Center s research effort in the Space, Earth, and HEDS Enterprises is focused i n large part to support Ames lead role for Astrobiology, which broadly defined is the scientific study of the origin, distribution, and future of life in the universe. This NASA initiative in Astrobiology is a broad science effort embracing basic research, technology development, and flight missions. Ames contributions to the Space Science Enterprise are focused in the areas of exobiology, planetary systems, astrophysics, and space technology. Ames supports the Earth Science Enterprise by conducting research and by developing technology with the objective of expanding our knowledge of the Earth s atmosphere and ecosystems. Finallv, Ames supports the HEDS Enterprise by conducting research, managing spaceflight projects, and developing technologies. A key objective is to understand the phenomena surrounding the effects of gravity on living things. Ames has also heen designated the Agency s Center of Evcellence for Information Technnlogv. The three cornerstones of Information Technology research at Ames are automated reasoning, human-centered computing, and high performance computing and networking.

  18. ExpoCast: Exposure Science for Prioritization and Toxicity Testing (T)

    EPA Science Inventory

    The US EPA National Center for Computational Toxicology (NCCT) has a mission to integrate modern computing and information technology with molecular biology to improve Agency prioritization of data requirements and risk assessment of chemicals. Recognizing the critical need for ...

  19. Mass storage system experiences and future needs at the National Center for Atmospheric Research

    NASA Technical Reports Server (NTRS)

    Olear, Bernard T.

    1991-01-01

    A summary and viewgraphs of a discussion presented at the National Space Science Data Center (NSSDC) Mass Storage Workshop is included. Some of the experiences of the Scientific Computing Division at the National Center for Atmospheric Research (NCAR) dealing the the 'data problem' are discussed. A brief history and a development of some basic mass storage system (MSS) principles are given. An attempt is made to show how these principles apply to the integration of various components into NCAR's MSS. Future MSS needs for future computing environments is discussed.

  20. The Theme's The Thing!

    ERIC Educational Resources Information Center

    Zaidel, Lisa Brusman

    1991-01-01

    Presents suggestions to help elementary teachers organize learning centers and activities around the themes of Peter Rabbit (Grade 1), weather (Grade 3), and bees (Grade 5). Suggestions are given for activities in centers for listening/reading, language arts, computers, math, science, cooperative learning, research, and writing. (SM)

  1. Staff | Computational Science | NREL

    Science.gov Websites

    develops and leads laboratory-wide efforts in high-performance computing and energy-efficient data centers Professional IV-High Perf Computing Jim.Albin@nrel.gov 303-275-4069 Ananthan, Shreyas Senior Scientist - High -Performance Algorithms and Modeling Shreyas.Ananthan@nrel.gov 303-275-4807 Bendl, Kurt IT Professional IV-High

  2. High-Performance Computing Data Center | Computational Science | NREL

    Science.gov Websites

    liquid cooling to achieve its very low PUE, then captures and reuses waste heat as the primary heating dry cooler that uses refrigerant in a passive cycle to dissipate heat-is reducing onsite water Measuring efficiency through PUE Warm-water liquid cooling Re-using waste heat from computing components

  3. Displaying Computer Simulations Of Physical Phenomena

    NASA Technical Reports Server (NTRS)

    Watson, Val

    1991-01-01

    Paper discusses computer simulation as means of experiencing and learning to understand physical phenomena. Covers both present simulation capabilities and major advances expected in near future. Visual, aural, tactile, and kinesthetic effects used to teach such physical sciences as dynamics of fluids. Recommends classrooms in universities, government, and industry be linked to advanced computing centers so computer simulations integrated into education process.

  4. Examining the Effects of Turkish Education Reform on Students' TIMSS 2007 Science Achievements

    ERIC Educational Resources Information Center

    Atar, Hakan Yavuz; Atar, Burcu

    2012-01-01

    The purpose of this study is to examine the effects of some of the changes such as student centered learning (i.e. inquiry science instruction), outfitting classrooms with latest technology and computers that the reform movement has brought about on students' TIMSS 2007 science achievements. Two-staged stratified sampling was used in the selection…

  5. Adaptive Mesh Experiments for Hyperbolic Partial Differential Equations

    DTIC Science & Technology

    1990-02-01

    JOSEPH E. FLAHERTY FEBRUARY 1990 US ARMY ARMAMENT RESEARCH , ~ DEVELOPMENT AND ENGINEERlING CENTER CLOSE COMBAT ARMAMENTS CENTER BENET LABORATORIES...NY 12189-4050 If. CONTROLLING OFFICE NAME AND ADDRESS 12. REPORT DATE U.S. Army ARDEC February 1990 Close Combat Armaments Center 13. NUMBER OF...Flaherty Department of Computer Science Rensselaer Polytechnic Institute Troy, NY 12180-3590 and U.S. Army ARDEC Close Combat Armaments Center Benet

  6. Government regulations and other influences on the medical use of computers.

    PubMed

    Mishelevich, D J; Grams, R R; Mize, S G; Smith, J P

    1979-01-01

    This paper presents points brought out in a panel discussion held at the 12th Hawaiian International Conference on System Sciences, January 1979. The session was attended by approximately two dozen interested parties from various segments of the academic, government, and health care communities. The broad categories covered include the specific problems of government regulations and their impact on specific clinical information systems installed at The University of Texas Health Science Center at Dallas, opportunities in a regulated environment, problems in a regulated environment, vendor-related issues in the marketing and manufacture of computer-based information systems, rational approaches to government control, and specific issues related to medical computer science.

  7. The National Cancer Institute's Physical Sciences - Oncology Network

    NASA Astrophysics Data System (ADS)

    Espey, Michael Graham

    In 2009, the NCI launched the Physical Sciences - Oncology Centers (PS-OC) initiative with 12 Centers (U54) funded through 2014. The current phase of the Program includes U54 funded Centers with the added feature of soliciting new Physical Science - Oncology Projects (PS-OP) U01 grant applications through 2017; see NCI PAR-15-021. The PS-OPs, individually and along with other PS-OPs and the Physical Sciences-Oncology Centers (PS-OCs), comprise the Physical Sciences-Oncology Network (PS-ON). The foundation of the Physical Sciences-Oncology initiative is a high-risk, high-reward program that promotes a `physical sciences perspective' of cancer and fosters the convergence of physical science and cancer research by forming transdisciplinary teams of physical scientists (e.g., physicists, mathematicians, chemists, engineers, computer scientists) and cancer researchers (e.g., cancer biologists, oncologists, pathologists) who work closely together to advance our understanding of cancer. The collaborative PS-ON structure catalyzes transformative science through increased exchange of people, ideas, and approaches. PS-ON resources are leveraged to fund Trans-Network pilot projects to enable synergy and cross-testing of experimental and/or theoretical concepts. This session will include a brief PS-ON overview followed by a strategic discussion with the APS community to exchange perspectives on the progression of trans-disciplinary physical sciences in cancer research.

  8. FERMI: A Flexible Expert Reasoner with Multi-Domain Inferencing.

    DTIC Science & Technology

    1985-07-29

    for Life Sciences University of Leyden AFOSR Education Research Center Boiling AFB Boerhaavelaan 2 Washington. DC 20032-6448 2334 EN Leyden The... Kathleen McKeown Columbia University Dr. Michael Levine Department of Computer Science Educational Psychology New York, NY 10027 210 Education Bldg

  9. BioSIGHT: Interactive Visualization Modules for Science Education

    NASA Technical Reports Server (NTRS)

    Wong, Wee Ling

    1998-01-01

    Redefining science education to harness emerging integrated media technologies with innovative pedagogical goals represents a unique challenge. The Integrated Media Systems Center (IMSC) is the only engineering research center in the area of multimedia and creative technologies sponsored by the National Science Foundation. The research program at IMSC is focused on developing advanced technologies that address human-computer interfaces, database management, and high-speed network capabilities. The BioSIGHT project at is a demonstration technology project in the area of education that seeks to address how such emerging multimedia technologies can make an impact on science education. The scope of this project will help solidify NASA's commitment for the development of innovative educational resources that promotes science literacy for our students and the general population as well. These issues must be addressed as NASA marches toward the goal of enabling human space exploration that requires an understanding of life sciences in space. The IMSC BioSIGHT lab was established with the purpose of developing a novel methodology that will map a high school biology curriculum into a series of interactive visualization modules that can be easily incorporated into a space biology curriculum. Fundamental concepts in general biology must be mastered in order to allow a better understanding and application for space biology. Interactive visualization is a powerful component that can capture the students' imagination, facilitate their assimilation of complex ideas, and help them develop integrated views of biology. These modules will augment the role of the teacher and will establish the value of student-centered interactivity, both in an individual setting as well as in a collaborative learning environment. Students will be able to interact with the content material, explore new challenges, and perform virtual laboratory simulations. The BioSIGHT effort is truly cross-disciplinary in nature and requires expertise from many areas including Biology, Computer Science Electrical Engineering, Education, and the Cognitive Sciences. The BioSIGHT team includes a scientific illustrator, educational software designer, computer programmers as well as IMSC graduate and undergraduate students.

  10. National Space Science Data Center (NSSDC) Data Listing

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Satellite and nonsatellite data available from the National Space Science Data Center are listed. The Satellite Data listing includes the spacecraft name, launch date, and an alphabetical list of experiments. The Non-Satellite Data listing contains ground based data, models, computer routines, and composite spacecraft data. The data set name, data form code, quantity of data, and the time space covered are included in the data sets of both listings where appropriate. Geodetic tracking data sets are also included.

  11. Monaural Speech Segregation by Integrating Primitive and Schema-Based Analysis

    DTIC Science & Technology

    2008-02-03

    vol. 19, pp. 475-492. Wang D.L. and Chang P.S. (2008): An oscillatory correlation model of auditory streaming. Cognitive Neurodynamics , vol. 2, pp...Subcontracts DeLiang Wang (Principal Investigator) March 2008 Department of Computer Science & Engineering and Center for Cognitive Science The

  12. NASA Scientists Push the Limits of Computer Technology

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Dr. Donald Frazier,NASA researcher, uses a blue laser shining through a quarts window into a special mix of chemicals to generate a polymer film on the inside quartz surface. As the chemicals respond to the laser light, they adhere to the glass surface, forming optical films. Dr. Frazier and Dr. Mark S. Paley developed the process in the Space Sciences Laboratory at NASA's Marshall Space Flight Center in Huntsville, AL. Working aboard the Space Shuttle, a science team led by Dr. Frazier formed thin films potentially useful in optical computers with fewer impurities than those formed on Earth. Patterns of these films can be traced onto the quartz surface. In the optical computers of the future, these films could replace electronic circuits and wires, making the systems more efficient and cost-effective, as well as lighter and more compact. Photo credit: NASA/Marshall Space Flight Center.

  13. NASA Scientists Push the Limits of Computer Technology

    NASA Technical Reports Server (NTRS)

    1998-01-01

    NASA research Dr. Donald Frazier uses a blue laser shining through a quartz window into a special mix of chemicals to generate a polymer film on the inside quartz surface. As the chemicals respond to the laser light, they adhere to the glass surface, forming opticl films. Dr. Frazier and Dr. Mark S. Paley developed the process in the Space Sciences Laboratory at NASA's Marshall Space Flight Center in Huntsville, AL. Working aboard the Space Shuttle, a science team led by Dr. Frazier formed thin-films potentially useful in optical computers with fewer impurities than those formed on Earth. Patterns of these films can be traced onto the quartz surface. In the optical computers on the future, these films could replace electronic circuits and wires, making the systems more efficient and cost-effective, as well as lighter and more compact. Photo credit: NASA/Marshall Space Flight Center

  14. NASA Scientists Push the Limits of Computer Technology

    NASA Technical Reports Server (NTRS)

    1999-01-01

    NASA researcher Dr. Donald Frazier uses a blue laser shining through a quartz window into a special mix of chemicals to generate a polymer film on the inside quartz surface. As the chemicals respond to the laser light, they adhere to the glass surface, forming optical films. Dr. Frazier and Dr. Mark S. Paley developed the process in the Space Sciences Laboratory at NASA's Marshall Space Flight Center in Huntsville, AL. Working aboard the Space Shuttle, a science team led by Dr. Frazier formed thin-films potentially useful in optical computers with fewer impurities than those formed on Earth. Patterns of these films can be traced onto the quartz surface. In the optical computers of the future, thee films could replace electronic circuits and wires, making the systems more efficient and cost-effective, as well as lighter and more compact. Photo credit: NASA/Marshall Space Flight Center

  15. Research Institute for Advanced Computer Science: Annual Report October 1998 through September 1999

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.; Gross, Anthony R. (Technical Monitor)

    1999-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center (ARC). It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. ARC has been designated NASA's Center of Excellence in Information Technology. In this capacity, ARC is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA ARC and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, and visiting scientist programs, designed to encourage and facilitate collaboration between the university and NASA information technology research communities.

  16. Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2000-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center. It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. Ames has been designated NASA's Center of Excellence in Information Technology. In this capacity, Ames is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA Ames and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth; (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking. Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, and visiting scientist programs, designed to encourage and facilitate collaboration between the university and NASA information technology research communities.

  17. Post-Mortem and Effective Measure of Science Programs: A Study of Bangladesh Open University

    ERIC Educational Resources Information Center

    Numan, Sharker Md.; Islam, Md. Anwarul; Shah, A. K. M. Azad

    2013-01-01

    Distance education can be more learners centered if distance educators are aware of the problems, needs, attitudes and characteristics of their learners. The aim of this study was to compare the learners' profile in terms of their attitude and demography between the learners of computer science and health science. A cross-sectional study design…

  18. Use of Digital Game Based Learning and Gamification in Secondary School Science: The Effect on Student Engagement, Learning and Gender Difference

    ERIC Educational Resources Information Center

    Khan, Amna; Ahmad, Farzana Hayat; Malik, Muhammad Muddassir

    2017-01-01

    This study aimed to identify the impact of a game based learning (GBL) application using computer technologies on student engagement in secondary school science classrooms. The literature reveals that conventional Science teaching techniques (teacher-centered lecture and teaching), which foster rote learning among students, are one of the major…

  19. Human Centered Computing for Mars Exploration

    NASA Technical Reports Server (NTRS)

    Trimble, Jay

    2005-01-01

    The science objectives are to determine the aqueous, climatic, and geologic history of a site on Mars where conditions may have been favorable to the preservation of evidence of prebiotic or biotic processes. Human Centered Computing is a development process that starts with users and their needs, rather than with technology. The goal is a system design that serves the user, where the technology fits the task and the complexity is that of the task not of the tool.

  20. Education through the prism of computation

    NASA Astrophysics Data System (ADS)

    Kaurov, Vitaliy

    2014-03-01

    With the rapid development of technology, computation claims its irrevocable place among research components of modern science. Thus to foster a successful future scientist, engineer or educator we need to add computation to the foundations of scientific education. We will discuss what type of paradigm shifts it brings to these foundations on the example of Wolfram Science Summer School. It is one of the most advanced computational outreach programs run by Wolfram Foundation, welcoming participants of almost all ages and backgrounds. Centered on complexity science and physics, it also covers numerous adjacent and interdisciplinary fields such as finance, biology, medicine and even music. We will talk about educational and research experiences in this program during the 12 years of its existence. We will review statistics and outputs the program has produced. Among these are interactive electronic publications at the Wolfram Demonstrations Project and contributions to the computational knowledge engine Wolfram|Alpa.

  1. 1994 Science Information Management and Data Compression Workshop

    NASA Technical Reports Server (NTRS)

    Tilton, James C. (Editor)

    1994-01-01

    This document is the proceedings from the 'Science Information Management and Data Compression Workshop,' which was held on September 26-27, 1994, at the NASA Goddard Space Flight Center, Greenbelt, Maryland. The Workshop explored promising computational approaches for handling the collection, ingestion, archival and retrieval of large quantities of data in future Earth and space science missions. It consisted of eleven presentations covering a range of information management and data compression approaches that are being or have been integrated into actual or prototypical Earth or space science data information systems, or that hold promise for such an application. The workshop was organized by James C. Tilton and Robert F. Cromp of the NASA Goddard Space Flight Center.

  2. The 1995 Science Information Management and Data Compression Workshop

    NASA Technical Reports Server (NTRS)

    Tilton, James C. (Editor)

    1995-01-01

    This document is the proceedings from the 'Science Information Management and Data Compression Workshop,' which was held on October 26-27, 1995, at the NASA Goddard Space Flight Center, Greenbelt, Maryland. The Workshop explored promising computational approaches for handling the collection, ingestion, archival, and retrieval of large quantities of data in future Earth and space science missions. It consisted of fourteen presentations covering a range of information management and data compression approaches that are being or have been integrated into actual or prototypical Earth or space science data information systems, or that hold promise for such an application. The Workshop was organized by James C. Tilton and Robert F. Cromp of the NASA Goddard Space Flight Center.

  3. Effect of Graphene with Nanopores on Metal Clusters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Hu; Chen, Xianlang; Wang, Lei

    Porous graphene, which is a novel type of defective graphene, shows excellent potential as a support material for metal clusters. In this work, the stability and electronic structures of metal clusters (Pd, Ir, Rh) supported on pristine graphene and graphene with different sizes of nanopore were investigated by first-principle density functional theory (DFT) calculations. Thereafter, CO adsorption and oxidation reaction on the Pd-graphene system were chosen to evaluate its catalytic performance. Graphene with nanopore can strongly stabilize the metal clusters and cause a substantial downshift of the d-band center of the metal clusters, thus decreasing CO adsorption. All binding energies,more » d-band centers, and adsorption energies show a linear change with the size of the nanopore: a bigger size of nanopore corresponds to a stronger metal clusters bond to the graphene, lower downshift of the d-band center, and weaker CO adsorption. By using a suitable size nanopore, supported Pd clusters on the graphene will have similar CO and O2 adsorption ability, thus leading to superior CO tolerance. The DFT calculated reaction energy barriers show that graphene with nanopore is a superior catalyst for CO oxidation reaction. These properties can play an important role in instructing graphene-supported metal catalyst preparation to prevent the diffusion or agglomeration of metal clusters and enhance catalytic performance. This work was supported by National Basic Research Program of China (973Program) (2013CB733501), the National Natural Science Foundation of China (NSFC-21176221, 21136001, 21101137, 21306169, and 91334013). D. Mei acknowledges the support from the US Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences & Biosciences. Pacific Northwest National Laboratory (PNNL) is a multiprogram national laboratory operated for DOE by Battelle. Computing time was granted by the grand challenge of computational catalysis of the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) and by the National Energy Research Scientific Computing Center (NERSC).« less

  4. Center of Excellence in Space Data and Information Sciences Annual Report, Year 9

    DTIC Science & Technology

    1997-06-01

    faculty at INSEAD in Paris, France, THESEUS in Sophia, Antibe, and the London Business School. Gave an invited presentation and chaired a panel at...the conference on digital cash held at THESEUS . Visited the Department of Computer Science at Johns Hopkins University with Nabil Adam (Rutgers

  5. 78 FR 78779 - Fisheries of the Caribbean, Gulf of Mexico, and South Atlantic; Revisions to Headboat Reporting...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-27

    ... revisions require fishing records to be submitted electronically (via computer or internet) on a weekly basis or at intervals shorter than a week if notified by the NMFS' Southeast Fisheries Science Center (SEFSC) Science and Research Director (SRD), and prohibits [[Page 78780

  6. Applied Information Systems Research Program Workshop

    NASA Technical Reports Server (NTRS)

    Bredekamp, Joe

    1991-01-01

    Viewgraphs on Applied Information Systems Research Program Workshop are presented. Topics covered include: the Earth Observing System Data and Information System; the planetary data system; Astrophysics Data System project review; OAET Computer Science and Data Systems Programs; the Center of Excellence in Space Data and Information Sciences; and CASIS background.

  7. Computational structures technology and UVA Center for CST

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    1992-01-01

    Rapid advances in computer hardware have had a profound effect on various engineering and mechanics disciplines, including the materials, structures, and dynamics disciplines. A new technology, computational structures technology (CST), has recently emerged as an insightful blend between material modeling, structural and dynamic analysis and synthesis on the one hand, and other disciplines such as computer science, numerical analysis, and approximation theory, on the other hand. CST is an outgrowth of finite element methods developed over the last three decades. The focus of this presentation is on some aspects of CST which can impact future airframes and propulsion systems, as well as on the newly established University of Virginia (UVA) Center for CST. The background and goals for CST are described along with the motivations for developing CST, and a brief discussion is made on computational material modeling. We look at the future in terms of technical needs, computing environment, and research directions. The newly established UVA Center for CST is described. One of the research projects of the Center is described, and a brief summary of the presentation is given.

  8. Jennifer Southerland | NREL

    Science.gov Websites

    Southerland Jennifer Southerland Professional II-Project Assistant Jennifer.Southerland@nrel.gov | 303-275-4065 Jennifer Southerland is a project assistant with the Computational Science Center where

  9. Senior Computational Scientist | Center for Cancer Research

    Cancer.gov

    The Basic Science Program (BSP) pursues independent, multidisciplinary research in basic and applied molecular biology, immunology, retrovirology, cancer biology, and human genetics. Research efforts and support are an integral part of the Center for Cancer Research (CCR) at the Frederick National Laboratory for Cancer Research (FNLCR). The Cancer & Inflammation Program (CIP),

  10. Berkeley Lab - Materials Sciences Division

    Science.gov Websites

    Computational Study of Excited-State Phenomena in Energy Materials Center for X-ray Optics MSD Facilities Ion Excited-State Phenomena in Energy Materials Center for X-ray Optics MSD Facilities Ion Beam Analysis Behavior of Lithium Metal across a Rigid Block Copolymer Electrolyte Membrane. Journal of the

  11. Animated computer graphics models of space and earth sciences data generated via the massively parallel processor

    NASA Technical Reports Server (NTRS)

    Treinish, Lloyd A.; Gough, Michael L.; Wildenhain, W. David

    1987-01-01

    The capability was developed of rapidly producing visual representations of large, complex, multi-dimensional space and earth sciences data sets via the implementation of computer graphics modeling techniques on the Massively Parallel Processor (MPP) by employing techniques recently developed for typically non-scientific applications. Such capabilities can provide a new and valuable tool for the understanding of complex scientific data, and a new application of parallel computing via the MPP. A prototype system with such capabilities was developed and integrated into the National Space Science Data Center's (NSSDC) Pilot Climate Data System (PCDS) data-independent environment for computer graphics data display to provide easy access to users. While developing these capabilities, several problems had to be solved independently of the actual use of the MPP, all of which are outlined.

  12. GES DISC Data Recipes in Jupyter Notebooks

    NASA Astrophysics Data System (ADS)

    Li, A.; Banavige, B.; Garimella, K.; Rice, J.; Shen, S.; Liu, Z.

    2017-12-01

    The Earth Science Data and Information System (ESDIS) Project manages twelve Distributed Active Archive Centers (DAACs) which are geographically dispersed across the United States. The DAACs are responsible for ingesting, processing, archiving, and distributing Earth science data produced from various sources (satellites, aircraft, field measurements, etc.). In response to projections of an exponential increase in data production, there has been a recent effort to prototype various DAAC activities in the cloud computing environment. This, in turn, led to the creation of an initiative, called the Cloud Analysis Toolkit to Enable Earth Science (CATEES), to develop a Python software package in order to transition Earth science data processing to the cloud. This project, in particular, supports CATEES and has two primary goals. One, transition data recipes created by the Goddard Earth Science Data and Information Service Center (GES DISC) DAAC into an interactive and educational environment using Jupyter Notebooks. Two, acclimate Earth scientists to cloud computing. To accomplish these goals, we create Jupyter Notebooks to compartmentalize the different steps of data analysis and help users obtain and parse data from the command line. We also develop a Docker container, comprised of Jupyter Notebooks, Python library dependencies, and command line tools, and configure it into an easy to deploy package. The end result is an end-to-end product that simulates the use case of end users working in the cloud computing environment.

  13. Building the interspace: Digital library infrastructure for a University Engineering Community

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schatz, B.

    A large-scale digital library is being constructed and evaluated at the University of Illinois, with the goal of bringing professional search and display to Internet information services. A testbed planned to grow to 10K documents and 100K users is being constructed in the Grainger Engineering Library Information Center, as a joint effort of the University Library and the National Center for Supercomputing Applications (NCSA), with evaluation and research by the Graduate School of Library and Information Science and the Department of Computer Science. The electronic collection will be articles from engineering and science journals and magazines, obtained directly from publishersmore » in SGML format and displayed containing all text, figures, tables, and equations. The publisher partners include IEEE Computer Society, AIAA (Aerospace Engineering), American Physical Society, and Wiley & Sons. The software will be based upon NCSA Mosaic as a network engine connected to commercial SGML displayers and full-text searchers. The users will include faculty/students across the midwestern universities in the Big Ten, with evaluations via interviews, surveys, and transaction logs. Concurrently, research into scaling the testbed is being conducted. This includes efforts in computer science, information science, library science, and information systems. These efforts will evaluate different semantic retrieval technologies, including automatic thesaurus and subject classification graphs. New architectures will be designed and implemented for a next generation digital library infrastructure, the Interspace, which supports interaction with information spread across information spaces within the Net.« less

  14. Examining the Computer Self-Efficacy Perceptions of Gifted Students

    ERIC Educational Resources Information Center

    Kaplan, Abdullah; Öztürk, Mesut; Doruk, Muhammet; Yilmaz, Alper

    2013-01-01

    This study was conducted in order to determine the computer self-efficacy perceptions of gifted students. The research group of this study is composed of gifted students (N = 36) who were studying at the Science and Arts Center in Gümüshane province in the spring semester of the 2012-2013 academic year. The "Computer Self-Efficacy Perception…

  15. Provision of Information to the Research Staff.

    ERIC Educational Resources Information Center

    Williams, Martha E.

    The Information Sciences section at Illinois Institute of Technology Research Institute (IITRI) is now operating a Computer Search Center (CSC) for handling numerous machine-readable data bases. The computer programs are generalized in the sense that they will handle any incoming data base. This is accomplished by means of a preprocessor system…

  16. 2000 FIRST Robotics Competition

    NASA Technical Reports Server (NTRS)

    Purman, Richard

    2000-01-01

    The New Horizons Regional Education Center (NHREC) in Hampton, VA sought and received NASA funding to support its participation in the 2000 FIRST Robotics competition. FIRST, Inc. (For Inspiration and Recognition of Science and Technology) is an organization which encourages the application of creative science, math, and computer science principles to solve real-world engineering problems. The FIRST competition is an international engineering contest featuring high school, government, and business partnerships.

  17. The 1984 NASA/ASEE summer faculty fellowship program

    NASA Technical Reports Server (NTRS)

    Mcinnis, B. C.; Duke, M. B.; Crow, B.

    1984-01-01

    An overview is given of the program management and activities. Participants and research advisors are listed. Abstracts give describe and present results of research assignments performed by 31 fellows either at the Johnson Space Center, at the White Sands test Facility, or at the California Space Institute in La Jolla. Disciplines studied include engineering; biology/life sciences; Earth sciences; chemistry; mathematics/statistics/computer sciences; and physics/astronomy.

  18. Singularity: Scientific containers for mobility of compute.

    PubMed

    Kurtzer, Gregory M; Sochat, Vanessa; Bauer, Michael W

    2017-01-01

    Here we present Singularity, software developed to bring containers and reproducibility to scientific computing. Using Singularity containers, developers can work in reproducible environments of their choosing and design, and these complete environments can easily be copied and executed on other platforms. Singularity is an open source initiative that harnesses the expertise of system and software engineers and researchers alike, and integrates seamlessly into common workflows for both of these groups. As its primary use case, Singularity brings mobility of computing to both users and HPC centers, providing a secure means to capture and distribute software and compute environments. This ability to create and deploy reproducible environments across these centers, a previously unmet need, makes Singularity a game changing development for computational science.

  19. Singularity: Scientific containers for mobility of compute

    PubMed Central

    Kurtzer, Gregory M.; Bauer, Michael W.

    2017-01-01

    Here we present Singularity, software developed to bring containers and reproducibility to scientific computing. Using Singularity containers, developers can work in reproducible environments of their choosing and design, and these complete environments can easily be copied and executed on other platforms. Singularity is an open source initiative that harnesses the expertise of system and software engineers and researchers alike, and integrates seamlessly into common workflows for both of these groups. As its primary use case, Singularity brings mobility of computing to both users and HPC centers, providing a secure means to capture and distribute software and compute environments. This ability to create and deploy reproducible environments across these centers, a previously unmet need, makes Singularity a game changing development for computational science. PMID:28494014

  20. Archiving and access systems for remote sensing: Chapter 6

    USGS Publications Warehouse

    Faundeen, John L.; Percivall, George; Baros, Shirley; Baumann, Peter; Becker, Peter H.; Behnke, J.; Benedict, Karl; Colaiacomo, Lucio; Di, Liping; Doescher, Chris; Dominguez, J.; Edberg, Roger; Ferguson, Mark; Foreman, Stephen; Giaretta, David; Hutchison, Vivian; Ip, Alex; James, N.L.; Khalsa, Siri Jodha S.; Lazorchak, B.; Lewis, Adam; Li, Fuqin; Lymburner, Leo; Lynnes, C.S.; Martens, Matt; Melrose, Rachel; Morris, Steve; Mueller, Norman; Navale, Vivek; Navulur, Kumar; Newman, D.J.; Oliver, Simon; Purss, Matthew; Ramapriyan, H.K.; Rew, Russ; Rosen, Michael; Savickas, John; Sixsmith, Joshua; Sohre, Tom; Thau, David; Uhlir, Paul; Wang, Lan-Wei; Young, Jeff

    2016-01-01

    Focuses on major developments inaugurated by the Committee on Earth Observation Satellites, the Group on Earth Observations System of Systems, and the International Council for Science World Data System at the global level; initiatives at national levels to create data centers (e.g. the National Aeronautics and Space Administration (NASA) Distributed Active Archive Centers and other international space agency counterparts), and non-government systems (e.g. Center for International Earth Science Information Network). Other major elements focus on emerging tool sets, requirements for metadata, data storage and refresh methods, the rise of cloud computing, and questions about what and how much data should be saved. The sub-sections of the chapter address topics relevant to the science, engineering and standards used for state-of-the-art operational and experimental systems.

  1. Insight Center | Computational Science | NREL

    Science.gov Websites

    effectively convey information and illustrate research findings to stakeholders and visitors. The -turbine array simulations. Observational data span from the nanostructures of biomass pretreatments to the

  2. Activities at the Lunar and Planetary Institute

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The activities of the Lunar and Planetary Institute for the period July to December 1984 are discussed. Functions of its departments and projects are summarized. These include: planetary image center; library information center; computer center; production services; scientific staff; visitors program; scientific projects; conferences; workshops; seminars; publications and communications; panels, teams, committees and working groups; NASA-AMES vertical gun range (AVGR); and lunar and planetary science council.

  3. The space physics analysis network

    NASA Astrophysics Data System (ADS)

    Green, James L.

    1988-04-01

    The Space Physics Analysis Network, or SPAN, is emerging as a viable method for solving an immediate communication problem for space and Earth scientists and has been operational for nearly 7 years. SPAN and its extension into Europe, utilizes computer-to-computer communications allowing mail, binary and text file transfer, and remote logon capability to over 1000 space science computer systems. The network has been used to successfully transfer real-time data to remote researchers for rapid data analysis but its primary function is for non-real-time applications. One of the major advantages for using SPAN is its spacecraft mission independence. Space science researchers using SPAN are located in universities, industries and government institutions all across the United States and Europe. These researchers are in such fields as magnetospheric physics, astrophysics, ionosperic physics, atmospheric physics, climatology, meteorology, oceanography, planetary physics and solar physics. SPAN users have access to space and Earth science data bases, mission planning and information systems, and computational facilities for the purposes of facilitating correlative space data exchange, data analysis and space research. For example, the National Space Science Data Center (NSSDC), which manages the network, is providing facilities on SPAN such as the Network Information Center (SPAN NIC). SPAN has interconnections with several national and international networks such as HEPNET and TEXNET forming a transparent DECnet network. The combined total number of computers now reachable over these combined networks is about 2000. In addition, SPAN supports full function capabilities over the international public packet switched networks (e.g. TELENET) and has mail gateways to ARPANET, BITNET and JANET.

  4. A Hybrid Cloud Computing Service for Earth Sciences

    NASA Astrophysics Data System (ADS)

    Yang, C. P.

    2016-12-01

    Cloud Computing is becoming a norm for providing computing capabilities for advancing Earth sciences including big Earth data management, processing, analytics, model simulations, and many other aspects. A hybrid spatiotemporal cloud computing service is bulit at George Mason NSF spatiotemporal innovation center to meet this demands. This paper will report the service including several aspects: 1) the hardware includes 500 computing services and close to 2PB storage as well as connection to XSEDE Jetstream and Caltech experimental cloud computing environment for sharing the resource; 2) the cloud service is geographically distributed at east coast, west coast, and central region; 3) the cloud includes private clouds managed using open stack and eucalyptus, DC2 is used to bridge these and the public AWS cloud for interoperability and sharing computing resources when high demands surfing; 4) the cloud service is used to support NSF EarthCube program through the ECITE project, ESIP through the ESIP cloud computing cluster, semantics testbed cluster, and other clusters; 5) the cloud service is also available for the earth science communities to conduct geoscience. A brief introduction about how to use the cloud service will be included.

  5. The creation and early implementation of a high speed fiber optic network for a university health sciences center.

    PubMed Central

    Schueler, J. D.; Mitchell, J. A.; Forbes, S. M.; Neely, R. C.; Goodman, R. J.; Branson, D. K.

    1991-01-01

    In late 1989 the University of Missouri Health Sciences Center began the process of creating an extensive fiber optic network throughout its facilities, with the intent to provide networked computer access to anyone in the Center desiring such access, regardless of geographic location or organizational affiliation. A committee representing all disciplines within the Center produced and, in conjunction with independent consultants, approved a comprehensive design for the network. Installation of network backbone components commenced in the second half of 1990 and was completed in early 1991. As the network entered its initial phases of operation, the first realities of this important new resource began to manifest themselves as enhanced functional capacity in the Health Sciences Center. This paper describes the development of the network, with emphasis on its design criteria, installation, early operation, and management. Also included are discussions on its organizational impact and its evolving significance as a medical community resource. PMID:1807660

  6. Using Scenarios to Design Complex Technology-Enhanced Learning Environments

    ERIC Educational Resources Information Center

    de Jong, Ton; Weinberger, Armin; Girault, Isabelle; Kluge, Anders; Lazonder, Ard W.; Pedaste, Margus; Ludvigsen, Sten; Ney, Muriel; Wasson, Barbara; Wichmann, Astrid; Geraedts, Caspar; Giemza, Adam; Hovardas, Tasos; Julien, Rachel; van Joolingen, Wouter R.; Lejeune, Anne; Manoli, Constantinos C.; Matteman, Yuri; Sarapuu, Tago; Verkade, Alex; Vold, Vibeke; Zacharia, Zacharias C.

    2012-01-01

    Science Created by You (SCY) learning environments are computer-based environments in which students learn about science topics in the context of addressing a socio-scientific problem. Along their way to a solution for this problem students produce many types of intermediate products or learning objects. SCY learning environments center the entire…

  7. Advanced Methodologies for NASA Science Missions

    NASA Astrophysics Data System (ADS)

    Hurlburt, N. E.; Feigelson, E.; Mentzel, C.

    2017-12-01

    Most of NASA's commitment to computational space science involves the organization and processing of Big Data from space-based satellites, and the calculations of advanced physical models based on these datasets. But considerable thought is also needed on what computations are needed. The science questions addressed by space data are so diverse and complex that traditional analysis procedures are often inadequate. The knowledge and skills of the statistician, applied mathematician, and algorithmic computer scientist must be incorporated into programs that currently emphasize engineering and physical science. NASA's culture and administrative mechanisms take full cognizance that major advances in space science are driven by improvements in instrumentation. But it is less well recognized that new instruments and science questions give rise to new challenges in the treatment of satellite data after it is telemetered to the ground. These issues might be divided into two stages: data reduction through software pipelines developed within NASA mission centers; and science analysis that is performed by hundreds of space scientists dispersed through NASA, U.S. universities, and abroad. Both stages benefit from the latest statistical and computational methods; in some cases, the science result is completely inaccessible using traditional procedures. This paper will review the current state of NASA and present example applications using modern methodologies.

  8. USGS science in Menlo Park -- a science strategy for the U.S. Geological Survey Menlo Park Science Center, 2005-2015

    USGS Publications Warehouse

    Brocher, Thomas M.; Carr, Michael D.; Halsing, David L.; John, David A.; Langenheim, V.E.; Mangan, Margaret T.; Marvin-DiPasquale, Mark C.; Takekawa, John Y.; Tiedeman, Claire

    2006-01-01

    In the spring of 2004, the U.S. Geological Survey (USGS) Menlo Park Center Council commissioned an interdisciplinary working group to develop a forward-looking science strategy for the USGS Menlo Park Science Center in California (hereafter also referred to as "the Center"). The Center has been the flagship research center for the USGS in the western United States for more than 50 years, and the Council recognizes that science priorities must be the primary consideration guiding critical decisions made about the future evolution of the Center. In developing this strategy, the working group consulted widely within the USGS and with external clients and collaborators, so that most stakeholders had an opportunity to influence the science goals and operational objectives.The Science Goals are to: Natural Hazards: Conduct natural-hazard research and assessments critical to effective mitigation planning, short-term forecasting, and event response. Ecosystem Change: Develop a predictive understanding of ecosystem change that advances ecosystem restoration and adaptive management. Natural Resources: Advance the understanding of natural resources in a geologic, hydrologic, economic, environmental, and global context. Modeling Earth System Processes: Increase and improve capabilities for quantitative simulation, prediction, and assessment of Earth system processes.The strategy presents seven key Operational Objectives with specific actions to achieve the scientific goals. These Operational Objectives are to:Provide a hub for technology, laboratories, and library services to support science in the Western Region. Increase advanced computing capabilities and promote sharing of these resources. Enhance the intellectual diversity, vibrancy, and capacity of the work force through improved recruitment and retention. Strengthen client and collaborative relationships in the community at an institutional level.Expand monitoring capability by increasing density, sensitivity, and efficiency and reducing costs of instruments and networks. Encourage a breadth of scientific capabilities in Menlo Park to foster interdisciplinary science. Communicate USGS science to a diverse audience.

  9. Science at the Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    White, Nicholas E.

    2012-01-01

    The Sciences and Exploration Directorate of the NASA Goddard Space Flight Center (GSFC) is the largest Earth and space science research organization in the world. Its scientists advance understanding of the Earth and its life-sustaining environment, the Sun, the solar system, and the wider universe beyond. Researchers in the Sciences and Exploration Directorate work with engineers, computer programmers, technologists, and other team members to develop the cutting-edge technology needed for space-based research. Instruments are also deployed on aircraft, balloons, and Earth's surface. I will give an overview of the current research activities and programs at GSFC including the James Web Space Telescope (JWST), future Earth Observing programs, experiments that are exploring our solar system and studying the interaction of the Sun with the Earth's magnetosphere.

  10. Python in the NERSC Exascale Science Applications Program for Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ronaghi, Zahra; Thomas, Rollin; Deslippe, Jack

    We describe a new effort at the National Energy Re- search Scientific Computing Center (NERSC) in performance analysis and optimization of scientific Python applications targeting the Intel Xeon Phi (Knights Landing, KNL) many- core architecture. The Python-centered work outlined here is part of a larger effort called the NERSC Exascale Science Applications Program (NESAP) for Data. NESAP for Data focuses on applications that process and analyze high-volume, high-velocity data sets from experimental/observational science (EOS) facilities supported by the US Department of Energy Office of Science. We present three case study applications from NESAP for Data that use Python. These codesmore » vary in terms of “Python purity” from applications developed in pure Python to ones that use Python mainly as a convenience layer for scientists without expertise in lower level programming lan- guages like C, C++ or Fortran. The science case, requirements, constraints, algorithms, and initial performance optimizations for each code are discussed. Our goal with this paper is to contribute to the larger conversation around the role of Python in high-performance computing today and tomorrow, highlighting areas for future work and emerging best practices« less

  11. Teaching Technology with Technology. An Off-the-Shelf Robotics Course Builds Technical Center Enrollment.

    ERIC Educational Resources Information Center

    Hannemann, Jim; Rice, Thomas R.

    1991-01-01

    At the Oakland Technical Center, which provides vocational programs for nine Michigan high schools, a one-semester course in Foundations of Technology Systems uses a computer-simulated manufacturing environment to teach applied math, science, language arts, communication skills, problem solving, and teamwork in the context of technology education.…

  12. The new space and earth science information systems at NASA's archive

    NASA Technical Reports Server (NTRS)

    Green, James L.

    1990-01-01

    The on-line interactive systems of the National Space Science Data Center (NSSDC) are examined. The worldwide computer network connections that allow access to NSSDC users are outlined. The services offered by the NSSDC new technology on-line systems are presented, including the IUE request system, ozone TOMS data, and data sets on astrophysics, atmospheric science, land sciences, and space plasma physics. Plans for future increases in the NSSDC data holdings are considered.

  13. The new space and Earth science information systems at NASA's archive

    NASA Technical Reports Server (NTRS)

    Green, James L.

    1990-01-01

    The on-line interactive systems of the National Space Science Data Center (NSSDC) are examined. The worldwide computer network connections that allow access to NSSDC users are outlined. The services offered by the NSSDC new technology on-line systems are presented, including the IUE request system, Total Ozone Mapping Spectrometer (TOMS) data, and data sets on astrophysics, atmospheric science, land sciences, and space plasma physics. Plans for future increases in the NSSDC data holdings are considered.

  14. Realistic Covariance Prediction for the Earth Science Constellation

    NASA Technical Reports Server (NTRS)

    Duncan, Matthew; Long, Anne

    2006-01-01

    Routine satellite operations for the Earth Science Constellation (ESC) include collision risk assessment between members of the constellation and other orbiting space objects. One component of the risk assessment process is computing the collision probability between two space objects. The collision probability is computed using Monte Carlo techniques as well as by numerically integrating relative state probability density functions. Each algorithm takes as inputs state vector and state vector uncertainty information for both objects. The state vector uncertainty information is expressed in terms of a covariance matrix. The collision probability computation is only as good as the inputs. Therefore, to obtain a collision calculation that is a useful decision-making metric, realistic covariance matrices must be used as inputs to the calculation. This paper describes the process used by the NASA/Goddard Space Flight Center's Earth Science Mission Operations Project to generate realistic covariance predictions for three of the Earth Science Constellation satellites: Aqua, Aura and Terra.

  15. Issues and recommendations associated with distributed computation and data management systems for the space sciences

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The primary purpose of the report is to explore management approaches and technology developments for computation and data management systems designed to meet future needs in the space sciences.The report builds on work presented in previous reports on solar-terrestrial and planetary reports, broadening the outlook to all of the space sciences, and considering policy issues aspects related to coordiantion between data centers, missions, and ongoing research activities, because it is perceived that the rapid growth of data and the wide geographic distribution of relevant facilities will present especially troublesome problems for data archiving, distribution, and analysis.

  16. Computational Toxicology at the US EPA | Science Inventory ...

    EPA Pesticide Factsheets

    Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, EPA is developing robust and flexible computational tools that can be applied to the thousands of chemicals in commerce, and contaminant mixtures found in America’s air, water, and hazardous-waste sites. The ORD Computational Toxicology Research Program (CTRP) is composed of three main elements. The largest component is the National Center for Computational Toxicology (NCCT), which was established in 2005 to coordinate research on chemical screening and prioritization, informatics, and systems modeling. The second element consists of related activities in the National Health and Environmental Effects Research Laboratory (NHEERL) and the National Exposure Research Laboratory (NERL). The third and final component consists of academic centers working on various aspects of computational toxicology and funded by the EPA Science to Achieve Results (STAR) program. Key intramural projects of the CTRP include digitizing legacy toxicity testing information toxicity reference database (ToxRefDB), predicting toxicity (ToxCast™) and exposure (ExpoCast™), and creating virtual liver (v-Liver™) and virtual embryo (v-Embryo™) systems models. The models and underlying data are being made publicly available t

  17. A Distributed User Information System

    DTIC Science & Technology

    1990-03-01

    NOE08 Department of Computer Science NOVO 8 1990 University of Maryland S College Park, MD 20742 D Abstract Current user information database technology ...Transactions on Computer Systems, May 1988. [So189] K. Sollins. A plan for internet directory services. Technical report, DDN Network Information Center...2424 A Distributed User Information System DTiC Steven D. Miller, Scott Carson, and Leo Mark DELECTE Institute for Advanced Computer Studies and

  18. Expanding the Scope of High-Performance Computing Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Uram, Thomas D.; Papka, Michael E.

    The high-performance computing centers of the future will expand their roles as service providers, and as the machines scale up, so should the sizes of the communities they serve. National facilities must cultivate their users as much as they focus on operating machines reliably. The authors present five interrelated topic areas that are essential to expanding the value provided to those performing computational science.

  19. Evolution of the Virtualized HPC Infrastructure of Novosibirsk Scientific Center

    NASA Astrophysics Data System (ADS)

    Adakin, A.; Anisenkov, A.; Belov, S.; Chubarov, D.; Kalyuzhny, V.; Kaplin, V.; Korol, A.; Kuchin, N.; Lomakin, S.; Nikultsev, V.; Skovpen, K.; Sukharev, A.; Zaytsev, A.

    2012-12-01

    Novosibirsk Scientific Center (NSC), also known worldwide as Akademgorodok, is one of the largest Russian scientific centers hosting Novosibirsk State University (NSU) and more than 35 research organizations of the Siberian Branch of Russian Academy of Sciences including Budker Institute of Nuclear Physics (BINP), Institute of Computational Technologies, and Institute of Computational Mathematics and Mathematical Geophysics (ICM&MG). Since each institute has specific requirements on the architecture of computing farms involved in its research field, currently we've got several computing facilities hosted by NSC institutes, each optimized for a particular set of tasks, of which the largest are the NSU Supercomputer Center, Siberian Supercomputer Center (ICM&MG), and a Grid Computing Facility of BINP. A dedicated optical network with the initial bandwidth of 10 Gb/s connecting these three facilities was built in order to make it possible to share the computing resources among the research communities, thus increasing the efficiency of operating the existing computing facilities and offering a common platform for building the computing infrastructure for future scientific projects. Unification of the computing infrastructure is achieved by extensive use of virtualization technology based on XEN and KVM platforms. This contribution gives a thorough review of the present status and future development prospects for the NSC virtualized computing infrastructure and the experience gained while using it for running production data analysis jobs related to HEP experiments being carried out at BINP, especially the KEDR detector experiment at the VEPP-4M electron-positron collider.

  20. Research and technology at the Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Cryogenic engineering, hypergolic engineering, hazardous warning, structures and mechanics, computer sciences, communications, meteorology, technology applications, safety engineering, materials analysis, biomedicine, and engineering management and training aids research are reviewed.

  1. Data Processing Center of Radioastron Project: 3 years of operation.

    NASA Astrophysics Data System (ADS)

    Shatskaya, Marina

    ASC DATA PROCESSING CENTER (DPC) of Radioastron Project is a fail-safe complex centralized system of interconnected software/ hardware components along with organizational procedures. Tasks facing of the scientific data processing center are organization of service information exchange, collection of scientific data, storage of all of scientific data, data science oriented processing. DPC takes part in the informational exchange with two tracking stations in Pushchino (Russia) and Green Bank (USA), about 30 ground telescopes, ballistic center, tracking headquarters and session scheduling center. Enormous flows of information go to Astro Space Center. For the inquiring of enormous data volumes we develop specialized network infrastructure, Internet channels and storage. The computer complex has been designed at the Astro Space Center (ASC) of Lebedev Physical Institute and includes: - 800 TB on-line storage, - 2000 TB hard drive archive, - backup system on magnetic tapes (2000 TB); - 24 TB redundant storage at Pushchino Radio Astronomy Observatory; - Web and FTP servers, - DPC management and data transmission networks. The structure and functions of ASC Data Processing Center are fully adequate to the data processing requirements of the Radioastron Mission and has been successfully confirmed during Fringe Search, Early Science Program and first year of Key Science Program.

  2. A Cellular Automata Approach to Computer Vision and Image Processing.

    DTIC Science & Technology

    1980-09-01

    the ACM, vol. 15, no. 9, pp. 827-837. [ Duda and Hart] R. 0. Duda and P. E. Hart, Pattern Classification and Scene Analysis, Wiley, New York, 1973...Center TR-738, 1979. [Farley] Arthur M. Farley and Andrzej Proskurowski, "Gossiping in Grid Graphs", University of Oregon Computer Science Department CS-TR

  3. Collaborative Research Goes to School: Guided Inquiry with Computers in Classrooms. Technical Report.

    ERIC Educational Resources Information Center

    Wiske, Martha Stone; And Others

    Twin aims--to advance theory and to improve practice in science, mathematics, and computing education--guided the Educational Technology Center's (ETC) research from its inception in 1983. These aims led ETC to establish collaborative research groups in which people whose primary interest was classroom teaching and learning, and researchers…

  4. White paper: A plan for cooperation between NASA and DARPA to establish a center for advanced architectures

    NASA Technical Reports Server (NTRS)

    Denning, P. J.; Adams, G. B., III; Brown, R. L.; Kanerva, P.; Leiner, B. M.; Raugh, M. R.

    1986-01-01

    Large, complex computer systems require many years of development. It is recognized that large scale systems are unlikely to be delivered in useful condition unless users are intimately involved throughout the design process. A mechanism is described that will involve users in the design of advanced computing systems and will accelerate the insertion of new systems into scientific research. This mechanism is embodied in a facility called the Center for Advanced Architectures (CAA). CAA would be a division of RIACS (Research Institute for Advanced Computer Science) and would receive its technical direction from a Scientific Advisory Board established by RIACS. The CAA described here is a possible implementation of a center envisaged in a proposed cooperation between NASA and DARPA.

  5. Lobachevsky Year at Kazan University: Center of Science, Education, Intellectual-Cognitive Tourism "Kazan - GeoNa - 2020+" and "Kazan-Moon-2020+" projects

    NASA Astrophysics Data System (ADS)

    Gusev, A.; Trudkova, N.

    2017-09-01

    Center "GeoNa" will enable scientists and teachers of the Russian universities to join to advanced achievements of a science, information technologies; to establish scientific communications with foreign colleagues in sphere of the high technology, educational projects and Intellectual-Cognitive Tourism. The Project "Kazan - Moon - 2020+" is directed on the decision of fundamental problems of celestial mechanics, selenodesy and geophysics of the Moon(s) connected to carrying out of complex theoretical researches and computer modelling.

  6. Photometric analysis in the Kepler Science Operations Center pipeline

    NASA Astrophysics Data System (ADS)

    Twicken, Joseph D.; Clarke, Bruce D.; Bryson, Stephen T.; Tenenbaum, Peter; Wu, Hayley; Jenkins, Jon M.; Girouard, Forrest; Klaus, Todd C.

    2010-07-01

    We describe the Photometric Analysis (PA) software component and its context in the Kepler Science Operations Center (SOC) Science Processing Pipeline. The primary tasks of this module are to compute the photometric flux and photocenters (centroids) for over 160,000 long cadence (~thirty minute) and 512 short cadence (~one minute) stellar targets from the calibrated pixels in their respective apertures. We discuss science algorithms for long and short cadence PA: cosmic ray cleaning; background estimation and removal; aperture photometry; and flux-weighted centroiding. We discuss the end-to-end propagation of uncertainties for the science algorithms. Finally, we present examples of photometric apertures, raw flux light curves, and centroid time series from Kepler flight data. PA light curves, centroid time series, and barycentric timestamp corrections are exported to the Multi-mission Archive at Space Telescope [Science Institute] (MAST) and are made available to the general public in accordance with the NASA/Kepler data release policy.

  7. International Space Station (ISS)

    NASA Image and Video Library

    2001-02-01

    The Payload Operations Center (POC) is the science command post for the International Space Station (ISS). Located at NASA's Marshall Space Flight Center in Huntsville, Alabama, it is the focal point for American and international science activities aboard the ISS. The POC's unique capabilities allow science experts and researchers around the world to perform cutting-edge science in the unique microgravity environment of space. The POC is staffed around the clock by shifts of payload flight controllers. At any given time, 8 to 10 flight controllers are on consoles operating, plarning for, and controlling various systems and payloads. This photograph shows a Payload Rack Officer (PRO) at a work station. The PRO is linked by a computer to all payload racks aboard the ISS. The PRO monitors and configures the resources and environment for science experiments including EXPRESS Racks, multiple-payload racks designed for commercial payloads.

  8. New project to support scientific collaboration electronically

    NASA Astrophysics Data System (ADS)

    Clauer, C. R.; Rasmussen, C. E.; Niciejewski, R. J.; Killeen, T. L.; Kelly, J. D.; Zambre, Y.; Rosenberg, T. J.; Stauning, P.; Friis-Christensen, E.; Mende, S. B.; Weymouth, T. E.; Prakash, A.; McDaniel, S. E.; Olson, G. M.; Finholt, T. A.; Atkins, D. E.

    A new multidisciplinary effort is linking research in the upper atmospheric and space, computer, and behavioral sciences to develop a prototype electronic environment for conducting team science worldwide. A real-world electronic collaboration testbed has been established to support scientific work centered around the experimental operations being conducted with instruments from the Sondrestrom Upper Atmospheric Research Facility in Kangerlussuaq, Greenland. Such group computing environments will become an important component of the National Information Infrastructure initiative, which is envisioned as the high-performance communications infrastructure to support national scientific research.

  9. A Vote for Election Science as an Academic Discipline

    ERIC Educational Resources Information Center

    Foster, Andrea L.

    2006-01-01

    This article presents the suggestion of Merle S. King, chairman of the department of computer science and information systems at Kennesaw State University and also a director of Kennesaw State's Center for Elections Systems, which has helped establish a uniform statewide voting system in Georgia. On the last day of the conference sponsored by the…

  10. 78 FR 59641 - Fisheries of the Caribbean, Gulf of Mexico, and South Atlantic; Revisions to Headboat Reporting...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-27

    ... revisions would require fishing records to be submitted electronically (via computer or Internet) on a weekly basis or at intervals shorter than a week if notified by the NMFS' Southeast Fisheries Science Center (SEFSC) Science and Research Director (SRD), and would prohibit headboats from continuing to fish...

  11. Caleb Phillips | NREL

    Science.gov Websites

    , Statistical Analysis and Data Mining: The ASA Data Science Journal (2017) Using GIS-Based Methods and Lidar techniques to the problem of large area coverage mapping for wireless networks. He has also done work in -4297 Dr. Caleb Phillips is a data scientist with the Computational Science Center at NREL. Caleb comes

  12. Record Number of Summer Students Work at Ames in 2014

    NASA Image and Video Library

    2014-09-16

    NASA's Ames Research Center concluded the 2014 summer student program session that featured a record number of participants from around the globe. More than 1,100 students with high school- to graduate-level education took part in a wide variety of science activities. Some of the activities included robotics, aeronautics, biology, computer science, engineering and astrophysics.

  13. A Science Information Infrastructure for Access to Earth and Space Science Data through the Nation's Science Museums

    NASA Technical Reports Server (NTRS)

    Murray, S.

    1999-01-01

    In this project, we worked with the University of California at Berkeley/Center for Extreme Ultraviolet Astrophysics and five science museums (the National Air and Space Museum, the Science Museum of Virginia, the Lawrence Hall of Science, the Exploratorium., and the New York Hall of Science) to formulate plans for computer-based laboratories located at these museums. These Science Learning Laboratories would be networked and provided with real Earth and space science observations, as well as appropriate lesson plans, that would allow the general public to directly access and manipulate the actual remote sensing data, much as a scientist would.

  14. GaAs Computer Technology

    DTIC Science & Technology

    1992-01-07

    AD-A259 259 FASTC-ID FOREIGN AEROSPACE SCIENCE AND TECHNOLOGY CENTER GaAs COMPUTER TECHNOLOGY (1) by Wang Qiao-yu 93-00999 Distrir bution t,,,Nm ted...FASTC- ID(RS)T-0310-92 HUMAN TRANSLATION FASTC-ID(RS)T-0310-92 7 January 1993 GaAs COMPUTER TECHNOLOGY (1) By: Wang Qiao-yu English pages: 6 Source...the best quality copy available. j C] " ------ GaAs Computer Technology (1) Wang Qiao-yu (Li-Shan Microelectronics Institute) Abstract: The paper

  15. Multicore: Fallout from a Computing Evolution

    ScienceCinema

    Yelick, Kathy [Director, NERSC

    2017-12-09

    July 22, 2008 Berkeley Lab lecture: Parallel computing used to be reserved for big science and engineering projects, but in two years that's all changed. Even laptops and hand-helds use parallel processors. Unfortunately, the software hasn't kept pace. Kathy Yelick, Director of the National Energy Research Scientific Computing Center at Berkeley Lab, describes the resulting chaos and the computing community's efforts to develop exciting applications that take advantage of tens or hundreds of processors on a single chip.

  16. The Future is Hera! Analyzing Astronomical Over the Internet

    NASA Technical Reports Server (NTRS)

    Valencic, L. A.; Chai, P.; Pence, W.; Shafer, R.; Snowden, S.

    2008-01-01

    Hera is the data processing facility provided by the High Energy Astrophysics Science Archive Research Center (HEASARC) at the NASA Goddard Space Flight Center for analyzing astronomical data. Hera provides all the pre-installed software packages, local disk space, and computing resources need to do general processing of FITS format data files residing on the users local computer, and to do research using the publicly available data from the High ENergy Astrophysics Division. Qualified students, educators and researchers may freely use the Hera services over the internet of research and educational purposes.

  17. Evolving Storage and Cyber Infrastructure at the NASA Center for Climate Simulation

    NASA Technical Reports Server (NTRS)

    Salmon, Ellen; Duffy, Daniel; Spear, Carrie; Sinno, Scott; Vaughan, Garrison; Bowen, Michael

    2018-01-01

    This talk will describe recent developments at the NASA Center for Climate Simulation, which is funded by NASAs Science Mission Directorate, and supports the specialized data storage and computational needs of weather, ocean, and climate researchers, as well as astrophysicists, heliophysicists, and planetary scientists. To meet requirements for higher-resolution, higher-fidelity simulations, the NCCS augments its High Performance Computing (HPC) and storage retrieval environment. As the petabytes of model and observational data grow, the NCCS is broadening data services offerings and deploying and expanding virtualization resources for high performance analytics.

  18. Usability Evaluation of the Student Centered e-Learning Environment

    ERIC Educational Resources Information Center

    Junus, Inas Sofiyah; Santoso, Harry Budi; Isal, R. Yugo K.; Utomo, Andika Yudha

    2015-01-01

    Student Centered e-Learning Environment (SCeLE) has substantial roles to support learning activities at Faculty of Computer Science, Universitas Indonesia (Fasilkom UI). Although it has been utilized for about 10 years, the usability aspect of SCeLE as an e-Learning system has not been evaluated. Therefore, the usability aspects of SCeLE Fasilkom…

  19. An Object-Oriented Software Reuse Tool

    DTIC Science & Technology

    1989-04-01

    Square Cambridge, MA 02139 I. CONTROLLING OFFICE NAME ANO ADDRESS 12. REPORT DATIE Advanced Research Projects Agency April 1989 1400 Wilson Blvd. IS...Office of Naval Research UNCLASSIFIED Information Systems Arlington, VA 22217 1s,. DECLASSIFICATION/DOWNGRAOINGSCHEDUL.E 6. O:STRIILJTION STATEMENT (of...DISTRIBUTION: Defense Technical Information Center Computer Sciences Division ONR, Code 1133 Navy Center for Applied Research in Artificial

  20. Integrating Intelligent Systems Domain Knowledge Into the Earth Science Curricula

    NASA Astrophysics Data System (ADS)

    Güereque, M.; Pennington, D. D.; Pierce, S. A.

    2017-12-01

    High-volume heterogeneous datasets are becoming ubiquitous, migrating to center stage over the last ten years and transcending the boundaries of computationally intensive disciplines into the mainstream, becoming a fundamental part of every science discipline. Despite the fact that large datasets are now pervasive across industries and academic disciplines, the array of skills is generally absent from earth science programs. This has left the bulk of the student population without access to curricula that systematically teach appropriate intelligent-systems skills, creating a void for skill sets that should be universal given their need and marketability. While some guidance regarding appropriate computational thinking and pedagogy is appearing, there exist few examples where these have been specifically designed and tested within the earth science domain. Furthermore, best practices from learning science have not yet been widely tested for developing intelligent systems-thinking skills. This research developed and tested evidence based computational skill modules that target this deficit with the intention of informing the earth science community as it continues to incorporate intelligent systems techniques and reasoning into its research and classrooms.

  1. Educational NASA Computational and Scientific Studies (enCOMPASS)

    NASA Technical Reports Server (NTRS)

    Memarsadeghi, Nargess

    2013-01-01

    Educational NASA Computational and Scientific Studies (enCOMPASS) is an educational project of NASA Goddard Space Flight Center aimed at bridging the gap between computational objectives and needs of NASA's scientific research, missions, and projects, and academia's latest advances in applied mathematics and computer science. enCOMPASS achieves this goal via bidirectional collaboration and communication between NASA and academia. Using developed NASA Computational Case Studies in university computer science/engineering and applied mathematics classes is a way of addressing NASA's goals of contributing to the Science, Technology, Education, and Math (STEM) National Objective. The enCOMPASS Web site at http://encompass.gsfc.nasa.gov provides additional information. There are currently nine enCOMPASS case studies developed in areas of earth sciences, planetary sciences, and astrophysics. Some of these case studies have been published in AIP and IEEE's Computing in Science and Engineering magazines. A few university professors have used enCOMPASS case studies in their computational classes and contributed their findings to NASA scientists. In these case studies, after introducing the science area, the specific problem, and related NASA missions, students are first asked to solve a known problem using NASA data and past approaches used and often published in a scientific/research paper. Then, after learning about the NASA application and related computational tools and approaches for solving the proposed problem, students are given a harder problem as a challenge for them to research and develop solutions for. This project provides a model for NASA scientists and engineers on one side, and university students, faculty, and researchers in computer science and applied mathematics on the other side, to learn from each other's areas of work, computational needs and solutions, and the latest advances in research and development. This innovation takes NASA science and engineering applications to computer science and applied mathematics university classes, and makes NASA objectives part of the university curricula. There is great potential for growth and return on investment of this program to the point where every major university in the U.S. would use at least one of these case studies in one of their computational courses, and where every NASA scientist and engineer facing a computational challenge (without having resources or expertise to solve it) would use enCOMPASS to formulate the problem as a case study, provide it to a university, and get back their solutions and ideas.

  2. PoPLAR: Portal for Petascale Lifescience Applications and Research

    PubMed Central

    2013-01-01

    Background We are focusing specifically on fast data analysis and retrieval in bioinformatics that will have a direct impact on the quality of human health and the environment. The exponential growth of data generated in biology research, from small atoms to big ecosystems, necessitates an increasingly large computational component to perform analyses. Novel DNA sequencing technologies and complementary high-throughput approaches--such as proteomics, genomics, metabolomics, and meta-genomics--drive data-intensive bioinformatics. While individual research centers or universities could once provide for these applications, this is no longer the case. Today, only specialized national centers can deliver the level of computing resources required to meet the challenges posed by rapid data growth and the resulting computational demand. Consequently, we are developing massively parallel applications to analyze the growing flood of biological data and contribute to the rapid discovery of novel knowledge. Methods The efforts of previous National Science Foundation (NSF) projects provided for the generation of parallel modules for widely used bioinformatics applications on the Kraken supercomputer. We have profiled and optimized the code of some of the scientific community's most widely used desktop and small-cluster-based applications, including BLAST from the National Center for Biotechnology Information (NCBI), HMMER, and MUSCLE; scaled them to tens of thousands of cores on high-performance computing (HPC) architectures; made them robust and portable to next-generation architectures; and incorporated these parallel applications in science gateways with a web-based portal. Results This paper will discuss the various developmental stages, challenges, and solutions involved in taking bioinformatics applications from the desktop to petascale with a front-end portal for very-large-scale data analysis in the life sciences. Conclusions This research will help to bridge the gap between the rate of data generation and the speed at which scientists can study this data. The ability to rapidly analyze data at such a large scale is having a significant, direct impact on science achieved by collaborators who are currently using these tools on supercomputers. PMID:23902523

  3. Travis Kemper | NREL

    Science.gov Websites

    | 303-275-4066 Dr. Travis Kemper is a post doctorate researcher in the Computational Science Center. He the University of Florida where he developed reactive force fields. During his post doctorate work at

  4. 76 FR 36095 - Notice of Submission for OMB Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-21

    ..., mathematics, and science literacy. It was first implemented by the National Center for Education Statistics..., mathematics will be the major subject domain. The field test will also include computer-based assessments in...

  5. FIRST 2002, 2003, 2004 Robotics Competition(s)

    NASA Technical Reports Server (NTRS)

    Purman, Richard

    2004-01-01

    The New Horizons Regional Education Center (NHREC) in Hampton, VA sought and received NASA funding to support its participation in the 2002, 2003, and 2004 FIRST Robotics Competitions. FIRST, Inc. (For Inspiration and Recognition of Science and Technology) is an organization which encourages the application of creative science, math, and computer science principles to solve real-world engineering problems. The FIRST competition is an international engineering contest featuring high school, government, and business partnerships.

  6. Microgravity

    NASA Image and Video Library

    1998-02-27

    NASA research Dr. Donald Frazier uses a blue laser shining through a quartz window into a special mix of chemicals to generate a polymer film on the inside quartz surface. As the chemicals respond to the laser light, they adhere to the glass surface, forming opticl films. Dr. Frazier and Dr. Mark S. Paley developed the process in the Space Sciences Laboratory at NASA's Marshall Space Flight Center in Huntsville, AL. Working aboard the Space Shuttle, a science team led by Dr. Frazier formed thin-films potentially useful in optical computers with fewer impurities than those formed on Earth. Patterns of these films can be traced onto the quartz surface. In the optical computers on the future, these films could replace electronic circuits and wires, making the systems more efficient and cost-effective, as well as lighter and more compact. Photo credit: NASA/Marshall Space Flight Center

  7. Microgravity

    NASA Image and Video Library

    1999-05-26

    NASA researcher Dr. Donald Frazier uses a blue laser shining through a quartz window into a special mix of chemicals to generate a polymer film on the inside quartz surface. As the chemicals respond to the laser light, they adhere to the glass surface, forming optical films. Dr. Frazier and Dr. Mark S. Paley developed the process in the Space Sciences Laboratory at NASA's Marshall Space Flight Center in Huntsville, AL. Working aboard the Space Shuttle, a science team led by Dr. Frazier formed thin-films potentially useful in optical computers with fewer impurities than those formed on Earth. Patterns of these films can be traced onto the quartz surface. In the optical computers of the future, thee films could replace electronic circuits and wires, making the systems more efficient and cost-effective, as well as lighter and more compact. Photo credit: NASA/Marshall Space Flight Center

  8. NASA Langley Research Center outreach in astronautical education

    NASA Technical Reports Server (NTRS)

    Duberg, J. E.

    1976-01-01

    The Langley Research Center has traditionally maintained an active relationship with the academic community, especially at the graduate level, to promote the Center's research program and to make graduate education available to its staff. Two new institutes at the Center - the Joint Institute for Acoustics and Flight Sciences, and the Institute for Computer Applications - are discussed. Both provide for research activity at the Center by university faculties. The American Society of Engineering Education Summer Faculty Fellowship Program and the NASA-NRC Postdoctoral Resident Research Associateship Program are also discussed.

  9. NASA Shared Services Center breaks ground

    NASA Image and Video Library

    2006-02-24

    NASA officials and elected leaders were on hand for the groundbreaking ceremony of the NASA Shared Services Center Feb. 24, 2006, on the grounds of Stennis Space Center. The NSSC provides agency centralized administrative processing, human resources, procurement and financial services. From left, Louisiana Economic Development Secretary Mike Olivier, Stennis Space Center Director Rick Gilbrech, Computer Sciences Corp. President Michael Laphen, NASA Deputy Administrator Shana Dale, Rep. Gene Taylor, Sen. Trent Lott, Mississippi Gov. Haley Barbour, NASA Administrator Mike Griffin and Shared Services Center Executive Director Arbuthnot use golden shovels to break ground at the site.

  10. NASA Shared Services Center breaks ground

    NASA Technical Reports Server (NTRS)

    2006-01-01

    NASA officials and elected leaders were on hand for the groundbreaking ceremony of the NASA Shared Services Center Feb. 24, 2006, on the grounds of Stennis Space Center. The NSSC provides agency centralized administrative processing, human resources, procurement and financial services. From left, Louisiana Economic Development Secretary Mike Olivier, Stennis Space Center Director Rick Gilbrech, Computer Sciences Corp. President Michael Laphen, NASA Deputy Administrator Shana Dale, Rep. Gene Taylor, Sen. Trent Lott, Mississippi Gov. Haley Barbour, NASA Administrator Mike Griffin and Shared Services Center Executive Director Arbuthnot use golden shovels to break ground at the site.

  11. Illustrative Computer Programming for Libraries; Selected Examples for Information Specialists. Contributions in Librarianship and Information Science, No. 12.

    ERIC Educational Resources Information Center

    Davis, Charles H.

    Intended for teaching applications programing for libraries and information centers, this volume is a graded workbook or text supplement containing typical practice problems, suggested solutions, and brief analyses which emphasize programing efficiency. The computer language used is Programing Language/One (PL/1) because it adapts readily to…

  12. JPL Earth Science Center Visualization Multitouch Table

    NASA Astrophysics Data System (ADS)

    Kim, R.; Dodge, K.; Malhotra, S.; Chang, G.

    2014-12-01

    JPL Earth Science Center Visualization table is a specialized software and hardware to allow multitouch, multiuser, and remote display control to create seamlessly integrated experiences to visualize JPL missions and their remote sensing data. The software is fully GIS capable through time aware OGC WMTS using Lunar Mapping and Modeling Portal as the GIS backend to continuously ingest and retrieve realtime remote sending data and satellite location data. 55 inch and 82 inch unlimited finger count multitouch displays allows multiple users to explore JPL Earth missions and visualize remote sensing data through very intuitive and interactive touch graphical user interface. To improve the integrated experience, Earth Science Center Visualization Table team developed network streaming which allows table software to stream data visualization to near by remote display though computer network. The purpose of this visualization/presentation tool is not only to support earth science operation, but specifically designed for education and public outreach and will significantly contribute to STEM. Our presentation will include overview of our software, hardware, and showcase of our system.

  13. ISCR Annual Report: Fical Year 2004

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McGraw, J R

    2005-03-03

    Large-scale scientific computation and all of the disciplines that support and help to validate it have been placed at the focus of Lawrence Livermore National Laboratory (LLNL) by the Advanced Simulation and Computing (ASC) program of the National Nuclear Security Administration (NNSA) and the Scientific Discovery through Advanced Computing (SciDAC) initiative of the Office of Science of the Department of Energy (DOE). The maturation of computational simulation as a tool of scientific and engineering research is underscored in the November 2004 statement of the Secretary of Energy that, ''high performance computing is the backbone of the nation's science and technologymore » enterprise''. LLNL operates several of the world's most powerful computers--including today's single most powerful--and has undertaken some of the largest and most compute-intensive simulations ever performed. Ultrascale simulation has been identified as one of the highest priorities in DOE's facilities planning for the next two decades. However, computers at architectural extremes are notoriously difficult to use efficiently. Furthermore, each successful terascale simulation only points out the need for much better ways of interacting with the resulting avalanche of data. Advances in scientific computing research have, therefore, never been more vital to LLNL's core missions than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, LLNL must engage researchers at many academic centers of excellence. In Fiscal Year 2004, the Institute for Scientific Computing Research (ISCR) served as one of LLNL's main bridges to the academic community with a program of collaborative subcontracts, visiting faculty, student internships, workshops, and an active seminar series. The ISCR identifies researchers from the academic community for computer science and computational science collaborations with LLNL and hosts them for short- and long-term visits with the aim of encouraging long-term academic research agendas that address LLNL's research priorities. Through such collaborations, ideas and software flow in both directions, and LLNL cultivates its future workforce. The Institute strives to be LLNL's ''eyes and ears'' in the computer and information sciences, keeping the Laboratory aware of and connected to important external advances. It also attempts to be the ''feet and hands'' that carry those advances into the Laboratory and incorporates them into practice. ISCR research participants are integrated into LLNL's Computing and Applied Research (CAR) Department, especially into its Center for Applied Scientific Computing (CASC). In turn, these organizations address computational challenges arising throughout the rest of the Laboratory. Administratively, the ISCR flourishes under LLNL's University Relations Program (URP). Together with the other five institutes of the URP, it navigates a course that allows LLNL to benefit from academic exchanges while preserving national security. While it is difficult to operate an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and worth the continued effort.« less

  14. Remarks on neurocybernetics and its links to computing science. To the memory of Prof. Luigi M. Ricciardi.

    PubMed

    Moreno-Díaz, Roberto; Moreno-Díaz, Arminda

    2013-06-01

    This paper explores the origins and content of neurocybernetics and its links to artificial intelligence, computer science and knowledge engineering. Starting with three remarkable pieces of work, we center attention on a number of events that initiated and developed basic topics that are still nowadays a matter of research and inquire, from goal directed activity theories to circular causality and to reverberations and learning. Within this context, we pay tribute to the memory of Prof. Ricciardi documenting the importance of his contributions in the mathematics of brain, neural nets and neurophysiological models, computational simulations and techniques. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  15. From cosmos to connectomes: the evolution of data-intensive science.

    PubMed

    Burns, Randal; Vogelstein, Joshua T; Szalay, Alexander S

    2014-09-17

    The analysis of data requires computation: originally by hand and more recently by computers. Different models of computing are designed and optimized for different kinds of data. In data-intensive science, the scale and complexity of data exceeds the comfort zone of local data stores on scientific workstations. Thus, cloud computing emerges as the preeminent model, utilizing data centers and high-performance clusters, enabling remote users to access and query subsets of the data efficiently. We examine how data-intensive computational systems originally built for cosmology, the Sloan Digital Sky Survey (SDSS), are now being used in connectomics, at the Open Connectome Project. We list lessons learned and outline the top challenges we expect to face. Success in computational connectomics would drastically reduce the time between idea and discovery, as SDSS did in cosmology. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. Multicore: Fallout From a Computing Evolution (LBNL Summer Lecture Series)

    ScienceCinema

    Yelick, Kathy [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). National Energy Research Scientific Computing Center (NERSC)

    2018-05-07

    Summer Lecture Series 2008: Parallel computing used to be reserved for big science and engineering projects, but in two years that's all changed. Even laptops and hand-helds use parallel processors. Unfortunately, the software hasn't kept pace. Kathy Yelick, Director of the National Energy Research Scientific Computing Center at Berkeley Lab, describes the resulting chaos and the computing community's efforts to develop exciting applications that take advantage of tens or hundreds of processors on a single chip.

  17. BioSIGHT: Interactive Visualization Modules for Science Education

    NASA Technical Reports Server (NTRS)

    Wong, Wee Ling

    1998-01-01

    Redefining science education to harness emerging integrated media technologies with innovative pedagogical goals represents a unique challenge. The Integrated Media Systems Center (IMSC) is the only engineering research center in the area of multimedia and creative technologies sponsored by the National Science Foundation. The research program at IMSC is focused on developing advanced technologies that address human-computer interfaces, database management, and high- speed network capabilities. The BioSIGHT project at IMSC is a demonstration technology project in the area of education that seeks to address how such emerging multimedia technologies can make an impact on science education. The scope of this project will help solidify NASA's commitment for the development of innovative educational resources that promotes science literacy for our students and the general population as well. These issues must be addressed as NASA marches towards the goal of enabling human space exploration that requires an understanding of life sciences in space. The IMSC BioSIGHT lab was established with the purpose of developing a novel methodology that will map a high school biology curriculum into a series of interactive visualization modules that can be easily incorporated into a space biology curriculum. Fundamental concepts in general biology must be mastered in order to allow a better understanding and application for space biology. Interactive visualization is a powerful component that can capture the students' imagination, facilitate their assimilation of complex ideas, and help them develop integrated views of biology. These modules will augment the role of the teacher and will establish the value of student-centered interactivity, both in an individual setting as well as in a collaborative learning environment. Students will be able to interact with the content material, explore new challenges, and perform virtual laboratory simulations. The BioSIGHT effort is truly cross-disciplinary in nature and requires expertise from many areas including Biology, Computer Science, Electrical Engineering, Education, and the Cognitive Sciences. The BioSIGHT team includes a scientific illustrator, educational software designer, computer programmers as well as IMSC graduate and undergraduate students. Our collaborators include TERC, a research and education organization with extensive k-12 math and science curricula development from Cambridge, MA.; SRI International of Menlo Park, CA.; teachers and students from local area high schools (Newbury Park High School, USC's Family of Five schools, Chadwick School, and Pasadena Polytechnic High School).

  18. Photometric Analysis in the Kepler Science Operations Center Pipeline

    NASA Technical Reports Server (NTRS)

    Twicken, Joseph D.; Clarke, Bruce D.; Bryson, Stephen T.; Tenenbaum, Peter; Wu, Hayley; Jenkins, Jon M.; Girouard, Forrest; Klaus, Todd C.

    2010-01-01

    We describe the Photometric Analysis (PA) software component and its context in the Kepler Science Operations Center (SOC) pipeline. The primary tasks of this module are to compute the photometric flux and photocenters (centroids) for over 160,000 long cadence (thirty minute) and 512 short cadence (one minute) stellar targets from the calibrated pixels in their respective apertures. We discuss the science algorithms for long and short cadence PA: cosmic ray cleaning; background estimation and removal; aperture photometry; and flux-weighted centroiding. We discuss the end-to-end propagation of uncertainties for the science algorithms. Finally, we present examples of photometric apertures, raw flux light curves, and centroid time series from Kepler flight data. PA light curves, centroid time series, and barycentric timestamp corrections are exported to the Multi-mission Archive at Space Telescope [Science Institute] (MAST) and are made available to the general public in accordance with the NASA/Kepler data release policy.

  19. The Kepler Science Operations Center Pipeline Framework Extensions

    NASA Technical Reports Server (NTRS)

    Klaus, Todd C.; Cote, Miles T.; McCauliff, Sean; Girouard, Forrest R.; Wohler, Bill; Allen, Christopher; Chandrasekaran, Hema; Bryson, Stephen T.; Middour, Christopher; Caldwell, Douglas A.; hide

    2010-01-01

    The Kepler Science Operations Center (SOC) is responsible for several aspects of the Kepler Mission, including managing targets, generating on-board data compression tables, monitoring photometer health and status, processing the science data, and exporting the pipeline products to the mission archive. We describe how the generic pipeline framework software developed for Kepler is extended to achieve these goals, including pipeline configurations for processing science data and other support roles, and custom unit of work generators that control how the Kepler data are partitioned and distributed across the computing cluster. We describe the interface between the Java software that manages the retrieval and storage of the data for a given unit of work and the MATLAB algorithms that process these data. The data for each unit of work are packaged into a single file that contains everything needed by the science algorithms, allowing these files to be used to debug and evolve the algorithms offline.

  20. Center for Computational Structures Technology

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Perry, Ferman W.

    1995-01-01

    The Center for Computational Structures Technology (CST) is intended to serve as a focal point for the diverse CST research activities. The CST activities include the use of numerical simulation and artificial intelligence methods in modeling, analysis, sensitivity studies, and optimization of flight-vehicle structures. The Center is located at NASA Langley and is an integral part of the School of Engineering and Applied Science of the University of Virginia. The key elements of the Center are: (1) conducting innovative research on advanced topics of CST; (2) acting as pathfinder by demonstrating to the research community what can be done (high-potential, high-risk research); (3) strong collaboration with NASA scientists and researchers from universities and other government laboratories; and (4) rapid dissemination of CST to industry, through integration of industrial personnel into the ongoing research efforts.

  1. 75 FR 10491 - Center for Scientific Review; Notice of Closed Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-08

    ...: Computational Biology, Image Processing, and Data Mining. Date: March 18, 2010. Time: 8 a.m. to 6 p.m. Agenda... Science. Date: March 24, 2010. Time: 12 p.m. to 3:30 p.m. Agenda: To review and evaluate grant...; Fellowship: Biophysical and Biochemical Sciences. Date: March 25-26, 2010. Time: 8 a.m. to 5 p.m. Agenda: To...

  2. Center for Advanced Computational Technology

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    2000-01-01

    The Center for Advanced Computational Technology (ACT) was established to serve as a focal point for diverse research activities pertaining to application of advanced computational technology to future aerospace systems. These activities include the use of numerical simulations, artificial intelligence methods, multimedia and synthetic environments, and computational intelligence, in the modeling, analysis, sensitivity studies, optimization, design and operation of future aerospace systems. The Center is located at NASA Langley and is an integral part of the School of Engineering and Applied Science of the University of Virginia. The Center has four specific objectives: 1) conduct innovative research on applications of advanced computational technology to aerospace systems; 2) act as pathfinder by demonstrating to the research community what can be done (high-potential, high-risk research); 3) help in identifying future directions of research in support of the aeronautical and space missions of the twenty-first century; and 4) help in the rapid transfer of research results to industry and in broadening awareness among researchers and engineers of the state-of-the-art in applications of advanced computational technology to the analysis, design prototyping and operations of aerospace and other high-performance engineering systems. In addition to research, Center activities include helping in the planning and coordination of the activities of a multi-center team of NASA and JPL researchers who are developing an intelligent synthesis environment for future aerospace systems; organizing workshops and national symposia; as well as writing state-of-the-art monographs and NASA special publications on timely topics.

  3. Unplugged Cybersecurity: An Approach for Bringing Computer Science into the Classroom

    ERIC Educational Resources Information Center

    Fees, Rachel E.; da Rosa, Jennifer A.; Durkin, Sarah S.; Murray, Mark M.; Moran, Angela L.

    2018-01-01

    The United States Naval Academy (USNA) STEM Center for Education and Outreach addresses an urgent Navy and national need for more young people to pursue careers in STEM fields through world-wide outreach to 17,000 students and 900 teachers per year. To achieve this mission, the STEM Center has developed a hands-on and inquiry-based methodology to…

  4. MIT Laboratory for Computer Science Progress Report, July 1984-June 1985

    DTIC Science & Technology

    1985-06-01

    larger (up to several thousand machines) multiprocessor systems. This facility, funded by the newly formed Strategic Computing Program of the Defense...Szolovits, Group Leader R. Patil Collaborating Investigators M. Criscitiello, M.D., Tufts-New England Medical Center Hospital J. Dzierzanowski, Ph.D., Dept...COMPUTATION STRUCTURES Academic Staff J. B. Dennis, Group Leader Research Staff W. B. Ackerman G. A. Boughton W. Y-P. Lim Graduate Students T-A. Chu S

  5. Real science at the petascale.

    PubMed

    Saksena, Radhika S; Boghosian, Bruce; Fazendeiro, Luis; Kenway, Owain A; Manos, Steven; Mazzeo, Marco D; Sadiq, S Kashif; Suter, James L; Wright, David; Coveney, Peter V

    2009-06-28

    We describe computational science research that uses petascale resources to achieve scientific results at unprecedented scales and resolution. The applications span a wide range of domains, from investigation of fundamental problems in turbulence through computational materials science research to biomedical applications at the forefront of HIV/AIDS research and cerebrovascular haemodynamics. This work was mainly performed on the US TeraGrid 'petascale' resource, Ranger, at Texas Advanced Computing Center, in the first half of 2008 when it was the largest computing system in the world available for open scientific research. We have sought to use this petascale supercomputer optimally across application domains and scales, exploiting the excellent parallel scaling performance found on up to at least 32 768 cores for certain of our codes in the so-called 'capability computing' category as well as high-throughput intermediate-scale jobs for ensemble simulations in the 32-512 core range. Furthermore, this activity provides evidence that conventional parallel programming with MPI should be successful at the petascale in the short to medium term. We also report on the parallel performance of some of our codes on up to 65 636 cores on the IBM Blue Gene/P system at the Argonne Leadership Computing Facility, which has recently been named the fastest supercomputer in the world for open science.

  6. Computing with Beowulf

    NASA Technical Reports Server (NTRS)

    Cohen, Jarrett

    1999-01-01

    Parallel computers built out of mass-market parts are cost-effectively performing data processing and simulation tasks. The Supercomputing (now known as "SC") series of conferences celebrated its 10th anniversary last November. While vendors have come and gone, the dominant paradigm for tackling big problems still is a shared-resource, commercial supercomputer. Growing numbers of users needing a cheaper or dedicated-access alternative are building their own supercomputers out of mass-market parts. Such machines are generally called Beowulf-class systems after the 11th century epic. This modern-day Beowulf story began in 1994 at NASA's Goddard Space Flight Center. A laboratory for the Earth and space sciences, computing managers there threw down a gauntlet to develop a $50,000 gigaFLOPS workstation for processing satellite data sets. Soon, Thomas Sterling and Don Becker were working on the Beowulf concept at the University Space Research Association (USRA)-run Center of Excellence in Space Data and Information Sciences (CESDIS). Beowulf clusters mix three primary ingredients: commodity personal computers or workstations, low-cost Ethernet networks, and the open-source Linux operating system. One of the larger Beowulfs is Goddard's Highly-parallel Integrated Virtual Environment, or HIVE for short.

  7. Utilization of Educationally Oriented Microcomputer Based Laboratories

    ERIC Educational Resources Information Center

    Fitzpatrick, Michael J.; Howard, James A.

    1977-01-01

    Describes one approach to supplying engineering and computer science educators with an economical portable digital systems laboratory centered around microprocessors. Expansion of the microcomputer based laboratory concept to include Learning Resource Aided Instruction (LRAI) systems is explored. (Author)

  8. | NREL

    Science.gov Websites

    of NREL's Computational Science Center, where he uses electronic structure calculations and other introductory chemistry and physical chemistry. Research Interests Electronic structure and dynamics in the quantum/classical molecular dynamics simulation|Coupling of molecular electronic structure to

  9. Better Broader Impacts through National Science Foundation Centers

    NASA Astrophysics Data System (ADS)

    Campbell, K. M.

    2010-12-01

    National Science Foundation Science and Technology Centers (STCs) play a leading role in developing and evaluating “Better Broader Impacts”; best practices for recruiting a broad spectrum of American students into STEM fields and for educating these future professionals, as well as their families, teachers and the general public. With staff devoted full time to Broader Impacts activities, over the ten year life of a Center, STCs are able to address both a broad range of audiences and a broad range of topics. Along with other NSF funded centers, such as Centers for Ocean Sciences Education Excellence, Engineering Research Centers and Materials Research Science and Engineering Centers, STCs develop both models and materials that individual researchers can adopt, as well as, in some cases, direct opportunities for individual researchers to offer their disciplinary research expertise to existing center Broader Impacts Programs. The National Center for Earth-surface Dynamics is an STC headquartered at the University of Minnesota. NCED’s disciplinary research spans the physical, biological and engineering issues associated with developing an integrative, quantitative and predictive understanding of rivers and river basins. Funded in 2002, we have had the opportunity to partner with individuals and institutions ranging from formal to informal education and from science museums to Tribal and women’s colleges. We have developed simple table top physical models, complete museum exhibitions, 3D paper maps and interactive computer based visualizations, all of which have helped us communicate with this wide variety of learners. Many of these materials themselves or plans to construct them are available online; in many cases they have also been formally evaluated. We have also listened to the formal and informal educators with whom we partner, from whom we have learned a great deal about how to design Broader Impacts activities and programs. Using NCED as a case study, this session showcases NCED’s materials, approaches and lessons learned. We will also introduce the work of our sister STCs, whose disciplines span the STEM fields.

  10. Center of Excellence for Geospatial Information Science research plan 2013-18

    USGS Publications Warehouse

    Usery, E. Lynn

    2013-01-01

    The U.S. Geological Survey Center of Excellence for Geospatial Information Science (CEGIS) was created in 2006 and since that time has provided research primarily in support of The National Map. The presentations and publications of the CEGIS researchers document the research accomplishments that include advances in electronic topographic map design, generalization, data integration, map projections, sea level rise modeling, geospatial semantics, ontology, user-centered design, volunteer geographic information, and parallel and grid computing for geospatial data from The National Map. A research plan spanning 2013–18 has been developed extending the accomplishments of the CEGIS researchers and documenting new research areas that are anticipated to support The National Map of the future. In addition to extending the 2006–12 research areas, the CEGIS research plan for 2013–18 includes new research areas in data models, geospatial semantics, high-performance computing, volunteered geographic information, crowdsourcing, social media, data integration, and multiscale representations to support the Three-Dimensional Elevation Program (3DEP) and The National Map of the future of the U.S. Geological Survey.

  11. Supporting Weather Data

    NASA Technical Reports Server (NTRS)

    2004-01-01

    Since its founding in 1992, Global Science & Technology, Inc. (GST), of Greenbelt, Maryland, has been developing technologies and providing services in support of NASA scientific research. GST specialties include scientific analysis, science data and information systems, data visualization, communications, networking and Web technologies, computer science, and software system engineering. As a longtime contractor to Goddard Space Flight Center s Earth Science Directorate, GST scientific, engineering, and information technology staff have extensive qualifications with the synthesis of satellite, in situ, and Earth science data for weather- and climate-related projects. GST s experience in this arena is end-to-end, from building satellite ground receiving systems and science data systems, to product generation and research and analysis.

  12. Index to NASA Tech Briefs, 1974

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The following information was given for 1974: (1) abstracts of reports dealing with new technology derived from the research and development activities of NASA or the U.S. Atomic Energy Commission, arranged by subjects: electronics/electrical, electronics/electrical systems, physical sciences, materials/chemistry, life sciences, mechanics, machines, equipment and tools, fabrication technology, and computer programs, (2) indexes for the above documents: subject, personal author, originating center.

  13. Determination of the vapor-liquid transition of square-well particles using a novel generalized-canonical-ensemble-based method

    NASA Astrophysics Data System (ADS)

    Zhao, Liang; Xu, Shun; Tu, Yu-Song; Zhou, Xin

    2017-06-01

    Not Available Project supported by the National Natural Science Foundation for Outstanding Young Scholars, China (Grant No. 11422542), the National Natural Science Foundation of China (Grant Nos. 11605151 and 11675138), and the Shanghai Supercomputer Center of China and Special Program for Applied Research on Super Computation of the NSFC-Guangdong Joint Fund (the second phase).

  14. Modelling Effects on Grid Cells of Sensory Input During Self-motion

    DTIC Science & Technology

    2016-04-20

    input during self-motion Florian Raudies, James R. Hinman and Michael E. Hasselmo Center for Systems Neuroscience , Centre for Memory and Brain...Department of Psychological and Brain Sciences and Graduate Program for Neuroscience , Boston University, 2 Cummington Mall, Boston, MA 02215, USA Visual...Psychological and Brain Sciences and the Centre for Computational Neuroscience and Neural Technology before taking his current position as a Research

  15. Training the Future - Interns Harvesting & Testing Plant Experim

    NASA Image and Video Library

    2017-07-19

    In the Space Life Sciences Laboratory at NASA's Kennedy Space Center in Florida, student interns such as Ayla Grandpre are joining agency scientists, contributing in the area of plant growth research for food production in space. Grandpre is majoring in computer science and chemistry at Rocky Mountain College in Billings, Montana. The agency attracts its future workforce through the NASA Internship, Fellowships and Scholarships, or NIFS, Program.

  16. Teacher Challenges, Perceptions, and Use of Science Models in Middle School Classrooms about Climate, Weather, and Energy Concepts

    ERIC Educational Resources Information Center

    Yarker, Morgan Brown

    2013-01-01

    Research suggests that scientific models and modeling should be topics covered in K-12 classrooms as part of a comprehensive science curriculum. It is especially important when talking about topics in weather and climate, where computer and forecast models are the center of attention. There are several approaches to model based inquiry, but it can…

  17. US Department of Energy High School Student Supercomputing Honors Program: A follow-up assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1987-01-01

    The US DOE High School Student Supercomputing Honors Program was designed to recognize high school students with superior skills in mathematics and computer science and to provide them with formal training and experience with advanced computer equipment. This document reports on the participants who attended the first such program, which was held at the National Magnetic Fusion Energy Computer Center at the Lawrence Livermore National Laboratory (LLNL) during August 1985.

  18. Management and Analysis of Biological and Clinical Data: How Computer Science May Support Biomedical and Clinical Research

    NASA Astrophysics Data System (ADS)

    Veltri, Pierangelo

    The use of computer based solutions for data management in biology and clinical science has contributed to improve life-quality and also to gather research results in shorter time. Indeed, new algorithms and high performance computation have been using in proteomics and genomics studies for curing chronic diseases (e.g., drug designing) as well as supporting clinicians both in diagnosis (e.g., images-based diagnosis) and patient curing (e.g., computer based information analysis on information gathered from patient). In this paper we survey on examples of computer based techniques applied in both biology and clinical contexts. The reported applications are also results of experiences in real case applications at University Medical School of Catanzaro and also part of experiences of the National project Staywell SH 2.0 involving many research centers and companies aiming to study and improve citizen wellness.

  19. Systems engineering technology for networks

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The report summarizes research pursued within the Systems Engineering Design Laboratory at Virginia Polytechnic Institute and State University between May 16, 1993 and January 31, 1994. The project was proposed in cooperation with the Computational Science and Engineering Research Center at Howard University. Its purpose was to investigate emerging systems engineering tools and their applicability in analyzing the NASA Network Control Center (NCC) on the basis of metrics and measures.

  20. Network-based approaches to climate knowledge discovery

    NASA Astrophysics Data System (ADS)

    Budich, Reinhard; Nyberg, Per; Weigel, Tobias

    2011-11-01

    Climate Knowledge Discovery Workshop; Hamburg, Germany, 30 March to 1 April 2011 Do complex networks combined with semantic Web technologies offer the next generation of solutions in climate science? To address this question, a first Climate Knowledge Discovery (CKD) Workshop, hosted by the German Climate Computing Center (Deutsches Klimarechenzentrum (DKRZ)), brought together climate and computer scientists from major American and European laboratories, data centers, and universities, as well as representatives from industry, the broader academic community, and the semantic Web communities. The participants, representing six countries, were concerned with large-scale Earth system modeling and computational data analysis. The motivation for the meeting was the growing problem that climate scientists generate data faster than it can be interpreted and the need to prepare for further exponential data increases. Current analysis approaches are focused primarily on traditional methods, which are best suited for large-scale phenomena and coarse-resolution data sets. The workshop focused on the open discussion of ideas and technologies to provide the next generation of solutions to cope with the increasing data volumes in climate science.

  1. A Community Publication and Dissemination System for Hydrology Education Materials

    NASA Astrophysics Data System (ADS)

    Ruddell, B. L.

    2015-12-01

    Hosted by CUAHSI and the Science Education Resource Center (SERC), federated by the National Science Digital Library (NSDL), and allied with the Water Data Center (WDC), Hydrologic Information System (HIS), and HydroShare projects, a simple cyberinfrastructure has been launched for the publication and dissemination of data and model driven university hydrology education materials. This lightweight system's metadata describes learning content as a data-driven module with defined data inputs and outputs. This structure allows a user to mix and match modules to create sequences of content that teach both hydrology and computer learning outcomes. Importantly, this modular infrastructure allows an instructor to substitute a module based on updated computer methods for one based on outdated computer methods, hopefully solving the problem of rapid obsolescence that has hampered previous community efforts. The prototype system is now available from CUAHSI and SERC, with some example content. The system is designed to catalog, link to, make visible, and make accessible the existing and future contributions of the community; this system does not create content. Submissions from hydrology educators are eagerly solicited, especially for existing content.

  2. Developing the Next Generation of Science Data System Engineers

    NASA Technical Reports Server (NTRS)

    Moses, John F.; Behnke, Jeanne; Durachka, Christopher D.

    2016-01-01

    At Goddard, engineers and scientists with a range of experience in science data systems are needed to employ new technologies and develop advances in capabilities for supporting new Earth and Space science research. Engineers with extensive experience in science data, software engineering and computer-information architectures are needed to lead and perform these activities. The increasing types and complexity of instrument data and emerging computer technologies coupled with the current shortage of computer engineers with backgrounds in science has led the need to develop a career path for science data systems engineers and architects.The current career path, in which undergraduate students studying various disciplines such as Computer Engineering or Physical Scientist, generally begins with serving on a development team in any of the disciplines where they can work in depth on existing Goddard data systems or serve with a specific NASA science team. There they begin to understand the data, infuse technologies, and begin to know the architectures of science data systems. From here the typical career involves peermentoring, on-the-job training or graduate level studies in analytics, computational science and applied science and mathematics. At the most senior level, engineers become subject matter experts and system architect experts, leading discipline-specific data centers and large software development projects. They are recognized as a subject matter expert in a science domain, they have project management expertise, lead standards efforts and lead international projects. A long career development remains necessary not only because of the breadth of knowledge required across physical sciences and engineering disciplines, but also because of the diversity of instrument data being developed today both by NASA and international partner agencies and because multidiscipline science and practitioner communities expect to have access to all types of observational data.This paper describes an approach to defining career-path guidance for college-bound high school and undergraduate engineering students, junior and senior engineers from various disciplines.

  3. Developing the Next Generation of Science Data System Engineers

    NASA Astrophysics Data System (ADS)

    Moses, J. F.; Durachka, C. D.; Behnke, J.

    2015-12-01

    At Goddard, engineers and scientists with a range of experience in science data systems are needed to employ new technologies and develop advances in capabilities for supporting new Earth and Space science research. Engineers with extensive experience in science data, software engineering and computer-information architectures are needed to lead and perform these activities. The increasing types and complexity of instrument data and emerging computer technologies coupled with the current shortage of computer engineers with backgrounds in science has led the need to develop a career path for science data systems engineers and architects. The current career path, in which undergraduate students studying various disciplines such as Computer Engineering or Physical Scientist, generally begins with serving on a development team in any of the disciplines where they can work in depth on existing Goddard data systems or serve with a specific NASA science team. There they begin to understand the data, infuse technologies, and begin to know the architectures of science data systems. From here the typical career involves peer mentoring, on-the-job training or graduate level studies in analytics, computational science and applied science and mathematics. At the most senior level, engineers become subject matter experts and system architect experts, leading discipline-specific data centers and large software development projects. They are recognized as a subject matter expert in a science domain, they have project management expertise, lead standards efforts and lead international projects. A long career development remains necessary not only because of the breath of knowledge required across physical sciences and engineering disciplines, but also because of the diversity of instrument data being developed today both by NASA and international partner agencies and because multi-discipline science and practitioner communities expect to have access to all types of observational data. This paper describes an approach to defining career-path guidance for college-bound high school and undergraduate engineering students, junior and senior engineers from various disciplines.

  4. Mild Cognitive Impairment: What Do We Do Now?

    MedlinePlus

    ... in studies that focus on individual health, computer use and technology, family relationships and caregiving, community services, housing, and ... Reserve Officer Training Corps Navy Research Centers Science, Technology, and ... of Education School of Performing Arts College Office of the ...

  5. Improving the Human Hazard Characterization of Chemicals: A Tox21 Update

    EPA Science Inventory

    Background: In 2008, the National Institute of Environmental Health Sciences/National Toxicology Program, the U.S. Environmental Protection Agency’s National Center for Computational Toxicology, and the National Human Genome Research Institute/National Institutes of Health ...

  6. Sandia National Laboratories: Research: Materials Science

    Science.gov Websites

    Technology Partnerships Business, Industry, & Non-Profits Government Universities Center for Development Agreement (CRADA) Strategic Partnership Projects, Non-Federal Entity (SPP/NFE) Agreements New research. Research Our research uses Sandia's experimental, theoretical, and computational capabilities to

  7. The 1987 RIACS annual report

    NASA Technical Reports Server (NTRS)

    1987-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established at the NASA Ames Research Center in June of 1983. RIACS is privately operated by the Universities Space Research Association (USRA), a consortium of 64 universities with graduate programs in the aerospace sciences, under several Cooperative Agreements with NASA. RIACS's goal is to provide preeminent leadership in basic and applied computer science research as partners in support of NASA's goals and missions. In pursuit of this goal, RIACS contributes to several of the grand challenges in science and engineering facing NASA: flying an airplane inside a computer; determining the chemical properties of materials under hostile conditions in the atmospheres of earth and the planets; sending intelligent machines on unmanned space missions; creating a one-world network that makes all scientific resources, including those in space, accessible to all the world's scientists; providing intelligent computational support to all stages of the process of scientific investigation from problem formulation to results dissemination; and developing accurate global models for climatic behavior throughout the world. In working with these challenges, we seek novel architectures, and novel ways to use them, that exploit the potential of parallel and distributed computation and make possible new functions that are beyond the current reach of computing machines. The investigation includes pattern computers as well as the more familiar numeric and symbolic computers, and it includes networked systems of resources distributed around the world. We believe that successful computer science research is interdisciplinary: it is driven by (and drives) important problems in other disciplines. We believe that research should be guided by a clear long-term vision with planned milestones. And we believe that our environment must foster and exploit innovation. Our activities and accomplishments for the calendar year 1987 and our plans for 1988 are reported.

  8. Wave refraction diagrams for the Baltimore Canyon region of the mid-Atlantic continental shelf computed by using three bottom topography approximation techniques

    NASA Technical Reports Server (NTRS)

    Poole, L. R.

    1976-01-01

    The Langley Research Center and Virginia Institute of Marine Science wave refraction computer model was applied to the Baltimore Canyon region of the mid-Atlantic continental shelf. Wave refraction diagrams for a wide range of normally expected wave periods and directions were computed by using three bottom topography approximation techniques: quadratic least squares, cubic least squares, and constrained bicubic interpolation. Mathematical or physical interpretation of certain features appearing in the computed diagrams is discussed.

  9. Image Understanding Research and Its Application to Cartography and Computer-Based Analysis of Aerial Imagery

    DTIC Science & Technology

    1983-09-01

    Report Al-TR-346. Artifcial Intelligence Laboratory, Mamachusetts Institute of Tech- niugy. Cambridge, Mmeh mett. June 19 [G.usmn@ A. Gaman-Arenas...Testbed Coordinator, 415/859-4395 Artificial Intelligence Center Computer Science and Technology Division Prepared for: Defense Advanced Research...to support processing of aerial photographs for such military applications as cartography, Intelligence , weapon guidance, and targeting. A key

  10. NASA's Participation in the National Computational Grid

    NASA Technical Reports Server (NTRS)

    Feiereisen, William J.; Zornetzer, Steve F. (Technical Monitor)

    1998-01-01

    Over the last several years it has become evident that the character of NASA's supercomputing needs has changed. One of the major missions of the agency is to support the design and manufacture of aero- and space-vehicles with technologies that will significantly reduce their cost. It is becoming clear that improvements in the process of aerospace design and manufacturing will require a high performance information infrastructure that allows geographically dispersed teams to draw upon resources that are broader than traditional supercomputing. A computational grid draws together our information resources into one system. We can foresee the time when a Grid will allow engineers and scientists to use the tools of supercomputers, databases and on line experimental devices in a virtual environment to collaborate with distant colleagues. The concept of a computational grid has been spoken of for many years, but several events in recent times are conspiring to allow us to actually build one. In late 1997 the National Science Foundation initiated the Partnerships for Advanced Computational Infrastructure (PACI) which is built around the idea of distributed high performance computing. The Alliance lead, by the National Computational Science Alliance (NCSA), and the National Partnership for Advanced Computational Infrastructure (NPACI), lead by the San Diego Supercomputing Center, have been instrumental in drawing together the "Grid Community" to identify the technology bottlenecks and propose a research agenda to address them. During the same period NASA has begun to reformulate parts of two major high performance computing research programs to concentrate on distributed high performance computing and has banded together with the PACI centers to address the research agenda in common.

  11. OPENING REMARKS: SciDAC: Scientific Discovery through Advanced Computing

    NASA Astrophysics Data System (ADS)

    Strayer, Michael

    2005-01-01

    Good morning. Welcome to SciDAC 2005 and San Francisco. SciDAC is all about computational science and scientific discovery. In a large sense, computational science characterizes SciDAC and its intent is change. It transforms both our approach and our understanding of science. It opens new doors and crosses traditional boundaries while seeking discovery. In terms of twentieth century methodologies, computational science may be said to be transformational. There are a number of examples to this point. First are the sciences that encompass climate modeling. The application of computational science has in essence created the field of climate modeling. This community is now international in scope and has provided precision results that are challenging our understanding of our environment. A second example is that of lattice quantum chromodynamics. Lattice QCD, while adding precision and insight to our fundamental understanding of strong interaction dynamics, has transformed our approach to particle and nuclear science. The individual investigator approach has evolved to teams of scientists from different disciplines working side-by-side towards a common goal. SciDAC is also undergoing a transformation. This meeting is a prime example. Last year it was a small programmatic meeting tracking progress in SciDAC. This year, we have a major computational science meeting with a variety of disciplines and enabling technologies represented. SciDAC 2005 should position itself as a new corner stone for Computational Science and its impact on science. As we look to the immediate future, FY2006 will bring a new cycle to SciDAC. Most of the program elements of SciDAC will be re-competed in FY2006. The re-competition will involve new instruments for computational science, new approaches for collaboration, as well as new disciplines. There will be new opportunities for virtual experiments in carbon sequestration, fusion, and nuclear power and nuclear waste, as well as collaborations with industry and virtual prototyping. New instruments of collaboration will include institutes and centers while summer schools, workshops and outreach will invite new talent and expertise. Computational science adds new dimensions to science and its practice. Disciplines of fusion, accelerator science, and combustion are poised to blur the boundaries between pure and applied science. As we open the door into FY2006 we shall see a landscape of new scientific challenges: in biology, chemistry, materials, and astrophysics to name a few. The enabling technologies of SciDAC have been transformational as drivers of change. Planning for major new software systems assumes a base line employing Common Component Architectures and this has become a household word for new software projects. While grid algorithms and mesh refinement software have transformed applications software, data management and visualization have transformed our understanding of science from data. The Gordon Bell prize now seems to be dominated by computational science and solvers developed by TOPS ISIC. The priorities of the Office of Science in the Department of Energy are clear. The 20 year facilities plan is driven by new science. High performance computing is placed amongst the two highest priorities. Moore's law says that by the end of the next cycle of SciDAC we shall have peta-flop computers. The challenges of petascale computing are enormous. These and the associated computational science are the highest priorities for computing within the Office of Science. Our effort in Leadership Class computing is just a first step towards this goal. Clearly, computational science at this scale will face enormous challenges and possibilities. Performance evaluation and prediction will be critical to unraveling the needed software technologies. We must not lose sight of our overarching goal—that of scientific discovery. Science does not stand still and the landscape of science discovery and computing holds immense promise. In this environment, I believe it is necessary to institute a system of science based performance metrics to help quantify our progress towards science goals and scientific computing. As a final comment I would like to reaffirm that the shifting landscapes of science will force changes to our computational sciences, and leave you with the quote from Richard Hamming, 'The purpose of computing is insight, not numbers'.

  12. Parallel computing of a climate model on the dawn 1000 by domain decomposition method

    NASA Astrophysics Data System (ADS)

    Bi, Xunqiang

    1997-12-01

    In this paper the parallel computing of a grid-point nine-level atmospheric general circulation model on the Dawn 1000 is introduced. The model was developed by the Institute of Atmospheric Physics (IAP), Chinese Academy of Sciences (CAS). The Dawn 1000 is a MIMD massive parallel computer made by National Research Center for Intelligent Computer (NCIC), CAS. A two-dimensional domain decomposition method is adopted to perform the parallel computing. The potential ways to increase the speed-up ratio and exploit more resources of future massively parallel supercomputation are also discussed.

  13. Embedded Data Processor and Portable Computer Technology testbeds

    NASA Technical Reports Server (NTRS)

    Alena, Richard; Liu, Yuan-Kwei; Goforth, Andre; Fernquist, Alan R.

    1993-01-01

    Attention is given to current activities in the Embedded Data Processor and Portable Computer Technology testbed configurations that are part of the Advanced Data Systems Architectures Testbed at the Information Sciences Division at NASA Ames Research Center. The Embedded Data Processor Testbed evaluates advanced microprocessors for potential use in mission and payload applications within the Space Station Freedom Program. The Portable Computer Technology (PCT) Testbed integrates and demonstrates advanced portable computing devices and data system architectures. The PCT Testbed uses both commercial and custom-developed devices to demonstrate the feasibility of functional expansion and networking for portable computers in flight missions.

  14. Uses of the Drupal CMS Collaborative Framework in the Woods Hole Scientific Community (Invited)

    NASA Astrophysics Data System (ADS)

    Maffei, A. R.; Chandler, C. L.; Work, T. T.; Shorthouse, D.; Furfey, J.; Miller, H.

    2010-12-01

    Organizations that comprise the Woods Hole scientific community (Woods Hole Oceanographic Institution, Marine Biological Laboratory, USGS Woods Hole Coastal and Marine Science Center, Woods Hole Research Center, NOAA NMFS Northeast Fisheries Science Center, SEA Education Association) have a long history of collaborative activity regarding computing, computer network and information technologies that support common, inter-disciplinary science needs. Over the past several years there has been growing interest in the use of the Drupal Content Management System (CMS) playing a variety of roles in support of research projects resident at several of these organizations. Many of these projects are part of science programs that are national and international in scope. Here we survey the current uses of Drupal within the Woods Hole scientific community and examine reasons it has been adopted. The promise of emerging semantic features in the Drupal framework is examined and projections of how pre-existing Drupal-based websites might benefit are made. Closer examination of Drupal software design exposes it as more than simply a content management system. The flexibility of its architecture; the power of its taxonomy module; the care taken in nurturing the open-source developer community that surrounds it (including organized and often well-attended code sprints); the ability to bind emerging software technologies as Drupal modules; the careful selection process used in adopting core functionality; multi-site hosting and cross-site deployment of updates and a recent trend towards development of use-case inspired Drupal distributions casts Drupal as a general-purpose application deployment framework. Recent work in the semantic arena casts Drupal as an emerging RDF framework as well. Examples of roles played by Drupal-based websites within the Woods Hole scientific community that will be discussed include: science data metadata database, organization main website, biological taxonomy development, bibliographic database, physical media data archive inventory manager, disaster-response website development framework, science project task management, science conference planning, and spreadsheet-to-database converter.

  15. ODISEES Availability and Feedback Request

    Atmospheric Science Data Center

    2014-09-06

    ... As a follow-up Action from the Atmospheric Science Data Center (ASDC) User Working Group (UWG) held on 24-25 June, we are ... for a common language to describe scientific terms so that a computer can scour the internet, automatically discover relevant information ...

  16. Sandia National Laboratories: Careers: Materials Science

    Science.gov Websites

    Technology Partnerships Business, Industry, & Non-Profits Government Universities Center for Development Agreement (CRADA) Strategic Partnership Projects, Non-Federal Entity (SPP/NFE) Agreements New Sandia's experimental, theoretical, and computational capabilities to establish the state of the art in

  17. Proceedings of the Thirteenth Annual Software Engineering Workshop

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Topics covered in the workshop included studies and experiments conducted in the Software Engineering Laboratory (SEL), a cooperative effort of NASA Goddard Space Flight Center, the University of Maryland, and Computer Sciences Corporation; software models; software products; and software tools.

  18. Medical Informatics in Academic Health Science Centers.

    ERIC Educational Resources Information Center

    Frisse, Mark E.

    1992-01-01

    An analysis of the state of medical informatics, the application of computer and information technology to biomedicine, looks at trends and concerns, including integration of traditionally distinct enterprises (clinical information systems, financial information, scholarly support activities, infrastructures); informatics career choice and…

  19. 77 FR 57571 - Center For Scientific Review; Notice of Closed Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-18

    ...: Genes, Genomes, and Genetics Integrated Review Group; Genomics, Computational Biology and Technology... Reproductive Sciences Integrated Review Group; Cellular, Molecular and Integrative Reproduction Study Section...: Immunology Integrated Review Group; Cellular and Molecular Immunology--B Study Section. [[Page 57572

  20. Sandia National Laboratories: Advanced Simulation and Computing

    Science.gov Websites

    Programs Nuclear Weapons About Nuclear Weapons Safety & Security Weapons Science & Technology Robotics R&D 100 Awards Laboratory Directed Research & Development Technology Deployment Centers Audit Sandia's Economic Impact Licensing & Technology Transfer Browse Technology Portfolios

  1. Centralized automated cataloging of health science materials in the MLC/SUNY/OCLC shared cataloging service.

    PubMed Central

    Raper, J E

    1977-01-01

    Since February 1976, The Medical Library Center of New York, with the assistance of the SUNY/OCLC Network, has offered, on a subscription basis, a centralized automated cataloging service to health science libraries in the greater metropolitan New York area. By using workforms and prints of OCLC record (amended by the subscribing participants), technical services personnel at the center have fed cataloging data, via a CRT terminal, into the OCLC system, which provides (1) catalog cards, received in computer filing order; (2) book card, spine, and pocket labels; (3) accessions lists; and (4) data for eventual production of book catalogs and union catalogs. The experience of the center in the development, implementation, operation, and budgeting of its shared cataloging service is discussed. PMID:843650

  2. Centralized automated cataloging of health science materials in the MLC/SUNY/OCLC shared cataloging service.

    PubMed

    Raper, J E

    1977-04-01

    Since February 1976, The Medical Library Center of New York, with the assistance of the SUNY/OCLC Network, has offered, on a subscription basis, a centralized automated cataloging service to health science libraries in the greater metropolitan New York area. By using workforms and prints of OCLC record (amended by the subscribing participants), technical services personnel at the center have fed cataloging data, via a CRT terminal, into the OCLC system, which provides (1) catalog cards, received in computer filing order; (2) book card, spine, and pocket labels; (3) accessions lists; and (4) data for eventual production of book catalogs and union catalogs. The experience of the center in the development, implementation, operation, and budgeting of its shared cataloging service is discussed.

  3. Acoustic Source Bearing Estimation (ASBE) computer program development

    NASA Technical Reports Server (NTRS)

    Wiese, Michael R.

    1987-01-01

    A new bearing estimation algorithm (Acoustic Source Analysis Technique - ASAT) and an acoustic analysis computer program (Acoustic Source Bearing Estimation - ASBE) are described, which were developed by Computer Sciences Corporation for NASA Langley Research Center. The ASBE program is used by the Acoustics Division/Applied Acoustics Branch and the Instrument Research Division/Electro-Mechanical Instrumentation Branch to analyze acoustic data and estimate the azimuths from which the source signals radiated. Included are the input and output from a benchmark test case.

  4. Solving the "Hidden Line" Problem

    NASA Technical Reports Server (NTRS)

    1984-01-01

    David Hedgley Jr., a mathematician at Dryden Flight Research Center, has developed an accurate computer program that considers whether a line in a graphic model of a three dimensional object should or should not be visible. The Hidden Line Computer Code, program automatically removes superfluous lines and permits the computer to display an object from specific viewpoints, just as the human eye would see it. Users include Rowland Institute for Science in Cambridge, MA, several departments of Lockheed Georgia Co., and Nebraska Public Power District (NPPD).

  5. Education and Outreach Programs Offered by the Center for High Pressure Research and the Consortium for Materials Properties Research in Earth Sciences

    NASA Astrophysics Data System (ADS)

    Richard, G. A.

    2003-12-01

    Major research facilities and organizations provide an effective venue for developing partnerships with educational organizations in order to offer a wide variety of educational programs, because they constitute a base where the culture of scientific investigation can flourish. The Consortium for Materials Properties Research in Earth Sciences (COMPRES) conducts education and outreach programs through the Earth Science Educational Resource Center (ESERC), in partnership with other groups that offer research and education programs. ESERC initiated its development of education programs in 1994 under the administration of the Center for High Pressure Research (CHiPR), which was funded as a National Science Foundation Science and Technology Center from 1991 to 2002. Programs developed during ESERC's association with CHiPR and COMPRES have targeted a wide range of audiences, including pre-K, K-12 students and teachers, undergraduates, and graduate students. Since 1995, ESERC has offered inquiry-based programs to Project WISE (Women in Science and Engineering) students at a high school and undergraduate level. Activities have included projects that investigated earthquakes, high pressure mineral physics, and local geology. Through a practicum known as Project Java, undergraduate computer science students have developed interactive instructional tools for several of these activities. For K-12 teachers, a course on Long Island geology is offered each fall, which includes an examination of the role that processes in the Earth's interior have played in the geologic history of the region. ESERC has worked with Stony Brook's Department of Geosciences faculty to offer courses on natural hazards, computer modeling, and field geology to undergraduate students, and on computer programming for graduate students. Each summer, a four-week residential college-level environmental geology course is offered to rising tenth graders from the Brentwood, New York schools in partnership with Stony Brook's Department of Technology and Society. During the academic year, a college-level Earth science course is offered to tenth graders from Sayville, New York. In both programs, students conduct research projects as one of their primary responsibilities. In collaboration with the Museum of Long Island Natural Sciences on the Stony Brook campus, two programs have been developed that enable visiting K-12 school classes to investigate earthquakes and phenomena that operate in the Earth's deep interior. From 1997 to 1999, the weekly activity-based Science Enrichment for the Early Years (SEEY) program, focusing on common Earth materials and fundamental Earth processes, was conducted at a local pre-K school. Since 2002, ESERC has worked with the Digital Library for Earth System Education (DLESE) to organize the Skills Workshops for their Annual Meeting and with EarthScope for the development of their Education and Outreach Program Plan. Future education programs and tools developed through COMPRES partnerships will place an increased emphasis on deep Earth materials and phenomena.

  6. Dan Goldin Presentation: Pathway to the Future

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the "Path to the Future" presentation held at NASA's Langley Center on March 31, 1999, NASA's Administrator Daniel S. Goldin outlined the future direction and strategies of NASA in relation to the general space exploration enterprise. NASA's Vision, Future System Characteristics, Evolutions of Engineering, and Revolutionary Changes are the four main topics of the presentation. In part one, the Administrator talks in detail about NASA's vision in relation to the NASA Strategic Activities that are Space Science, Earth Science, Human Exploration, and Aeronautics & Space Transportation. Topics discussed in this section include: space science for the 21st century, flying in mars atmosphere (mars plane), exploring new worlds, interplanetary internets, earth observation and measurements, distributed information-system-in-the-sky, science enabling understanding and application, space station, microgravity, science and exploration strategies, human mars mission, advance space transportation program, general aviation revitalization, and reusable launch vehicles. In part two, he briefly talks about the future system characteristics. He discusses major system characteristics like resiliencey, self-sufficiency, high distribution, ultra-efficiency, and autonomy and the necessity to overcome any distance, time, and extreme environment barriers. Part three of Mr. Goldin's talk deals with engineering evolution, mainly evolution in the Computer Aided Design (CAD)/Computer Aided Engineering (CAE) systems. These systems include computer aided drafting, computerized solid models, virtual product development (VPD) systems, networked VPD systems, and knowledge enriched networked VPD systems. In part four, the last part, the Administrator talks about the need for revolutionary changes in communication and networking areas of a system. According to the administrator, the four major areas that need cultural changes in the creativity process are human-centered computing, an infrastructure for distributed collaboration, rapid synthesis and simulation tools, and life-cycle integration and validation. Mr. Goldin concludes his presentation with the following maxim "Collaborate, Integrate, Innovate or Stagnate and Evaporate." He also answers some questions after the presentation.

  7. The Three-Pronged Approach to Community Education: An Ongoing Hydrologic Science Outreach Campaign Directed from a University Research Center

    NASA Astrophysics Data System (ADS)

    Gallagher, L.; Morse, M.; Maxwell, R. M.

    2017-12-01

    The Integrated GroundWater Modeling Center (IGWMC) at Colorado School of Mines has, over the past three years, developed a community outreach program focusing on hydrologic science education, targeting K-12 teachers and students, and providing experiential learning for undergraduate and graduate students. During this time, the programs led by the IGWMC reached approximately 7500 students, teachers, and community members along the Colorado Front Range. An educational campaign of this magnitude for a small (2 full-time employees, 4 PIs) research center required restructuring and modularizing of the outreach strategy. We refined our approach to include three main "modules" of delivery. First: grassroots education delivery in the form of K-12 classroom visits, science fairs, and teacher workshops. Second: content development in the form of lesson plans for K-12 classrooms and STEM camps, hands-on physical and computer model activities, and long-term citizen science partnerships. Lastly: providing education/outreach experiences for undergraduate and graduate student volunteers, training them via a 3-credit honors course, and instilling the importance of effective science communication skills. Here we present specific case studies and examples of the successes and failures of our three-pronged system, future developments, and suggestions for entities newly embarking on an earth science education outreach campaign.

  8. Crosscut report: Exascale Requirements Reviews, March 9–10, 2017 – Tysons Corner, Virginia. An Office of Science review sponsored by: Advanced Scientific Computing Research, Basic Energy Sciences, Biological and Environmental Research, Fusion Energy Sciences, High Energy Physics, Nuclear Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerber, Richard; Hack, James; Riley, Katherine

    The mission of the U.S. Department of Energy Office of Science (DOE SC) is the delivery of scientific discoveries and major scientific tools to transform our understanding of nature and to advance the energy, economic, and national security missions of the United States. To achieve these goals in today’s world requires investments in not only the traditional scientific endeavors of theory and experiment, but also in computational science and the facilities that support large-scale simulation and data analysis. The Advanced Scientific Computing Research (ASCR) program addresses these challenges in the Office of Science. ASCR’s mission is to discover, develop, andmore » deploy computational and networking capabilities to analyze, model, simulate, and predict complex phenomena important to DOE. ASCR supports research in computational science, three high-performance computing (HPC) facilities — the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory and Leadership Computing Facilities at Argonne (ALCF) and Oak Ridge (OLCF) National Laboratories — and the Energy Sciences Network (ESnet) at Berkeley Lab. ASCR is guided by science needs as it develops research programs, computers, and networks at the leading edge of technologies. As we approach the era of exascale computing, technology changes are creating challenges for science programs in SC for those who need to use high performance computing and data systems effectively. Numerous significant modifications to today’s tools and techniques will be needed to realize the full potential of emerging computing systems and other novel computing architectures. To assess these needs and challenges, ASCR held a series of Exascale Requirements Reviews in 2015–2017, one with each of the six SC program offices,1 and a subsequent Crosscut Review that sought to integrate the findings from each. Participants at the reviews were drawn from the communities of leading domain scientists, experts in computer science and applied mathematics, ASCR facility staff, and DOE program managers in ASCR and the respective program offices. The purpose of these reviews was to identify mission-critical scientific problems within the DOE Office of Science (including experimental facilities) and determine the requirements for the exascale ecosystem that would be needed to address those challenges. The exascale ecosystem includes exascale computing systems, high-end data capabilities, efficient software at scale, libraries, tools, and other capabilities. This effort will contribute to the development of a strategic roadmap for ASCR compute and data facility investments and will help the ASCR Facility Division establish partnerships with Office of Science stakeholders. It will also inform the Office of Science research needs and agenda. The results of the six reviews have been published in reports available on the web at http://exascaleage.org/. This report presents a summary of the individual reports and of common and crosscutting findings, and it identifies opportunities for productive collaborations among the DOE SC program offices.« less

  9. Applying a Qualitative Modeling Shell to Process Diagnosis: The Caster System.

    DTIC Science & Technology

    1986-03-01

    Process Diagnosis: The Caster System by Timothy F. Thompson and William J. Clancey Department of Computer Science Stanford University Stanford, CA 94303...MODELING SHELL TO PROCESS DIAGNOSIS: THE CASTER SYSTEM 12 PERSONAL AUTHOR(S) TIMOTHY F. THOMPSON. WESTINGHOUSE R&D CENTER, WILLIAM CLANCEY, STANFORD...editions are obsolete. Applying a Qualitative Modeling Shell to Process Diagnosis: The Caster System by Timothy F. Thompson, Westinghouse R&D Center

  10. Performance assessment of KORAT-3D on the ANL IBM-SP computer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alexeyev, A.V.; Zvenigorodskaya, O.A.; Shagaliev, R.M.

    1999-09-01

    The TENAR code is currently being developed at the Russian Federal Nuclear Center (VNIIEF) as a coupled dynamics code for the simulation of transients in VVER and RBMK systems and other nuclear systems. The neutronic module in this code system is KORAT-3D. This module is also one of the most computationally intensive components of the code system. A parallel version of KORAT-3D has been implemented to achieve the goal of obtaining transient solutions in reasonable computational time, particularly for RBMK calculations that involve the application of >100,000 nodes. An evaluation of the KORAT-3D code performance was recently undertaken on themore » Argonne National Laboratory (ANL) IBM ScalablePower (SP) parallel computer located in the Mathematics and Computer Science Division of ANL. At the time of the study, the ANL IBM-SP computer had 80 processors. This study was conducted under the auspices of a technical staff exchange program sponsored by the International Nuclear Safety Center (INSC).« less

  11. ERB master archival tape specification no. T 134081 ERB MAT, revision 1

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The Earth radiation budget (ERB)MAT tapes are generated by the ERB MATGEN software using the IBM 3081 computer system operated by the Science and Applications Computer Center at Goddard Space Flight Center. All MAT's are 9-track and MAT data are in ascending time order. The gross tape format for NIMBUS year-1 and year-2 MAT's is different from the format of MAT's starting with year-3. The MATs from the first two years are to contain one day's worth of data while all other MATs are to contain multiple day's worth of data stacked onto the tapes.

  12. Inverse Design: Playing "Jeopardy" in Materials Science (A "Life at the Frontiers of Energy Research" contest entry from the 2011 Energy Frontier Research Centers (EFRCs) Summit and Forum)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zunger, Alex

    "Inverse Design: Playing 'Jeopardy' in Materials Science" was submitted by the Center for Inverse Design (CID) to the "Life at the Frontiers of Energy Research" video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. CID, an EFRC directed by Bill Tumas at the National Renewable Energy Laboratory is a partnership of scientists from six institutions: NREL (lead), Northwestern University, University of Colorado, Colorado School of Mines, Stanford University, and Oregon State University. The Office of Basic Energy Sciencesmore » in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges. The mission of the Center for Inverse Design is 'to replace trial-and-error methods used in the development of materials for solar energy conversion with an inverse design approach powered by theory and computation.' Research topics are: solar photovoltaic, photonic, metamaterial, defects, spin dynamics, matter by design, novel materials synthesis, and defect tolerant materials.« less

  13. PREPARING FOR EXASCALE: ORNL Leadership Computing Application Requirements and Strategy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joubert, Wayne; Kothe, Douglas B; Nam, Hai Ah

    2009-12-01

    In 2009 the Oak Ridge Leadership Computing Facility (OLCF), a U.S. Department of Energy (DOE) facility at the Oak Ridge National Laboratory (ORNL) National Center for Computational Sciences (NCCS), elicited petascale computational science requirements from leading computational scientists in the international science community. This effort targeted science teams whose projects received large computer allocation awards on OLCF systems. A clear finding of this process was that in order to reach their science goals over the next several years, multiple projects will require computational resources in excess of an order of magnitude more powerful than those currently available. Additionally, for themore » longer term, next-generation science will require computing platforms of exascale capability in order to reach DOE science objectives over the next decade. It is generally recognized that achieving exascale in the proposed time frame will require disruptive changes in computer hardware and software. Processor hardware will become necessarily heterogeneous and will include accelerator technologies. Software must undergo the concomitant changes needed to extract the available performance from this heterogeneous hardware. This disruption portends to be substantial, not unlike the change to the message passing paradigm in the computational science community over 20 years ago. Since technological disruptions take time to assimilate, we must aggressively embark on this course of change now, to insure that science applications and their underlying programming models are mature and ready when exascale computing arrives. This includes initiation of application readiness efforts to adapt existing codes to heterogeneous architectures, support of relevant software tools, and procurement of next-generation hardware testbeds for porting and testing codes. The 2009 OLCF requirements process identified numerous actions necessary to meet this challenge: (1) Hardware capabilities must be advanced on multiple fronts, including peak flops, node memory capacity, interconnect latency, interconnect bandwidth, and memory bandwidth. (2) Effective parallel programming interfaces must be developed to exploit the power of emerging hardware. (3) Science application teams must now begin to adapt and reformulate application codes to the new hardware and software, typified by hierarchical and disparate layers of compute, memory and concurrency. (4) Algorithm research must be realigned to exploit this hierarchy. (5) When possible, mathematical libraries must be used to encapsulate the required operations in an efficient and useful way. (6) Software tools must be developed to make the new hardware more usable. (7) Science application software must be improved to cope with the increasing complexity of computing systems. (8) Data management efforts must be readied for the larger quantities of data generated by larger, more accurate science models. Requirements elicitation, analysis, validation, and management comprise a difficult and inexact process, particularly in periods of technological change. Nonetheless, the OLCF requirements modeling process is becoming increasingly quantitative and actionable, as the process becomes more developed and mature, and the process this year has identified clear and concrete steps to be taken. This report discloses (1) the fundamental science case driving the need for the next generation of computer hardware, (2) application usage trends that illustrate the science need, (3) application performance characteristics that drive the need for increased hardware capabilities, (4) resource and process requirements that make the development and deployment of science applications on next-generation hardware successful, and (5) summary recommendations for the required next steps within the computer and computational science communities.« less

  14. Participatory Design of Human-Centered Cyberinfrastructure (Invited)

    NASA Astrophysics Data System (ADS)

    Pennington, D. D.; Gates, A. Q.

    2010-12-01

    Cyberinfrastructure, by definition, is about people sharing resources to achieve outcomes that cannot be reached independently. CI depends not just on creating discoverable resources, or tools that allow those resources to be processed, integrated, and visualized -- but on human activation of flows of information across those resources. CI must be centered on human activities. Yet for those CI projects that are directed towards observational science, there are few models for organizing collaborative research in ways that align individual research interests into a collective vision of CI-enabled science. Given that the emerging technologies are themselves expected to change the way science is conducted, it is not simply a matter of conducting requirements analysis on how scientists currently work, or building consensus among the scientists on what is needed. Developing effective CI depends on generating a new, creative vision of problem solving within a community based on computational concepts that are, in some cases, still very abstract and theoretical. The computer science theory may (or may not) be well formalized, but the potential for impact on any particular domain is typically ill-defined. In this presentation we will describe approaches being developed and tested at the CyberShARE Center of Excellence at University of Texas in El Paso for ill-structured problem solving within cross-disciplinary teams of scientists and computer scientists working on data intensive environmental and geoscience. These approaches deal with the challenges associated with sharing and integrating knowledge across disciplines; the challenges of developing effective teamwork skills in a culture that favors independent effort; and the challenges of evolving shared, focused research goals from ill-structured, vague starting points - all issues that must be confronted by every interdisciplinary CI project. We will introduce visual and semantic-based tools that can enable the collaborative research design process and illustrate their application in designing and developing useful end-to-end data solutions for scientists. Lastly, we will outline areas of future investigation within CyberShARE that we believe have the potential for high impact.

  15. University of Maryland MRSEC - Research: Seed 1

    Science.gov Websites

    . University of Maryland Materials Research Science and Engineering Center Home About Us Leadership & Biochemistry Wolfgang Losert, Physics, IPST, IREAP Ben Shapiro, Bio-Engineering, Aerospace Engineering Edo Waks, Electrical & Computer Engineering, IREAP, JQI Creating specific functional patterns

  16. NSSDC Data Listing

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Data available from the National Space Science Data Center (NSSDC) are listed. The spacecraft, principal investigator, the experiment, and time span of the data are given. A listing is also included of ground-based data, models, computer routines and composite spacecraft data that are available from NSSDC.

  17. Capabilities: Science Pillars

    Science.gov Websites

    Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron

  18. Science Briefs

    Science.gov Websites

    Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron

  19. Office of Science

    Science.gov Websites

    Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron

  20. Bradbury Science Museum

    Science.gov Websites

    Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron

  1. Sandia National Laboratories: National Security Missions: Nuclear Weapons

    Science.gov Websites

    Technology Partnerships Business, Industry, & Non-Profits Government Universities Center for Development Agreement (CRADA) Strategic Partnership Projects, Non-Federal Entity (SPP/NFE) Agreements New , in which fundamental science, computer models, and unique experimental facilities come together so

  2. Solar System Number-Crunching.

    ERIC Educational Resources Information Center

    Albrecht, Bob; Firedrake, George

    1997-01-01

    Defines terrestrial and Jovian planets and provides directions to obtain planetary data from the National Space Science Data Center Web sites. Provides "number-crunching" activities for the terrestrial planets using Texas Instruments TI-83 graphing calculators: computing volumetric mean radius and volume, density, ellipticity, speed,…

  3. 77 FR 59938 - Center for Scientific Review Notice of Closed Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-01

    ... Panel; Program Project: Drug Addiction. Date: October 30-31, 2012. Time: 8:00 a.m. to 8:30 p.m. Agenda... Biomedical Computational Science and Technology Initiative. Date: October 30, 2012. Time: 3:00 p.m. to 4:00 p...

  4. Galen Maclaurin | NREL

    Science.gov Websites

    Scientific programming and high performance computing Research Interests Wind and solar resource assessment , Department of Geography and Environmental Sciences, Denver, CO Research Assistant, National Center for Atmospheric Research (NCAR), Boulder, CO Graduate Instructor and Research Assistant, University of Colorado

  5. Ammonia Oxidation by Abstraction of Three Hydrogen Atoms from a Mo–NH 3 Complex

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhattacharya, Papri; Heiden, Zachariah M.; Wiedner, Eric S.

    We report ammonia oxidation by homolytic cleavage of all three H atoms from a Mo-15NH3 complex using the 2,4,6-tri-tert-butylphenoxyl radical to afford a Mo-alkylimido (Mo=15NR) complex (R = 2,4,6-tri-t-butylcyclohexa-2,5-dien-1-one). Reductive cleavage of Mo=15NR generates a terminal Mo≡N nitride, and a [Mo-15NH]+ complex is formed by protonation. Computational analysis describes the energetic profile for the stepwise removal of three H atoms from the Mo-15NH3 complex and the formation of Mo=15NR. Acknowledgment. This work was supported as part of the Center for Molecular Electrocatalysis, an Energy Frontier Re-search Center funded by the U.S. Department of Energy (U.S. DOE), Office of Science, Officemore » of Basic Energy Sciences. EPR and mass spectrometry experiments were performed using EMSL, a national scientific user facility sponsored by the DOE’s Office of Biological and Environmental Research and located at PNNL. The authors thank Dr. Eric D. Walter and Dr. Rosalie Chu for assistance in performing EPR and mass spectroscopy analysis, respectively. Computational resources provided by the National Energy Re-search Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory. Pacific North-west National Laboratory is operated by Battelle for the U.S. DOE.« less

  6. Contention Bounds for Combinations of Computation Graphs and Network Topologies

    DTIC Science & Technology

    2014-08-08

    member of STARnet, a Semiconductor Research Corporation program sponsored by MARCO and DARPA, and ASPIRE Lab industrial sponsors and affiliates Intel...Google, Nokia, NVIDIA , Oracle, MathWorks and Samsung. Also funded by U.S. DOE Office of Science, Office of Advanced Scientific Computing Research...DARPA Award Number HR0011-12-2- 0016, the Center for Future Architecture Research, a mem- ber of STARnet, a Semiconductor Research Corporation

  7. Communications and Computers in the 21st Century. Hearing before the Technology Policy Task Force of the Committee on Science, Space, and Technology. House of Representatives, One Hundredth Congress, First Session.

    ERIC Educational Resources Information Center

    Congress of the U.S., Washington, DC. House Committee on Science, Space and Technology.

    Based upon the premise that manufacturing, communications, and computers are the key to productivity, this hearing before the Technology Policy Task Force was held to examine how the federal government interacts with universities, engineering research centers, professional associations, and private businesses in these areas. This document contains…

  8. A Computer Learning Center for Environmental Sciences

    NASA Technical Reports Server (NTRS)

    Mustard, John F.

    2000-01-01

    In the fall of 1998, MacMillan Hall opened at Brown University to students. In MacMillan Hall was the new Computer Learning Center, since named the EarthLab which was outfitted with high-end workstations and peripherals primarily focused on the use of remotely sensed and other spatial data in the environmental sciences. The NASA grant we received as part of the "Centers of Excellence in Applications of Remote Sensing to Regional and Global Integrated Environmental Assessments" was the primary source of funds to outfit this learning and research center. Since opening, we have expanded the range of learning and research opportunities and integrated a cross-campus network of disciplines who have come together to learn and use spatial data of all kinds. The EarthLab also forms a core of undergraduate, graduate, and faculty research on environmental problems that draw upon the unique perspective of remotely sensed data. Over the last two years, the Earthlab has been a center for research on the environmental impact of water resource use in and regions, impact of the green revolution on forest cover in India, the design of forest preserves in Vietnam, and detailed assessments of the utility of thermal and hyperspectral data for water quality analysis. It has also been used extensively for local environmental activities, in particular studies on the impact of lead on the health of urban children in Rhode Island. Finally, the EarthLab has also served as a key educational and analysis center for activities related to the Brown University Affiliated Research Center that is devoted to transferring university research to the private sector.

  9. Large-scale Labeled Datasets to Fuel Earth Science Deep Learning Applications

    NASA Astrophysics Data System (ADS)

    Maskey, M.; Ramachandran, R.; Miller, J.

    2017-12-01

    Deep learning has revolutionized computer vision and natural language processing with various algorithms scaled using high-performance computing. However, generic large-scale labeled datasets such as the ImageNet are the fuel that drives the impressive accuracy of deep learning results. Large-scale labeled datasets already exist in domains such as medical science, but creating them in the Earth science domain is a challenge. While there are ways to apply deep learning using limited labeled datasets, there is a need in the Earth sciences for creating large-scale labeled datasets for benchmarking and scaling deep learning applications. At the NASA Marshall Space Flight Center, we are using deep learning for a variety of Earth science applications where we have encountered the need for large-scale labeled datasets. We will discuss our approaches for creating such datasets and why these datasets are just as valuable as deep learning algorithms. We will also describe successful usage of these large-scale labeled datasets with our deep learning based applications.

  10. ACToR-Aggregated Computational Resource | Science ...

    EPA Pesticide Factsheets

    ACToR (Aggregated Computational Toxicology Resource) is a database and set of software applications that bring into one central location many types and sources of data on environmental chemicals. Currently, the ACToR chemical database contains information on chemical structure, in vitro bioassays and in vivo toxicology assays derived from more than 150 sources including the U.S. Environmental Protection Agency (EPA), Centers for Disease Control (CDC), U.S. Food & Drug Administration (FDA), National Institutes of Health (NIH), state agencies, corresponding government agencies in Canada, Europe and Japan, universities, the World Health Organization (WHO) and non-governmental organizations (NGOs). At the EPA National Center for Computational Toxicology, ACToR helps manage large data sets being used in a high throughput environmental chemical screening and prioritization program called ToxCast(TM).

  11. Integration Of PanDA Workload Management System With Supercomputers for ATLAS and Data Intensive Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De, K; Jha, S; Klimentov, A

    2016-01-01

    The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS, one of the largest collaborations ever assembled in the sciences, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Managementmore » System for managing the workflow for all data processing on over 150 data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. While PanDA currently uses more than 250,000 cores with a peak performance of 0.3 petaFLOPS, LHC data taking runs require more resources than Grid computing can possibly provide. To alleviate these challenges, LHC experiments are engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with supercomputers in United States, Europe and Russia (in particular with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF), MIRA supercomputer at Argonne Leadership Computing Facilities (ALCF), Supercomputer at the National Research Center Kurchatov Institute , IT4 in Ostrava and others). Current approach utilizes modified PanDA pilot framework for job submission to the supercomputers batch queues and local data management, with light-weight MPI wrappers to run single threaded workloads in parallel on LCFs multi-core worker nodes. This implementation was tested with a variety of Monte-Carlo workloads on several supercomputing platforms for ALICE and ATLAS experiments and it is in full production for the ATLAS experiment since September 2015. We will present our current accomplishments with running PanDA WMS at supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facilities infrastructure for High Energy and Nuclear Physics as well as other data-intensive science applications, such as bioinformatics and astro-particle physics.« less

  12. USRA/RIACS

    NASA Technical Reports Server (NTRS)

    Oliger, Joseph

    1992-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on 6 June 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under a cooperative agreement with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. A flexible scientific staff is provided through a university faculty visitor program, a post doctoral program, and a student visitor program. Not only does this provide appropriate expertise but it also introduces scientists outside of NASA to NASA problems. A small group of core RIACS staff provides continuity and interacts with an ARC technical monitor and scientific advisory group to determine the RIACS mission. RIACS activities are reviewed and monitored by a USRA advisory council and ARC technical monitor. Research at RIACS is currently being done in the following areas: Parallel Computing; Advanced Methods for Scientific Computing; Learning Systems; High Performance Networks and Technology; Graphics, Visualization, and Virtual Environments.

  13. Computer ethics education: Impact from societal norms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, G.B.

    1994-12-31

    Discussions have occurred on the best way to implement the horizontal and vertical integration of education on the social, ethical and professional issues relating to computer science. These discussions have not only included debates on the subject matter and what manner to approach it (i.e. integrated among all computer science courses taught, as a separate required course, or a combination of both), but have also involved debates over who is best qualified to address the subject. What has seldom been addressed, however, is how societal impressions of what is ethical have impacted both those who develop software and those whomore » use it. In light of the experience of such institutions as the U.S. Air Force Academy which recently instituted a program called the Center for Character Development (due to a perceived erosion of the core values of its recruits), should academia and industry expect more from computer scientists than from the population as a whole? It is the integration of ethics courses in the computer science curriculum in light of a general erosion of ethical values in society as a whole that is addressed in this paper.« less

  14. Plasma Science and Innovation Center (PSI-Center) at Washington, Wisconsin, and Utah State, ARRA Supplement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sovinec, Carl

    The objective of the Plasma Science and Innovation Center (PSI-Center) is to develop and deploy computational models that simulate conditions in smaller, concept-exploration plasma experiments. The PSIC group at the University of Wisconsin-Madison, led by Prof. Carl Sovinec, uses and enhances the Non-Ideal Magnetohydrodynamics with Rotation, Open Discussion (NIMROD) code, to simulate macroscopic plasma dynamics in a number of magnetic confinement configurations. These numerical simulations provide information on how magnetic fields and plasma flows evolve over all three spatial dimensions, which supplements the limited access of diagnostics in plasma experiments. The information gained from simulation helps explain how plasma evolves.more » It is also used to engineer more effective plasma confinement systems, reducing the need for building many experiments to cover the physical parameter space. The ultimate benefit is a more cost-effective approach to the development of fusion energy for peaceful power production. The supplemental funds provided by the American Recovery and Reinvestment Act of 2009 were used to purchase computer components that were assembled into a 48-core system with 256 Gb of shared memory. The system was engineered and constructed by the group's system administrator at the time, Anthony Hammond. It was successfully used by then graduate student, Dr. John O'Bryan, for computing magnetic relaxation dynamics that occur during experimental tests of non-inductive startup in the Pegasus Toroidal Experiment (pegasus.ep.wisc.edu). Dr. O'Bryan's simulations provided the first detailed explanation of how the driven helical filament of electrical current evolves into a toroidal tokamak-like plasma configuration.« less

  15. NASA Lewis' IITA K-12 Program

    NASA Technical Reports Server (NTRS)

    1996-01-01

    The NASA Lewis Research Center's Information Infrastructure Technology and Applications for Kindergarten to 12th Grade (IITA K-12) Program is designed to introduce into school systems computing and communications technology that benefits math and science studies. By incorporating this technology into K-12 curriculums, we hope to increase the proficiency and interest in math and science subjects by K-12 students so that they continue to study technical subjects after their high school careers are over.

  16. The SGI/CRAY T3E: Experiences and Insights

    NASA Technical Reports Server (NTRS)

    Bernard, Lisa Hamet

    1999-01-01

    The focus of the HPCC Earth and Space Sciences (ESS) Project is capability computing - pushing highly scalable computing testbeds to their performance limits. The drivers of this focus are the Grand Challenge problems in Earth and space science: those that could not be addressed in a capacity computing environment where large jobs must continually compete for resources. These Grand Challenge codes require a high degree of communication, large memory, and very large I/O (throughout the duration of the processing, not just in loading initial conditions and saving final results). This set of parameters led to the selection of an SGI/Cray T3E as the current ESS Computing Testbed. The T3E at the Goddard Space Flight Center is a unique computational resource within NASA. As such, it must be managed to effectively support the diverse research efforts across the NASA research community yet still enable the ESS Grand Challenge Investigator teams to achieve their performance milestones, for which the system was intended. To date, all Grand Challenge Investigator teams have achieved the 10 GFLOPS milestone, eight of nine have achieved the 50 GFLOPS milestone, and three have achieved the 100 GFLOPS milestone. In addition, many technical papers have been published highlighting results achieved on the NASA T3E, including some at this Workshop. The successes enabled by the NASA T3E computing environment are best illustrated by the 512 PE upgrade funded by the NASA Earth Science Enterprise earlier this year. Never before has an HPCC computing testbed been so well received by the general NASA science community that it was deemed critical to the success of a core NASA science effort. NASA looks forward to many more success stories before the conclusion of the NASA-SGI/Cray cooperative agreement in June 1999.

  17. Bridging the PSI Knowledge Gap: A Multi-Scale Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wirth, Brian D.

    2015-01-08

    Plasma-surface interactions (PSI) pose an immense scientific hurdle in magnetic confinement fusion and our present understanding of PSI in confinement environments is highly inadequate; indeed, a recent Fusion Energy Sciences Advisory Committee report found that 4 out of the 5 top five fusion knowledge gaps were related to PSI. The time is appropriate to develop a concentrated and synergistic science effort that would expand, exploit and integrate the wealth of laboratory ion-beam and plasma research, as well as exciting new computational tools, towards the goal of bridging the PSI knowledge gap. This effort would broadly advance plasma and material sciences,more » while providing critical knowledge towards progress in fusion PSI. This project involves the development of a Science Center focused on a new approach to PSI science; an approach that both exploits access to state-of-the-art PSI experiments and modeling, as well as confinement devices. The organizing principle is to develop synergistic experimental and modeling tools that treat the truly coupled multi-scale aspect of the PSI issues in confinement devices. This is motivated by the simple observation that while typical lab experiments and models allow independent manipulation of controlling variables, the confinement PSI environment is essentially self-determined with few outside controls. This means that processes that may be treated independently in laboratory experiments, because they involve vastly different physical and time scales, will now affect one another in the confinement environment. Also, lab experiments cannot simultaneously match all exposure conditions found in confinement devices typically forcing a linear extrapolation of lab results. At the same time programmatic limitations prevent confinement experiments alone from answering many key PSI questions. The resolution to this problem is to usefully exploit access to PSI science in lab devices, while retooling our thinking from a linear and de-coupled extrapolation to a multi-scale, coupled approach. The PSI Plasma Center consisted of three equal co-centers; one located at the MIT Plasma Science and Fusion Center, one at UC San Diego Center for Energy Research and one at the UC Berkeley Department of Nuclear Engineering, which moved to the University of Tennessee, Knoxville (UTK) with Professor Brian Wirth in July 2010. The Center had three co-directors: Prof. Dennis Whyte led the MIT co-center, the UCSD co-center was led by Dr. Russell Doerner, and Prof. Brian Wirth led the UCB/UTK center. The directors have extensive experience in PSI and material research, and have been internationally recognized in the magnetic fusion, materials and plasma research fields. The co-centers feature keystone PSI experimental and modeling facilities dedicated to PSI science: the DIONISOS/CLASS facility at MIT, the PISCES facility at UCSD, and the state-of-the-art numerical modeling capabilities at UCB/UTK. A collaborative partner in the center is Sandia National Laboratory at Livermore (SNL/CA), which has extensive capabilities with low energy ion beams and surface diagnostics, as well as supporting plasma facilities, including the Tritium Plasma Experiment, all of which significantly augment the Center. Interpretive, continuum material models are available through SNL/CA, UCSD and MIT. The participating institutions of MIT, UCSD, UCB/UTK, SNL/CA and LLNL brought a formidable array of experimental tools and personnel abilities into the PSI Plasma Center. Our work has focused on modeling activities associated with plasma surface interactions that are involved in effects of He and H plasma bombardment on tungsten surfaces. This involved performing computational material modeling of the surface evolution during plasma bombardment using molecular dynamics modeling. The principal outcomes of the research efforts within the combined experimental – modeling PSI center are to provide a knowledgebase of the mechanisms of surface degradation, and the influence of the surface on plasma conditions.« less

  18. Greater Philadelphia Bioinformatics Alliance (GPBA) 3rd Annual Retreat 2005

    DTIC Science & Technology

    2005-11-01

    Using Probabilistic Network Reliability. Genome Res. 14:1170-1175 [27] Batagelj , V. and Mrvar , A. (1998). Pajek: Program for large network analysis...and Neural Computation, Division of Informatics, Centre for Cognitive Science, University of Edinburgh, Scotland, April 1996 . www.anc.ed.ac.uk/-mjo...research center in the College of Engineering, and one of the foremost academic research centers in its field. From 1996 to 1998 he was the Founding

  19. A PICKSC Science Gateway for enabling the common plasma physicist to run kinetic software

    NASA Astrophysics Data System (ADS)

    Hu, Q.; Winjum, B. J.; Zonca, A.; Youn, C.; Tsung, F. S.; Mori, W. B.

    2017-10-01

    Computer simulations offer tremendous opportunities for studying plasmas, ranging from simulations for students that illuminate fundamental educational concepts to research-level simulations that advance scientific knowledge. Nevertheless, there is a significant hurdle to using simulation tools. Users must navigate codes and software libraries, determine how to wrangle output into meaningful plots, and oftentimes confront a significant cyberinfrastructure with powerful computational resources. Science gateways offer a Web-based environment to run simulations without needing to learn or manage the underlying software and computing cyberinfrastructure. We discuss our progress on creating a Science Gateway for the Particle-in-Cell and Kinetic Simulation Software Center that enables users to easily run and analyze kinetic simulations with our software. We envision that this technology could benefit a wide range of plasma physicists, both in the use of our simulation tools as well as in its adaptation for running other plasma simulation software. Supported by NSF under Grant ACI-1339893 and by the UCLA Institute for Digital Research and Education.

  20. Interdisciplinary Relationships in Technical Education: The CORD Perspective.

    ERIC Educational Resources Information Center

    Hull, Daniel M.

    1990-01-01

    The director of the Center for Occupational Research and Development (CORD) suggests areas in a technical curriculum that could be improved using an interdisciplinary approach: (1) systems; (2) the electromechanical core; (3) the mathematics/science base; (4) computers; and (5) interpersonal/communication skills. (Author)

  1. Los Alamos Science Facilities

    Science.gov Websites

    Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron

  2. Frontiers in Science Lectures

    Science.gov Websites

    Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron

  3. Tony Magri | NREL

    Science.gov Websites

    Windows System Engineer with the Computational Science Center. He implements, supports, and integrates Windows-based technology solutions at the ESIF and manages a portion of the VMware infrastructure . Throughout his career, Tony has built a strong skillset in enterprise Windows Engineering and Active

  4. ISCR FY2005 Annual Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keyes, D E; McGraw, J R

    2006-02-02

    Large-scale scientific computation and all of the disciplines that support and help validate it have been placed at the focus of Lawrence Livermore National Laboratory (LLNL) by the Advanced Simulation and Computing (ASC) program of the National Nuclear Security Administration (NNSA) and the Scientific Discovery through Advanced Computing (SciDAC) initiative of the Office of Science of the Department of Energy (DOE). The maturation of simulation as a fundamental tool of scientific and engineering research is underscored in the President's Information Technology Advisory Committee (PITAC) June 2005 finding that ''computational science has become critical to scientific leadership, economic competitiveness, and nationalmore » security''. LLNL operates several of the world's most powerful computers--including today's single most powerful--and has undertaken some of the largest and most compute-intensive simulations ever performed, most notably the molecular dynamics simulation that sustained more than 100 Teraflop/s and won the 2005 Gordon Bell Prize. Ultrascale simulation has been identified as one of the highest priorities in DOE's facilities planning for the next two decades. However, computers at architectural extremes are notoriously difficult to use in an efficient manner. Furthermore, each successful terascale simulation only points out the need for much better ways of interacting with the resulting avalanche of data. Advances in scientific computing research have, therefore, never been more vital to the core missions of LLNL than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, LLNL must engage researchers at many academic centers of excellence. In FY 2005, the Institute for Scientific Computing Research (ISCR) served as one of LLNL's main bridges to the academic community with a program of collaborative subcontracts, visiting faculty, student internships, workshops, and an active seminar series. The ISCR identifies researchers from the academic community for computer science and computational science collaborations with LLNL and hosts them for both brief and extended visits with the aim of encouraging long-term academic research agendas that address LLNL research priorities. Through these collaborations, ideas and software flow in both directions, and LLNL cultivates its future workforce. The Institute strives to be LLNL's ''eyes and ears'' in the computer and information sciences, keeping the Laboratory aware of and connected to important external advances. It also attempts to be the ''hands and feet'' that carry those advances into the Laboratory and incorporate them into practice. ISCR research participants are integrated into LLNL's Computing Applications and Research (CAR) Department, especially into its Center for Applied Scientific Computing (CASC). In turn, these organizations address computational challenges arising throughout the rest of the Laboratory. Administratively, the ISCR flourishes under LLNL's University Relations Program (URP). Together with the other four institutes of the URP, the ISCR navigates a course that allows LLNL to benefit from academic exchanges while preserving national security. While it is difficult to operate an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and worth the continued effort. The pages of this annual report summarize the activities of the faculty members, postdoctoral researchers, students, and guests from industry and other laboratories who participated in LLNL's computational mission under the auspices of the ISCR during FY 2005.« less

  5. Materials Science Research Rack-1 (MSRR-1)

    NASA Technical Reports Server (NTRS)

    2001-01-01

    This computer-generated image depicts the Materials Science Research Rack-1 (MSRR-1) being developed by NASA's Marshall Space Flight Center and the European Space Agency (ESA) for placement in the Destiny laboratory module aboard the International Space Station. The rack is part of the plarned Materials Science Research Facility (MSRF) and is expected to include two furnace module inserts, a Quench Module Insert (being developed by NASA's Marshall Space Flight Center) to study directional solidification in rapidly cooled alloys and a Diffusion Module Insert (being developed by the European Space Agency) to study crystal growth, and a transparent furnace (being developed by NASA's Space Product Development program). Multi-user equipment in the rack is being developed under the auspices of NASA's Office of Biological and Physical Research (OBPR) and ESA. Key elements are labeled in other images (0101754, 0101829, 0101830, and TBD).

  6. Materials Science Research Rack-1 (MSRR-1)

    NASA Technical Reports Server (NTRS)

    2001-01-01

    This computer-generated image depicts the Materials Science Research Rack-1 (MSRR-1) being developed by NASA's Marshall Space Flight Center and the European Space Agency (ESA) for placement in the Destiny laboratory module aboard the International Space Station. The rack is part of the plarned Materials Science Research Facility (MSRF) and is expected to include two furnace module inserts, a Quench Module Insert (being developed by NASA's Marshall Space Flight Center) to study directional solidification in rapidly cooled alloys and a Diffusion Module Insert (being developed by the European Space Agency) to study crystal growth, and a transparent furnace (being developed by NASA's Space Product Development program). Multi-user equipment in the rack is being developed under the auspices of NASA's Office of Biological and Physical Research (OBPR) and ESA. A larger image is available without labels (No. 0101755).

  7. Materials Science Research Rack-1 (MSRR-1)

    NASA Technical Reports Server (NTRS)

    2001-01-01

    This computer-generated image depicts the Materials Science Research Rack-1 (MSRR-1) being developed by NASA's Marshall Space Flight Center and the European Space Agency (ESA) for placement in the Destiny laboratory module aboard the International Space Station. The rack is part of the plarned Materials Science Research Facility (MSRF) and is expected to include two furnace module inserts, a Quench Module Insert (being developed by NASA's Marshall Space Flight Center) to study directional solidification in rapidly cooled alloys and a Diffusion Module Insert (being developed by the European Space Agency) to study crystal growth, and a transparent furnace (being developed by NASA's Space Product Development program). Multi-user equipment in the rack is being developed under the auspices of NASA's Office of Biological and Physical Research (OBPR) and ESA. Key elements are labeled in other images (0101754, 0101830, and TBD).

  8. Materials Science Research Rack-1 (MSRR-1)

    NASA Technical Reports Server (NTRS)

    2001-01-01

    This computer-generated image depicts the Materials Science Research Rack-1 (MSRR-1) being developed by NASA's Marshall Space Flight Center and the European Space Agency (ESA) for placement in the Destiny laboratory module aboard the International Space Station. The rack is part of the plarned Materials Science Research Facility (MSRF) and is expected to include two furnace module inserts, a Quench Module Insert (being developed by NASA's Marshall Space Flight Center) to study directional solidification in rapidly cooled alloys and a Diffusion Module Insert (being developed by the European Space Agency) to study crystal growth, and a transparent furnace (being developed by NASA's Space Product Development program). Multi-user equipment in the rack is being developed under the auspices of NASA's Office of Biological and Physical Research (OBPR) and ESA. Key elements are labeled in other images (0101754, 0101829, 0101830).

  9. Facilitating NASA Earth Science Data Processing Using Nebula Cloud Computing

    NASA Technical Reports Server (NTRS)

    Pham, Long; Chen, Aijun; Kempler, Steven; Lynnes, Christopher; Theobald, Michael; Asghar, Esfandiari; Campino, Jane; Vollmer, Bruce

    2011-01-01

    Cloud Computing has been implemented in several commercial arenas. The NASA Nebula Cloud Computing platform is an Infrastructure as a Service (IaaS) built in 2008 at NASA Ames Research Center and 2010 at GSFC. Nebula is an open source Cloud platform intended to: a) Make NASA realize significant cost savings through efficient resource utilization, reduced energy consumption, and reduced labor costs. b) Provide an easier way for NASA scientists and researchers to efficiently explore and share large and complex data sets. c) Allow customers to provision, manage, and decommission computing capabilities on an as-needed bases

  10. Design and Implement of Astronomical Cloud Computing Environment In China-VO

    NASA Astrophysics Data System (ADS)

    Li, Changhua; Cui, Chenzhou; Mi, Linying; He, Boliang; Fan, Dongwei; Li, Shanshan; Yang, Sisi; Xu, Yunfei; Han, Jun; Chen, Junyi; Zhang, Hailong; Yu, Ce; Xiao, Jian; Wang, Chuanjun; Cao, Zihuang; Fan, Yufeng; Liu, Liang; Chen, Xiao; Song, Wenming; Du, Kangyu

    2017-06-01

    Astronomy cloud computing environment is a cyber-Infrastructure for Astronomy Research initiated by Chinese Virtual Observatory (China-VO) under funding support from NDRC (National Development and Reform commission) and CAS (Chinese Academy of Sciences). Based on virtualization technology, astronomy cloud computing environment was designed and implemented by China-VO team. It consists of five distributed nodes across the mainland of China. Astronomer can get compuitng and storage resource in this cloud computing environment. Through this environments, astronomer can easily search and analyze astronomical data collected by different telescopes and data centers , and avoid the large scale dataset transportation.

  11. Modeling Subsurface Reactive Flows Using Leadership-Class Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mills, Richard T; Hammond, Glenn; Lichtner, Peter

    2009-01-01

    We describe our experiences running PFLOTRAN - a code for simulation of coupled hydro-thermal-chemical processes in variably saturated, non-isothermal, porous media - on leadership-class supercomputers, including initial experiences running on the petaflop incarnation of Jaguar, the Cray XT5 at the National Center for Computational Sciences at Oak Ridge National Laboratory. PFLOTRAN utilizes fully implicit time-stepping and is built on top of the Portable, Extensible Toolkit for Scientific Computation (PETSc). We discuss some of the hurdles to 'at scale' performance with PFLOTRAN and the progress we have made in overcoming them on leadership-class computer architectures.

  12. Conformational Dynamics and Proton Relay Positioning in Nickel Catalysts for Hydrogen Production and Oxidation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Franz, James A.; O'Hagan, Molly J.; Ho, Ming-Hsun

    2013-12-09

    The [Ni(PR2NR’2)2]2+ catalysts, (where PR2NR´2 is 1,5-R´-3,7-R-1,5-diaza-3,7-diphosphacyclooctane), are some of the fastest reported for hydrogen production and oxidation, however, chair/boat isomerization and the presence of a fifth solvent ligand have the potential to slow catalysis by incorrectly positioning the pendant amines or blocking the addition of hydrogen. Here, we report the structural dynamics of a series of [Ni(PR2NR’2)2]n+ complexes, characterized by NMR spectroscopy and theoretical modeling. A fast exchange process was observed for the [Ni(CH3CN)(PR2NR’2)2]2+ complexes which depends on the ligand. This exchange process was identified to occur through a three step mechanism including dissociation of the acetonitrile, boat/chair isomerizationmore » of each of the four rings identified by the phosphine ligands (including nitrogen inversion), and reassociation of acetonitrile on the opposite side of the complex. The rate of the chair/boat inversion can be influenced by varying the substituent on the nitrogen atom, but the rate of the overall exchange process is at least an order of magnitude faster than the catalytic rate in acetonitrile demonstrating that the structural dynamics of the [Ni(PR2NR´2)2]2+ complexes does not hinder catalysis. This material is based upon work supported as part of the Center for Molecular Electrocatalysis, an Energy Frontier Research Center funded by the US Department of Energy, Office of Science, Office of Basic Energy Sciences under FWP56073. Research by J.A.F., M.O., M-H. H., M.L.H, D.L.D. A.M.A., S. R. and R.M.B. was carried out in the Center for Molecular Electrocatalysis, an Energy Frontier Research Center funded by the U.S. Department of Energy, Office of Science. W.J.S. and S.L. were funded by the DOE Office of Science Early Career Research Program through the Office of Basic Energy Sciences. T.L. was supported by the US Department of Energy, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences & Biosciences. Pacific Northwest National Laboratory (PNNL) is a multiprogram national laboratory operated for DOE by Battelle. Computational resources were provided at W. R. Wiley Environmental Molecular Science Laboratory (EMSL), a national scientific user facility sponsored by the Department of Energy’s Office of Biological and Environmental Research located at Pacific Northwest National Laboratory; the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory; and the Jaguar supercomputer at Oak Ridge National Laboratory (INCITE 2008-2011 award supported by the Office of Science of the U.S. DOE under Contract No. DE-AC0500OR22725).« less

  13. Preparation for microgravity - The role of the Microgravity Material Science Laboratory

    NASA Technical Reports Server (NTRS)

    Johnston, J. Christopher; Rosenthal, Bruce N.; Meyer, Maryjo B.; Glasgow, Thomas K.

    1988-01-01

    Experiments at the NASA Lewis Research Center's Microgravity Material Science Laboratory using physical and mathematical models to delineate the effects of gravity on processes of scientific and commercial interest are discussed. Where possible, transparent model systems are used to visually track convection, settling, crystal growth, phase separation, agglomeration, vapor transport, diffusive flow, and polymer reactions. Materials studied include metals, alloys, salts, glasses, ceramics, and polymers. Specific technologies discussed include the General Purpose furnace used in the study of metals and crystal growth, the isothermal dendrite growth apparatus, the electromagnetic levitator/instrumented drop tube, the high temperature directional solidification furnace, the ceramics and polymer laboratories and the center's computing facilities.

  14. Science at the interstices: an evolution in the academy.

    PubMed

    Balser, Jeffrey R; Baruchin, Andrea

    2008-09-01

    Biomedical science is at an evolutionary turning point. Many of the rate-limiting steps to realizing the next generation of personalized, highly targeted diagnostics and therapeutics rest at the interstices between biomedical science and the classic, university-based disciplines, such as physics, mathematics, computational science, engineering, social sciences, business, and law. Institutes, centers, or other entities created to foster interdisciplinary science are rapidly forming to tackle these formidable challenges, but they are plagued with substantive barriers, born of traditions, processes, and culture, which impede scientific progress and endanger success. Without a more seamless interdisciplinary framework, academic health centers will struggle to move transformative advances in technology into the foundation of biomedical science, and the equally challenging advancement of models that effectively integrate new molecular diagnostics and therapies into the business and social fabric of our population will be similarly hampered. At the same time, excess attention on rankings tied to competition for National Institutes of Health and other federal funds adversely encourages academic medical centers (AMCs) and universities to hoard, rather than share, resources effectively and efficiently. To fully realize their discovery potential, AMCs must consider a substantive realignment relative to one another, as well as with their associated universities, as the academy looks toward innovative approaches to provide a more supportive foundation for the emergent biomedical research enterprise. The authors discuss potential models that could serve to lower barriers to interdisciplinary science, promoting a new synergy between AMCs and their parent universities.

  15. Production Management System for AMS Computing Centres

    NASA Astrophysics Data System (ADS)

    Choutko, V.; Demakov, O.; Egorov, A.; Eline, A.; Shan, B. S.; Shi, R.

    2017-10-01

    The Alpha Magnetic Spectrometer [1] (AMS) has collected over 95 billion cosmic ray events since it was installed on the International Space Station (ISS) on May 19, 2011. To cope with enormous flux of events, AMS uses 12 computing centers in Europe, Asia and North America, which have different hardware and software configurations. The centers are participating in data reconstruction, Monte-Carlo (MC) simulation [2]/Data and MC production/as well as in physics analysis. Data production management system has been developed to facilitate data and MC production tasks in AMS computing centers, including job acquiring, submitting, monitoring, transferring, and accounting. It was designed to be modularized, light-weighted, and easy-to-be-deployed. The system is based on Deterministic Finite Automaton [3] model, and implemented by script languages, Python and Perl, and the built-in sqlite3 database on Linux operating systems. Different batch management systems, file system storage, and transferring protocols are supported. The details of the integration with Open Science Grid are presented as well.

  16. Games and Simulations for Climate, Weather and Earth Science Education

    NASA Astrophysics Data System (ADS)

    Russell, R. M.; Clark, S.

    2015-12-01

    We will demonstrate several interactive, computer-based simulations, games, and other interactive multimedia. These resources were developed for weather, climate, atmospheric science, and related Earth system science education. The materials were created by the UCAR Center for Science Education. These materials have been disseminated via our web site (SciEd.ucar.edu), webinars, online courses, teacher workshops, and large touchscreen displays in weather and Sun-Earth connections exhibits in NCAR's Mesa Lab facility in Boulder, Colorado. Our group has also assembled a web-based list of similar resources, especially simulations and games, from other sources that touch upon weather, climate, and atmospheric science topics. We'll briefly demonstrate this directory.

  17. Computer Based Instruction in the U.S. Army’s Entry Level Enlisted Training.

    DTIC Science & Technology

    1985-03-13

    rosters with essential personal data, and graduation rosters with class standings and printed diplomas. The computer also managed the progress of the...discussion is presented in Chapter Three. Methods of Employment Course administration. In 1980 the US Army Research Center for Behaviorial and Social Studies...contained in Appendix C. Data Presentation All responses from the questionaires were coded for use by the Statistical Package for the Social Sciences

  18. Laboratory for Computer Science Progress Report 21, July 1983-June 1984.

    DTIC Science & Technology

    1984-06-01

    Systems 269 4. Distributed Consensus 270 5. Election of a Leader in a Distributed Ring of Processors 273 6. Distributed Network Algorithms 274 7. Diagnosis...multiprocessor systems. This facility, funded by the new!y formed Strategic Computing Program of the Defense Advanced Research Projects Agency, will enable...Academic Staff P. Szo)ovits, Group Leader R. Patil Collaborating Investigators M. Criscitiello, M.D., Tufts-New England Medical Center Hospital R

  19. The DEVELOP Program as a Unique Applied Science Internship

    NASA Astrophysics Data System (ADS)

    Skiles, J. W.; Schmidt, C. L.; Ruiz, M. L.; Cawthorn, J.

    2004-12-01

    The NASA mission includes "Inspiring the next generation of explorers" and "Understanding and protecting our home planet". DEVELOP students conduct research projects in Earth Systems Science, gaining valuable training and work experience, which support accomplishing this mission. This presentation will describe the DEVELOP Program, a NASA human capital development initiative, which is student run and student led with NASA scientists serving as mentors. DEVELOP began in 1998 at NASA's Langley Research Center in Virginia and expanded to NASA's Stennis Space Center in Mississippi and Marshall Space Flight Center in Alabama in 2002. NASA's Ames Research Center in California began DEVELOP activity in 2003. DEVELOP is a year round activity. High school through graduate school students participate in DEVELOP with students' backgrounds encompassing a wide variety of academic majors such as engineering, biology, physics, mathematics, computer science, remote sensing, geographic information systems, business, and geography. DEVELOP projects are initiated when county, state, or tribal governments submit a proposal requesting students work on local projects. When a project is selected, science mentors guide students in the application of NASA applied science and technology to enhance decision support tools for customers. Partnerships are established with customers, professional organizations and state and federal agencies in order to leverage resources needed to complete research projects. Student teams are assigned a project and are responsible for creating an inclusive project plan beginning with the design and approach of the study, the timeline, and the deliverables for the customer. Project results can consist of student papers, both team and individually written, face-to-face meetings and seminars with customers, presentations at national meetings in the form of posters and oral papers, displays at the Western and Southern Governors' Associations, and visualizations produced by the students. Projects have included Homeland Security in Virginia, Energy Management in New Mexico, Water Management in Mississippi, Air Quality Management in Alabama, Invasive Species mapping in Nevada, Public Health risk assessment in California, Disaster Management in Oklahoma, Agricultural Efficiency in South Dakota, Coastal Management in Louisiana and Carbon Management in Oregon. DEVELOP students gain experience in applied science, computer technology, and project management. Several DEVELOP projects will be demonstrated and discussed during this presentation. DEVELOP is sponsored by the Applications Division of NASA's Science Mission Directorate.

  20. Next Generation Workload Management System For Big Data on Heterogeneous Distributed Computing

    NASA Astrophysics Data System (ADS)

    Klimentov, A.; Buncic, P.; De, K.; Jha, S.; Maeno, T.; Mount, R.; Nilsson, P.; Oleynik, D.; Panitkin, S.; Petrosyan, A.; Porter, R. J.; Read, K. F.; Vaniachine, A.; Wells, J. C.; Wenaus, T.

    2015-05-01

    The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS and ALICE are the largest collaborations ever assembled in the sciences and are at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, both experiments rely on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Management System (WMS) for managing the workflow for all data processing on hundreds of data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. The scale is demonstrated by the following numbers: PanDA manages O(102) sites, O(105) cores, O(108) jobs per year, O(103) users, and ATLAS data volume is O(1017) bytes. In 2013 we started an ambitious program to expand PanDA to all available computing resources, including opportunistic use of commercial and academic clouds and Leadership Computing Facilities (LCF). The project titled ‘Next Generation Workload Management and Analysis System for Big Data’ (BigPanDA) is funded by DOE ASCR and HEP. Extending PanDA to clouds and LCF presents new challenges in managing heterogeneity and supporting workflow. The BigPanDA project is underway to setup and tailor PanDA at the Oak Ridge Leadership Computing Facility (OLCF) and at the National Research Center "Kurchatov Institute" together with ALICE distributed computing and ORNL computing professionals. Our approach to integration of HPC platforms at the OLCF and elsewhere is to reuse, as much as possible, existing components of the PanDA system. We will present our current accomplishments with running the PanDA WMS at OLCF and other supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facilities infrastructure for High Energy and Nuclear Physics as well as other data-intensive science applications.

  1. Community Coordinated Modeling Center: A Powerful Resource in Space Science and Space Weather Education

    NASA Astrophysics Data System (ADS)

    Chulaki, A.; Kuznetsova, M. M.; Rastaetter, L.; MacNeice, P. J.; Shim, J. S.; Pulkkinen, A. A.; Taktakishvili, A.; Mays, M. L.; Mendoza, A. M. M.; Zheng, Y.; Mullinix, R.; Collado-Vega, Y. M.; Maddox, M. M.; Pembroke, A. D.; Wiegand, C.

    2015-12-01

    Community Coordinated Modeling Center (CCMC) is a NASA affiliated interagency partnership with the primary goal of aiding the transition of modern space science models into space weather forecasting while supporting space science research. Additionally, over the past ten years it has established itself as a global space science education resource supporting undergraduate and graduate education and research, and spreading space weather awareness worldwide. A unique combination of assets, capabilities and close ties to the scientific and educational communities enable this small group to serve as a hub for raising generations of young space scientists and engineers. CCMC resources are publicly available online, providing unprecedented global access to the largest collection of modern space science models (developed by the international research community). CCMC has revolutionized the way simulations are utilized in classrooms settings, student projects, and scientific labs and serves hundreds of educators, students and researchers every year. Another major CCMC asset is an expert space weather prototyping team primarily serving NASA's interplanetary space weather needs. Capitalizing on its unrivaled capabilities and experiences, the team provides in-depth space weather training to students and professionals worldwide, and offers an amazing opportunity for undergraduates to engage in real-time space weather monitoring, analysis, forecasting and research. In-house development of state-of-the-art space weather tools and applications provides exciting opportunities to students majoring in computer science and computer engineering fields to intern with the software engineers at the CCMC while also learning about the space weather from the NASA scientists.

  2. Center for Interface Science and Catalysis | Theory

    Science.gov Websites

    & Stanford School of Engineering Toggle navigation Home Research Publications People About Academic to overcome challenges associated with the atomic-scale design of catalysts for chemical computational methods we are developing a quantitative description of chemical processes at the solid-gas and

  3. Mark F. Davis | NREL

    Science.gov Websites

    | 303-384-6140 Orcid ID http://orcid.org/0000-0003-4541-9852 Research Interests Dr. Mark Davis is the years, he has served as the Platform Program Manager for Thermochemical and has directed research Science Center, including high throughput recalcitrance assays, omics research, computational modeling

  4. Science and Innovation at Los Alamos

    Science.gov Websites

    Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron

  5. Improving Family Forest Knowledge Transfer through Social Network Analysis

    ERIC Educational Resources Information Center

    Gorczyca, Erika L.; Lyons, Patrick W.; Leahy, Jessica E.; Johnson, Teresa R.; Straub, Crista L.

    2012-01-01

    To better engage Maine's family forest landowners our study used social network analysis: a computational social science method for identifying stakeholders, evaluating models of engagement, and targeting areas for enhanced partnerships. Interviews with researchers associated with a research center were conducted to identify how social network…

  6. Press Releases | Argonne National Laboratory

    Science.gov Websites

    Electrochemical Energy Science --Center for Transportation Research --Chain Reaction Innovations --Computation renewable energy such as wind and solar power. April 25, 2018 John Carlisle, director of Chain Reaction across nation to grow startups Argonne announces second cohort of Chain Reaction Innovations. April 18

  7. Science Education: An Experiment in Facilitating the Learning of Neurophysiology.

    ERIC Educational Resources Information Center

    Levitan, Herbert

    1981-01-01

    Summarizes the experiences of a zoology professor attempting to construct a student-centered course in neurophysiology. Various aspects of the organization and conduct of the course are described, including the beginning experience, topics of interest, lecture, laboratory, computer simulation, examinations, student lectures. Evaluation of the…

  8. 77 FR 43286 - Agency Forms Undergoing Paperwork Reduction Act Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-24

    ... information are broken into nine separate questions (data fields) for computer entry. General information... Questions. Kimberly S. Lane, Deputy Director, Office of Science Integrity, Office of the Associate Director... DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Disease Control and Prevention [30-Day-12-0040...

  9. Deployment and Operational Experiences with CernVM-FS at the GridKa Tier-1 Center

    NASA Astrophysics Data System (ADS)

    Alef, Manfred; Jäger, Axel; Petzold and, Andreas; Verstege, Bernhard

    2012-12-01

    In 2012 the GridKa Tier-1 computing center hosts 130 kHS06 computing resources and 14PB disk and 17PB tape space. These resources are shared between the four LHC VOs and a number of national and international VOs from high energy physics and other sciences. CernVM-FS has been deployed at GridKa to supplement the existing NFS-based system to access VO software on the worker nodes. It provides a solution tailored to the requirement of the LHC VOs. We will focus on the first operational experiences and the monitoring of CernVM-FS on the worker nodes and the squid caches.

  10. A visiting scientist program in atmospheric sciences for the Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Davis, M. H.

    1989-01-01

    A visiting scientist program was conducted in the atmospheric sciences and related areas at the Goddard Laboratory for Atmospheres. Research was performed in mathematical analysis as applied to computer modeling of the atmospheres; development of atmospheric modeling programs; analysis of remotely sensed atmospheric, surface, and oceanic data and its incorporation into atmospheric models; development of advanced remote sensing instrumentation; and related research areas. The specific research efforts are detailed by tasks.

  11. Mission leverage education: NSU/NASA innovative undergraduate model

    NASA Technical Reports Server (NTRS)

    Chaudhury, S. Raj; Shaw, Paula R. D.

    2005-01-01

    The BEST Lab (Center for Excellence in Science Education), the Center for Materials Research (CMR), and the Chemistry, Mathematics, Physics, and Computer Science (CS) Departments at Norfolk State University (NSU) joined forces to implement MiLEN(2) IUM - an innovative approach tu integrate current and emerging research into the undergraduate curricula and train students on NASA-related fields. An Earth Observing System (EOS) mission was simulated where students are educated and trained in many aspects of Remote Sensing: detector physics and spectroscopy; signal processing; data conditioning, analysis, visualization; and atmospheric science. This model and its continued impact is expected to significantly enhance the quality of the Mathematics, Science, Engineering and Technology (MSET or SMET) educational experience and to inspire students from historically underrepresented groups to pursue careers in NASA-related fields. MiLEN(2) IUM will be applicable to other higher education institutions that are willing to make the commitment to this endeavor in terms of faculty interest and space.

  12. National Observation Services at OSUG and construction of a Data Center and of a mutualized information system

    NASA Astrophysics Data System (ADS)

    Meunier, N.

    2016-12-01

    OSUG (Observatoire des Sciences de l'Univers de Grenoble) is strongly involved in more than 20 national observation services (hereafter SNO) covering the different INSU (Institut National des Sciences de l'Univers) sections, and is the PI for ten of them. This strong involvement led us to implement a data center (OSUG-DC), in order to provide the SNO and many other projects an infrastructure and common tools (software development, data monitoring, ...): the objective is to allow them to make their data available to the community in the best conditions. The OSUG-DC has been recognized as a Regional Expertise Center for the astronomy-astrophysics component in 2003 (3 SNO are concerned). This construction is also part of a larger reflexion concerning the mutualization of certain services of the information system at OSUG and at University Grenoble Alpes, some already in place for some time such as a high performance computation regional center. This paper presents the management organisation of these projects, strong points and issues.

  13. The 3d International Workshop on Computational Electronics

    NASA Astrophysics Data System (ADS)

    Goodnick, Stephen M.

    1994-09-01

    The Third International Workshop on Computational Electronics (IWCE) was held at the Benson Hotel in downtown Portland, Oregon, on May 18, 19, and 20, 1994. The workshop was devoted to a broad range of topics in computational electronics related to the simulation of electronic transport in semiconductors and semiconductor devices, particularly those which use large computational resources. The workshop was supported by the National Science Foundation (NSF), the Office of Naval Research and the Army Research Office, as well as local support from the Oregon Joint Graduate Schools of Engineering and the Oregon Center for Advanced Technology Education. There were over 100 participants in the Portland workshop, of which more than one quarter represented research groups outside of the United States from Austria, Canada, France, Germany, Italy, Japan, Switzerland, and the United Kingdom. There were a total 81 papers presented at the workshop, 9 invited talks, 26 oral presentations and 46 poster presentations. The emphasis of the contributions reflected the interdisciplinary nature of computational electronics with researchers from the Chemistry, Computer Science, Mathematics, Engineering, and Physics communities participating in the workshop.

  14. Electronic Structure Theory | Materials Science | NREL

    Science.gov Websites

    design and discover materials for energy applications. This includes detailed studies of the physical computing. Key Research Areas Materials by Design NREL leads the U.S. Department of Energy's Center for Next Generation of Materials by Design, which incorporates metastability and synthesizability. Learn more about

  15. NASA CORE (Central Operation of Resources for Educators) Educational Materials Catalog

    NASA Technical Reports Server (NTRS)

    1998-01-01

    This educational materials catalog presents NASA CORE (Central Operation of Resources for Educators). The topics include: 1) Videocassettes (Aeronautics, Earth Resources, Weather, Space Exploration/Satellites, Life Sciences, Careers); 2) Slide Programs; 3) Computer Materials; 4) NASA Memorabilia/Miscellaneous; 5) NASA Educator Resource Centers; 6) and NASA Resources.

  16. Data collection and evaluation for experimental computer science research

    NASA Technical Reports Server (NTRS)

    Zelkowitz, Marvin V.

    1983-01-01

    The Software Engineering Laboratory was monitoring software development at NASA Goddard Space Flight Center since 1976. The data collection activities of the Laboratory and some of the difficulties of obtaining reliable data are described. In addition, the application of this data collection process to a current prototyping experiment is reviewed.

  17. Grand challenge problems in environmental modeling and remediation: groundwater contaminant transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Todd Arbogast; Steve Bryant; Clint N. Dawson

    1998-08-31

    This report describes briefly the work of the Center for Subsurface Modeling (CSM) of the University of Texas at Austin (and Rice University prior to September 1995) on the Partnership in Computational Sciences Consortium (PICS) project entitled Grand Challenge Problems in Environmental Modeling and Remediation: Groundwater Contaminant Transport.

  18. Teaching Human-Centered Security Using Nontraditional Techniques

    ERIC Educational Resources Information Center

    Renaud, Karen; Cutts, Quintin

    2013-01-01

    Computing science students amass years of programming experience and a wealth of factual knowledge in their undergraduate courses. Based on our combined years of experience, however, one of our students' abiding shortcomings is that they think there is only "one correct answer" to issues in most courses: an "idealistic"…

  19. Markush enumeration to manage, mesh and manipulate substances of unknown or variable composition (ACS Fall meeting 5 of 12)

    EPA Science Inventory

    The National Center for Computational Toxicology (NCCT) at the US Environmental Protection Agency has measured, assembled and delivered an enormous quantity and diversity of data for the environmental sciences. This includes high-throughput in vitro screening data, legacy in vivo...

  20. Using Flip Camcorders for Active Classroom Metacognitive Reflection

    ERIC Educational Resources Information Center

    Hargis, Jace; Marotta, Sebastian M.

    2011-01-01

    A Center for Teaching and Learning provided Flip camcorders to a group of 10 new faculty members, who were asked to use this teaching tool in their classroom instruction. The classes included mathematics, political science, computer engineering, psychology, business, music and dance. The qualitative results indicate that all faculty members and…

  1. Kidspiration[R] for Inquiry-Centered Activities

    ERIC Educational Resources Information Center

    Shaw, Edward L., Jr.; Baggett, Paige V.; Salyer, Barbara

    2004-01-01

    Computer technology can be integrated into science inquiry activities to increase student motivation and enhance and expand scientific thinking. Fifth-grade students used the visual thinking tools in the Kidspiration[R] software program to generate and represent a web of hypotheses around the question, "What affects the distance a marble rolls?"…

  2. A Graphical Database Interface for Casual, Naive Users.

    ERIC Educational Resources Information Center

    Burgess, Clifford; Swigger, Kathleen

    1986-01-01

    Describes the design of a database interface for infrequent users of computers which consists of a graphical display of a model of a database and a natural language query language. This interface was designed for and tested with physicians at the University of Texas Health Science Center in Dallas. (LRW)

  3. 77 FR 29350 - Agency Forms Undergoing Paperwork Reduction Act Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-17

    ... using audio computer assisted self-interview (ACASI). The ACASI interview includes questions about... DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Disease Control and Prevention [30Day-12-12EL... Exit Interview 10 1 30/60 Kimberly S. Lane, Deputy Director, Office of Science Integrity, Office of the...

  4. Public Access to Environmental Chemistry Data via the CompTox Chemistry Dashboard (ACS Fall Meeting 6 of 12)

    EPA Science Inventory

    The National Center for Computational Toxicology (NCCT) has assembled and delivered an enormous quantity and diversity of data for the environmental sciences through the CompTox Chemistry Dashboard. These data include high-throughput in vitro screening data, in vivo and functiona...

  5. First-principles data-driven discovery of transition metal oxides for artificial photosynthesis

    NASA Astrophysics Data System (ADS)

    Yan, Qimin

    We develop a first-principles data-driven approach for rapid identification of transition metal oxide (TMO) light absorbers and photocatalysts for artificial photosynthesis using the Materials Project. Initially focusing on Cr, V, and Mn-based ternary TMOs in the database, we design a broadly-applicable multiple-layer screening workflow automating density functional theory (DFT) and hybrid functional calculations of bulk and surface electronic and magnetic structures. We further assess the electrochemical stability of TMOs in aqueous environments from computed Pourbaix diagrams. Several promising earth-abundant low band-gap TMO compounds with desirable band edge energies and electrochemical stability are identified by our computational efforts and then synergistically evaluated using high-throughput synthesis and photoelectrochemical screening techniques by our experimental collaborators at Caltech. Our joint theory-experiment effort has successfully identified new earth-abundant copper and manganese vanadate complex oxides that meet highly demanding requirements for photoanodes, substantially expanding the known space of such materials. By integrating theory and experiment, we validate our approach and develop important new insights into structure-property relationships for TMOs for oxygen evolution photocatalysts, paving the way for use of first-principles data-driven techniques in future applications. This work is supported by the Materials Project Predictive Modeling Center and the Joint Center for Artificial Photosynthesis through the U.S. Department of Energy, Office of Basic Energy Sciences, Materials Sciences and Engineering Division, under Contract No. DE-AC02-05CH11231. Computational resources also provided by the Department of Energy through the National Energy Supercomputing Center.

  6. A policy for science.

    PubMed

    Lauer, Michael S

    2012-06-12

    Policy and science often interact. Typically, we think of policymakers looking to scientists for advice on issues informed by science. We may appreciate less the opposite look: where people outside science inform policies that affect the conduct of science. In clinical medicine, we are forced to make decisions about practices for which there is insufficient, inadequate evidence to know whether they improve clinical outcomes, yet the health care system may not be structured to rapidly generate needed evidence. For example, when the Centers for Medicare and Medicaid Services noted insufficient evidence to support routine use of computed tomography angiography and they called for a national commitment to completion of randomized trials, their call ran into substantial opposition. I use the computed tomography angiography story to illustrate how we might consider a "policy for science" in which stakeholders would band together to identify evidence gaps and to use their influence to promote the efficient design, implementation, and completion of high-quality randomized trials. Such a policy for science could create a culture that incentivizes and invigorates the rapid generation of evidence, ultimately engaging all clinicians, all patients, and indeed all stakeholders into the scientific enterprise. Copyright © 2012 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  7. NASA Tech Briefs, April 1995. Volume 19, No. 4

    NASA Technical Reports Server (NTRS)

    1995-01-01

    This issue of the NASA Tech Briefs has a special focus section on video and imaging, a feature on the NASA invention of the year, and a resource report on the Dryden Flight Research Center. The issue also contains articles on electronic components and circuits, electronic systems, physical sciences, materials, computer programs, mechanics, machinery, manufacturing/fabrication, mathematics and information sciences and life sciences. In addition to the standard articles in the NASA Tech brief, this contains a supplement entitled "Laser Tech Briefs" which features an article on the National Ignition Facility, and other articles on the use of Lasers.

  8. The role of broken symmetry in solvation of a spherical cavity in classical and quantum water models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Remsing, Richard C.; Baer, Marcel D.; Schenter, Gregory K.

    2014-08-21

    Insertion of a hard sphere cavity in liquid water breaks translational symmetry and generates an electrostatic potential difference between the region near the cavity and the bulk. Here, we clarify the physical interpretation of this potential and its calculation. We also show that the electrostatic potential in the center of small, medium, and large cavities depends very sensitively on the form of the assumed molecular interactions for dfferent classical simple point-charge models and quantum mechanical DFT-based interaction potentials, as reected in their description of donor and acceptor hydrogen bonds near the cavity. These dfferences can signifcantly affect the magnitude ofmore » the scalar electrostatic potential. We argue that the result of these studies will have direct consequences toward our understanding of the thermodynamics of ion solvation through the cavity charging process. JDW and RCR are supported by the National Science Foundation (Grants CHE0848574 and CHE1300993). CJM and GKS are supported by the U.S. Department of Energy`s (DOE) Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences and Biosciences. Pacific Northwest National Laboratory (PNNL) is operated for the Department of Energy by Battelle. MDB is grateful for the support of the Linus Pauling Distinguished Postdoctoral Fellowship Program at PNNL. We acknowledge illuminating discussions and sharing of ideas and preprints with Dr. Shawn M. Kathmann and Prof. Tom Beck. The DFT simulations used resources of the National Energy Research Scientific Computing Center, which is supported by the Office of Science of the U.S. Department of Energy under Contract No. DE-AC02-05CH11231. Additional computing resources were generously allocated by PNNL's Institutional Computing program.« less

  9. Workforce Retention Study in Support of the U.S. Army Aberdeen Test Center Human Capital Management Strategy

    DTIC Science & Technology

    2016-09-01

    Sciences Group 6% 1550s Computer Scientists Group 5% Other 1500s ORSAa, Mathematics, & Statistics Group 3% 1600s Equipment & Facilities Group 4...Employee removal based on misconduct, delinquency , suitability, unsatisfactory performance, or failure to qualify for conversion to a career appointment...average of 10.4% in many areas, but over double the average for the 1550s (Computer Scientists) and other 1500s (ORSA, Mathematics, and Statistics ). Also

  10. View From Camera Not Used During Curiosity's First Six Months on Mars

    NASA Image and Video Library

    2017-12-08

    This view of Curiosity's left-front and left-center wheels and of marks made by wheels on the ground in the "Yellowknife Bay" area comes from one of six cameras used on Mars for the first time more than six months after the rover landed. The left Navigation Camera (Navcam) linked to Curiosity's B-side computer took this image during the 223rd Martian day, or sol, of Curiosity's work on Mars (March 22, 2013). The wheels are 20 inches (50 centimeters) in diameter. Curiosity carries a pair of main computers, redundant to each other, in order to have a backup available if one fails. Each of the computers, A-side and B-side, also has other redundant subsystems linked to just that computer. Curiosity operated on its A-side from before the August 2012 landing until Feb. 28, when engineers commanded a switch to the B-side in response to a memory glitch on the A-side. One set of activities after switching to the B-side computer has been to check the six engineering cameras that are hard-linked to that computer. The rover's science instruments, including five science cameras, can each be operated by either the A-side or B-side computer, whichever is active. However, each of Curiosity's 12 engineering cameras is linked to just one of the computers. The engineering cameras are the Navigation Camera (Navcam), the Front Hazard-Avoidance Camera (Front Hazcam) and Rear Hazard-Avoidance Camera (Rear Hazcam). Each of those three named cameras has four cameras as part of it: two stereo pairs of cameras, with one pair linked to each computer. Only the pairs linked to the active computer can be used, and the A-side computer was active from before landing, in August, until Feb. 28. All six of the B-side engineering cameras have been used during March 2013 and checked out OK. Image Credit: NASA/JPL-Caltech NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  11. KSC-03pd0810

    NASA Image and Video Library

    2003-03-21

    KENNEDY SPACE CENTER, FLA. - Sponsor representatives of the 2003 Southeastern Regional FIRST Robotic Competition take a moment to compare notes between events. From left are Wayne Weinberg, director of development for the University of Central Florida College of Engineering and Computer Science; Erik Halleus, chair of the FIRST Regional Advisory Committee and a vice president at Siemens Enterprise Networks; and Roy D. Bridges, Jr., director of the NASA/Kennedy Space Center. The competition is being held at the University of Central Florida (UCF) in Orlando, March 20-23. Forty student teams from around the country are participating in the event that pits team-built gladiator robots against each other in an athletic-style competition. The teams are sponsored by NASA/Kennedy Space Center, The Boeing Company/Brevard Community College, and Lockheed Martin Space Operations/Mission Systems for the nonprofit organization For Inspiration and Recognition of Science and Technology, known as FIRST. The vision of FIRST is to inspire in the youth of our nation an appreciation of science and technology and an understanding that mastering these disciplines can enrich the lives of all mankind.

  12. United States Air Force Summer Research Program -- 1993. Volume 6. Arnold Engineering Development Center, Frank J. Seiler Research Laboratory, Wilford Hall Medical Center

    DTIC Science & Technology

    1993-12-01

    where negative charge state. The local symmetry of the Ge(I) and Ge(II) centers are CI and C2 respectively. (See also Fig. 1.) q=- 1 Ge(I) Ge(II) s p...Raymond Field: Dept. of Computer Science Dept, CEM. M•e s , PhD Laboratory: / 3200 Willow Creek Road zmbry-Riddle Aeronautical Univ Vol-Page No: 0- 0...Field: Electrical Engineering Assistant Professor, PhD Laboratory: PL/WS 2390 S . York Street University of Denver Vol-Page No: 3-35 Denver, CO 80209-0177

  13. Visions of the Future - the Changing Role of Actors in Data-Intensive Science

    NASA Astrophysics Data System (ADS)

    Schäfer, L.; Klump, J. F.

    2013-12-01

    Around the world scientific disciplines are increasingly facing the challenge of a burgeoning volume of research data. This data avalanche consists of a stream of information generated from sensors and scientific instruments, digital recordings, social-science surveys or drawn from the World Wide Web. All areas of the scientific economy are affected by this rapid growth in data, from the logging of digs in Archaeology, telescope data with observations of distant galaxies in Astrophysics or data from polls and surveys in the Social Sciences. The challenge for science is not only to process the data through analysis, reduction and visualization, but also to set up infrastructures for provisioning and storing the data. The rise of new technologies and developments also poses new challenges for the actors in the area of research data infrastructures. Libraries, as one of the actors, enable access to digital media and support the publication of research data and its long-term archiving. Digital media and research data, however, introduce new aspects into the libraries' range of activities. How are we to imagine the library of the future? The library as an interface to the computer centers? Will library and computer center fuse into a new service unit? What role will scientific publishers play in future? Currently the traditional form of publication still carry greater weight - articles for conferences and journals. But will this still be the case in future? New forms of publication are already making their presence felt. The tasks of the computer centers may also change. Yesterday their remit was provisioning of rapid hardware, whereas now everything revolves around the topic of data and services. Finally, how about the researchers themselves? Not such a long time ago, Geoscience was not necessarily seen as linked to Computer Science. Nowadays, modern Geoscience relies heavily on IT and its techniques. Thus, in how far will the profile of the modern geoscientist change? This gives rise to the question of what tools are required to locate and pursue the correct course in a networked world. One tool from the area of innovation management is the scenario technique. This poster will outline visions of the future as possible developments of the scientific world in 2020 (or later). The scenarios presented will show possible developments - both positive and negative. It is up then to the actors themselves to define their own position in this context, to rethink it and consider steps that can achieve a positive development for the future.

  14. Development of a High Resolution Weather Forecast Model for Mesoamerica Using the NASA Nebula Cloud Computing Environment

    NASA Technical Reports Server (NTRS)

    Molthan, Andrew L.; Case, Jonathan L.; Venner, Jason; Moreno-Madrinan, Max. J.; Delgado, Francisco

    2012-01-01

    Over the past two years, scientists in the Earth Science Office at NASA fs Marshall Space Flight Center (MSFC) have explored opportunities to apply cloud computing concepts to support near real ]time weather forecast modeling via the Weather Research and Forecasting (WRF) model. Collaborators at NASA fs Short ]term Prediction Research and Transition (SPoRT) Center and the SERVIR project at Marshall Space Flight Center have established a framework that provides high resolution, daily weather forecasts over Mesoamerica through use of the NASA Nebula Cloud Computing Platform at Ames Research Center. Supported by experts at Ames, staff at SPoRT and SERVIR have established daily forecasts complete with web graphics and a user interface that allows SERVIR partners access to high resolution depictions of weather in the next 48 hours, useful for monitoring and mitigating meteorological hazards such as thunderstorms, heavy precipitation, and tropical weather that can lead to other disasters such as flooding and landslides. This presentation will describe the framework for establishing and providing WRF forecasts, example applications of output provided via the SERVIR web portal, and early results of forecast model verification against available surface ] and satellite ]based observations.

  15. Development of a High Resolution Weather Forecast Model for Mesoamerica Using the NASA Nebula Cloud Computing Environment

    NASA Astrophysics Data System (ADS)

    Molthan, A.; Case, J.; Venner, J.; Moreno-Madriñán, M. J.; Delgado, F.

    2012-12-01

    Over the past two years, scientists in the Earth Science Office at NASA's Marshall Space Flight Center (MSFC) have explored opportunities to apply cloud computing concepts to support near real-time weather forecast modeling via the Weather Research and Forecasting (WRF) model. Collaborators at NASA's Short-term Prediction Research and Transition (SPoRT) Center and the SERVIR project at Marshall Space Flight Center have established a framework that provides high resolution, daily weather forecasts over Mesoamerica through use of the NASA Nebula Cloud Computing Platform at Ames Research Center. Supported by experts at Ames, staff at SPoRT and SERVIR have established daily forecasts complete with web graphics and a user interface that allows SERVIR partners access to high resolution depictions of weather in the next 48 hours, useful for monitoring and mitigating meteorological hazards such as thunderstorms, heavy precipitation, and tropical weather that can lead to other disasters such as flooding and landslides. This presentation will describe the framework for establishing and providing WRF forecasts, example applications of output provided via the SERVIR web portal, and early results of forecast model verification against available surface- and satellite-based observations.

  16. BCTC for Windows: Abstract of Issue 9903W

    NASA Astrophysics Data System (ADS)

    Whisnant, David M.; McCormick, James A.

    1999-05-01

    BCTC for Windows was originally published by JCE Software in 1992 (1) in Series B for PC-compatible (MS-DOS) computers. JCE Software is now re-releasing BCTC for Windows as issue 9903W to make it more accessible to Windows users-especially those running Windows 95 and Windows 98-while we continue to phase out Series B (DOS) issues. Aside from a new Windows-compatible installation program, BCTC is unchanged. BCTC is an environmental simulation modeled after the dioxin controversy (2). In the simulation, students are involved in the investigation of a suspected carcinogen called BCTC, which has been found in a river below a chemical plant and above the water supply of a nearby city. The students have the options of taking water samples, analyzing the water (for BCTC, oxygen, metals, and pesticides), determining LD50s in an animal lab, visiting a library, making economic analyses, and conferring with colleagues, all using the computer. In the Classroom BCTC gives students experience with science in the context of a larger social and political problem. It can serve as the basis for a scientific report, class discussion, or a role-playing exercise (3). Because it requires no previous laboratory experience, this simulation can be used by students in middle and high school science classes, or in college courses for non-science majors. It also has been used in introductory chemistry courses for science majors. One of the intentions of BCTC is to involve students in an exercise (2) that closely approximates what scientists do. The realistic pictures, many of them captured with a video camera, create an atmosphere that furthers this goal. BCTC also reflects the comments of teachers who have used the program (4) and accounts of dioxin research (5). Screen from BCTC showing location of the entry of the effluent in the river, the city, and the city water supply.

    Acknowledgments Support for this project was provided by NSF Grant USE-9151873 and by a BellSouth Foundation Grant. Literature Cited 1. Whisnant, D. M.; McCormick, J. A. BCTC for Windows; J. Chem. Educ. Software 1992, 5B2. 2. Whisnant, D. M. J. Chem. Educ. 1984, 61, 627-629. 3. Whisnant, D. M. J. Chem. Educ. 1992, 69, 42. 4. Camille and Henry Dreyfus Institute on the Chemistry of Water, 1990; Institute for Chemical Education Summer Workshops, University of Wisconsin-Madison, 1991. 5. Roberts, L. Science 1991, 251, 624-626; ibid, 254, 377. Keywords Computer Room; Simulation; High School; General; General Science; Environmental Chemistry; Chemistry and Society; Water Chemistry Hardware and Software Requirements for BCTC for Windows

  17. Science alliance: A vital ORNL-UT partnership

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Richmond, C.R.; Riedinger, L.; Garritano, T.

    1991-01-01

    Partnerships between Department of Energy national laboratories and universities have long been keys to advancing scientific research and education in the United States. Perhaps the most enduring and closely knit of these relationships is the one between Oak Ridge National Laboratory and the University of Tennessee at Knoxville. Since its birth in the 1940's, ORNL has had a very special relationship with UT, and today the two institutions have closer ties than virtually any other university and national laboratory. Seven years ago, ORNL and UT began a new era of cooperation by creating the Science Alliance, a Center of Excellencemore » at UT sponsored by the Tennessee Higher Education Commission. As the oldest and largest of these centers, the Science Alliance is the primary vehicle through which Tennessee promotes research and educational collaboration between UT and ORNL. By letting the two institutions pool their intellectual and financial resources, the alliance creates a more fertile scientific environment than either could achieve on its own. Part of the UT College of Liberal Arts, the Science Alliance is composed of four divisions (Biological Sciences, Chemical Sciences, Physical Sciences, and Mathematics and Computer Science) that team 100 of the university's top faculty with their outstanding colleagues from ORNL.« less

  18. NSI customer service representatives and user support office: NASA Science Internet

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The NASA Science Internet, (NSI) was established in 1987 to provide NASA's Offices of Space Science and Applications (OSSA) missions with transparent wide-area data connectivity to NASA's researchers, computational resources, and databases. The NSI Office at NASA/Ames Research Center has the lead responsibility for implementing a total, open networking program to serve the OSSA community. NSI is a full-service communications provider whose services include science network planning, network engineering, applications development, network operations, and network information center/user support services. NSI's mission is to provide reliable high-speed communications to the NASA science community. To this end, the NSI Office manages and operates the NASA Science Internet, a multiprotocol network currently supporting both DECnet and TCP/IP protocols. NSI utilizes state-of-the-art network technology to meet its customers' requirements. THe NASA Science Internet interconnects with other national networks including the National Science Foundation's NSFNET, the Department of Energy's ESnet, and the Department of Defense's MILNET. NSI also has international connections to Japan, Australia, New Zealand, Chile, and several European countries. NSI cooperates with other government agencies as well as academic and commercial organizations to implement networking technologies which foster interoperability, improve reliability and performance, increase security and control, and expedite migration to the OSI protocols.

  19. The scientific research potential of virtual worlds.

    PubMed

    Bainbridge, William Sims

    2007-07-27

    Online virtual worlds, electronic environments where people can work and interact in a somewhat realistic manner, have great potential as sites for research in the social, behavioral, and economic sciences, as well as in human-centered computer science. This article uses Second Life and World of Warcraft as two very different examples of current virtual worlds that foreshadow future developments, introducing a number of research methodologies that scientists are now exploring, including formal experimentation, observational ethnography, and quantitative analysis of economic markets or social networks.

  20. Training the Future - Interns Harvesting & Testing Plant Experim

    NASA Image and Video Library

    2017-07-19

    In the Space Life Sciences Laboratory at NASA's Kennedy Space Center in Florida, student interns such as Ayla Grandpre, left, and Payton Barnwell are joining agency scientists, contributing in the area of plant growth research for food production in space. Grandpre is pursuing a degree in computer science and chemistry at Rocky Mountain College in Billings, Montana. Barnwell is a mechanical engineering and nanotechnology major at Florida Polytechnic University. The agency attracts its future workforce through the NASA Internship, Fellowships and Scholarships, or NIFS, Program.

  1. Games and Simulations for Climate, Weather and Earth Science Education

    NASA Astrophysics Data System (ADS)

    Russell, R. M.

    2014-12-01

    We will demonstrate several interactive, computer-based simulations, games, and other interactive multimedia. These resources were developed for weather, climate, atmospheric science, and related Earth system science education. The materials were created by the UCAR Center for Science Education. These materials have been disseminated via our web site (SciEd.ucar.edu), webinars, online courses, teacher workshops, and large touchscreen displays in weather and Sun-Earth connections exhibits in NCAR's Mesa Lab facility in Boulder, Colorado. Our group has also assembled a web-based list of similar resources, especially simulations and games, from other sources that touch upon weather, climate, and atmospheric science topics. We'll briefly demonstrate this directory. More info available at: scied.ucar.edu/events/agu-2014-games-simulations-sessions

  2. Project LASER

    NASA Technical Reports Server (NTRS)

    1990-01-01

    NASA formally launched Project LASER (Learning About Science, Engineering and Research) in March 1990, a program designed to help teachers improve science and mathematics education and to provide 'hands on' experiences. It featured the first LASER Mobile Teacher Resource Center (MTRC), is designed to reach educators all over the nation. NASA hopes to operate several MTRCs with funds provided by private industry. The mobile unit is a 22-ton tractor-trailer stocked with NASA educational publications and outfitted with six work stations. Each work station, which can accommodate two teachers at a time, has a computer providing access to NASA Spacelink. Each also has video recorders and photocopy/photographic equipment for the teacher's use. MTRC is only one of the five major elements within LASER. The others are: a Space Technology Course, to promote integration of space science studies with traditional courses; the Volunteer Databank, in which NASA employees are encouraged to volunteer as tutors, instructors, etc; Mobile Discovery Laboratories that will carry simple laboratory equipment and computers to provide hands-on activities for students and demonstrations of classroom activities for teachers; and the Public Library Science Program which will present library based science and math programs.

  3. e-Science on Earthquake Disaster Mitigation by EUAsiaGrid

    NASA Astrophysics Data System (ADS)

    Yen, Eric; Lin, Simon; Chen, Hsin-Yen; Chao, Li; Huang, Bor-Shoh; Liang, Wen-Tzong

    2010-05-01

    Although earthquake is not predictable at this moment, with the aid of accurate seismic wave propagation analysis, we could simulate the potential hazards at all distances from possible fault sources by understanding the source rupture process during large earthquakes. With the integration of strong ground-motion sensor network, earthquake data center and seismic wave propagation analysis over gLite e-Science Infrastructure, we could explore much better knowledge on the impact and vulnerability of potential earthquake hazards. On the other hand, this application also demonstrated the e-Science way to investigate unknown earth structure. Regional integration of earthquake sensor networks could aid in fast event reporting and accurate event data collection. Federation of earthquake data center entails consolidation and sharing of seismology and geology knowledge. Capability building of seismic wave propagation analysis implies the predictability of potential hazard impacts. With gLite infrastructure and EUAsiaGrid collaboration framework, earth scientists from Taiwan, Vietnam, Philippine, Thailand are working together to alleviate potential seismic threats by making use of Grid technologies and also to support seismology researches by e-Science. A cross continental e-infrastructure, based on EGEE and EUAsiaGrid, is established for seismic wave forward simulation and risk estimation. Both the computing challenge on seismic wave analysis among 5 European and Asian partners, and the data challenge for data center federation had been exercised and verified. Seismogram-on-Demand service is also developed for the automatic generation of seismogram on any sensor point to a specific epicenter. To ease the access to all the services based on users workflow and retain the maximal flexibility, a Seismology Science Gateway integating data, computation, workflow, services and user communities would be implemented based on typical use cases. In the future, extension of the earthquake wave propagation to tsunami mitigation would be feasible once the user community support is in place.

  4. EC97-44347-15

    NASA Image and Video Library

    1997-12-11

    This console and its compliment of computers, monitors and commmunications equipment make up the Research Engineering Test Station, the nerve center for an aerodynamics experiment conducted by NASA's Dryden Flight Research Center, Edwards, California. The equipment was installed on a modified Lockheed L-1011 Tristar jetliner operated by Orbital Sciences Corp., of Dulles, Va., for Dryden's Adaptive Performance Optimization project. The experiment sought to improve the efficiency of long-range jetliners by using small movements of the ailerons to improve the aerodynamics of the wing at cruise conditions.

  5. Final Technical Report - Center for Technology for Advanced Scientific Component Software (TASCS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sussman, Alan

    2014-10-21

    This is a final technical report for the University of Maryland work in the SciDAC Center for Technology for Advanced Scientific Component Software (TASCS). The Maryland work focused on software tools for coupling parallel software components built using the Common Component Architecture (CCA) APIs. Those tools are based on the Maryland InterComm software framework that has been used in multiple computational science applications to build large-scale simulations of complex physical systems that employ multiple separately developed codes.

  6. CERT TST November 2016 Visit Summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Little, Robert Currier; Bailey, Teresa S.; Kahler, III, Albert Comstock

    2017-04-27

    The dozen plus presentations covered the span of the Center’s activities, including experimental progress, simulations of the experiments (both for calibration and validation), UQ analysis, nuclear data impacts, status of simulation codes, methods development, computational science progress, and plans for upcoming priorities. All three institutions comprising the Center (Texas A&M, University of Colorado Boulder, and Simon Fraser University) were represented. Center-supported students not only gave two of the oral presentations, but also highlighted their research in a number of excellent posters.

  7. RIACS/USRA

    NASA Technical Reports Server (NTRS)

    Oliger, Joseph

    1993-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on 6 June 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. A flexible scientific staff is provided through a university faculty visitor program, a post doctoral program, and a student visitor program. Not only does this provide appropriate expertise but it also introduces scientists outside of NASA to NASA problems. A small group of core RIACS staff provides continuity and interacts with an ARC technical monitor and scientific advisory group to determine the RIACS mission. RIACS activities are reviewed and monitored by a USRA advisory council and ARC technical monitor. Research at RIACS is currently being done in the following areas: Parallel Computing, Advanced Methods for Scientific Computing, High Performance Networks and Technology, and Learning Systems. Parallel compiler techniques, adaptive numerical methods for flows in complicated geometries, and optimization were identified as important problems to investigate for ARC's involvement in the Computational Grand Challenges of the next decade.

  8. The Science DMZ: A Network Design Pattern for Data-Intensive Science

    DOE PAGES

    Dart, Eli; Rotman, Lauren; Tierney, Brian; ...

    2014-01-01

    The ever-increasing scale of scientific data has become a significant challenge for researchers that rely on networks to interact with remote computing systems and transfer results to collaborators worldwide. Despite the availability of high-capacity connections, scientists struggle with inadequate cyberinfrastructure that cripples data transfer performance, and impedes scientific progress. The Science DMZ paradigm comprises a proven set of network design patterns that collectively address these problems for scientists. We explain the Science DMZ model, including network architecture, system configuration, cybersecurity, and performance tools, that creates an optimized network environment for science. We describe use cases from universities, supercomputing centers andmore » research laboratories, highlighting the effectiveness of the Science DMZ model in diverse operational settings. In all, the Science DMZ model is a solid platform that supports any science workflow, and flexibly accommodates emerging network technologies. As a result, the Science DMZ vastly improves collaboration, accelerating scientific discovery.« less

  9. Global satellite composites - 20 years of evolution

    NASA Astrophysics Data System (ADS)

    Kohrs, Richard A.; Lazzara, Matthew A.; Robaidek, Jerrold O.; Santek, David A.; Knuth, Shelley L.

    2014-01-01

    For two decades, the University of Wisconsin Space Science and Engineering Center (SSEC) and the Antarctic Meteorological Research Center (AMRC) have been creating global, regional and hemispheric satellite composites. These composites have proven useful in research, operational forecasting, commercial applications and educational outreach. Using the Man computer Interactive Data System (McIDAS) software developed at SSEC, infrared window composites were created by combining Geostationary Operational Environmental Satellite (GOES), and polar orbiting data from the SSEC Data Center and polar data acquired at McMurdo and Palmer stations, Antarctica. Increased computer processing speed has allowed for more advanced algorithms to address the decision making process for co-located pixels. The algorithms have evolved from a simplistic maximum brightness temperature to those that account for distance from the sub-satellite point, parallax displacement, pixel time and resolution. The composites are the state-of-the-art means for merging/mosaicking satellite imagery.

  10. Computational oncology.

    PubMed

    Lefor, Alan T

    2011-08-01

    Oncology research has traditionally been conducted using techniques from the biological sciences. The new field of computational oncology has forged a new relationship between the physical sciences and oncology to further advance research. By applying physics and mathematics to oncologic problems, new insights will emerge into the pathogenesis and treatment of malignancies. One major area of investigation in computational oncology centers around the acquisition and analysis of data, using improved computing hardware and software. Large databases of cellular pathways are being analyzed to understand the interrelationship among complex biological processes. Computer-aided detection is being applied to the analysis of routine imaging data including mammography and chest imaging to improve the accuracy and detection rate for population screening. The second major area of investigation uses computers to construct sophisticated mathematical models of individual cancer cells as well as larger systems using partial differential equations. These models are further refined with clinically available information to more accurately reflect living systems. One of the major obstacles in the partnership between physical scientists and the oncology community is communications. Standard ways to convey information must be developed. Future progress in computational oncology will depend on close collaboration between clinicians and investigators to further the understanding of cancer using these new approaches.

  11. Research Reports: 1988 NASA/ASEE Summer Faculty Fellowship Program

    NASA Technical Reports Server (NTRS)

    Freeman, L. Michael (Editor); Chappell, Charles R. (Editor); Cothran, Ernestine K. (Editor); Karr, Gerald R. (Editor)

    1988-01-01

    The basic objectives are to further the professional knowledge of qualified engineering and science faculty members; to stimulate an exchange of ideas between participants and NASA: to enrich and refresh the research and teaching activities of the participants' institutions; and to contribute to the research objectives of the NASA centers. Topics addressed include: cryogenics; thunderstorm simulation; computer techniques; computer assisted instruction; system analysis weather forecasting; rocket engine design; crystal growth; control systems design; turbine pumps for the Space Shuttle Main engine; electron mobility; heat transfer predictions; rotor dynamics; mathematical models; computational fluid dynamics; and structural analysis.

  12. NASA Intelligent Systems Project: Results, Accomplishments and Impact on Science Missions.

    NASA Astrophysics Data System (ADS)

    Coughlan, J. C.

    2005-12-01

    The Intelligent Systems Project was responsible for much of NASA's programmatic investment in artificial intelligence and advanced information technologies. IS has completed three major project milestones which demonstrated increased capabilities in autonomy, human centered computing, and intelligent data understanding. Autonomy involves the ability of a robot to place an instrument on a remote surface with a single command cycle, human centered computing supported a collaborative, mission centric data and planning system for the Mars Exploration Rovers and data understanding has produced key components of a terrestrial satellite observation system with automated modeling and data analysis capabilities. This paper summarizes the technology demonstrations and metrics which quantify and summarize these new technologies which are now available for future NASA missions.

  13. NASA Intelligent Systems Project: Results, Accomplishments and Impact on Science Missions

    NASA Technical Reports Server (NTRS)

    Coughlan, Joseph C.

    2005-01-01

    The Intelligent Systems Project was responsible for much of NASA's programmatic investment in artificial intelligence and advanced information technologies. IS has completed three major project milestones which demonstrated increased capabilities in autonomy, human centered computing, and intelligent data understanding. Autonomy involves the ability of a robot to place an instrument on a remote surface with a single command cycle. Human centered computing supported a collaborative, mission centric data and planning system for the Mars Exploration Rovers and data understanding has produced key components of a terrestrial satellite observation system with automated modeling and data analysis capabilities. This paper summarizes the technology demonstrations and metrics which quantify and summarize these new technologies which are now available for future Nasa missions.

  14. Rapid Speech Transmission Index predictions and auralizations of unusual instructional spaces at MIT's new Stata Center

    NASA Astrophysics Data System (ADS)

    Conant, David A.

    2005-04-01

    The Stata Center for Computer, Information and Intelligence Sciences, recently opened at the Massachusetts Institute of Technology, includes a variety of oddly-shaped seminar rooms in addition to lecture spaces of somewhat more conventional form. The architects design approach prohibited following conventional, well understood room-acoustical behavior yet MIT and the design team were keenly interested in ensuring that these spaces functioned exceptionally well, acoustically. CATT-Acoustic room modeling was employed to assess RASTI through multiple design iterations for all these spaces. Presented here are computational and descriptive results achieved for these rooms which are highly-regarded by faculty. They all sound peculiarly good, given their unusual form. In addition, binaural auralizations for selected spaces are provided.

  15. Quantum Machine Learning

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak

    2018-01-01

    Quantum computing promises an unprecedented ability to solve intractable problems by harnessing quantum mechanical effects such as tunneling, superposition, and entanglement. The Quantum Artificial Intelligence Laboratory (QuAIL) at NASA Ames Research Center is the space agency's primary facility for conducting research and development in quantum information sciences. QuAIL conducts fundamental research in quantum physics but also explores how best to exploit and apply this disruptive technology to enable NASA missions in aeronautics, Earth and space sciences, and space exploration. At the same time, machine learning has become a major focus in computer science and captured the imagination of the public as a panacea to myriad big data problems. In this talk, we will discuss how classical machine learning can take advantage of quantum computing to significantly improve its effectiveness. Although we illustrate this concept on a quantum annealer, other quantum platforms could be used as well. If explored fully and implemented efficiently, quantum machine learning could greatly accelerate a wide range of tasks leading to new technologies and discoveries that will significantly change the way we solve real-world problems.

  16. Earth System Grid II, Turning Climate Datasets into Community Resources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Middleton, Don

    2006-08-01

    The Earth System Grid (ESG) II project, funded by the Department of Energy’s Scientific Discovery through Advanced Computing program, has transformed climate data into community resources. ESG II has accomplished this goal by creating a virtual collaborative environment that links climate centers and users around the world to models and data via a computing Grid, which is based on the Department of Energy’s supercomputing resources and the Internet. Our project’s success stems from partnerships between climate researchers and computer scientists to advance basic and applied research in the terrestrial, atmospheric, and oceanic sciences. By interfacing with other climate science projects,more » we have learned that commonly used methods to manage and remotely distribute data among related groups lack infrastructure and under-utilize existing technologies. Knowledge and expertise gained from ESG II have helped the climate community plan strategies to manage a rapidly growing data environment more effectively. Moreover, approaches and technologies developed under the ESG project have impacted datasimulation integration in other disciplines, such as astrophysics, molecular biology and materials science.« less

  17. Large-Scale NASA Science Applications on the Columbia Supercluster

    NASA Technical Reports Server (NTRS)

    Brooks, Walter

    2005-01-01

    Columbia, NASA's newest 61 teraflops supercomputer that became operational late last year, is a highly integrated Altix cluster of 10,240 processors, and was named to honor the crew of the Space Shuttle lost in early 2003. Constructed in just four months, Columbia increased NASA's computing capability ten-fold, and revitalized the Agency's high-end computing efforts. Significant cutting-edge science and engineering simulations in the areas of space and Earth sciences, as well as aeronautics and space operations, are already occurring on this largest operational Linux supercomputer, demonstrating its capacity and capability to accelerate NASA's space exploration vision. The presentation will describe how an integrated environment consisting not only of next-generation systems, but also modeling and simulation, high-speed networking, parallel performance optimization, and advanced data analysis and visualization, is being used to reduce design cycle time, accelerate scientific discovery, conduct parametric analysis of multiple scenarios, and enhance safety during the life cycle of NASA missions. The talk will conclude by discussing how NAS partnered with various NASA centers, other government agencies, computer industry, and academia, to create a national resource in large-scale modeling and simulation.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    East, D. R.; Sexton, J.

    This was a collaborative effort between Lawrence Livermore National Security, LLC as manager and operator of Lawrence Livermore National Laboratory (LLNL) and IBM TJ Watson Research Center to research, assess feasibility and develop an implementation plan for a High Performance Computing Innovation Center (HPCIC) in the Livermore Valley Open Campus (LVOC). The ultimate goal of this work was to help advance the State of California and U.S. commercial competitiveness in the arena of High Performance Computing (HPC) by accelerating the adoption of computational science solutions, consistent with recent DOE strategy directives. The desired result of this CRADA was a well-researched,more » carefully analyzed market evaluation that would identify those firms in core sectors of the US economy seeking to adopt or expand their use of HPC to become more competitive globally, and to define how those firms could be helped by the HPCIC with IBM as an integral partner.« less

  19. Space physics analysis network node directory (The Yellow Pages): Fourth edition

    NASA Technical Reports Server (NTRS)

    Peters, David J.; Sisson, Patricia L.; Green, James L.; Thomas, Valerie L.

    1989-01-01

    The Space Physics Analysis Network (SPAN) is a component of the global DECnet Internet, which has over 17,000 host computers. The growth of SPAN from its implementation in 1981 to its present size of well over 2,500 registered SPAN host computers, has created a need for users to acquire timely information about the network through a central source. The SPAN Network Information Center (SPAN-NIC) an online facility managed by the National Space Science Data Center (NSSDC) was developed to meet this need for SPAN-wide information. The remote node descriptive information in this document is not currently contained in the SPAN-NIC database, but will be incorporated in the near future. Access to this information is also available to non-DECnet users over a variety of networks such as Telenet, the NASA Packet Switched System (NPSS), and the TCP/IP Internet. This publication serves as the Yellow Pages for SPAN node information. The document also provides key information concerning other computer networks connected to SPAN, nodes associated with each SPAN routing center, science discipline nodes, contacts for primary SPAN nodes, and SPAN reference information. A section on DECnet Internetworking discusses SPAN connections with other wide-area DECnet networks (many with thousands of nodes each). Another section lists node names and their disciplines, countries, and institutions in the SPAN Network Information Center Online Data Base System. All remote sites connected to US-SPAN and European-SPAN (E-SPAN) are indexed. Also provided is information on the SPAN tail circuits, i.e., those remote nodes connected directly to a SPAN routing center, which is the local point of contact for resolving SPAN-related problems. Reference material is included for those who wish to know more about SPAN. Because of the rapid growth of SPAN, the SPAN Yellow Pages is reissued periodically.

  20. Sandia QIS Capabilities.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muller, Richard P.

    2017-07-01

    Sandia National Laboratories has developed a broad set of capabilities in quantum information science (QIS), including elements of quantum computing, quantum communications, and quantum sensing. The Sandia QIS program is built atop unique DOE investments at the laboratories, including the MESA microelectronics fabrication facility, the Center for Integrated Nanotechnologies (CINT) facilities (joint with LANL), the Ion Beam Laboratory, and ASC High Performance Computing (HPC) facilities. Sandia has invested $75 M of LDRD funding over 12 years to develop unique, differentiating capabilities that leverage these DOE infrastructure investments.

  1. Virtual test: A student-centered software to measure student's critical thinking on human disease

    NASA Astrophysics Data System (ADS)

    Rusyati, Lilit; Firman, Harry

    2016-02-01

    The study "Virtual Test: A Student-Centered Software to Measure Student's Critical Thinking on Human Disease" is descriptive research. The background is importance of computer-based test that use element and sub element of critical thinking. Aim of this study is development of multiple choices to measure critical thinking that made by student-centered software. Instruments to collect data are (1) construct validity sheet by expert judge (lecturer and medical doctor) and professional judge (science teacher); and (2) test legibility sheet by science teacher and junior high school student. Participants consisted of science teacher, lecturer, and medical doctor as validator; and the students as respondent. Result of this study are describe about characteristic of virtual test that use to measure student's critical thinking on human disease, analyze result of legibility test by students and science teachers, analyze result of expert judgment by science teachers and medical doctor, and analyze result of trial test of virtual test at junior high school. Generally, result analysis shown characteristic of multiple choices to measure critical thinking was made by eight elements and 26 sub elements that developed by Inch et al.; complete by relevant information; and have validity and reliability more than "enough". Furthermore, specific characteristic of multiple choices to measure critical thinking are information in form science comic, table, figure, article, and video; correct structure of language; add source of citation; and question can guide student to critical thinking logically.

  2. Curriculum and Resources: Computer Provision in a CTC.

    ERIC Educational Resources Information Center

    Denholm, Lawrence

    The program for City Technical Colleges (CTCs) draws on ideas and resources from government, private industry, and education to focus on the educational needs of inner city and urban children. Mathematics, science, and technology are at the center of the CTCs' mission, in a context which includes economic awareness and a commitment to enterprise…

  3. Using Generic and Context-Specific Scaffolding to Support Authentic Science Inquiry

    ERIC Educational Resources Information Center

    Belland, Brian R.; Gu, Jiangyue; Armbrust, Sara; Cook, Brant

    2013-01-01

    In this conceptual paper, we propose an heuristic to balance context-specific and generic scaffolding, as well as computer-based and teacher scaffolding, during instruction centered on authentic, scientific problems. This paper is novel in that many researchers ask a dichotomous question of whether generic or context-specific scaffolding is best,…

  4. Reducing Nutrients and Nutrient Impacts Priority Issue Team - St. Louis Bay Project: Implementing Nutrients PIT Action Step 1.1

    NASA Technical Reports Server (NTRS)

    Mason, Ted

    2011-01-01

    The NASA Applied Science & Technology Project Office at Stennis Space Center(SSC) used satellites, in-situ measurements and computational modeling to study relationships between water quality in St. Louis Bay, Mississippi and the watershed characteristics of the Jourdan and Wolf rivers from 2000-2010.

  5. Providing Computer-Based Information Services to an Academic Community. Final Technical Report.

    ERIC Educational Resources Information Center

    Bayer, Bernard

    The Mechanized Information Center (MIC) at the Ohio State University conducts retrospective and current awareness searches for faculty, students, and staff using data bases for agriculture, chemistry, education, psychology, and social sciences, as well as a multidisciplinary data base. The final report includes (1) a description of the background…

  6. NASIC at MIT. Final Report, 1 March 1974 through 28 February 1975.

    ERIC Educational Resources Information Center

    Benenfeld, Alan R.; And Others

    Computer-based reference search services were provided to users on a fee-for-service basis at the Massachusetts Institute of Technology as the first, and experimental, note in the development of the Northeast Academic Science Information Center (NASIC). Development of a training program for information specialists and training materials is…

  7. 77 FR 57569 - Science Advisory Board to the National Center for Toxicological Research; Notice of Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-18

    ... overview of research activities from the NCTR Division of Bioinformatics and Computational Biology and the Division of Systems Biology. The SAB will also receive and update from the subcommittee on Immunotoxicology... advisory committee meetings and will make every effort to accommodate persons with physical disabilities or...

  8. 76 FR 1442 - Center for Scientific Review; Notice of Closed Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-10

    ... Group; Macromolecular Structure and Function D Study Section. Date: February 8-9, 2011. Time: 8 a.m. to...; Biomedical Computing and Health Informatics Study Section. Date: February 8, 2011. Time: 8 a.m. to 5 p.m... Skin Sciences Integrated Review Group; Skeletal Muscle and Exercise Physiology Study Section. Date...

  9. Opening Comments: SciDAC 2008

    NASA Astrophysics Data System (ADS)

    Strayer, Michael

    2008-07-01

    Welcome to Seattle and the 2008 SciDAC Conference. This conference, the fourth in the series, is a continuation of the PI meetings we first began under SciDAC-1. I would like to start by thanking the organizing committee, and Rick Stevens in particular, for organizing this year's meeting. This morning I would like to look briefly at SciDAC, to give you a brief history of SciDAC and also look ahead to see where we plan to go over the next few years. I think the best description of SciDAC, at least the simulation part, comes from a quote from Dr Ray Orbach, DOE's Under Secretary for Science and Director of the Office of Science. In an interview that appeared in the SciDAC Review magazine, Dr Orbach said, `SciDAC is unique in the world. There isn't any other program like it anywhere else, and it has the remarkable ability to do science by bringing together physical scientists, mathematicians, applied mathematicians, and computer scientists who recognize that computation is not something you do at the end, but rather it needs to be built into the solution of the very problem that one is addressing'. Of course, that is extended not just to physical scientists, but also to biological scientists. This is a theme of computational science, this partnership among disciplines, which goes all the way back to the early 1980s and Ken Wilson. It's a unique thread within the Department of Energy. SciDAC-1, launched around the turn of the millennium, created a new generation of scientific simulation codes. It advocated building out mathematical and computing system software in support of science and a new collaboratory software environment for data. The original concept for SciDAC-1 had topical centers for the execution of the various science codes, but several corrections and adjustments were needed. The ASCR scientific computing infrastructure was also upgraded, providing the hardware facilities for the program. The computing facility that we had at that time was the big 3 teraflop/s center at NERSC and that had to be shared with the programmatic side supporting research across DOE. At the time, ESnet was just slightly over half a gig per sec of bandwidth; and the science being addressed was accelerator science, climate, chemistry, fusion, astrophysics, materials science, and QCD. We built out the national collaboratories from the ASCR office, and in addition we built Integrated Software Infrastructure Centers (ISICs). Of these, three were in applied mathematics, four in computer science (including a performance evaluation research center), and four were collaboratories or Grid projects having to do with data management. For science, there were remarkable breakthroughs in simulation, such as full 3D laboratory scale flame simulation. There were also significant improvements in application codes - from factors of almost 3 to more than 100 - and code improvement as people began to realize they had to integrate mathematics tools and computer science tools into their codes to take advantage of the parallelism of the day. The SciDAC data-mining tool, Sapphire, received a 2006 R&D 100 award. And the community as a whole worked well together and began building a publication record that was substantial. In 2006, we recompeted the program with similar goals - SciDAC-1 was very successful, and we wanted to continue that success and extend what was happening under SciDAC to the broader science community. We opened up the partnership to all of the Offices of Science and the NSF and the NNSA. The goal was to create comprehensive scientific computing software and the infrastructure for the software to enable scientific discovery in the physical, biological, and environmental sciences and take the simulations to an extreme scale, in this case petascale. We would also build out a new generation of data management tools. What we observed during SciDAC-1 was that the data and the data communities - both experimental data from large experimental facilities and observational data, along with simulation data - were expanding at a rate significantly faster than Moore's law. In the past few weeks, the FastBit indexing technology software tool for data analyses and data mining developed under SciDAC's Scientific Data Management project was recognized with an R&D 100 Award, selected by an independent judging panel and editors of R&D Magazine as one of the 100 most technologically significant products introduced into the marketplace over the past year. For SciDAC-2 we had nearly 250 proposals requesting a total of slightly over 1 billion in funding. Of course, we had nowhere near 1 billion. The facilities and the science we ended up with were not significantly different from what we had in SciDAC-1. But we had put in place substantially increased facilities for science. When SciDAC-1 was originally executed with the facilities at NERSC, there was significant impact on the resources at NERSC, because not only did we have an expanding portfolio of programmatic science, but we had the SciDAC projects that also needed to run at NERSC. Suddenly, NERSC was incredibly oversubscribed. With SciDAC-2, we had in place leadership-class computing facilities at Argonne with slightly more than half a petaflop and at Oak Ridge with slightly more than a quarter petaflop with an upgrade planned at the end of this year for a petaflop. And we increased the production computing capacity at NERSC to 104 teraflop/s just so that we would not impact the programmatic research and so that we would have a startup facility for SciDAC. At the end of the summer, NERSC will be at 360 teraflop/s. Both the Oak Ridge system and the principal resource at NERSC are Cray systems; Argonne has a different architecture, an IBM Blue Gene/P. At the same time, ESnet has been built out, and we are on a path where we will have dual rings around the country, from 10 to 40 gigabits per second - a factor of 20 to 80 over what was available during SciDAC-1. The science areas include accelerator science and simulation, astrophysics, climate modeling and simulation, computational biology, fusion science, high-energy physics, petabyte high-energy/ nuclear physics, materials science and chemistry, nuclear physics, QCD, radiation transport, turbulence, and groundwater reactive transport modeling and simulation. They were supported by new enabling technology centers and university-based institutes to develop an educational thread for the SciDAC program. There were four mathematics projects and four computer science projects; and under data management, we see a significant difference in that we are bringing up new visualization projects to support and sustain data-intensive science. When we look at the budgets, we see growth in the budget from just under 60 million for SciDAC-1 to just over 80 for SciDAC-2. Part of the growth is due to bringing in NSF and NNSA as new partners, and some of the growth is due to some program offices increasing their investment in SciDAC, while other program offices are constant or have decreased their investment. This is not a reflection of their priorities per se but, rather, a reflection of the budget process and the difficult times in Washington during the past two years. New activities are under way in SciDAC - the annual PI meeting has turned into what I would describe as the premier interdisciplinary computational science meeting, one of the best in the world. Doing interdisciplinary meetings is difficult because people tend to develop a focus for their particular subject area. But this is the fourth in the series; and since the first meeting in San Francisco, these conferences have been remarkably successful. For SciDAC-2 we also created an outreach magazine, SciDAC Review, which highlights scientific discovery as well as high-performance computing. It's been very successful in telling the non-practitioners what SciDAC and computational science are all about. The other new instrument in SciDAC-2 is an outreach center. As we go from computing at the terascale to computing at the petascale, we face the problem of narrowing our research community. The number of people who are `literate' enough to compute at the terascale is more than the number of those who can compute at the petascale. To address this problem, we established the SciDAC Outreach Center to bring people into the fold and educate them as to how we do SciDAC, how the teams are composed, and what it really means to compute at scale. The resources I have mentioned don't come for free. As part of the HECRTF law of 2005, Congress mandated that the Secretary would ensure that leadership-class facilities would be open to everyone across all agencies. So we took Congress at its word, and INCITE is our instrument for making allocations at the leadership-class facilities at Argonne and Oak Ridge, as well as smaller allocations at NERSC. Therefore, the selected proposals are very large projects that are computationally intensive, that compute at scale, and that have a high science impact. An important feature is that INCITE is completely open to anyone - there is no requirement of DOE Office of Science funding, and proposals are rigorously reviewed for both the science and the computational readiness. In 2008, more than 100 proposals were received, requesting about 600 million processor-hours. We allocated just over a quarter of a billion processor-hours. Astrophysics, materials science, lattice gauge theory, and high energy and nuclear physics were the major areas. These were the teams that were computationally ready for the big machines and that had significant science they could identify. In 2009, there will be a significant increase amount of time to be allocated, over half a billion processor-hours. The deadline is August 11 for new proposals and September 12 for renewals. We anticipate a significant increase in the number of requests this year. We expect you - as successful SciDAC centers, institutes, or partnerships - to compete for and win INCITE program allocation awards. If you have a successful SciDAC proposal, we believe it will make you successful in the INCITE review. We have the expectation that you will among those most prepared and most ready to use the machines and to compute at scale. Over the past 18 months, we have assembled a team to look across our computational science portfolio and to judge what are the 10 most significant science accomplishments. The ASCR office, as it goes forward with OMB, the new administration, and Congress, will be judged by the science we have accomplished. All of our proposals - such as for increasing SciDAC, increasing applied mathematics, and so on - are tied to what have we accomplished in science. And so these 10 big accomplishments are key to establishing credibility for new budget requests. Tony Mezzacappa, who chaired the committee, will also give a presentation on the ranking of these top 10, how they got there, and what the science is all about. Here is the list - numbers 2, 5, 6, 7, 9, and 10 are all SciDAC projects. RankTitle 1Modeling the Molecular Basis of Parkinson's Disease (Tsigelny) 2Discovery of the Standing Accretion Shock Instability and Pulsar Birth Mechanism in a Core-Collapse Supernova Evolution and Explosion (Blondin) 3Prediction and Design of Macromolecular Structures and Functions (Baker) 4Understanding How Lifted Flame Stabilized in a Hot Coflow (Yoo) 5New Insights from LCF-enabled Advanced Kinetic Simulations of Global Turbulence in Fusion Systems (Tang) 6High Transition Temperature Superconductivity: A High-Temperature Superconductive State and a Pairing Mechanism in 2-D Hubbard Model (Scalapino) 7 PETsc: Providing the Solvers for DOE High-Performance Simulations (Smith) 8 Via Lactea II, A Billion Particle Simulation of the Dark Matter Halo of the Milky Way (Madau) 9Probing the Properties of Water through Advanced Computing (Galli) 10First Provably Scalable Maxwell Solver Enables Scalable Electromagnetic Simulations (Kovel) So, what's the future going to look like for us? The office is putting together an initiative with the community, which we call the E3 Initiative. We're looking for a 10-year horizon for what's going to happen. Through the series of town hall meetings, which many of you participated in, we have produced a document on `Transforming Energy, the Environment and Science through simulations at the eXtreme Scale'; it can be found at http://www.science.doe.gov/ascr/ProgramDocuments/TownHall.pdf . We sometimes call it the Exascale initiative. Exascale computing is the gold-ring level of computing that seems just out of reach; but if we work hard and stretch, we just might be able to reach it. We envision that there will be a SciDAC-X, working at the extreme scale, with SciDAC teams that will perform and carry out science in the areas that will have a great societal impact, such as alternative fuels and transportation, combustion, climate, fusion science, high-energy physics, advanced fuel cycles, carbon management, and groundwater. We envision institutes for applied mathematics and computer science that probably will segue into algorithms because, at the extreme scale, we see the distinction between the applied math and the algorithm per se and its implementation in computer science as being inseparable. We envision an INCITE-X with multi-petaflop platforms, perhaps even exaflop computing resources. ESnet will be best in class - our 10-year plan calls for having 400 terabits per second capacity available in dual rings around the country, an enormously fast data communications network for moving large amounts of data. In looking at where we've been and where we are going, we can see that the gigaflops and teraflops era was a regime where we were following Moore's law through advances in clock speed. In the current regime, we're introducing massive parallelism, which I think is exemplified by Intel's announcement of their teraflop chip, where they envision more than a thousand cores on a chip. But in order to reach exascale, extrapolations talk about machines that require 100 megawatts of power in terms of current architectures. It's clearly going to require novel architectures, things we have perhaps not yet envisioned. It is of course an era of challenge. There will be an unpredictable evolution of hardware if we are to reach the exascale; and there will clearly be multilevel heterogeneous parallelism, including multilevel memory hierarchies. We have no idea right now as to the programming models needed to execute at such an extreme scale. We have been incredibly successful at the petascale - we know that already. Managing data and just getting communications to scale is an enormous challenge. And it's not just the extreme scaling. It's the rapid increase in complexity that represents the challenge. Let me end with a metaphor. In previous meetings we have talked about the road to petascale. Indeed, we have seen in hindsight that it was a road well traveled. But perhaps the road to exascale is not a road at all. Perhaps the metaphor will be akin to scaling the south face of K2. That's clearly not something all of us will be able to do, and probably computing at the exascale is not something all of us will do. But if we achieve that goal, perhaps the words of Emily Dickinson will best summarize where we will be. Perhaps in her words, looking backward and down, you will say: I climb the `Hill of Science' I view the landscape o'er; Such transcendental prospect I ne'er beheld before!

  10. The quantum computer game: citizen science

    NASA Astrophysics Data System (ADS)

    Damgaard, Sidse; Mølmer, Klaus; Sherson, Jacob

    2013-05-01

    Progress in the field of quantum computation is hampered by daunting technical challenges. Here we present an alternative approach to solving these by enlisting the aid of computer players around the world. We have previously examined a quantum computation architecture involving ultracold atoms in optical lattices and strongly focused tweezers of light. In The Quantum Computer Game (see http://www.scienceathome.org/), we have encapsulated the time-dependent Schrödinger equation for the problem in a graphical user interface allowing for easy user input. Players can then search the parameter space with real-time graphical feedback in a game context with a global high-score that rewards short gate times and robustness to experimental errors. The game which is still in a demo version has so far been tried by several hundred players. Extensions of the approach to other models such as Gross-Pitaevskii and Bose-Hubbard are currently under development. The game has also been incorporated into science education at high-school and university level as an alternative method for teaching quantum mechanics. Initial quantitative evaluation results are very positive. AU Ideas Center for Community Driven Research, CODER.

  11. The Magellan Final Report on Cloud Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    ,; Coghlan, Susan; Yelick, Katherine

    The goal of Magellan, a project funded through the U.S. Department of Energy (DOE) Office of Advanced Scientific Computing Research (ASCR), was to investigate the potential role of cloud computing in addressing the computing needs for the DOE Office of Science (SC), particularly related to serving the needs of mid- range computing and future data-intensive computing workloads. A set of research questions was formed to probe various aspects of cloud computing from performance, usability, and cost. To address these questions, a distributed testbed infrastructure was deployed at the Argonne Leadership Computing Facility (ALCF) and the National Energy Research Scientific Computingmore » Center (NERSC). The testbed was designed to be flexible and capable enough to explore a variety of computing models and hardware design points in order to understand the impact for various scientific applications. During the project, the testbed also served as a valuable resource to application scientists. Applications from a diverse set of projects such as MG-RAST (a metagenomics analysis server), the Joint Genome Institute, the STAR experiment at the Relativistic Heavy Ion Collider, and the Laser Interferometer Gravitational Wave Observatory (LIGO), were used by the Magellan project for benchmarking within the cloud, but the project teams were also able to accomplish important production science utilizing the Magellan cloud resources.« less

  12. Intricacies of modern supercomputing illustrated with recent advances in simulations of strongly correlated electron systems

    NASA Astrophysics Data System (ADS)

    Schulthess, Thomas C.

    2013-03-01

    The continued thousand-fold improvement in sustained application performance per decade on modern supercomputers keeps opening new opportunities for scientific simulations. But supercomputers have become very complex machines, built with thousands or tens of thousands of complex nodes consisting of multiple CPU cores or, most recently, a combination of CPU and GPU processors. Efficient simulations on such high-end computing systems require tailored algorithms that optimally map numerical methods to particular architectures. These intricacies will be illustrated with simulations of strongly correlated electron systems, where the development of quantum cluster methods, Monte Carlo techniques, as well as their optimal implementation by means of algorithms with improved data locality and high arithmetic density have gone hand in hand with evolving computer architectures. The present work would not have been possible without continued access to computing resources at the National Center for Computational Science of Oak Ridge National Laboratory, which is funded by the Facilities Division of the Office of Advanced Scientific Computing Research, and the Swiss National Supercomputing Center (CSCS) that is funded by ETH Zurich.

  13. Unique life sciences research facilities at NASA Ames Research Center

    NASA Technical Reports Server (NTRS)

    Mulenburg, G. M.; Vasques, M.; Caldwell, W. F.; Tucker, J.

    1994-01-01

    The Life Science Division at NASA's Ames Research Center has a suite of specialized facilities that enable scientists to study the effects of gravity on living systems. This paper describes some of these facilities and their use in research. Seven centrifuges, each with its own unique abilities, allow testing of a variety of parameters on test subjects ranging from single cells through hardware to humans. The Vestibular Research Facility allows the study of both centrifugation and linear acceleration on animals and humans. The Biocomputation Center uses computers for 3D reconstruction of physiological systems, and interactive research tools for virtual reality modeling. Psycophysiological, cardiovascular, exercise physiology, and biomechanical studies are conducted in the 12 bed Human Research Facility and samples are analyzed in the certified Central Clinical Laboratory and other laboratories at Ames. Human bedrest, water immersion and lower body negative pressure equipment are also available to study physiological changes associated with weightlessness. These and other weightlessness models are used in specialized laboratories for the study of basic physiological mechanisms, metabolism and cell biology. Visual-motor performance, perception, and adaptation are studied using ground-based models as well as short term weightlessness experiments (parabolic flights). The unique combination of Life Science research facilities, laboratories, and equipment at Ames Research Center are described in detail in relation to their research contributions.

  14. Interfacing with in-Situ Data Networks during the Arctic Boreal Vulnerability Experiment (ABoVE)

    NASA Astrophysics Data System (ADS)

    McInerney, M.; Griffith, P. C.; Duffy, D.; Hoy, E.; Schnase, J. L.; Sinno, S.; Thompson, J. H.

    2014-12-01

    The Arctic Boreal Vulnerability Experiment (ABoVE) is designed to improve understanding of the causes and impacts of ecological changes in Arctic/boreal regions, and will integrate field-based studies, modeling, and data from airborne and satellite remote sensing. ABoVE will result in a fuller understanding of ecosystem vulnerability and resilience to environmental change in the Arctic and boreal regions of western North America, and provide scientific information required to develop options for societal responses to the impacts of these changes. The studies sponsored by NASA during ABoVE will be coordinated with research and in-situ monitoring activities being sponsored by a number of national and international partners. The NASA Center for Climate Simulation at the Goddard Space Flight Center has partnered with the NASA Carbon Cycle & Ecosystems Office to create a science cloud designed for this field campaign - the ABoVE Science Cloud (ASC). The ASC combines high performance computing with emerging technologies to create an environment specifically designed for large-scale modeling, analysis of remote sensing data, copious disk storage with integrated data management, and integration of core variables from in-situ networks identified by the ABoVE Science Definition Team. In this talk, we will present the scientific requirements driving the development of the ABoVE Science Cloud, discuss the necessary interfaces, both computational and human, with in-situ monitoring networks, and show examples of how the ASC is being used to meet the needs of the ABoVE campaign.

  15. INDIGO-DataCloud solutions for Earth Sciences

    NASA Astrophysics Data System (ADS)

    Aguilar Gómez, Fernando; de Lucas, Jesús Marco; Fiore, Sandro; Monna, Stephen; Chen, Yin

    2017-04-01

    INDIGO-DataCloud (https://www.indigo-datacloud.eu/) is a European Commission funded project aiming to develop a data and computing platform targeting scientific communities, deployable on multiple hardware and provisioned over hybrid (private or public) e-infrastructures. The development of INDIGO solutions covers the different layers in cloud computing (IaaS, PaaS, SaaS), and provides tools to exploit resources like HPC or GPGPUs. INDIGO is oriented to support European Scientific research communities, that are well represented in the project. Twelve different Case Studies have been analyzed in detail from different fields: Biological & Medical sciences, Social sciences & Humanities, Environmental and Earth sciences and Physics & Astrophysics. INDIGO-DataCloud provides solutions to emerging challenges in Earth Science like: -Enabling an easy deployment of community services at different cloud sites. Many Earth Science research infrastructures often involve distributed observation stations across countries, and also have distributed data centers to support the corresponding data acquisition and curation. There is a need to easily deploy new data center services while the research infrastructure continuous spans. As an example: LifeWatch (ESFRI, Ecosystems and Biodiversity) uses INDIGO solutions to manage the deployment of services to perform complex hydrodynamics and water quality modelling over a Cloud Computing environment, predicting algae blooms, using the Docker technology: TOSCA requirement description, Docker repository, Orchestrator for deployment, AAI (AuthN, AuthZ) and OneData (Distributed Storage System). -Supporting Big Data Analysis. Nowadays, many Earth Science research communities produce large amounts of data and and are challenged by the difficulties of processing and analysing it. A climate models intercomparison data analysis case study for the European Network for Earth System Modelling (ENES) community has been setup, based on the Ophidia big data analysis framework and the Kepler workflow management system. Such services normally involve a large and distributed set of data and computing resources. In this regard, this case study exploits the INDIGO PaaS for a flexible and dynamic allocation of the resources at the infrastructural level. -Providing Distributed Data Storage Solutions. In order to allow scientific communities to perform heavy computation on huge datasets, INDIGO provides global data access solutions allowing researchers to access data in a distributed environment like fashion regardless of its location, and also to publish and share their research results with public or close communities. INDIGO solutions that support the access to distributed data storage (OneData) are being tested on EMSO infrastructure (Ocean Sciences and Geohazards) data. Another aspect of interest for the EMSO community is in efficient data processing by exploiting INDIGO services like PaaS Orchestrator. Further, for HPC exploitation, a new solution named Udocker has been implemented, enabling users to execute docker containers in supercomputers, without requiring administration privileges. This presentation will overview INDIGO solutions that are interesting and useful for Earth science communities and will show how they can be applied to other Case Studies.

  16. New Center Links Earth, Space, and Information Sciences

    NASA Astrophysics Data System (ADS)

    Aswathanarayana, U.

    2004-05-01

    Broad-based geoscience instruction melding the Earth, space, and information technology sciences has been identified as an effective way to take advantage of the new jobs created by technological innovations in natural resources management. Based on this paradigm, the University of Hyderabad in India is developing a Centre of Earth and Space Sciences that will be linked to the university's super-computing facility. The proposed center will provide the basic science underpinnings for the Earth, space, and information technology sciences; develop new methodologies for the utilization of natural resources such as water, soils, sediments, minerals, and biota; mitigate the adverse consequences of natural hazards; and design innovative ways of incorporating scientific information into the legislative and administrative processes. For these reasons, the ethos and the innovatively designed management structure of the center would be of particular relevance to the developing countries. India holds 17% of the world's human population, and 30% of its farm animals, but only about 2% of the planet's water resources. Water will hence constitute the core concern of the center, because ecologically sustainable, socially equitable, and economically viable management of water resources of the country holds the key to the quality of life (drinking water, sanitation, and health), food security, and industrial development of the country. The center will be focused on interdisciplinary basic and pure applied research that is relevant to the practical needs of India as a developing country. These include, for example, climate prediction, since India is heavily dependent on the monsoon system, and satellite remote sensing of soil moisture, since agriculture is still a principal source of livelihood in India. The center will perform research and development in areas such as data assimilation and validation, and identification of new sensors to be mounted on the Indian meteorological satellites to make measurements in those spectral bands and with those polarizations that are needed to address water resources management issues.

  17. The SCEC Community Modeling Environment (SCEC/CME) - An Overview of its Architecture and Current Capabilities

    NASA Astrophysics Data System (ADS)

    Maechling, P. J.; Jordan, T. H.; Minster, B.; Moore, R.; Kesselman, C.; SCEC ITR Collaboration

    2004-12-01

    The Southern California Earthquake Center (SCEC), in collaboration with the San Diego Supercomputer Center, the USC Information Sciences Institute, the Incorporated Research Institutions for Seismology, and the U.S. Geological Survey, is developing the Southern California Earthquake Center Community Modeling Environment (CME) under a five-year grant from the National Science Foundation's Information Technology Research (ITR) Program jointly funded by the Geosciences and Computer and Information Science & Engineering Directorates. The CME system is an integrated geophysical simulation modeling framework that automates the process of selecting, configuring, and executing models of earthquake systems. During the Project's first three years, we have performed fundamental geophysical and information technology research and have also developed substantial system capabilities, software tools, and data collections that can help scientist perform systems-level earthquake science. The CME system provides collaborative tools to facilitate distributed research and development. These collaborative tools are primarily communication tools, providing researchers with access to information in ways that are convenient and useful. The CME system provides collaborators with access to significant computing and storage resources. The computing resources of the Project include in-house servers, Project allocations on USC High Performance Computing Linux Cluster, as well as allocations on NPACI Supercomputers and the TeraGrid. The CME system provides access to SCEC community geophysical models such as the Community Velocity Model, Community Fault Model, Community Crustal Motion Model, and the Community Block Model. The organizations that develop these models often provide access to them so it is not necessary to use the CME system to access these models. However, in some cases, the CME system supplements the SCEC community models with utility codes that make it easier to use or access these models. In some cases, the CME system also provides alternatives to the SCEC community models. The CME system hosts a collection of community geophysical software codes. These codes include seismic hazard analysis (SHA) programs developed by the SCEC/USGS OpenSHA group. Also, the CME system hosts anelastic wave propagation codes including Kim Olsen's Finite Difference code and Carnegie Mellon's Hercules Finite Element tool chain. The CME system can execute a workflow, that is, a series of geophysical computations using the output of one processing step as the input to a subsequent step. Our workflow capability utilizes grid-based computing software that can submit calculations to a pool of computing resources as well as data management tools that help us maintain an association between data files and metadata descriptions of those files. The CME system maintains, and provides access to, a collection of valuable geophysical data sets. The current CME Digital Library holdings include a collection of 60 ground motion simulation results calculated by a SCEC/PEER working group and a collection of Greens Functions calculated for 33 TriNet broadband receiver sites in the Los Angeles area.

  18. Networking Technologies Enable Advances in Earth Science

    NASA Technical Reports Server (NTRS)

    Johnson, Marjory; Freeman, Kenneth; Gilstrap, Raymond; Beck, Richard

    2004-01-01

    This paper describes an experiment to prototype a new way of conducting science by applying networking and distributed computing technologies to an Earth Science application. A combination of satellite, wireless, and terrestrial networking provided geologists at a remote field site with interactive access to supercomputer facilities at two NASA centers, thus enabling them to validate and calibrate remotely sensed geological data in near-real time. This represents a fundamental shift in the way that Earth scientists analyze remotely sensed data. In this paper we describe the experiment and the network infrastructure that enabled it, analyze the data flow during the experiment, and discuss the scientific impact of the results.

  19. Cumulative index to NASA Tech Briefs, 1986-1990, volumes 10-14

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Tech Briefs are short announcements of new technology derived from the R&D activities of the National Aeronautics and Space Administration. These briefs emphasize information considered likely to be transferrable across industrial, regional, or disciplinary lines and are issued to encourage commercial application. This cumulative index of Tech Briefs contains abstracts and four indexes (subject, personal author, originating center, and Tech Brief number) and covers the period 1986 to 1990. The abstract section is organized by the following subject categories: electronic components and circuits, electronic systems, physical sciences, materials, computer programs, life sciences, mechanics, machinery, fabrication technology, and mathematics and information sciences.

  20. Algorithmic trends in computational fluid dynamics; The Institute for Computer Applications in Science and Engineering (ICASE)/LaRC Workshop, NASA Langley Research Center, Hampton, VA, US, Sep. 15-17, 1991

    NASA Technical Reports Server (NTRS)

    Hussaini, M. Y. (Editor); Kumar, A. (Editor); Salas, M. D. (Editor)

    1993-01-01

    The purpose here is to assess the state of the art in the areas of numerical analysis that are particularly relevant to computational fluid dynamics (CFD), to identify promising new developments in various areas of numerical analysis that will impact CFD, and to establish a long-term perspective focusing on opportunities and needs. Overviews are given of discretization schemes, computational fluid dynamics, algorithmic trends in CFD for aerospace flow field calculations, simulation of compressible viscous flow, and massively parallel computation. Also discussed are accerelation methods, spectral and high-order methods, multi-resolution and subcell resolution schemes, and inherently multidimensional schemes.

  1. Integrating Grid Services into the Cray XT4 Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NERSC; Cholia, Shreyas; Lin, Hwa-Chun Wendy

    2009-05-01

    The 38640 core Cray XT4"Franklin" system at the National Energy Research Scientific Computing Center (NERSC) is a massively parallel resource available to Department of Energy researchers that also provides on-demand grid computing to the Open Science Grid. The integration of grid services on Franklin presented various challenges, including fundamental differences between the interactive and compute nodes, a stripped down compute-node operating system without dynamic library support, a shared-root environment and idiosyncratic application launching. Inour work, we describe how we resolved these challenges on a running, general-purpose production system to provide on-demand compute, storage, accounting and monitoring services through generic gridmore » interfaces that mask the underlying system-specific details for the end user.« less

  2. Next Generation Workload Management System For Big Data on Heterogeneous Distributed Computing

    DOE PAGES

    Klimentov, A.; Buncic, P.; De, K.; ...

    2015-05-22

    The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS and ALICE are the largest collaborations ever assembled in the sciences and are at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, both experiments rely on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Managementmore » System (WMS) for managing the workflow for all data processing on hundreds of data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. The scale is demonstrated by the following numbers: PanDA manages O(10 2) sites, O(10 5) cores, O(10 8) jobs per year, O(10 3) users, and ATLAS data volume is O(10 17) bytes. In 2013 we started an ambitious program to expand PanDA to all available computing resources, including opportunistic use of commercial and academic clouds and Leadership Computing Facilities (LCF). The project titled 'Next Generation Workload Management and Analysis System for Big Data' (BigPanDA) is funded by DOE ASCR and HEP. Extending PanDA to clouds and LCF presents new challenges in managing heterogeneity and supporting workflow. The BigPanDA project is underway to setup and tailor PanDA at the Oak Ridge Leadership Computing Facility (OLCF) and at the National Research Center "Kurchatov Institute" together with ALICE distributed computing and ORNL computing professionals. Our approach to integration of HPC platforms at the OLCF and elsewhere is to reuse, as much as possible, existing components of the PanDA system. Finally, we will present our current accomplishments with running the PanDA WMS at OLCF and other supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facilities infrastructure for High Energy and Nuclear Physics as well as other data-intensive science applications.« less

  3. Next Generation Workload Management System For Big Data on Heterogeneous Distributed Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klimentov, A.; Buncic, P.; De, K.

    The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS and ALICE are the largest collaborations ever assembled in the sciences and are at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, both experiments rely on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Managementmore » System (WMS) for managing the workflow for all data processing on hundreds of data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. The scale is demonstrated by the following numbers: PanDA manages O(10 2) sites, O(10 5) cores, O(10 8) jobs per year, O(10 3) users, and ATLAS data volume is O(10 17) bytes. In 2013 we started an ambitious program to expand PanDA to all available computing resources, including opportunistic use of commercial and academic clouds and Leadership Computing Facilities (LCF). The project titled 'Next Generation Workload Management and Analysis System for Big Data' (BigPanDA) is funded by DOE ASCR and HEP. Extending PanDA to clouds and LCF presents new challenges in managing heterogeneity and supporting workflow. The BigPanDA project is underway to setup and tailor PanDA at the Oak Ridge Leadership Computing Facility (OLCF) and at the National Research Center "Kurchatov Institute" together with ALICE distributed computing and ORNL computing professionals. Our approach to integration of HPC platforms at the OLCF and elsewhere is to reuse, as much as possible, existing components of the PanDA system. Finally, we will present our current accomplishments with running the PanDA WMS at OLCF and other supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facilities infrastructure for High Energy and Nuclear Physics as well as other data-intensive science applications.« less

  4. NASA and USGS invest in invasive species modeling to evaluate habitat for Africanized Honey Bees

    USGS Publications Warehouse

    2009-01-01

    Invasive non-native species, such as plants, animals, and pathogens, have long been an interest to the U.S. Geological Survey (USGS) and NASA. Invasive species cause harm to our economy (around $120 B/year), the environment (e.g., replacing native biodiversity, forest pathogens negatively affecting carbon storage), and human health (e.g., plague, West Nile virus). Five years ago, the USGS and NASA formed a partnership to improve ecological forecasting capabilities for the early detection and containment of the highest priority invasive species. Scientists from NASA Goddard Space Flight Center (GSFC) and the Fort Collins Science Center developed a longterm strategy to integrate remote sensing capabilities, high-performance computing capabilities and new spatial modeling techniques to advance the science of ecological invasions [Schnase et al., 2002].

  5. Microgravity

    NASA Image and Video Library

    2001-06-05

    This computer-generated image depicts the Materials Science Research Rack-1 (MSRR-1) being developed by NASA's Marshall Space Flight Center and the European Space Agency (ESA) for placement in the Destiny laboratory module aboard the International Space Station. The rack is part of the plarned Materials Science Research Facility (MSRF) and is expected to include two furnace module inserts, a Quench Module Insert (being developed by NASA's Marshall Space Flight Center) to study directional solidification in rapidly cooled alloys and a Diffusion Module Insert (being developed by the European Space Agency) to study crystal growth, and a transparent furnace (being developed by NASA's Space Product Development program). Multi-user equipment in the rack is being developed under the auspices of NASA's Office of Biological and Physical Research (OBPR) and ESA. Key elements are labeled in other images (0101754, 0101830, and TBD).

  6. Microgravity

    NASA Image and Video Library

    2001-06-05

    This computer-generated image depicts the Materials Science Research Rack-1 (MSRR-1) being developed by NASA's Marshall Space Flight Center and the European Space Agency (ESA) for placement in the Destiny laboratory module aboard the International Space Station. The rack is part of the plarned Materials Science Research Facility (MSRF) and is expected to include two furnace module inserts, a Quench Module Insert (being developed by NASA's Marshall Space Flight Center) to study directional solidification in rapidly cooled alloys and a Diffusion Module Insert (being developed by the European Space Agency) to study crystal growth, and a transparent furnace (being developed by NASA's Space Product Development program). Multi-user equipment in the rack is being developed under the auspices of NASA's Office of Biological and Physical Research (OBPR) and ESA. Key elements are labeled in other images (0101754, 0101829, 0101830).

  7. Microgravity

    NASA Image and Video Library

    2001-06-05

    This computer-generated image depicts the Materials Science Research Rack-1 (MSRR-1) being developed by NASA's Marshall Space Flight Center and the European Space Agency (ESA) for placement in the Destiny laboratory module aboard the International Space Station. The rack is part of the plarned Materials Science Research Facility (MSRF) and is expected to include two furnace module inserts, a Quench Module Insert (being developed by NASA's Marshall Space Flight Center) to study directional solidification in rapidly cooled alloys and a Diffusion Module Insert (being developed by the European Space Agency) to study crystal growth, and a transparent furnace (being developed by NASA's Space Product Development program). Multi-user equipment in the rack is being developed under the auspices of NASA's Office of Biological and Physical Research (OBPR) and ESA. A larger image is available without labels (No. 0101755).

  8. 1996 NASA-ASEE-Stanford Summer Faculty Fellowship Program. Part 1

    NASA Technical Reports Server (NTRS)

    1996-01-01

    As is customary, the final technical report for the NASA-ASEE Summer Faculty Fellowship Program at the Ames Research Center, Dryden Flight Research Center and Stanford University essentially consists of a compilation of the summary technical reports of all the fellows. More extended versions done either as NASA publications, archival papers, or other laboratory reports are not included here. The reader will note that the areas receiving emphasis were the life sciences, astronomy, remote sensing, aeronautics, fluid dynamics/aerophysics, and computer science. Of course, the areas of emphasis vary somewhat from year to year depending on the interests of the most qualified applicants. Once again, the work is of especially high quality. The reports of the first and second year fellows are grouped separately and are arranged alphabetically within each group.

  9. Solar-Terrestrial and Astronomical Research Network (STAR-Network) - A Meaningful Practice of New Cyberinfrastructure on Space Science

    NASA Astrophysics Data System (ADS)

    Hu, X.; Zou, Z.

    2017-12-01

    For the next decades, comprehensive big data application environment is the dominant direction of cyberinfrastructure development on space science. To make the concept of such BIG cyberinfrastructure (e.g. Digital Space) a reality, these aspects of capability should be focused on and integrated, which includes science data system, digital space engine, big data application (tools and models) and the IT infrastructure. In the past few years, CAS Chinese Space Science Data Center (CSSDC) has made a helpful attempt in this direction. A cloud-enabled virtual research platform on space science, called Solar-Terrestrial and Astronomical Research Network (STAR-Network), has been developed to serve the full lifecycle of space science missions and research activities. It integrated a wide range of disciplinary and interdisciplinary resources, to provide science-problem-oriented data retrieval and query service, collaborative mission demonstration service, mission operation supporting service, space weather computing and Analysis service and other self-help service. This platform is supported by persistent infrastructure, including cloud storage, cloud computing, supercomputing and so on. Different variety of resource are interconnected: the science data can be displayed on the browser by visualization tools, the data analysis tools and physical models can be drived by the applicable science data, the computing results can be saved on the cloud, for example. So far, STAR-Network has served a series of space science mission in China, involving Strategic Pioneer Program on Space Science (this program has invested some space science satellite as DAMPE, HXMT, QUESS, and more satellite will be launched around 2020) and Meridian Space Weather Monitor Project. Scientists have obtained some new findings by using the science data from these missions with STAR-Network's contribution. We are confident that STAR-Network is an exciting practice of new cyberinfrastructure architecture on space science.

  10. Computational Science: A Research Methodology for the 21st Century

    NASA Astrophysics Data System (ADS)

    Orbach, Raymond L.

    2004-03-01

    Computational simulation - a means of scientific discovery that employs computer systems to simulate a physical system according to laws derived from theory and experiment - has attained peer status with theory and experiment. Important advances in basic science are accomplished by a new "sociology" for ultrascale scientific computing capability (USSCC), a fusion of sustained advances in scientific models, mathematical algorithms, computer architecture, and scientific software engineering. Expansion of current capabilities by factors of 100 - 1000 open up new vistas for scientific discovery: long term climatic variability and change, macroscopic material design from correlated behavior at the nanoscale, design and optimization of magnetic confinement fusion reactors, strong interactions on a computational lattice through quantum chromodynamics, and stellar explosions and element production. The "virtual prototype," made possible by this expansion, can markedly reduce time-to-market for industrial applications such as jet engines and safer, more fuel efficient cleaner cars. In order to develop USSCC, the National Energy Research Scientific Computing Center (NERSC) announced the competition "Innovative and Novel Computational Impact on Theory and Experiment" (INCITE), with no requirement for current DOE sponsorship. Fifty nine proposals for grand challenge scientific problems were submitted for a small number of awards. The successful grants, and their preliminary progress, will be described.

  11. The Laboratory for Terrestrial Physics

    NASA Technical Reports Server (NTRS)

    2003-01-01

    The Laboratory for Terrestrial Physics is dedicated to the advancement of knowledge in Earth and planetary science, by conducting innovative research using space technology. The Laboratory's mission and activities support the work and new initiatives at NASA's Goddard Space Flight Center (GSFC). The Laboratory's success contributes to the Earth Science Directorate as a national resource for studies of Earth from Space. The Laboratory is part of the Earth Science Directorate based at the GSFC in Greenbelt, MD. The Directorate itself is comprised of the Global Change Data Center (GCDC), the Space Data and Computing Division (SDCD), and four science Laboratories, including Laboratory for Terrestrial Physics, Laboratory for Atmospheres, and Laboratory for Hydrospheric Processes all in Greenbelt, MD. The fourth research organization, Goddard Institute for Space Studies (GISS), is in New York, NY. Relevant to NASA's Strategic Plan, the Laboratory ensures that all work undertaken and completed is within the vision of GSFC. The philosophy of the Laboratory is to balance the completion of near term goals, while building on the Laboratory's achievements as a foundation for the scientific challenges in the future.

  12. Computational Toxicology as Implemented by the US EPA ...

    EPA Pesticide Factsheets

    Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, the U.S. Environmental Protection Agency EPA is developing robust and flexible computational tools that can be applied to the thousands of chemicals in commerce, and contaminant mixtures found in air, water, and hazardous-waste sites. The Office of Research and Development (ORD) Computational Toxicology Research Program (CTRP) is composed of three main elements. The largest component is the National Center for Computational Toxicology (NCCT), which was established in 2005 to coordinate research on chemical screening and prioritization, informatics, and systems modeling. The second element consists of related activities in the National Health and Environmental Effects Research Laboratory (NHEERL) and the National Exposure Research Laboratory (NERL). The third and final component consists of academic centers working on various aspects of computational toxicology and funded by the U.S. EPA Science to Achieve Results (STAR) program. Together these elements form the key components in the implementation of both the initial strategy, A Framework for a Computational Toxicology Research Program (U.S. EPA, 2003), and the newly released The U.S. Environmental Protection Agency's Strategic Plan for Evaluating the T

  13. Community Coordinated Modeling Center Support of Science Needs for Integrated Data Environment

    NASA Technical Reports Server (NTRS)

    Kuznetsova, M. M.; Hesse, M.; Rastatter, L.; Maddox, M.

    2007-01-01

    Space science models are essential component of integrated data environment. Space science models are indispensable tools to facilitate effective use of wide variety of distributed scientific sources and to place multi-point local measurements into global context. The Community Coordinated Modeling Center (CCMC) hosts a set of state-of-the- art space science models ranging from the solar atmosphere to the Earth's upper atmosphere. The majority of models residing at CCMC are comprehensive computationally intensive physics-based models. To allow the models to be driven by data relevant to particular events, the CCMC developed an online data file generation tool that automatically downloads data from data providers and transforms them to required format. CCMC provides a tailored web-based visualization interface for the model output, as well as the capability to download simulations output in portable standard format with comprehensive metadata and user-friendly model output analysis library of routines that can be called from any C supporting language. CCMC is developing data interpolation tools that enable to present model output in the same format as observations. CCMC invite community comments and suggestions to better address science needs for the integrated data environment.

  14. Final Report for DOE Award ER25756

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kesselman, Carl

    2014-11-17

    The SciDAC-funded Center for Enabling Distributed Petascale Science (CEDPS) was established to address technical challenges that arise due to the frequent geographic distribution of data producers (in particular, supercomputers and scientific instruments) and data consumers (people and computers) within the DOE laboratory system. Its goal is to produce technical innovations that meet DOE end-user needs for (a) rapid and dependable placement of large quantities of data within a distributed high-performance environment, and (b) the convenient construction of scalable science services that provide for the reliable and high-performance processing of computation and data analysis requests from many remote clients. The Centermore » is also addressing (c) the important problem of troubleshooting these and other related ultra-high-performance distributed activities from the perspective of both performance and functionality« less

  15. Assessment of Adaptive PBL's Impact on HOT Development of Computer Science Students

    ERIC Educational Resources Information Center

    Raiyn, Jamal; Tilchin, Oleg

    2015-01-01

    Meaningful learning based on PBL is new learning strategy. Compared to traditional learning strategy, the meaningful learning strategy put the student in center of the learning process. The roles of the student in the meaningful learning strategy will be increased. The Problem-based Learning (PBL) model is considered the most productive way to…

  16. Technical Assessment: Integrated Photonics

    DTIC Science & Technology

    2015-10-01

    in global internet protocol traffic as a function of time by local access technology. Photonics continues to play a critical role in enabling this...communication networks. This has enabled services like the internet , high performance computing, and power-efficient large-scale data centers. The...signal processing, quantum information science, and optics for free space applications. However major obstacles challenge the implementation of

  17. Shaping Software Engineering Curricula Using Open Source Communities: A Case Study

    ERIC Educational Resources Information Center

    Bowring, James; Burke, Quinn

    2016-01-01

    This paper documents four years of a novel approach to teaching a two-course sequence in software engineering as part of the ABET-accredited computer science curriculum at the College of Charleston. This approach is team-based and centers on learning software engineering in the context of open source software projects. In the first course, teams…

  18. Enhancing Project-Based Learning in Software Engineering Lab Teaching through an E-Portfolio Approach

    ERIC Educational Resources Information Center

    Macias, J. A.

    2012-01-01

    Project-based learning is one of the main successful student-centered pedagogies broadly used in computing science courses. However, this approach can be insufficient when dealing with practical subjects that implicitly require many deliverables and a great deal of feedback and organizational resources. In this paper, a worked e-portfolio is…

  19. Campus Community Partnerships with People Who Are Deaf or Hard-of-Hearing

    ERIC Educational Resources Information Center

    Matteson, Jamie; Kha, Christine K.; Hu, Diane J.; Cheng, Chih-Chieh; Saul, Lawrence; Sadler, Georgia Robins

    2008-01-01

    In 1997, the Moores University of California, San Diego (UCSD) Cancer Center and advocacy groups for people who are deaf and hard of hearing launched a highly hearing, successful cancer control collaborative. In 2006, faculty from the Computer Science Department at UCSD invited the collaborative to help develop a new track in their doctoral…

  20. Global Collective Resources: A Study of Monographic Bibliographic Records in WorldCat.

    ERIC Educational Resources Information Center

    Perrault, Anna H.

    In 2001, WorldCat, the primary international bibliographic utility, contained 45 million records with over 750 million library location listings. These records span over 4,000 years of recorded knowledge in 377 languages. Under the auspices of an OCLC/ALISE (Online Computer Library Center/Association of Library and Information Science Educators)…

  1. Tracking the NGS revolution: managing life science research on shared high-performance computing clusters.

    PubMed

    Dahlö, Martin; Scofield, Douglas G; Schaal, Wesley; Spjuth, Ola

    2018-05-01

    Next-generation sequencing (NGS) has transformed the life sciences, and many research groups are newly dependent upon computer clusters to store and analyze large datasets. This creates challenges for e-infrastructures accustomed to hosting computationally mature research in other sciences. Using data gathered from our own clusters at UPPMAX computing center at Uppsala University, Sweden, where core hour usage of ∼800 NGS and ∼200 non-NGS projects is now similar, we compare and contrast the growth, administrative burden, and cluster usage of NGS projects with projects from other sciences. The number of NGS projects has grown rapidly since 2010, with growth driven by entry of new research groups. Storage used by NGS projects has grown more rapidly since 2013 and is now limited by disk capacity. NGS users submit nearly twice as many support tickets per user, and 11 more tools are installed each month for NGS projects than for non-NGS projects. We developed usage and efficiency metrics and show that computing jobs for NGS projects use more RAM than non-NGS projects, are more variable in core usage, and rarely span multiple nodes. NGS jobs use booked resources less efficiently for a variety of reasons. Active monitoring can improve this somewhat. Hosting NGS projects imposes a large administrative burden at UPPMAX due to large numbers of inexperienced users and diverse and rapidly evolving research areas. We provide a set of recommendations for e-infrastructures that host NGS research projects. We provide anonymized versions of our storage, job, and efficiency databases.

  2. Dehydration of 1-octadecanol over H-BEA: A combined experimental and computational study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Song, Wenji; Liu, Yuanshuai; Barath, Eszter

    Liquid phase dehydration of 1-octdecanol, which is intermediately formed during the hydrodeoxygenation of microalgae oil, has been explored in a combined experimental and computational study. The alkyl chain of C18 alcohol interacts with acid sites during diffusion inside the zeolite pores, resulting in an inefficient utilization of the Brønsted acid sites for samples with high acid site concentrations. The parallel intra- and inter- molecular dehydration pathways having different activation energies pass through alternative reaction intermediates. Formation of surface-bound alkoxide species is the rate-limiting step during intramolecular dehydration, whereas intermolecular dehydration proceeds via a bulky dimer intermediate. Octadecene is the primarymore » dehydration product over H-BEA at 533 K. Despite of the main contribution of Brønsted acid sites towards both dehydration pathways, Lewis acid sites are also active in the formation of dioctadecyl ether. The intramolecular dehydration to octadecene and cleavage of the intermediately formed ether, however, require strong BAS. L. Wang, D. Mei and J. A. Lercher, acknowledge the partial support from the US Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences & Biosciences. Pacific Northwest National Laboratory (PNNL) is a multiprogram national laboratory operated for DOE by Battelle. Computing time was granted by the grand challenge of computational catalysis of the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) and by the National Energy Research Scientific Computing Center (NERSC). EMSL is a national scientific user facility located at Pacific Northwest National Laboratory (PNNL) and sponsored by DOE’s Office of Biological and Environmental Research.« less

  3. Tracking the NGS revolution: managing life science research on shared high-performance computing clusters

    PubMed Central

    2018-01-01

    Abstract Background Next-generation sequencing (NGS) has transformed the life sciences, and many research groups are newly dependent upon computer clusters to store and analyze large datasets. This creates challenges for e-infrastructures accustomed to hosting computationally mature research in other sciences. Using data gathered from our own clusters at UPPMAX computing center at Uppsala University, Sweden, where core hour usage of ∼800 NGS and ∼200 non-NGS projects is now similar, we compare and contrast the growth, administrative burden, and cluster usage of NGS projects with projects from other sciences. Results The number of NGS projects has grown rapidly since 2010, with growth driven by entry of new research groups. Storage used by NGS projects has grown more rapidly since 2013 and is now limited by disk capacity. NGS users submit nearly twice as many support tickets per user, and 11 more tools are installed each month for NGS projects than for non-NGS projects. We developed usage and efficiency metrics and show that computing jobs for NGS projects use more RAM than non-NGS projects, are more variable in core usage, and rarely span multiple nodes. NGS jobs use booked resources less efficiently for a variety of reasons. Active monitoring can improve this somewhat. Conclusions Hosting NGS projects imposes a large administrative burden at UPPMAX due to large numbers of inexperienced users and diverse and rapidly evolving research areas. We provide a set of recommendations for e-infrastructures that host NGS research projects. We provide anonymized versions of our storage, job, and efficiency databases. PMID:29659792

  4. Geocoded data structures and their applications to Earth science investigations

    NASA Technical Reports Server (NTRS)

    Goldberg, M.

    1984-01-01

    A geocoded data structure is a means for digitally representing a geographically referenced map or image. The characteristics of representative cellular, linked, and hybrid geocoded data structures are reviewed. The data processing requirements of Earth science projects at the Goddard Space Flight Center and the basic tools of geographic data processing are described. Specific ways that new geocoded data structures can be used to adapt these tools to scientists' needs are presented. These include: expanding analysis and modeling capabilities; simplifying the merging of data sets from diverse sources; and saving computer storage space.

  5. Venus - Global View Centered at 180 degrees

    NASA Image and Video Library

    1996-11-26

    This global view of the surface of Venus is centered at 180 degrees east longitude. Magellan synthetic aperture radar mosaics from the first cycle of Magellan mapping, and a 5 degree latitude-longitude grid, are mapped onto a computer-simulated globe to create this image. Data gaps are filled with Pioneer-Venus Orbiter data, or a constant mid-range value. The image was produced by the Solar System Visualization project and the Magellan Science team at the JPL Multimission Image Processing Laboratory. http://photojournal.jpl.nasa.gov/catalog/PIA00478

  6. S'COOL Provides Research Opportunities and Current Data for Today's Technological Classroom

    NASA Technical Reports Server (NTRS)

    Green, Carolyn J.; Chambers, Lin H.; Racel, Anne M.

    1999-01-01

    NASA's Students' Cloud Observations On-Line (S'COOL) project, a hands-on educational project, was an innovative idea conceived by the scientists in the Radiation Sciences Branch at NASA Langley Research Center, Hampton, Virginia, in 1996. It came about after a local teacher expressed the idea that she wanted her students to be involved in real-life science. S'COOL supports NASA's Clouds and the Earth's Radiant Energy System (CERES) instrument, which was launched on the Tropical Rainforest Measuring Mission (TRMM) in November, 1997, as part of NASA's Earth Science Enterprise. With the S'COOL project students observe clouds and related weather conditions, compute data and note vital information while obtaining ground truth observations for the CERES instrument. The observations can then be used to help validate the CERES measurements, particularly detection of clear sky from space. In addition to meeting math, science and geography standards, students are engaged in using the computer to obtain, report and analyze current data, thus bringing modern technology into the realm of classroom, a paradigm that demands our attention.

  7. Publisher Correction: Western US volcanism due to intruding oceanic mantle driven by ancient Farallon slabs

    NASA Astrophysics Data System (ADS)

    Zhou, Quan; Liu, Lijun; Hu, Jiashun

    2018-05-01

    In the version of this Article originally published, data points representing mafic eruptions were missing from Fig. 4b, the corrected version is shown below. Furthermore, the authors omitted to include the following acknowledgements to the provider of the computational resources: "This research is part of the Blue Waters sustained-petascale computing project, which is supported by the National Science Foundation (awards OCI-0725070 and ACI-1238993) and the state of Illinois. Blue Waters is a joint effort of the University of Illinois at Urbana-Champaign and its National Center for Supercomputing Applications. This work is also part of the `PRAC Title 4-D Geodynamic Modeling With Data Assimilation: Origin Of Intra-Plate Volcanism In The Pacific Northwest' PRAC allocation support by the National Science Foundation (award number ACI 1516586). This work also used the Extreme Science and Engineering Discovery Environment (XSEDE), which is supported by National Science Foundation grant number ACI-1548562." Figure 4 and the Acknowledgements section have been updated in the online version of the Article.

  8. Integration of Panda Workload Management System with supercomputers

    NASA Astrophysics Data System (ADS)

    De, K.; Jha, S.; Klimentov, A.; Maeno, T.; Mashinistov, R.; Nilsson, P.; Novikov, A.; Oleynik, D.; Panitkin, S.; Poyda, A.; Read, K. F.; Ryabinkin, E.; Teslyuk, A.; Velikhov, V.; Wells, J. C.; Wenaus, T.

    2016-09-01

    The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS, one of the largest collaborations ever assembled in the sciences, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Management System for managing the workflow for all data processing on over 140 data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. While PanDA currently uses more than 250000 cores with a peak performance of 0.3+ petaFLOPS, next LHC data taking runs will require more resources than Grid computing can possibly provide. To alleviate these challenges, LHC experiments are engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with supercomputers in United States, Europe and Russia (in particular with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF), Supercomputer at the National Research Center "Kurchatov Institute", IT4 in Ostrava, and others). The current approach utilizes a modified PanDA pilot framework for job submission to the supercomputers batch queues and local data management, with light-weight MPI wrappers to run singlethreaded workloads in parallel on Titan's multi-core worker nodes. This implementation was tested with a variety of Monte-Carlo workloads on several supercomputing platforms. We will present our current accomplishments in running PanDA WMS at supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facility's infrastructure for High Energy and Nuclear Physics, as well as other data-intensive science applications, such as bioinformatics and astro-particle physics.

  9. Binary Black Hole Mergers, Gravitational Waves, and LISA

    NASA Astrophysics Data System (ADS)

    Centrella, Joan; Baker, J.; Boggs, W.; Kelly, B.; McWilliams, S.; van Meter, J.

    2007-12-01

    The final merger of comparable mass binary black holes is expected to be the strongest source of gravitational waves for LISA. Since these mergers take place in regions of extreme gravity, we need to solve Einstein's equations of general relativity on a computer in order to calculate these waveforms. For more than 30 years, scientists have tried to compute black hole mergers using the methods of numerical relativity. The resulting computer codes have been plagued by instabilities, causing them to crash well before the black holes in the binary could complete even a single orbit. Within the past few years, however, this situation has changed dramatically, with a series of remarkable breakthroughs. We will present the results of new simulations of black hole mergers with unequal masses and spins, focusing on the gravitational waves emitted and the accompanying astrophysical "kicks.” The magnitude of these kicks has bearing on the production and growth of supermassive blackholes during the epoch of structure formation, and on the retention of black holes in stellar clusters. This work was supported by NASA grant 06-BEFS06-19, and the simulations were carried out using Project Columbia at the NASA Advanced Supercomputing Division (Ames Research Center) and at the NASA Center for Computational Sciences (Goddard Space Flight Center).

  10. PACES Participation in Educational Outreach Programs at the University of Texas at El Paso

    NASA Technical Reports Server (NTRS)

    Dodge, Rebecca L.

    1997-01-01

    The University of Texas at El Paso (UTEP) is involved in several initiatives to improve science education within the El Paso area public schools. These include outreach efforts into the K- 12 classrooms; training programs for in-service teachers; and the introduction of a strong science core curricula within the College of Education. The Pan American Center for Earth and Environmental Studies (PACES), a NASA-funded University Research Center, will leverage off the goals of these existing initiatives to provide curriculum support materials at all levels. We will use currently available Mission to Planet Earth (MTPE) materials as well as new materials developed specifically for this region, in an effort to introduce the Earth System Science perspective into these programs. In addition, we are developing curriculum support materials and classes within the Geology and Computer Departments, to provide education in the area of remote sensing and GIS applications at the undergraduate and graduate levels.

  11. How Data Becomes Physics: Inside the RACF

    ScienceCinema

    Ernst, Michael; Rind, Ofer; Rajagopalan, Srini; Lauret, Jerome; Pinkenburg, Chris

    2018-06-22

    The RHIC & ATLAS Computing Facility (RACF) at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory sits at the center of a global computing network. It connects more than 2,500 researchers around the world with the data generated by millions of particle collisions taking place each second at Brookhaven Lab's Relativistic Heavy Ion Collider (RHIC, a DOE Office of Science User Facility for nuclear physics research), and the ATLAS experiment at the Large Hadron Collider in Europe. Watch this video to learn how the people and computing resources of the RACF serve these scientists to turn petabytes of raw data into physics discoveries.

  12. A Bimetallic Nickel–Gallium Complex Catalyzes CO 2 Hydrogenation via the Intermediacy of an Anionic d 10 Nickel Hydride

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cammarota, Ryan C.; Vollmer, Matthew V.; Xie, Jing

    Large-scale CO2 hydrogenation could offer a renewable stream of industrially important C1 chemicals while reducing CO2 emissions. Critical to this opportunity is the requirement for inexpensive catalysts based on earth-abundant metals instead of precious metals. We report a nickel-gallium complex featuring a Ni(0)→Ga(III) bond that shows remarkable catalytic activity for hydrogenating CO2 to formate at ambient temperature (3150 turnovers, turnover frequency = 9700 h-1), compared with prior homogeneous Ni-centred catalysts. The Lewis acidic Ga(III) ion plays a pivotal role by stabilizing reactive catalytic intermediates, including a rare anionic d10 Ni hydride. The structure of this reactive intermediate shows a terminalmore » Ni-H, for which the hydride donor strength rivals those of precious metal-hydrides. Collectively, our experimental and computational results demonstrate that modulating a transition metal center via a direct interaction with a Lewis acidic support can be a powerful strategy for promoting new reactivity paradigms in base-metal catalysis. The work was supported as part of the Inorganometallic Catalysis Design Center, an Energy Frontier Research Center funded by the U.S. Department of Energy (DOE), Office of Science, Basic Energy Sciences under Award DE-SC0012702. R.C.C. and M.V.V. were supported by DOE Office of Science Graduate Student Research and National Science Foundation Graduate Research Fellowship programs, respectively. J.C.L., S.A.B., and A.M.A. were supported by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences & Biosciences. Pacific Northwest National Laboratory is operated by Battelle for the U.S. Department of Energy.« less

  13. Theory, Modeling, Software and Hardware Development for Analytical and Computational Materials Science

    NASA Technical Reports Server (NTRS)

    Young, Gerald W.; Clemons, Curtis B.

    2004-01-01

    The focus of this Cooperative Agreement between the Computational Materials Laboratory (CML) of the Processing Science and Technology Branch of the NASA Glenn Research Center (GRC) and the Department of Theoretical and Applied Mathematics at The University of Akron was in the areas of system development of the CML workstation environment, modeling of microgravity and earth-based material processing systems, and joint activities in laboratory projects. These efforts complement each other as the majority of the modeling work involves numerical computations to support laboratory investigations. Coordination and interaction between the modelers, system analysts, and laboratory personnel are essential toward providing the most effective simulations and communication of the simulation results. Toward these means, The University of Akron personnel involved in the agreement worked at the Applied Mathematics Research Laboratory (AMRL) in the Department of Theoretical and Applied Mathematics while maintaining a close relationship with the personnel of the Computational Materials Laboratory at GRC. Network communication between both sites has been established. A summary of the projects we undertook during the time period 9/1/03 - 6/30/04 is included.

  14. Spacecube: A Family of Reconfigurable Hybrid On-Board Science Data Processors

    NASA Technical Reports Server (NTRS)

    Flatley, Thomas P.

    2015-01-01

    SpaceCube is a family of Field Programmable Gate Array (FPGA) based on-board science data processing systems developed at the NASA Goddard Space Flight Center (GSFC). The goal of the SpaceCube program is to provide 10x to 100x improvements in on-board computing power while lowering relative power consumption and cost. SpaceCube is based on the Xilinx Virtex family of FPGAs, which include processor, FPGA logic and digital signal processing (DSP) resources. These processing elements are leveraged to produce a hybrid science data processing platform that accelerates the execution of algorithms by distributing computational functions to the most suitable elements. This approach enables the implementation of complex on-board functions that were previously limited to ground based systems, such as on-board product generation, data reduction, calibration, classification, eventfeature detection, data mining and real-time autonomous operations. The system is fully reconfigurable in flight, including data parameters, software and FPGA logic, through either ground commanding or autonomously in response to detected eventsfeatures in the instrument data stream.

  15. Parallel Computational Fluid Dynamics: Current Status and Future Requirements

    NASA Technical Reports Server (NTRS)

    Simon, Horst D.; VanDalsem, William R.; Dagum, Leonardo; Kutler, Paul (Technical Monitor)

    1994-01-01

    One or the key objectives of the Applied Research Branch in the Numerical Aerodynamic Simulation (NAS) Systems Division at NASA Allies Research Center is the accelerated introduction of highly parallel machines into a full operational environment. In this report we discuss the performance results obtained from the implementation of some computational fluid dynamics (CFD) applications on the Connection Machine CM-2 and the Intel iPSC/860. We summarize some of the experiences made so far with the parallel testbed machines at the NAS Applied Research Branch. Then we discuss the long term computational requirements for accomplishing some of the grand challenge problems in computational aerosciences. We argue that only massively parallel machines will be able to meet these grand challenge requirements, and we outline the computer science and algorithm research challenges ahead.

  16. Technology pedagogy: Six teacher candidates' developing pedagogical models for the use of computers in science instruction

    NASA Astrophysics Data System (ADS)

    Myhre, Oddmund Reidar

    1997-12-01

    This study investigated how teacher candidates' developing pedagogical beliefs and knowledge of technology influenced their perception of such tools in the teaching of subject matter as they complete the initial course work of their professional program. The purpose of the study was to conceptualize more clearly the relationship between prospective teachers' thinking about computer technology and the content of their professional education. A case study methodology was used to investigate changes in six pre-service secondary science teachers' thinking about technology as a pedagogical tool. Two of the teachers had extensive experience with technology upon entering the teacher preparation course-work, whereas the other four were novice computer users. Data included three semi structured interviews and non-participant observations during the technology course-work. Additional data were collected in the form of interviews with university faculty and cooperating teachers. Analysis of these data indicated that prospective candidates entered teacher education viewing technology as a tool that supports a teacher centered classroom. As the candidates explored more student centered approaches to teaching, they found less room for technology in their images of their future practice. The data also indicated that the technology course-work was isolated from the rest of the teacher education program and many of the misconceptions about technology that the candidates brought to their professional preparation were left unchallenged.

  17. Large Scale Computing and Storage Requirements for High Energy Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerber, Richard A.; Wasserman, Harvey

    2010-11-24

    The National Energy Research Scientific Computing Center (NERSC) is the leading scientific computing facility for the Department of Energy's Office of Science, providing high-performance computing (HPC) resources to more than 3,000 researchers working on about 400 projects. NERSC provides large-scale computing resources and, crucially, the support and expertise needed for scientists to make effective use of them. In November 2009, NERSC, DOE's Office of Advanced Scientific Computing Research (ASCR), and DOE's Office of High Energy Physics (HEP) held a workshop to characterize the HPC resources needed at NERSC to support HEP research through the next three to five years. Themore » effort is part of NERSC's legacy of anticipating users needs and deploying resources to meet those demands. The workshop revealed several key points, in addition to achieving its goal of collecting and characterizing computing requirements. The chief findings: (1) Science teams need access to a significant increase in computational resources to meet their research goals; (2) Research teams need to be able to read, write, transfer, store online, archive, analyze, and share huge volumes of data; (3) Science teams need guidance and support to implement their codes on future architectures; and (4) Projects need predictable, rapid turnaround of their computational jobs to meet mission-critical time constraints. This report expands upon these key points and includes others. It also presents a number of case studies as representative of the research conducted within HEP. Workshop participants were asked to codify their requirements in this case study format, summarizing their science goals, methods of solution, current and three-to-five year computing requirements, and software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, multi-core environment that is expected to dominate HPC architectures over the next few years. The report includes a section that describes efforts already underway or planned at NERSC that address requirements collected at the workshop. NERSC has many initiatives in progress that address key workshop findings and are aligned with NERSC's strategic plans.« less

  18. Interactive visualization of Earth and Space Science computations

    NASA Technical Reports Server (NTRS)

    Hibbard, William L.; Paul, Brian E.; Santek, David A.; Dyer, Charles R.; Battaiola, Andre L.; Voidrot-Martinez, Marie-Francoise

    1994-01-01

    Computers have become essential tools for scientists simulating and observing nature. Simulations are formulated as mathematical models but are implemented as computer algorithms to simulate complex events. Observations are also analyzed and understood in terms of mathematical models, but the number of these observations usually dictates that we automate analyses with computer algorithms. In spite of their essential role, computers are also barriers to scientific understanding. Unlike hand calculations, automated computations are invisible and, because of the enormous numbers of individual operations in automated computations, the relation between an algorithm's input and output is often not intuitive. This problem is illustrated by the behavior of meteorologists responsible for forecasting weather. Even in this age of computers, many meteorologists manually plot weather observations on maps, then draw isolines of temperature, pressure, and other fields by hand (special pads of maps are printed for just this purpose). Similarly, radiologists use computers to collect medical data but are notoriously reluctant to apply image-processing algorithms to that data. To these scientists with life-and-death responsibilities, computer algorithms are black boxes that increase rather than reduce risk. The barrier between scientists and their computations can be bridged by techniques that make the internal workings of algorithms visible and that allow scientists to experiment with their computations. Here we describe two interactive systems developed at the University of Wisconsin-Madison Space Science and Engineering Center (SSEC) that provide these capabilities to Earth and space scientists.

  19. CDAC Student Report: Summary of LLNL Internship

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herriman, Jane E.

    Multiple objectives motivated me to apply for an internship at LLNL: I wanted to experience the work environment at a national lab, to learn about research and job opportunities at LLNL in particular, and to gain greater experience with code development, particularly within the realm of high performance computing (HPC). This summer I was selected to participate in LLNL's Computational Chemistry and Material Science Summer Institute (CCMS). CCMS is a 10 week program hosted by the Quantum Simulations group leader, Dr. Eric Schwegler. CCMS connects graduate students to mentors at LLNL involved in similar re- search and provides weekly seminarsmore » on a broad array of topics from within chemistry and materials science. Dr. Xavier Andrade and Dr. Erik Draeger served as my co-mentors over the summer, and Dr. Andrade continues to mentor me now that CCMS has concluded. Dr. Andrade is a member of the Quantum Simulations group within the Physical and Life Sciences at LLNL, and Dr. Draeger leads the HPC group within the Center for Applied Scientific Computing (CASC). The two have worked together to develop Qb@ll, an open-source first principles molecular dynamics code that was the platform for my summer research project.« less

  20. Testimony to the House Science Space and Technology Committee.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Church, Michael Kenton; Tannenbaum, Benn

    Chairman Smith, Ranking Member Johnson, and distinguished members of the Committee on Science, Space, and Technology, I thank you for the opportunity to testify today on the role of science, engineering, and research at Sandia National Laboratories, one of the nation’s premiere national labs and the nation’s largest Federally Funded Research and Development Center (FFRDC) laboratory. I am Dr. Susan Seestrom, Sandia’s Associate Laboratories Director for Advanced Science & Technology (AST) and Chief Research Officer (CRO). As CRO I am responsible for research strategy, Laboratory Directed Research & Development (LDRD), partnerships strategy, and technology transfer. As director and line managermore » for AST I manage capabilities and mission delivery across a variety of the physical and mathematical sciences and engineering disciplines, such as pulsed power, radiation effects, major environmental testing, high performance computing, and modeling and simulation.« less

  1. ISTP Science Data Systems and Products

    NASA Astrophysics Data System (ADS)

    Mish, William H.; Green, James L.; Reph, Mary G.; Peredo, Mauricio

    1995-02-01

    The International Solar-Terrestrial Physics (ISTP) program will provide simultaneous coordinated scientific measurements from most of the major areas of geospace including specific locations on the Earth's surface. This paper describes the comprehensive ISTP ground science data handling system which has been developed to promote optimal mission planning and efficient data processing, analysis and distribution. The essential components of this ground system are the ISTP Central Data Handling Facility (CDHF), the Information Processing Division's Data Distribution Facility (DDF), the ISTP/Global Geospace Science (GGS) Science Planning and Operations Facility (SPOF) and the NASA Data Archive and Distribution Service (NDADS). The ISTP CDHF is the one place in the program where measurements from this wide variety of geospace and ground-based instrumentation and theoretical studies are brought together. Subsequently, these data will be distributed, along with ancillary data, in a unified fashion to the ISTP Principal Investigator (PI) and Co-Investigator (CoI) teams for analysis on their local systems. The CDHF ingests the telemetry streams, orbit, attitude, and command history from the GEOTAIL, WIND, POLAR, SOHO, and IMP-8 Spacecraft; computes summary data sets, called Key Parameters (KPs), for each scientific instrument; ingests pre-computed KPs from other spacecraft and ground basel investigations; provides a computational platform for parameterized modeling; and provides a number of ‘data services” for the ISTP community of investigators. The DDF organizes the KPs, decommutated telemetry, and associated ancillary data into products for duistribution to the ISTP community on CD-ROMs. The SPOF is the component of the GGS program responsible for the development and coordination of ISTP science planning operations. The SPOF operates under the direction of the ISTP Project Scientist and is responsible for the development and coordination of the science plan for ISTP spacecraft. Instrument command requests for the WIND and POLAR investigations are submitted by the PIs to the SPOF where they are checked for science conflicts, forwarded to the GSFC Command Management Syntem/Payload Operations Control Center (CMS/POCC) for engineering conflict validation, and finally incorporated into the conflict-free science operations plan. Conflict resolution is accomplished through iteration between the PIs, SPOF and CMS and in consultation with the Project Scientist when necessary. The long term archival of ISTP KP and level-zero data will be undertaken by NASA's National Space Science Data Center using the NASA Data Archive and Distribution Service (NDADS). This on-line archive facility will provide rapid access to archived KPs and event data and includes security features to restrict access to the data during the time they are proprietary.

  2. How are the energy waves blocked on the way from hot to cold?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bai, Xianming; He, Lingfeng; Khafizov, Marat

    Representing the Center for Materials Science of Nuclear Fuel (CMSNF), this document is one of the entries in the Ten Hundred and One Word Challenge. As part of the challenge, the 46 Energy Frontier Research Centers were invited to represent their science in images, cartoons, photos, words and original paintings, but any descriptions or words could only use the 1000 most commonly used words in the English language, with the addition of one word important to each of the EFRCs and the mission of DOE energy. The mission of CMSNF to develop an experimentally validated multi-scale computational capability for themore » predictive understanding of the impact of microstructure on thermal transport in nuclear fuel under irradiation, with ultimate application to UO2 as a model system« less

  3. National Space Science Data Center data archive and distribution service (NDADS) automated retrieval mail system user's guide

    NASA Technical Reports Server (NTRS)

    Perry, Charleen M.; Vansteenberg, Michael E.

    1992-01-01

    The National Space Science Data Center (NSSDC) has developed an automated data retrieval request service utilizing our Data Archive and Distribution Service (NDADS) computer system. NDADS currently has selected project data written to optical disk platters with the disks residing in a robotic 'jukebox' near-line environment. This allows for rapid and automated access to the data with no staff intervention required. There are also automated help information and user services available that can be accessed. The request system permits an average-size data request to be completed within minutes of the request being sent to NSSDC. A mail message, in the format described in this document, retrieves the data and can send it to a remote site. Also listed in this document are the data currently available.

  4. FOREWORD: Third Nordic Symposium on Computer Simulation in Physics, Chemistry, Biology and Mathematics

    NASA Astrophysics Data System (ADS)

    Kaski, K.; Salomaa, M.

    1990-01-01

    These are Proceedings of the Third Nordic Symposium on Computer Simulation in Physics, Chemistry, Biology, and Mathematics, held August 25-26, 1989, at Lahti (Finland). The Symposium belongs to an annual series of Meetings, the first one of which was arranged in 1987 at Lund (Sweden) and the second one in 1988 at Kolle-Kolle near Copenhagen (Denmark). Although these Symposia have thus far been essentially Nordic events, their international character has increased significantly; the trend is vividly reflected through contributions in the present Topical Issue. The interdisciplinary nature of Computational Science is central to the activity; this fundamental aspect is also responsible, in an essential way, for its rapidly increasing impact. Crucially important to a wide spectrum of superficially disparate fields is the common need for extensive - and often quite demanding - computational modelling. For such theoretical models, no closed-form (analytical) solutions are available or they would be extremely difficult to find; hence one must rather resort to the Art of performing computational investigations. Among the unifying features in the computational research are the methods of simulation employed; methods which frequently are quite closely related with each other even for faculties of science that are quite unrelated. Computer simulation in Natural Sciences is presently apprehended as a discipline on its own right, occupying a broad region somewhere between the experimental and theoretical methods, but also partially overlapping with and complementing them. - Whichever its proper definition may be, the computational approach serves as a novel and an extremely versatile tool with which one can equally well perform "pure" experimental modelling and conduct "computational theory". Computational studies that have earlier been made possible only through supercomputers have opened unexpected, as well as exciting, novel frontiers equally in mathematics (e.g., fractals), physics (fluid-dynamical and quantum-mechanical calculations; extensive numerical simulations of various condensed-matter systems; the development of stellar constellations, even the early Universe), chemistry (quantum-chemical calculations on the structures of new chemical compounds; chemical reactions and reaction dynamics), and biology (various models, for example, in population dynamics). We succeeded in our effort to assemble several internationally recognized researchers of Computational Science to deliver invited talks on a couple of exceptionally beautiful late-summer days in the modern premises of the Adult Education Center at Lahti. Among the plenary speakers, Per Bak described his highly original work on self-organized criticality. David Ceperley discussed pioneering numerical simulations of superfluid helium in which, for the first time, Feynman's path-integral formulation of quantum mechanics has been implemented on a computer. Jim Gunton presented his comprehensive studies of the Cahn-Hilliard equation for the dynamics of ordering in a condensed-matter system far from equilibrium, while Alex Hansen explained those on nonlinear breakdown in disordered materials. Representing the important field of computational chemistry, Bo Jönsson dealt with attractive forces between polyelectrolytes. Kurt Kremer gave an interesting account on computer-simulation studies of complex polymer systems, while Ole Mouritsen reviewed studies of interfacial fluctuations in lipid membranes. Pekka Pyykkö introduced his pioneering work which has led to predictions of completely novel chemical species. Annette Zippelius gave an expert introduction to the highly active field of neural networks. It is evident from each of these intriguing plenary contributions that, indeed, the computational approach is a frontier field of science, possibly providing the most versatile research method available today. We also arranged a competition for the best Posters presented at the Symposium; the Prizes were some of the newest books on the beauty of fractals. The First Prize was won by Hanna Viertio, the Second Prize by Miguel Zendejas and the Third Prize was shared by Leo Kärkkäinen and Kari Rummukainen. As for the future of Computational Science, we identify two principal avenues: (a) big science - large centers with ultrafast supercomputers, and (b) small science - active groups utilizing personal minisupercomputers or supenvorkstations. At present, it appears that the latter already compete extremely favourably in their performance with the massive supercomputers - at least in their throughput and, especially, in tasks where a broad range of diverse software support is not absolutely necessary. In view of this important emergence of "personal supercomputing", we envisage that the role and the development of large computer centers will have to be reviewed critically and modified accordingly. Furthermore, a promise for some radically new approaches to Computational Science could be provided by massively parallel computers; among them, maybe solutions based on ideas of neural computing could be utilized, especially for restricted applications. Therefore, in order not to overlook any important advances within such a forefront field, one should rather choose the strategy of actively following each and every one of these routes. In perspective of the large variety of simultaneous developments, we want to emphasize the importance of Nordic collaboration in sharing expertise and experience in the rapidly progressing research - it ought to be cultivated and could be expanded. Therefore, we think that it is vitally important to continue with and to further promote the kind of Nordic Symposia that have been held at Lund, Kolle-Kolle, and Lahti. We want to thank most cordially the plenary and invited speakers, contributors, students, and in particular the Conference Secretary, Ms Ulla Ahlfors and Dr Milja Mäkelä, who was responsible for the local arrangements. The work that they did served to make this Symposium a scientific success and a useful and pleasant experience for all the well over 100 participants. We also thank the City of Lahti for kindly arranging a refreshing reception at the Town Hall. We wish to express our gratitude to Nordiska Kulturfonden, NORDITA, the Research Institute for Theoretical Physics at the University of Helsinki, the Finnish Ministry of Education and the Academy of Finland for their financial support. March 1990

  5. Transport properties of two-dimensional metal-phthalocyanine junctions: An ab initio study

    NASA Astrophysics Data System (ADS)

    Liu, Shuang-Long; Wang, Yun-Peng; Li, Xiang-Guo; Cheng, Hai-Ping

    We study two dimensional (2D) electronic/spintronic junctions made of metal-organic frameworks via first-principles simulation. The system consists of two Mn-phthalocyanine leads and a Ni-phthalocyanine center. A 2D Mn phthalocyanine sheet is ferromagnetic half metal and a 2D Ni phthalocyanine sheet is nonmagnetic semiconductor. Our results show that this system has a large tunnel magnetic resistance. The transmission coefficient at Fermi energy decays exponentially with the length of the central region which is not surprising. However, the transmission of the junction can be tuned using gate voltage by up to two orders of magnitude. The origin of the change lies in the mode matching between the lead and the center electronic states. Moreover, the threshold gate voltage varies with the length of the center region which provides a way of engineering the transport properties. Finally, we combine non-equilibrium Green's function and Boltzmann transport equation to compute conductance of the junction. This work was supported by the US Department of Energy (DOE), Office of Basic Energy Sciences (BES), under Contract No. DE-FG02-02ER45995. Computations were done using the utilities of NERSC and University of Florida Research Computing.

  6. Particle-in-cell code library for numerical simulation of the ECR source plasma

    NASA Astrophysics Data System (ADS)

    Shirkov, G.; Alexandrov, V.; Preisendorf, V.; Shevtsov, V.; Filippov, A.; Komissarov, R.; Mironov, V.; Shirkova, E.; Strekalovsky, O.; Tokareva, N.; Tuzikov, A.; Vatulin, V.; Vasina, E.; Fomin, V.; Anisimov, A.; Veselov, R.; Golubev, A.; Grushin, S.; Povyshev, V.; Sadovoi, A.; Donskoi, E.; Nakagawa, T.; Yano, Y.

    2003-05-01

    The project ;Numerical simulation and optimization of ion accumulation and production in multicharged ion sources; is funded by the International Science and Technology Center (ISTC). A summary of recent project development and the first version of a computer code library for simulation of electron-cyclotron resonance (ECR) source plasmas based on the particle-in-cell method are presented.

  7. NAVO MSRC Navigator. Spring 2008

    DTIC Science & Technology

    2008-01-01

    EINSTEIN and DAVINCI Come to the MSRC The Porthole 19 Visitors to the Naval Oceanographic Office Major Shared Resource Center Navigator Tools and...traditionally considered one of the leading track guidance tools for forecasters. As an example, we consider the case of Hurricane Figure 2. The...MSRC NAVIGATOR EINSTEIN and DAVINCI Come to the MSRC Christine Cuicchi, Computational Science and Applications Lead, NAVO MSRC The Technology

  8. Overview of Human-Centric Space Situational Awareness Science and Technology

    DTIC Science & Technology

    2012-09-01

    AGI), the developers of Satellite Tool Kit ( STK ), has provided demonstrations of innovative SSA visualization concepts that take advantage of the...needs inherent with SSA. RH has conducted CTAs and developed work-centered human-computer interfaces, visualizations , and collaboration technologies...all end users. RH’s Battlespace Visualization Branch researches methods to exploit the visual channel primarily to improve decision making and

  9. KSC-2013-3570

    NASA Image and Video Library

    2013-09-12

    CAPE CANAVERAL, Fla. – Tracey Kickbusch, chief of computational sciences at NASA's Kennedy Space Center in Florida, discusses modeling and simulations with attendees at the Technology Transfer Forum of the Economic Development Commission of Florida's Space Coast. A goal of the session was to showcase ways commercial businesses can work with NASA to develop technology and apply existing technology to commercial uses. Photo credit: NASA/Glenn Benson

  10. Artificial Intelligence: An Analysis of the Technology for Training. Training and Development Research Center Project Number Fourteen.

    ERIC Educational Resources Information Center

    Sayre, Scott Alan

    The ultimate goal of the science of artificial intelligence (AI) is to establish programs that will use algorithmic computer techniques to imitate the heuristic thought processes of humans. Most AI programs, especially expert systems, organize their knowledge into three specific areas: data storage, a rule set, and a control structure. Limitations…

  11. 77 FR 27470 - Center for Scientific Review Notice of Closed Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-10

    ..., Prevention and Intervention for Addictions Study Section. Date: June 7-8, 2012. Time: 8:00 a.m. to 5:00 p.m...: Bioengineering Sciences & Technologies Integrated Review Group; Nanotechnology Study Section. Date: June 7-8..., Computational Biology and Technology Study Section. Date: June 7-8, 2012. Time: 8:30 a.m. to 6:00 p.m. Agenda...

  12. Computer network access to scientific information systems for minority universities

    NASA Astrophysics Data System (ADS)

    Thomas, Valerie L.; Wakim, Nagi T.

    1993-08-01

    The evolution of computer networking technology has lead to the establishment of a massive networking infrastructure which interconnects various types of computing resources at many government, academic, and corporate institutions. A large segment of this infrastructure has been developed to facilitate information exchange and resource sharing within the scientific community. The National Aeronautics and Space Administration (NASA) supports both the development and the application of computer networks which provide its community with access to many valuable multi-disciplinary scientific information systems and on-line databases. Recognizing the need to extend the benefits of this advanced networking technology to the under-represented community, the National Space Science Data Center (NSSDC) in the Space Data and Computing Division at the Goddard Space Flight Center has developed the Minority University-Space Interdisciplinary Network (MU-SPIN) Program: a major networking and education initiative for Historically Black Colleges and Universities (HBCUs) and Minority Universities (MUs). In this paper, we will briefly explain the various components of the MU-SPIN Program while highlighting how, by providing access to scientific information systems and on-line data, it promotes a higher level of collaboration among faculty and students and NASA scientists.

  13. Curricular Design for Intelligent Systems in Geosciences Using Urban Groundwater Studies.

    NASA Astrophysics Data System (ADS)

    Cabral-Cano, E.; Pierce, S. A.; Fuentes-Pineda, G.; Arora, R.

    2016-12-01

    Geosciences research frequently focuses on process-centered phenomena, studying combinations of physical, geological, chemical, biological, ecological, and anthropogenic factors. These interconnected Earth systems can be best understood through the use of digital tools that should be documented as workflows. To develop intelligent systems, it is important that geoscientists and computing and information sciences experts collaborate to: (1) develop a basic understanding of the geosciences and computing and information sciences disciplines so that the problem and solution approach are clear to all stakeholders, and (2) implement the desired intelligent system with a short turnaround time. However, these interactions and techniques are seldom covered in traditional Earth Sciences curricula. We have developed an exchange course on Intelligent Systems for Geosciences to support workforce development and build capacity to facilitate skill-development at the undergraduate student-level. The first version of this course was offered jointly by the University of Texas at Austin and the Universidad Nacional Autónoma de México as an intensive, study-abroad summer course. Content included: basic Linux introduction, shell scripting and high performance computing, data management, experts systems, field data collection exercises and basics of machine learning. Additionally, student teams were tasked to develop a term projects that centered on applications of Intelligent Systems applied to urban and karst groundwater systems. Projects included expert system and reusable workflow development for subsidence hazard analysis in Celaya, Mexico, a classification model to analyze land use change over a 30 Year Period in Austin, Texas, big data processing and decision support for central Texas groundwater case studies and 3D mapping with point cloud processing at three Texas field sites. We will share experiences and pedagogical insights to improve future versions of this course.

  14. Cloud Computing Applications in Support of Earth Science Activities at Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    Molthan, Andrew L.; Limaye, Ashutosh S.; Srikishen, Jayanthi

    2011-01-01

    Currently, the NASA Nebula Cloud Computing Platform is available to Agency personnel in a pre-release status as the system undergoes a formal operational readiness review. Over the past year, two projects within the Earth Science Office at NASA Marshall Space Flight Center have been investigating the performance and value of Nebula s "Infrastructure as a Service", or "IaaS" concept and applying cloud computing concepts to advance their respective mission goals. The Short-term Prediction Research and Transition (SPoRT) Center focuses on the transition of unique NASA satellite observations and weather forecasting capabilities for use within the operational forecasting community through partnerships with NOAA s National Weather Service (NWS). SPoRT has evaluated the performance of the Weather Research and Forecasting (WRF) model on virtual machines deployed within Nebula and used Nebula instances to simulate local forecasts in support of regional forecast studies of interest to select NWS forecast offices. In addition to weather forecasting applications, rapidly deployable Nebula virtual machines have supported the processing of high resolution NASA satellite imagery to support disaster assessment following the historic severe weather and tornado outbreak of April 27, 2011. Other modeling and satellite analysis activities are underway in support of NASA s SERVIR program, which integrates satellite observations, ground-based data and forecast models to monitor environmental change and improve disaster response in Central America, the Caribbean, Africa, and the Himalayas. Leveraging SPoRT s experience, SERVIR is working to establish a real-time weather forecasting model for Central America. Other modeling efforts include hydrologic forecasts for Kenya, driven by NASA satellite observations and reanalysis data sets provided by the broader meteorological community. Forecast modeling efforts are supplemented by short-term forecasts of convective initiation, determined by geostationary satellite observations processed on virtual machines powered by Nebula.

  15. The Future Medical Science and Colorectal Surgeons

    PubMed Central

    2017-01-01

    Future medical technology breakthroughs will build from the incredible progress made in computers, biotechnology, and nanotechnology and from the information learned from the human genome. With such technology and information, computer-aided diagnoses, organ replacement, gene therapy, personalized drugs, and even age reversal will become possible. True 3-dimensional system technology will enable surgeons to envision key clinical features and will help them in planning complex surgery. Surgeons will enter surgical instructions in a virtual space from a remote medical center, order a medical robot to perform the operation, and review the operation in real time on a monitor. Surgeons will be better than artificial intelligence or automated robots when surgeons (or we) love patients and ask questions for a better future. The purpose of this paper is looking at the future medical science and the changes of colorectal surgeons. PMID:29354602

  16. The Future Medical Science and Colorectal Surgeons.

    PubMed

    Kim, Young Jin

    2017-12-01

    Future medical technology breakthroughs will build from the incredible progress made in computers, biotechnology, and nanotechnology and from the information learned from the human genome. With such technology and information, computer-aided diagnoses, organ replacement, gene therapy, personalized drugs, and even age reversal will become possible. True 3-dimensional system technology will enable surgeons to envision key clinical features and will help them in planning complex surgery. Surgeons will enter surgical instructions in a virtual space from a remote medical center, order a medical robot to perform the operation, and review the operation in real time on a monitor. Surgeons will be better than artificial intelligence or automated robots when surgeons (or we) love patients and ask questions for a better future. The purpose of this paper is looking at the future medical science and the changes of colorectal surgeons.

  17. KSC-2012-5019

    NASA Image and Video Library

    2012-09-06

    PE CANAVERAL, Fla. – During NASA's Innovation Expo at the Kennedy Space Center in Florida, Priscilla Elfrey, of NASA's Computational Sciences Branch, proposes that NASA to partner with two organizations to help improve minority employment. Kennedy Kick-Start Chair Mike Conroy looks on from the left. As Kennedy continues developing programs and infrastructure to become a 21st century spaceport, many employees are devising ways to do their jobs better and more efficiently. On Sept. 6, 2012, 16 Kennedy employees pitched their innovative ideas for improving the center at the Kennedy Kick-Start event. The competition was part of a center-wide effort designed to increase exposure for innovative ideas and encourage their implementation. For more information, visit http://www.nasa.gov/centers/kennedy/news/kick-start_competition.html Photo credit: NASA/Gianni Woods

  18. Early development of Science Opportunity Analysis tools for the Jupiter Icy Moons Explorer (JUICE) mission

    NASA Astrophysics Data System (ADS)

    Cardesin Moinelo, Alejandro; Vallat, Claire; Altobelli, Nicolas; Frew, David; Llorente, Rosario; Costa, Marc; Almeida, Miguel; Witasse, Olivier

    2016-10-01

    JUICE is the first large mission in the framework of ESA's Cosmic Vision 2015-2025 program. JUICE will survey the Jovian system with a special focus on three of the Galilean Moons: Europa, Ganymede and Callisto.The mission has recently been adopted and big efforts are being made by the Science Operations Center (SOC) at the European Space and Astronomy Centre (ESAC) in Madrid for the development of tools to provide the necessary support to the Science Working Team (SWT) for science opportunity analysis and early assessment of science operation scenarios. This contribution will outline some of the tools being developed within ESA and in collaboration with the Navigation and Ancillary Information Facility (NAIF) at JPL.The Mission Analysis and Payload Planning Support (MAPPS) is developed by ESA and has been used by most of ESA's planetary missions to generate and validate science observation timelines for the simulation of payload and spacecraft operations. MAPPS has the capability to compute and display all the necessary geometrical information such as the distances, illumination angles and projected field-of-view of an imaging instrument on the surface of the given body and a preliminary setup is already in place for the early assessment of JUICE science operations.NAIF provides valuable SPICE support to the JUICE mission and several tools are being developed to compute and visualize science opportunities. In particular the WebGeoCalc and Cosmographia systems are provided by NAIF to compute time windows and create animations of the observation geometry available via traditional SPICE data files, such as planet orbits, spacecraft trajectory, spacecraft orientation, instrument field-of-view "cones" and instrument footprints. Other software tools are being developed by ESA and other collaborating partners to support the science opportunity analysis for all missions, like the SOLab (Science Operations Laboratory) or new interfaces for observation definitions and opportunity window databases.

  19. When technology became language: the origins of the linguistic conception of computer programming, 1950-1960.

    PubMed

    Nofre, David; Priestley, Mark; Alberts, Gerard

    2014-01-01

    Language is one of the central metaphors around which the discipline of computer science has been built. The language metaphor entered modern computing as part of a cybernetic discourse, but during the second half of the 1950s acquired a more abstract meaning, closely related to the formal languages of logic and linguistics. The article argues that this transformation was related to the appearance of the commercial computer in the mid-1950s. Managers of computing installations and specialists on computer programming in academic computer centers, confronted with an increasing variety of machines, called for the creation of "common" or "universal languages" to enable the migration of computer code from machine to machine. Finally, the article shows how the idea of a universal language was a decisive step in the emergence of programming languages, in the recognition of computer programming as a proper field of knowledge, and eventually in the way we think of the computer.

  20. Computer Simulated Development of Improved Command to Line-of-Sight Missile Guidance Techniques

    DTIC Science & Technology

    1979-03-01

    INaval Postgraduate School Ma//MW Monterey, CA 93940 C -" 10 6 -VA. MONITORING A41INCY MAMIE 6 AOORESS(it 011f.,.t frau Cdfltt.I01gg 01HOS). IS. SECURITY...States Navy B.S., United States Naval Academy, 1967 Submitted in partial fulfillment of the requirements for the degree of MASTER OF SCIENCE IN SYSTEM4S...i i i it +P~4 to i if 1 . 1 - ztnco C3i.- Z-, a O.) Z (~VI- (M CWn4 Ul% 103 4-7 A BIBLIOGRAPHY 1. U.S. Army Foreign Science and Technology Center

  1. Real-Time On-Board Airborne Demonstration of High-Speed On-Board Data Processing for Science Instruments (HOPS)

    NASA Technical Reports Server (NTRS)

    Beyon, Jeffrey Y.; Ng, Tak-Kwong; Davis, Mitchell J.; Adams, James K.; Bowen, Stephen C.; Fay, James J.; Hutchinson, Mark A.

    2015-01-01

    The project called High-Speed On-Board Data Processing for Science Instruments (HOPS) has been funded by NASA Earth Science Technology Office (ESTO) Advanced Information Systems Technology (AIST) program since April, 2012. The HOPS team recently completed two flight campaigns during the summer of 2014 on two different aircrafts with two different science instruments. The first flight campaign was in July, 2014 based at NASA Langley Research Center (LaRC) in Hampton, VA on the NASA's HU-25 aircraft. The science instrument that flew with HOPS was Active Sensing of CO2 Emissions over Nights, Days, and Seasons (ASCENDS) CarbonHawk Experiment Simulator (ACES) funded by NASA's Instrument Incubator Program (IIP). The second campaign was in August, 2014 based at NASA Armstrong Flight Research Center (AFRC) in Palmdale, CA on the NASA's DC-8 aircraft. HOPS flew with the Multifunctional Fiber Laser Lidar (MFLL) instrument developed by Excelis Inc. The goal of the campaigns was to perform an end-to-end demonstration of the capabilities of the HOPS prototype system (HOPS COTS) while running the most computationally intensive part of the ASCENDS algorithm real-time on-board. The comparison of the two flight campaigns and the results of the functionality tests of the HOPS COTS are presented in this paper.

  2. Optical information processing at NASA Ames Research Center

    NASA Technical Reports Server (NTRS)

    Reid, Max B.; Bualat, Maria G.; Cho, Young C.; Downie, John D.; Gary, Charles K.; Ma, Paul W.; Ozcan, Meric; Pryor, Anna H.; Spirkovska, Lilly

    1993-01-01

    The combination of analog optical processors with digital electronic systems offers the potential of tera-OPS computational performance, while often requiring less power and weight relative to all-digital systems. NASA is working to develop and demonstrate optical processing techniques for on-board, real time science and mission applications. Current research areas and applications under investigation include optical matrix processing for space structure vibration control and the analysis of Space Shuttle Main Engine plume spectra, optical correlation-based autonomous vision for robotic vehicles, analog computation for robotic path planning, free-space optical interconnections for information transfer within digital electronic computers, and multiplexed arrays of fiber optic interferometric sensors for acoustic and vibration measurements.

  3. Data reduction and analysis of HELIOS plasma wave data

    NASA Technical Reports Server (NTRS)

    Anderson, Roger R.

    1988-01-01

    Reduction of data acquired from the HELIOS Solar Wind Plasma Wave Experiments on HELIOS 1 and 2 was continued. Production of 24 hour survey plots of the HELIOS 1 plasma wave data were continued and microfilm copies were submitted to the National Space Science Data Center. Much of the effort involved the shock memory from both HELIOS 1 and 2. This data had to be deconvoluted and time ordered before it could be displayed and plotted in an organized form. The UNIVAX 418-III computer was replaced by a DEC VAX 11/780 computer. In order to continue the reduction and analysis of the data set, all data reduction and analysis computer programs had to be rewritten.

  4. NLSI Focus Group on Recovery of Missing ALSEP Data: Status Update for 2012 NLSI Science Forum

    NASA Technical Reports Server (NTRS)

    Lewis, Lyach R.; Nakamura, Y.; Nagihara, S.; Williams, D. R.; Chi, P.; Taylor, P. T.; Schmidt, G. K.; Hill, H. K.

    2012-01-01

    On the six Apollo lunar landed missions, the Astronauts deployed the Apollo Lunar Surface Experiments Package (ALSEP) science stations which measured active and passive seismic events, magnetic fields, charged particles, solar wind, heat flow, the diffuse atmosphere, meteorites and their ejecta, lunar dust, etc. Today s investigators are able to extract new information and make new discoveries from the old ALSEP data utilizing recent advances in computer capabilities and new analysis techniques. However, current-day investigators are encountering problems in trying to use the ALSEP data. The data were in formats often not well described in the published reports and contained rerecording anomalies which required tape experts to resolve. To solve these problems the DPS Lunar Data Node was established at NASA Goddard Space Flight Center (GSFC) NASA Space Science Data Center (NSSDC) in 2008 and is currently in the process of making the existing archived ALSEP data available to current-day investigators in easily useable forms. However, current estimates by NSSDC archivists are that only about 60 percent of the PI processed ALSEP data and less than 30 percent of the raw experiment ALSEP data-of-interest to current lunar science investigators are currently in the NSSDC archives.

  5. eHealth research from the user's perspective.

    PubMed

    Hesse, Bradford W; Shneiderman, Ben

    2007-05-01

    The application of information technology (IT) to issues of healthcare delivery has had a long and tortuous history in the United States. Within the field of eHealth, vanguard applications of advanced computing techniques, such as applications in artificial intelligence or expert systems, have languished in spite of a track record of scholarly publication and decisional accuracy. The problem is one of purpose, of asking the right questions for the science to solve. Historically, many computer science pioneers have been tempted to ask "what can the computer do?" New advances in eHealth are prompting developers to ask "what can people do?" How can eHealth take part in national goals for healthcare reform to empower relationships between healthcare professionals and patients, healthcare teams and families, and hospitals and communities to improve health equitably throughout the population? To do this, eHealth researchers must combine best evidence from the user sciences (human factors engineering, human-computer interaction, psychology, and usability) with best evidence in medicine to create transformational improvements in the quality of care that medicine offers. These improvements should follow recommendations from the Institute of Medicine to create a healthcare system that is (1) safe, (2) effective (evidence based), (3) patient centered, and (4) timely. Relying on the eHealth researcher's intuitive grasp of systems issues, improvements should be made with considerations of users and beneficiaries at the individual (patient-physician), group (family-staff), community, and broad environmental levels.

  6. Integration Of PanDA Workload Management System With Supercomputers for ATLAS and Data Intensive Science

    NASA Astrophysics Data System (ADS)

    Klimentov, A.; De, K.; Jha, S.; Maeno, T.; Nilsson, P.; Oleynik, D.; Panitkin, S.; Wells, J.; Wenaus, T.

    2016-10-01

    The.LHC, operating at CERN, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe. ATLAS, one of the largest collaborations ever assembled in the sciences, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Management System for managing the workflow for all data processing on over 150 data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. While PanDA currently uses more than 250,000 cores with a peak performance of 0.3 petaFLOPS, LHC data taking runs require more resources than grid can possibly provide. To alleviate these challenges, LHC experiments are engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with supercomputers in United States, in particular with Titan supercomputer at Oak Ridge Leadership Computing Facility. Current approach utilizes modified PanDA pilot framework for job submission to the supercomputers batch queues and local data management, with light-weight MPI wrappers to run single threaded workloads in parallel on LCFs multi-core worker nodes. This implementation was tested with a variety of Monte-Carlo workloads on several supercomputing platforms for ALICE and ATLAS experiments and it is in full pro duction for the ATLAS since September 2015. We will present our current accomplishments with running PanDA at supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facilities infrastructure for High Energy and Nuclear Physics as well as other data-intensive science applications, such as bioinformatics and astro-particle physics.

  7. Dynamic Voltage Frequency Scaling Simulator for Real Workflows Energy-Aware Management in Green Cloud Computing

    PubMed Central

    Cotes-Ruiz, Iván Tomás; Prado, Rocío P.; García-Galán, Sebastián; Muñoz-Expósito, José Enrique; Ruiz-Reyes, Nicolás

    2017-01-01

    Nowadays, the growing computational capabilities of Cloud systems rely on the reduction of the consumed power of their data centers to make them sustainable and economically profitable. The efficient management of computing resources is at the heart of any energy-aware data center and of special relevance is the adaptation of its performance to workload. Intensive computing applications in diverse areas of science generate complex workload called workflows, whose successful management in terms of energy saving is still at its beginning. WorkflowSim is currently one of the most advanced simulators for research on workflows processing, offering advanced features such as task clustering and failure policies. In this work, an expected power-aware extension of WorkflowSim is presented. This new tool integrates a power model based on a computing-plus-communication design to allow the optimization of new management strategies in energy saving considering computing, reconfiguration and networks costs as well as quality of service, and it incorporates the preeminent strategy for on host energy saving: Dynamic Voltage Frequency Scaling (DVFS). The simulator is designed to be consistent in different real scenarios and to include a wide repertory of DVFS governors. Results showing the validity of the simulator in terms of resources utilization, frequency and voltage scaling, power, energy and time saving are presented. Also, results achieved by the intra-host DVFS strategy with different governors are compared to those of the data center using a recent and successful DVFS-based inter-host scheduling strategy as overlapped mechanism to the DVFS intra-host technique. PMID:28085932

  8. Dynamic Voltage Frequency Scaling Simulator for Real Workflows Energy-Aware Management in Green Cloud Computing.

    PubMed

    Cotes-Ruiz, Iván Tomás; Prado, Rocío P; García-Galán, Sebastián; Muñoz-Expósito, José Enrique; Ruiz-Reyes, Nicolás

    2017-01-01

    Nowadays, the growing computational capabilities of Cloud systems rely on the reduction of the consumed power of their data centers to make them sustainable and economically profitable. The efficient management of computing resources is at the heart of any energy-aware data center and of special relevance is the adaptation of its performance to workload. Intensive computing applications in diverse areas of science generate complex workload called workflows, whose successful management in terms of energy saving is still at its beginning. WorkflowSim is currently one of the most advanced simulators for research on workflows processing, offering advanced features such as task clustering and failure policies. In this work, an expected power-aware extension of WorkflowSim is presented. This new tool integrates a power model based on a computing-plus-communication design to allow the optimization of new management strategies in energy saving considering computing, reconfiguration and networks costs as well as quality of service, and it incorporates the preeminent strategy for on host energy saving: Dynamic Voltage Frequency Scaling (DVFS). The simulator is designed to be consistent in different real scenarios and to include a wide repertory of DVFS governors. Results showing the validity of the simulator in terms of resources utilization, frequency and voltage scaling, power, energy and time saving are presented. Also, results achieved by the intra-host DVFS strategy with different governors are compared to those of the data center using a recent and successful DVFS-based inter-host scheduling strategy as overlapped mechanism to the DVFS intra-host technique.

  9. [Introduction].

    PubMed

    Gerard, Adrienne; van den Bogaard, Alberts

    2008-01-01

    Along with the international trends in history of computing, Dutch contributions over the past twenty years moved away from a focus on machinery to the broader scope of use of computers, appropriation of computing technologies in various traditions, labour relations and professionalisation issues, and, lately, software. It is only natural that an emerging field like computer science sets out to write its genealogy and canonise the important steps in its intellectual endeavour. It is fair to say that a historiography diverging from such "home" interest, started in 1987 with the work of Eda Kranakis--then active in The Netherlands--commissioned by the national bureau for technology assessment, and Gerard Alberts, turning a commemorative volume of the Mathematical Center into a history of the same institute. History of computing in The Netherlands made a major leap in the spring of 1994 when Dirk de Wit, Jan van den Ende and Ellen van Oost defended their dissertations, on the roads towards adoption of computing technology in banking, in science and engineering, and on the gender aspect in computing. Here, history of computing had already moved from machines to the use of computers. The three authors joined Gerard Alberts and Onno de Wit in preparing a volume on the rise of IT in The Netherlands, the sequel of which in now in preparation in a team lead by Adrienne van den Bogaard. Dutch research reflected the international attention for professionalisation issues (Ensmenger, Haigh) very early on in the dissertation by Ruud van Dael, Something to do with computers (2001) revealing how occupations dealing with computers typically escape the pattern of closure by professionalisation as expected by the, thus outdated, sociology of professions. History of computing not only takes use and users into consideration, but finally, as one may say, confronts the technological side of putting the machine to use, software, head on. The groundbreaking works of the 2000 Paderborn meeting and by Martin Campbell-Kelly resonate in work done in The Netherlands and recently in a major research project sponsored by the European Science Foundation: Software for Europe. The four contributions to this issue offer a true cross-section of ongoing history of computing in The Netherlands. Gerard Alberts and Huub de Beer return to the earliest computers at the Mathematical Center. As they do so under the perspective of using the machines, the result is, let us say, remarkable. Adrienne van den Bogaard compares the styles of software as practiced by Van der Poel and Dijkstra: so much had these two pioneers in common, so different the consequences they took. Frank Veraart treats us with an excerpt from his recent dissertation on the domestication of the micro computer technology: appropriation of computing technology is shown by the role of intermediate actors. Onno de Wit, finally, gives an account of the development, prior to internet, of a national data communication network among large scale users and its remarkable persistence under competition with new network technologies.

  10. Know Your Discipline: Teaching the Philosophy of Computer Science

    ERIC Educational Resources Information Center

    Tedre, Matti

    2007-01-01

    The diversity and interdisciplinarity of computer science and the multiplicity of its uses in other sciences make it hard to define computer science and to prescribe how computer science should be carried out. The diversity of computer science also causes friction between computer scientists from different branches. Computer science curricula, as…

  11. User manual for semi-circular compact range reflector code: Version 2

    NASA Technical Reports Server (NTRS)

    Gupta, Inder J.; Burnside, Walter D.

    1987-01-01

    A computer code has been developed at the Ohio State University ElectroScience Laboratory to analyze a semi-circular paraboloidal reflector with or without a rolled edge at the top and a skirt at the bottom. The code can be used to compute the total near field of the reflector or its individual components at a given distance from the center of the paraboloid. The code computes the fields along a radial, horizontal, vertical or axial cut at that distance. Thus, it is very effective in computing the size of the sweet spot for a semi-circular compact range reflector. This report describes the operation of the code. Various input and output statements are explained. Some results obtained using the computer code are presented to illustrate the code's capability as well as being samples of input/output sets.

  12. Microgravity Science Glovebox

    NASA Technical Reports Server (NTRS)

    2001-01-01

    Computer-generated drawing shows the relative scale and working space for the Microgravity Science Glovebox (MSG) being developed by NASA and the European Space Agency for science experiments aboard the International Space Station (ISS). The person at the glovebox repesents a 95th percentile American male. The MSG will be deployed first to the Destiny laboratory module and later will be moved to ESA's Columbus Attached Payload Module. Each module will be filled with International Standard Payload Racks (green) attached to standoff fittings (yellow) that hold the racks in position. Destiny is six racks in length. The MSG is being developed by the European Space Agency and NASA to provide a large working volume for hands-on experiments aboard the International Space Station. Scientists will use the MSG to carry out multidisciplinary studies in combustion science, fluid physics and materials science. The MSG is managed by NASA's Marshall Space Flight Center. (Credit: NASA/Marshall)

  13. Spacelab

    NASA Image and Video Library

    1994-07-08

    This is a Space Shuttle Columbia (STS-65) onboard photo of the second International Microgravity Laboratory (IML-2) in the cargo bay with Earth in the background. Mission objectives of IML-2 were to conduct science and technology investigations that required the low-gravity environment of space, with emphasis on experiments that studied the effects of microgravity on materials processes and living organisms. Materials science and life sciences are two of the most exciting areas of microgravity research because discoveries in these fields could greatly enhance the quality of life on Earth. If the structure of certain proteins can be determined by examining high-quality protein crystals grown in microgravity, advances can be made to improve the treatment of many human diseases. Electronic materials research in space may help us refine processes and make better products, such as computers, lasers, and other high-tech devices. The 14-nation European Space Agency (ESA), the Canadian Space Agency (SCA), the French National Center for Space Studies (CNES), the German Space Agency and the German Aerospace Research Establishment (DARA/DLR), and the National Space Development Agency of Japan (NASDA) participated in developing hardware and experiments for the IML missions. The missions were managed by NASA's Marshall Space Flight Center. The Orbiter Columbia was launched from the Kennedy Space Center on July 8, 1994 for the IML-2 mission.

  14. Carbon in Underland (A "Life at the Frontiers of Energy Research" contest entry from the 2011 Energy Frontier Research Centers (EFRCs) Summit and Forum

    ScienceCinema

    DePaolo, Donald J. (Director, Center for Nanoscale Control of Geologic CO2); NCGC Staff

    2017-12-09

    'Carbon in Underland' was submitted by the Center for Nanoscale Control of Geologic CO2 (NCGC) to the 'Life at the Frontiers of Energy Research' video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. This video was selected as one of five winners by a distinguished panel of judges for its 'entertaining animation and engaging explanations of carbon sequestration'. NCGC, an EFRC directed by Donald J. DePaolo at Lawrence Berkeley National Laboratory is a partnership of scientists from seven institutions: LBNL (lead) Massachusetts Institute of Technology, Lawrence Livermore National Laboratory, Oak Ridge National Laboratory, University of California, Davis, Ohio State University, and Washington University in St. Louis. The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges. The mission of the Center for Nanoscale Control of Geologic CO{sub 2} is 'to use new investigative tools, combined with experiments and computer simulations, to build a fundamental understanding of molecular-to-pore-scale processes in fluid-rock systems, and to demonstrate the ability to control critical aspects of flow, transport, and mineralization in porous rock media as applied to geologic sequestration of CO{sub 2}. Research topics are: bio-inspired, CO{sub 2} (store), greenhouse gas, and interfacial characterization.

  15. NLSI Focus Group on Missing ALSEP Data Recovery: Progress and Plans

    NASA Technical Reports Server (NTRS)

    Lewis, L. R.; Nakamura, Y.; Nagihara, S.; Williams, D. R.; Chi, P.; Taylor, P. T.; Schmidt, G. K.; Grayzeck, E. J.

    2011-01-01

    On the six Apollo landed missions, the Astronauts deployed the Apollo Lunar Surface Experiments Package (ALSEP) science stations which measured active and passive seismic events, magnetic fields, charged particles, solar wind, heat flow, the diffuse atmosphere, meteorites and their ejecta, lunar dust, etc. Today's scientists are able to extract new information and make new discoveries from the old ALSEP data utilizing recent advances in computer capabilities and new analysis techniques. However, current-day investigators are encountering problems trying to use the ALSEP data. In 2007 archivists from NASA Goddard Space Flight Center (GSFC) National Space Science Data Center (NSSDC) estimated only about 50 percent of the processed ALSEP lunar surface data-of-interest to current lunar science investigators were in the NSSDC archives. The current-day lunar science investigators found most of the ALSEP data, then in the NSSDC archives. were extremely difficult to use. The data were in forms often not well described in the published reports and rerecording anomalies existed in the data which could only be resolved by tape experts. To resolve this problem, the DPS Lunar Data Node was established in 2008 at NSSDC and is in the process of successfully making the existing archived ALSEP data available to current-day investigators in easily useable forms. In July of 2010 the NASA Lunar Science Institute (NLSI) at Ames Research Center established the Recovery of Missing ALSEP Data Focus Group in recognition of the importance of the current activities to find the raw and processed ALSEP data missing from the NSSDC archives.

  16. High Performance Computing Facility Operational Assessment, FY 2011 Oak Ridge Leadership Computing Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, Ann E; Bland, Arthur S Buddy; Hack, James J

    Oak Ridge National Laboratory's Leadership Computing Facility (OLCF) continues to deliver the most powerful resources in the U.S. for open science. At 2.33 petaflops peak performance, the Cray XT Jaguar delivered more than 1.5 billion core hours in calendar year (CY) 2010 to researchers around the world for computational simulations relevant to national and energy security; advancing the frontiers of knowledge in physical sciences and areas of biological, medical, environmental, and computer sciences; and providing world-class research facilities for the nation's science enterprise. Scientific achievements by OLCF users range from collaboration with university experimentalists to produce a working supercapacitor thatmore » uses atom-thick sheets of carbon materials to finely determining the resolution requirements for simulations of coal gasifiers and their components, thus laying the foundation for development of commercial-scale gasifiers. OLCF users are pushing the boundaries with software applications sustaining more than one petaflop of performance in the quest to illuminate the fundamental nature of electronic devices. Other teams of researchers are working to resolve predictive capabilities of climate models, to refine and validate genome sequencing, and to explore the most fundamental materials in nature - quarks and gluons - and their unique properties. Details of these scientific endeavors - not possible without access to leadership-class computing resources - are detailed in Section 4 of this report and in the INCITE in Review. Effective operations of the OLCF play a key role in the scientific missions and accomplishments of its users. This Operational Assessment Report (OAR) will delineate the policies, procedures, and innovations implemented by the OLCF to continue delivering a petaflop-scale resource for cutting-edge research. The 2010 operational assessment of the OLCF yielded recommendations that have been addressed (Reference Section 1) and where appropriate, changes in Center metrics were introduced. This report covers CY 2010 and CY 2011 Year to Date (YTD) that unless otherwise specified, denotes January 1, 2011 through June 30, 2011. User Support remains an important element of the OLCF operations, with the philosophy 'whatever it takes' to enable successful research. Impact of this center-wide activity is reflected by the user survey results that show users are 'very satisfied.' The OLCF continues to aggressively pursue outreach and training activities to promote awareness - and effective use - of U.S. leadership-class resources (Reference Section 2). The OLCF continues to meet and in many cases exceed DOE metrics for capability usage (35% target in CY 2010, delivered 39%; 40% target in CY 2011, 54% January 1, 2011 through June 30, 2011). The Schedule Availability (SA) and Overall Availability (OA) for Jaguar were exceeded in CY2010. Given the solution to the VRM problem the SA and OA for Jaguar in CY 2011 are expected to exceed the target metrics of 95% and 90%, respectively (Reference Section 3). Numerous and wide-ranging research accomplishments, scientific support, and technological innovations are more fully described in Sections 4 and 6 and reflect OLCF leadership in enabling high-impact science solutions and vision in creating an exascale-ready center. Financial Management (Section 5) and Risk Management (Section 7) are carried out using best practices approved of by DOE. The OLCF has a valid cyber security plan and Authority to Operate (Section 8). The proposed metrics for 2012 are reflected in Section 9.« less

  17. Leveraging Campus Network Capabilities at the Desktop: Helping Users Get Real Work Done or How Windows Sockets & MacTCP Changed My Life.

    ERIC Educational Resources Information Center

    Ezekiel, Aaron B.

    At the University of New Mexico, stakeholders from the Computer and Information Resources and Technology (CIRT) department, Financial Systems, the Health Sciences Center, and the General Libraries, were involved in deciding on the goals of a project to replace Telnet with a suite of network middleware and productivity software on campus computer…

  18. EarthExplorer

    USGS Publications Warehouse

    Houska, Treva

    2012-01-01

    The EarthExplorer trifold provides basic information for on-line access to remotely-sensed data from the U.S. Geological Survey Earth Resources Observation and Science (EROS) Center archive. The EarthExplorer (http://earthexplorer.usgs.gov/) client/server interface allows users to search and download aerial photography, satellite data, elevation data, land-cover products, and digitized maps. Minimum computer system requirements and customer service contact information also are included in the brochure.

  19. Recent Naval Postgraduate School Publications.

    DTIC Science & Technology

    1980-04-01

    Numerical models of ocean circulation and Climate interaction Revs, of Geophis,.and Space Phys., vol. 17, no. 7, p. 1494-1507, (1 979) Haney, R 1...POSTGRADUATE SCHOOL Monterey, California DEPARTMENT OF COMPUTER SCIENCE C06FEBENCE PRESENTATIONS Bradley, G H Enerqy modelling with network optimization...Systems Analysis, Sept., 97 Bradley, G H; Brown, G G Network optimization and defense modeling Center for Nay. Analyses, Arlington, Va., Aug., 1976

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garzoglio, Gabriele

    The Fermilab Grid and Cloud Computing Department and the KISTI Global Science experimental Data hub Center propose a joint project. The goals are to enable scientific workflows of stakeholders to run on multiple cloud resources by use of (a) Virtual Infrastructure Automation and Provisioning, (b) Interoperability and Federat ion of Cloud Resources , and (c) High-Throughput Fabric Virtualization. This is a matching fund project in which Fermilab and KISTI will contribute equal resources .

Top