Sample records for advanced computing laboratory

  1. Computer laboratory in medical education for medical students.

    PubMed

    Hercigonja-Szekeres, Mira; Marinović, Darko; Kern, Josipa

    2009-01-01

    Five generations of second year students at the Zagreb University School of Medicine were interviewed through an anonymous questionnaire on their use of personal computers, Internet, computer laboratories and computer-assisted education in general. Results show an advance in students' usage of information and communication technology during the period from 1998/99 to 2002/03. However, their positive opinion about computer laboratory depends on installed capacities: the better the computer laboratory technology, the better the students' acceptance and use of it.

  2. Using an Advanced Computational Laboratory Experiment to Extend and Deepen Physical Chemistry Students' Understanding of Atomic Structure

    ERIC Educational Resources Information Center

    Hoffman, Gary G.

    2015-01-01

    A computational laboratory experiment is described, which involves the advanced study of an atomic system. The students use concepts and techniques typically covered in a physical chemistry course but extend those concepts and techniques to more complex situations. The students get a chance to explore the study of atomic states and perform…

  3. Advanced CNC and CAM Series. Educational Resources for the Machine Tool Industry. Course Syllabi, Instructor's Handbook [and] Student Laboratory Manual.

    ERIC Educational Resources Information Center

    Texas State Technical Coll. System, Waco.

    This package consists of course syllabi, an instructor's handbook, and student laboratory manual for a 1-year vocational training program to prepare students for entry-level positions as advanced computer numerical control (CNC) and computer-assisted manufacturing (CAM) technicians.. The program was developed through a modification of the DACUM…

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The Computing and Communications (C) Division is responsible for the Laboratory`s Integrated Computing Network (ICN) as well as Laboratory-wide communications. Our computing network, used by 8,000 people distributed throughout the nation, constitutes one of the most powerful scientific computing facilities in the world. In addition to the stable production environment of the ICN, we have taken a leadership role in high-performance computing and have established the Advanced Computing Laboratory (ACL), the site of research on experimental, massively parallel computers; high-speed communication networks; distributed computing; and a broad variety of advanced applications. The computational resources available in the ACL are ofmore » the type needed to solve problems critical to national needs, the so-called ``Grand Challenge`` problems. The purpose of this publication is to inform our clients of our strategic and operating plans in these important areas. We review major accomplishments since late 1990 and describe our strategic planning goals and specific projects that will guide our operations over the next few years. Our mission statement, planning considerations, and management policies and practices are also included.« less

  5. Final Report National Laboratory Professional Development Workshop for Underrepresented Participants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, Valerie

    The 2013 CMD-IT National Laboratories Professional Development Workshop for Underrepresented Participants (CMD-IT NLPDev 2013) was held at the Oak Ridge National Laboratory campus in Oak Ridge, TN. from June 13 - 14, 2013. Sponsored by the Department of Energy (DOE) Advanced Scientific Computing Research Program, the primary goal of these workshops is to provide information about career opportunities in computational science at the various national laboratories and to mentor the underrepresented participants through community building and expert presentations focused on career success. This second annual workshop offered sessions to facilitate career advancement and, in particular, the strategies and resources neededmore » to be successful at the national laboratories.« less

  6. Extreme-Scale Computing Project Aims to Advance Precision Oncology | Frederick National Laboratory for Cancer Research

    Cancer.gov

    Two government agencies and five national laboratories are collaborating to develop extremely high-performance computing capabilities that will analyze mountains of research and clinical data to improve scientific understanding of cancer, predict dru

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The Computing and Communications (C) Division is responsible for the Laboratory's Integrated Computing Network (ICN) as well as Laboratory-wide communications. Our computing network, used by 8,000 people distributed throughout the nation, constitutes one of the most powerful scientific computing facilities in the world. In addition to the stable production environment of the ICN, we have taken a leadership role in high-performance computing and have established the Advanced Computing Laboratory (ACL), the site of research on experimental, massively parallel computers; high-speed communication networks; distributed computing; and a broad variety of advanced applications. The computational resources available in the ACL are ofmore » the type needed to solve problems critical to national needs, the so-called Grand Challenge'' problems. The purpose of this publication is to inform our clients of our strategic and operating plans in these important areas. We review major accomplishments since late 1990 and describe our strategic planning goals and specific projects that will guide our operations over the next few years. Our mission statement, planning considerations, and management policies and practices are also included.« less

  8. Teaching Advanced Concepts in Computer Networks: VNUML-UM Virtualization Tool

    ERIC Educational Resources Information Center

    Ruiz-Martinez, A.; Pereniguez-Garcia, F.; Marin-Lopez, R.; Ruiz-Martinez, P. M.; Skarmeta-Gomez, A. F.

    2013-01-01

    In the teaching of computer networks the main problem that arises is the high price and limited number of network devices the students can work with in the laboratories. Nowadays, with virtualization we can overcome this limitation. In this paper, we present a methodology that allows students to learn advanced computer network concepts through…

  9. COMPUTATIONAL TOXICOLOGY ADVANCES: EMERGING CAPABILITIES FOR DATA EXPLORATION AND SAR MODEL DEVELOPMENT

    EPA Science Inventory

    Computational Toxicology Advances: Emerging capabilities for data exploration and SAR model development
    Ann M. Richard and ClarLynda R. Williams, National Health & Environmental Effects Research Laboratory, US EPA, Research Triangle Park, NC, USA; email: richard.ann@epa.gov

  10. Elementary and Advanced Computer Projects for the Physics Classroom and Laboratory

    DTIC Science & Technology

    1992-12-01

    are SPF/PC, MS Word, n3, Symphony, Mathematics, and FORTRAN. The authors’ programs assist data analysis in particular laboratory experiments and make...assist data analysis in particular laboratory experiments and make use of the Monte Carlo and other numerical techniques in computer simulation and...the language of science and engineering in industry and government laboratories (alth..4h C is becoming a powerful competitor ). RM/FORTRAN (cost $400

  11. Advanced Algebra and Calculus. High School Mathematics Curricula. Instructor's Guide.

    ERIC Educational Resources Information Center

    Natour, Denise M.

    This manual is an instructor's guide for the utilization of the "CCA High School Mathematics Curricula: Advanced Algebra and Calculus" courseware developed by the Computer-based Education Research Laboratory (CERL). The curriculum comprises 34 algebra lessons within 12 units and 15 calculus lessons that are computer-based and require…

  12. ISCR Annual Report: Fical Year 2004

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McGraw, J R

    2005-03-03

    Large-scale scientific computation and all of the disciplines that support and help to validate it have been placed at the focus of Lawrence Livermore National Laboratory (LLNL) by the Advanced Simulation and Computing (ASC) program of the National Nuclear Security Administration (NNSA) and the Scientific Discovery through Advanced Computing (SciDAC) initiative of the Office of Science of the Department of Energy (DOE). The maturation of computational simulation as a tool of scientific and engineering research is underscored in the November 2004 statement of the Secretary of Energy that, ''high performance computing is the backbone of the nation's science and technologymore » enterprise''. LLNL operates several of the world's most powerful computers--including today's single most powerful--and has undertaken some of the largest and most compute-intensive simulations ever performed. Ultrascale simulation has been identified as one of the highest priorities in DOE's facilities planning for the next two decades. However, computers at architectural extremes are notoriously difficult to use efficiently. Furthermore, each successful terascale simulation only points out the need for much better ways of interacting with the resulting avalanche of data. Advances in scientific computing research have, therefore, never been more vital to LLNL's core missions than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, LLNL must engage researchers at many academic centers of excellence. In Fiscal Year 2004, the Institute for Scientific Computing Research (ISCR) served as one of LLNL's main bridges to the academic community with a program of collaborative subcontracts, visiting faculty, student internships, workshops, and an active seminar series. The ISCR identifies researchers from the academic community for computer science and computational science collaborations with LLNL and hosts them for short- and long-term visits with the aim of encouraging long-term academic research agendas that address LLNL's research priorities. Through such collaborations, ideas and software flow in both directions, and LLNL cultivates its future workforce. The Institute strives to be LLNL's ''eyes and ears'' in the computer and information sciences, keeping the Laboratory aware of and connected to important external advances. It also attempts to be the ''feet and hands'' that carry those advances into the Laboratory and incorporates them into practice. ISCR research participants are integrated into LLNL's Computing and Applied Research (CAR) Department, especially into its Center for Applied Scientific Computing (CASC). In turn, these organizations address computational challenges arising throughout the rest of the Laboratory. Administratively, the ISCR flourishes under LLNL's University Relations Program (URP). Together with the other five institutes of the URP, it navigates a course that allows LLNL to benefit from academic exchanges while preserving national security. While it is difficult to operate an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and worth the continued effort.« less

  13. Development and Evaluation of Computer-Based Laboratory Practical Learning Tool

    ERIC Educational Resources Information Center

    Gandole, Y. B.

    2006-01-01

    Effective evaluation of educational software is a key issue for successful introduction of advanced tools in the curriculum. This paper details to developing and evaluating a tool for computer assisted learning of science laboratory courses. The process was based on the generic instructional system design model. Various categories of educational…

  14. High performance computing for advanced modeling and simulation of materials

    NASA Astrophysics Data System (ADS)

    Wang, Jue; Gao, Fei; Vazquez-Poletti, Jose Luis; Li, Jianjiang

    2017-02-01

    The First International Workshop on High Performance Computing for Advanced Modeling and Simulation of Materials (HPCMS2015) was held in Austin, Texas, USA, Nov. 18, 2015. HPCMS 2015 was organized by Computer Network Information Center (Chinese Academy of Sciences), University of Michigan, Universidad Complutense de Madrid, University of Science and Technology Beijing, Pittsburgh Supercomputing Center, China Institute of Atomic Energy, and Ames Laboratory.

  15. ISCR FY2005 Annual Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keyes, D E; McGraw, J R

    2006-02-02

    Large-scale scientific computation and all of the disciplines that support and help validate it have been placed at the focus of Lawrence Livermore National Laboratory (LLNL) by the Advanced Simulation and Computing (ASC) program of the National Nuclear Security Administration (NNSA) and the Scientific Discovery through Advanced Computing (SciDAC) initiative of the Office of Science of the Department of Energy (DOE). The maturation of simulation as a fundamental tool of scientific and engineering research is underscored in the President's Information Technology Advisory Committee (PITAC) June 2005 finding that ''computational science has become critical to scientific leadership, economic competitiveness, and nationalmore » security''. LLNL operates several of the world's most powerful computers--including today's single most powerful--and has undertaken some of the largest and most compute-intensive simulations ever performed, most notably the molecular dynamics simulation that sustained more than 100 Teraflop/s and won the 2005 Gordon Bell Prize. Ultrascale simulation has been identified as one of the highest priorities in DOE's facilities planning for the next two decades. However, computers at architectural extremes are notoriously difficult to use in an efficient manner. Furthermore, each successful terascale simulation only points out the need for much better ways of interacting with the resulting avalanche of data. Advances in scientific computing research have, therefore, never been more vital to the core missions of LLNL than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, LLNL must engage researchers at many academic centers of excellence. In FY 2005, the Institute for Scientific Computing Research (ISCR) served as one of LLNL's main bridges to the academic community with a program of collaborative subcontracts, visiting faculty, student internships, workshops, and an active seminar series. The ISCR identifies researchers from the academic community for computer science and computational science collaborations with LLNL and hosts them for both brief and extended visits with the aim of encouraging long-term academic research agendas that address LLNL research priorities. Through these collaborations, ideas and software flow in both directions, and LLNL cultivates its future workforce. The Institute strives to be LLNL's ''eyes and ears'' in the computer and information sciences, keeping the Laboratory aware of and connected to important external advances. It also attempts to be the ''hands and feet'' that carry those advances into the Laboratory and incorporate them into practice. ISCR research participants are integrated into LLNL's Computing Applications and Research (CAR) Department, especially into its Center for Applied Scientific Computing (CASC). In turn, these organizations address computational challenges arising throughout the rest of the Laboratory. Administratively, the ISCR flourishes under LLNL's University Relations Program (URP). Together with the other four institutes of the URP, the ISCR navigates a course that allows LLNL to benefit from academic exchanges while preserving national security. While it is difficult to operate an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and worth the continued effort. The pages of this annual report summarize the activities of the faculty members, postdoctoral researchers, students, and guests from industry and other laboratories who participated in LLNL's computational mission under the auspices of the ISCR during FY 2005.« less

  16. A Case against Computer Symbolic Manipulation in School Mathematics Today.

    ERIC Educational Resources Information Center

    Waits, Bert K.; Demana, Franklin

    1992-01-01

    Presented are two reasons discouraging computer symbol manipulation systems use in school mathematics at present: cost for computer laboratories or expensive pocket computers; and impracticality of exact solution representations. Although development with this technology in mathematics education advances, graphing calculators are recommended to…

  17. Software development to support sensor control of robot arc welding

    NASA Technical Reports Server (NTRS)

    Silas, F. R., Jr.

    1986-01-01

    The development of software for a Digital Equipment Corporation MINC-23 Laboratory Computer to provide functions of a workcell host computer for Space Shuttle Main Engine (SSME) robotic welding is documented. Routines were written to transfer robot programs between the MINC and an Advanced Robotic Cyro 750 welding robot. Other routines provide advanced program editing features while additional software allows communicatin with a remote computer aided design system. Access to special robot functions were provided to allow advanced control of weld seam tracking and process control for future development programs.

  18. Physical and Chemical Properties of the Copper-Alanine System: An Advanced Laboratory Project

    ERIC Educational Resources Information Center

    Farrell, John J.

    1977-01-01

    An integrated physical-analytical-inorganic chemistry laboratory procedure for use with undergraduate biology majors is described. The procedure requires five to six laboratory periods and includes acid-base standardizations, potentiometric determinations, computer usage, spectrophotometric determinations of crystal-field splitting…

  19. CFL3D: Its History and Some Recent Applications

    NASA Technical Reports Server (NTRS)

    Rumsey, C. L.; Biedron, R. T.; Thomas, J. L.

    1997-01-01

    The history of the Computational Fluids Laboratory -3D (CFL3D) Navier-Stokes computer code is discussed and a comprehensive reference list is given. Three recent advanced applications are presented (1) Wing with partial-spanflap, (2) F/A-18 with forebody control strake, and (3) Noise predictions for an advanced ducted propeller turbomachinery flow.

  20. Computation Directorate Annual Report 2003

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crawford, D L; McGraw, J R; Ashby, S F

    Big computers are icons: symbols of the culture, and of the larger computing infrastructure that exists at Lawrence Livermore. Through the collective effort of Laboratory personnel, they enable scientific discovery and engineering development on an unprecedented scale. For more than three decades, the Computation Directorate has supplied the big computers that enable the science necessary for Laboratory missions and programs. Livermore supercomputing is uniquely mission driven. The high-fidelity weapon simulation capabilities essential to the Stockpile Stewardship Program compel major advances in weapons codes and science, compute power, and computational infrastructure. Computation's activities align with this vital mission of the Departmentmore » of Energy. Increasingly, non-weapons Laboratory programs also rely on computer simulation. World-class achievements have been accomplished by LLNL specialists working in multi-disciplinary research and development teams. In these teams, Computation personnel employ a wide array of skills, from desktop support expertise, to complex applications development, to advanced research. Computation's skilled professionals make the Directorate the success that it has become. These individuals know the importance of the work they do and the many ways it contributes to Laboratory missions. They make appropriate and timely decisions that move the entire organization forward. They make Computation a leader in helping LLNL achieve its programmatic milestones. I dedicate this inaugural Annual Report to the people of Computation in recognition of their continuing contributions. I am proud that we perform our work securely and safely. Despite increased cyber attacks on our computing infrastructure from the Internet, advanced cyber security practices ensure that our computing environment remains secure. Through Integrated Safety Management (ISM) and diligent oversight, we address safety issues promptly and aggressively. The safety of our employees, whether at work or at home, is a paramount concern. Even as the Directorate meets today's supercomputing requirements, we are preparing for the future. We are investigating open-source cluster technology, the basis of our highly successful Mulitprogrammatic Capability Resource (MCR). Several breakthrough discoveries have resulted from MCR calculations coupled with theory and experiment, prompting Laboratory scientists to demand ever-greater capacity and capability. This demand is being met by a new 23-TF system, Thunder, with architecture modeled on MCR. In preparation for the ''after-next'' computer, we are researching technology even farther out on the horizon--cell-based computers. Assuming that the funding and the technology hold, we will acquire the cell-based machine BlueGene/L within the next 12 months.« less

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Manteuffel, T.A.

    The objective of this project is the development of numerical solution techniques for deterministic models of the transport of neutral and charged particles and the demonstration of their effectiveness in both a production environment and on advanced architecture computers. The primary focus is on various versions of the linear Boltzman equation. These equations are fundamental in many important applications. This project is an attempt to integrate the development of numerical algorithms with the process of developing production software. A major thrust of this reject will be the implementation of these algorithms on advanced architecture machines that reside at the Advancedmore » Computing Laboratory (ACL) at Los Alamos National Laboratories (LANL).« less

  2. Some Hail 'Computational Science' as Biggest Advance Since Newton, Galileo.

    ERIC Educational Resources Information Center

    Turner, Judith Axler

    1987-01-01

    Computational science is defined as science done on a computer. A computer can serve as a laboratory for researchers who cannot experiment with their subjects, and as a calculator for those who otherwise might need centuries to solve some problems mathematically. The National Science Foundation's support of supercomputers is discussed. (MLW)

  3. Computer Aided Design: Instructional Manual. The North Dakota High Technology Mobile Laboratory Project.

    ERIC Educational Resources Information Center

    Cheng, Wan-Lee

    This instructional manual contains 12 learning activity packets for use in a workshop in computer-aided design and drafting (CADD). The lessons cover the following topics: introduction to computer graphics and computer-aided design/drafting; coordinate systems; advance space graphics hardware configuration and basic features of the IBM PC…

  4. Extreme-Scale Computing Project Aims to Advance Precision Oncology | FNLCR Staging

    Cancer.gov

    Two government agencies and five national laboratories are collaborating to develop extremely high-performance computing capabilities that will analyze mountains of research and clinical data to improve scientific understanding of cancer, predict dru

  5. Computing, Environment and Life Sciences | Argonne National Laboratory

    Science.gov Websites

    engineer receives prestigious medal August 18, 2016 Software optimized on Mira advances design of mini » Back to top Twitter Flickr Facebook Linked In YouTube Pinterest Google Plus Computing, Environment and

  6. Abstract - Cooperative Research and Development Agreement between Ames National Laboratory and National Energy Technology Laboratory AGMT-0609

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bryden, Mark; Tucker, David A.

    The goal of this project is to develop a merged environment for simulation and analysis (MESA) at the National Energy Technology Laboratory’s (NETL) Hybrid Performance (Hyper) project laboratory. The MESA sensor lab developed as a component of this research will provide a development platform for investigating: 1) advanced control strategies, 2) testing and development of sensor hardware, 3) various modeling in-the-loop algorithms and 4) other advanced computational algorithms for improved plant performance using sensors, real-time models, and complex systems tools.

  7. Delivering Insight The History of the Accelerated Strategic Computing Initiative

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larzelere II, A R

    2007-01-03

    The history of the Accelerated Strategic Computing Initiative (ASCI) tells of the development of computational simulation into a third fundamental piece of the scientific method, on a par with theory and experiment. ASCI did not invent the idea, nor was it alone in bringing it to fruition. But ASCI provided the wherewithal - hardware, software, environment, funding, and, most of all, the urgency - that made it happen. On October 1, 2005, the Initiative completed its tenth year of funding. The advances made by ASCI over its first decade are truly incredible. Lawrence Livermore, Los Alamos, and Sandia National Laboratories,more » along with leadership provided by the Department of Energy's Defense Programs Headquarters, fundamentally changed computational simulation and how it is used to enable scientific insight. To do this, astounding advances were made in simulation applications, computing platforms, and user environments. ASCI dramatically changed existing - and forged new - relationships, both among the Laboratories and with outside partners. By its tenth anniversary, despite daunting challenges, ASCI had accomplished all of the major goals set at its beginning. The history of ASCI is about the vision, leadership, endurance, and partnerships that made these advances possible.« less

  8. Sandia National Laboratories: Advanced Simulation and Computing

    Science.gov Websites

    Programs Nuclear Weapons About Nuclear Weapons Safety & Security Weapons Science & Technology Robotics R&D 100 Awards Laboratory Directed Research & Development Technology Deployment Centers Audit Sandia's Economic Impact Licensing & Technology Transfer Browse Technology Portfolios

  9. 1996 Laboratory directed research and development annual report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meyers, C.E.; Harvey, C.L.; Lopez-Andreas, L.M.

    This report summarizes progress from the Laboratory Directed Research and Development (LDRD) program during fiscal year 1996. In addition to a programmatic and financial overview, the report includes progress reports from 259 individual R&D projects in seventeen categories. The general areas of research include: engineered processes and materials; computational and information sciences; microelectronics and photonics; engineering sciences; pulsed power; advanced manufacturing technologies; biomedical engineering; energy and environmental science and technology; advanced information technologies; counterproliferation; advanced transportation; national security technology; electronics technologies; idea exploration and exploitation; production; and science at the interfaces - engineering with atoms.

  10. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1: ASC software quality engineering practices, Version 2.0.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sturtevant, Judith E.; Heaphy, Robert; Hodges, Ann Louise

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR 1.3.2 and 1.3.6 and to a Department of Energy document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines. This document also identifies ASC management and software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  11. Extreme-Scale Computing Project Aims to Advance Precision Oncology | Poster

    Cancer.gov

    Two government agencies and five national laboratories are collaborating to develop extremely high-performance computing capabilities that will analyze mountains of research and clinical data to improve scientific understanding of cancer, predict drug response, and improve treatments for patients.

  12. Science & Technology Review: September 2016

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vogt, Ramona L.; Meissner, Caryn N.; Chinn, Ken B.

    2016-09-30

    This is the September issue of the Lawrence Livermore National Laboratory's Science & Technology Review, which communicates, to a broad audience, the Laboratory’s scientific and technological accomplishments in fulfilling its primary missions. This month, there are features on "Laboratory Investments Drive Computational Advances" and "Laying the Groundwork for Extreme-Scale Computing." Research highlights include "Nuclear Data Moves into the 21st Century", "Peering into the Future of Lick Observatory", and "Facility Drives Hydrogen Vehicle Innovations."

  13. Virtual Transgenics: Using a Molecular Biology Simulation to Impact Student Academic Achievement and Attitudes

    NASA Astrophysics Data System (ADS)

    Shegog, Ross; Lazarus, Melanie M.; Murray, Nancy G.; Diamond, Pamela M.; Sessions, Nathalie; Zsigmond, Eva

    2012-10-01

    The transgenic mouse model is useful for studying the causes and potential cures for human genetic diseases. Exposing high school biology students to laboratory experience in developing transgenic animal models is logistically prohibitive. Computer-based simulation, however, offers this potential in addition to advantages of fidelity and reach. This study describes and evaluates a computer-based simulation to train advanced placement high school science students in laboratory protocols, a transgenic mouse model was produced. A simulation module on preparing a gene construct in the molecular biology lab was evaluated using a randomized clinical control design with advanced placement high school biology students in Mercedes, Texas ( n = 44). Pre-post tests assessed procedural and declarative knowledge, time on task, attitudes toward computers for learning and towards science careers. Students who used the simulation increased their procedural and declarative knowledge regarding molecular biology compared to those in the control condition (both p < 0.005). Significant increases continued to occur with additional use of the simulation ( p < 0.001). Students in the treatment group became more positive toward using computers for learning ( p < 0.001). The simulation did not significantly affect attitudes toward science in general. Computer simulation of complex transgenic protocols have potential to provide a "virtual" laboratory experience as an adjunct to conventional educational approaches.

  14. Conversing with Computers

    NASA Technical Reports Server (NTRS)

    2004-01-01

    I/NET, Inc., is making the dream of natural human-computer conversation a practical reality. Through a combination of advanced artificial intelligence research and practical software design, I/NET has taken the complexity out of developing advanced, natural language interfaces. Conversational capabilities like pronoun resolution, anaphora and ellipsis processing, and dialog management that were once available only in the laboratory can now be brought to any application with any speech recognition system using I/NET s conversational engine middleware.

  15. Cielo Computational Environment Usage Model With Mappings to ACE Requirements for the General Availability User Environment Capabilities Release Version 1.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vigil,Benny Manuel; Ballance, Robert; Haskell, Karen

    Cielo is a massively parallel supercomputer funded by the DOE/NNSA Advanced Simulation and Computing (ASC) program, and operated by the Alliance for Computing at Extreme Scale (ACES), a partnership between Los Alamos National Laboratory (LANL) and Sandia National Laboratories (SNL). The primary Cielo compute platform is physically located at Los Alamos National Laboratory. This Cielo Computational Environment Usage Model documents the capabilities and the environment to be provided for the Q1 FY12 Level 2 Cielo Capability Computing (CCC) Platform Production Readiness Milestone. This document describes specific capabilities, tools, and procedures to support both local and remote users. The model ismore » focused on the needs of the ASC user working in the secure computing environments at Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory, or Sandia National Laboratories, but also addresses the needs of users working in the unclassified environment. The Cielo Computational Environment Usage Model maps the provided capabilities to the tri-Lab ASC Computing Environment (ACE) Version 8.0 requirements. The ACE requirements reflect the high performance computing requirements for the Production Readiness Milestone user environment capabilities of the ASC community. A description of ACE requirements met, and those requirements that are not met, are included in each section of this document. The Cielo Computing Environment, along with the ACE mappings, has been issued and reviewed throughout the tri-Lab community.« less

  16. Cosmochemistry: Understanding the Solar System through analysis of extraterrestrial materials.

    PubMed

    MacPherson, Glenn J; Thiemens, Mark H

    2011-11-29

    Cosmochemistry is the chemical analysis of extraterrestrial materials. This term generally is taken to mean laboratory analysis, which is the cosmochemistry gold standard because of the ability for repeated analysis under highly controlled conditions using the most advanced instrumentation unhindered by limitations in power, space, or environment. Over the past 40 y, advances in technology have enabled telescopic and spacecraft instruments to provide important data that significantly complement the laboratory data. In this special edition, recent advances in the state of the art of cosmochemistry are presented, which range from instrumental analysis of meteorites to theoretical-computational and astronomical observations.

  17. (U) Status of Trinity and Crossroads Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Archer, Billy Joe; Lujan, James Westley; Hemmert, K. S.

    2017-01-10

    (U) This paper provides a general overview of current and future plans for the Advanced Simulation and Computing (ASC) Advanced Technology (AT) systems fielded by the New Mexico Alliance for Computing at Extreme Scale (ACES), a collaboration between Los Alamos Laboratory and Sandia National Laboratories. Additionally, this paper touches on research of technology beyond traditional CMOS. The status of Trinity, ASCs first AT system, and Crossroads, anticipated to succeed Trinity as the third AT system in 2020 will be presented, along with initial performance studies of the Intel Knights Landing Xeon Phi processors, introduced on Trinity. The challenges and opportunitiesmore » for our production simulation codes on AT systems will also be discussed. Trinity and Crossroads are a joint procurement by ACES and Lawrence Berkeley Laboratory as part of the Alliance for application Performance at EXtreme scale (APEX) http://apex.lanl.gov.« less

  18. Advanced information processing system

    NASA Technical Reports Server (NTRS)

    Lala, J. H.

    1984-01-01

    Design and performance details of the advanced information processing system (AIPS) for fault and damage tolerant data processing on aircraft and spacecraft are presented. AIPS comprises several computers distributed throughout the vehicle and linked by a damage tolerant data bus. Most I/O functions are available to all the computers, which run in a TDMA mode. Each computer performs separate specific tasks in normal operation and assumes other tasks in degraded modes. Redundant software assures that all fault monitoring, logging and reporting are automated, together with control functions. Redundant duplex links and damage-spread limitation provide the fault tolerance. Details of an advanced design of a laboratory-scale proof-of-concept system are described, including functional operations.

  19. Lab4CE: A Remote Laboratory for Computer Education

    ERIC Educational Resources Information Center

    Broisin, Julien; Venant, Rémi; Vidal, Philippe

    2017-01-01

    Remote practical activities have been demonstrated to be efficient when learners come to acquire inquiry skills. In computer science education, virtualization technologies are gaining popularity as this technological advance enables instructors to implement realistic practical learning activities, and learners to engage in authentic and…

  20. Two Crystallographic Laboratory and Computational Exercises for Undergraduates.

    ERIC Educational Resources Information Center

    Lessinger, Leslie

    1988-01-01

    Describes two introductory exercises designed to teach the fundamental ideas and methods of crystallography, and to convey some important features of inorganic and organic crystal structures to students in an advanced laboratory course. Exercises include "The Crystal Structure of NiO" and "The Crystal Structure of Beta-Fumaric Acid." (CW)

  1. Using Free Computational Resources to Illustrate the Drug Design Process in an Undergraduate Medicinal Chemistry Course

    ERIC Educational Resources Information Center

    Rodrigues, Ricardo P.; Andrade, Saulo F.; Mantoani, Susimaire P.; Eifler-Lima, Vera L.; Silva, Vinicius B.; Kawano, Daniel F.

    2015-01-01

    Advances in, and dissemination of, computer technologies in the field of drug research now enable the use of molecular modeling tools to teach important concepts of drug design to chemistry and pharmacy students. A series of computer laboratories is described to introduce undergraduate students to commonly adopted "in silico" drug design…

  2. Computation Directorate and Science& Technology Review Computational Science and Research Featured in 2002

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alchorn, A L

    Thank you for your interest in the activities of the Lawrence Livermore National Laboratory Computation Directorate. This collection of articles from the Laboratory's Science & Technology Review highlights the most significant computational projects, achievements, and contributions during 2002. In 2002, LLNL marked the 50th anniversary of its founding. Scientific advancement in support of our national security mission has always been the core of the Laboratory. So that researchers could better under and predict complex physical phenomena, the Laboratory has pushed the limits of the largest, fastest, most powerful computers in the world. In the late 1950's, Edward Teller--one of themore » LLNL founders--proposed that the Laboratory commission a Livermore Advanced Research Computer (LARC) built to Livermore's specifications. He tells the story of being in Washington, DC, when John Von Neumann asked to talk about the LARC. He thought Teller wanted too much memory in the machine. (The specifications called for 20-30,000 words.) Teller was too smart to argue with him. Later Teller invited Von Neumann to the Laboratory and showed him one of the design codes being prepared for the LARC. He asked Von Neumann for suggestions on fitting the code into 10,000 words of memory, and flattered him about ''Labbies'' not being smart enough to figure it out. Von Neumann dropped his objections, and the LARC arrived with 30,000 words of memory. Memory, and how close memory is to the processor, is still of interest to us today. Livermore's first supercomputer was the Remington-Rand Univac-1. It had 5600 vacuum tubes and was 2 meters wide by 4 meters long. This machine was commonly referred to as a 1 KFlop machine [E+3]. Skip ahead 50 years. The ASCI White machine at the Laboratory today, produced by IBM, is rated at a peak performance of 12.3 TFlops or E+13. We've improved computer processing power by 10 orders of magnitude in 50 years, and I do not believe there's any reason to think we won't improve another 10 orders of magnitude in the next 50 years. For years I have heard talk of hitting the physical limits of Moore's Law, but new technologies will take us into the next phase of computer processing power such as 3-D chips, molecular computing, quantum computing, and more. Big computers are icons or symbols of the culture and larger infrastructure that exists at LLNL to guide scientific discovery and engineering development. We have dealt with balance issues for 50 years and will continue to do so in our quest for a digital proxy of the properties of matter at extremely high temperatures and pressures. I believe that the next big computational win will be the merger of high-performance computing with information management. We already create terabytes--soon to be petabytes--of data. Efficiently storing, finding, visualizing and extracting data and turning that into knowledge which aids decision-making and scientific discovery is an exciting challenge. In the meantime, please enjoy this retrospective on computational physics, computer science, advanced software technologies, and applied mathematics performed by programs and researchers at LLNL during 2002. It offers a glimpse into the stimulating world of computational science in support of the national missions and homeland defense.« less

  3. High-order hydrodynamic algorithms for exascale computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morgan, Nathaniel Ray

    Hydrodynamic algorithms are at the core of many laboratory missions ranging from simulating ICF implosions to climate modeling. The hydrodynamic algorithms commonly employed at the laboratory and in industry (1) typically lack requisite accuracy for complex multi- material vortical flows and (2) are not well suited for exascale computing due to poor data locality and poor FLOP/memory ratios. Exascale computing requires advances in both computer science and numerical algorithms. We propose to research the second requirement and create a new high-order hydrodynamic algorithm that has superior accuracy, excellent data locality, and excellent FLOP/memory ratios. This proposal will impact a broadmore » range of research areas including numerical theory, discrete mathematics, vorticity evolution, gas dynamics, interface instability evolution, turbulent flows, fluid dynamics and shock driven flows. If successful, the proposed research has the potential to radically transform simulation capabilities and help position the laboratory for computing at the exascale.« less

  4. Purple Computational Environment With Mappings to ACE Requirements for the General Availability User Environment Capabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barney, B; Shuler, J

    2006-08-21

    Purple is an Advanced Simulation and Computing (ASC) funded massively parallel supercomputer located at Lawrence Livermore National Laboratory (LLNL). The Purple Computational Environment documents the capabilities and the environment provided for the FY06 LLNL Level 1 General Availability Milestone. This document describes specific capabilities, tools, and procedures to support both local and remote users. The model is focused on the needs of the ASC user working in the secure computing environments at Los Alamos National Laboratory, Lawrence Livermore National Laboratory, and Sandia National Laboratories, but also documents needs of the LLNL and Alliance users working in the unclassified environment. Additionally,more » the Purple Computational Environment maps the provided capabilities to the Trilab ASC Computing Environment (ACE) Version 8.0 requirements. The ACE requirements reflect the high performance computing requirements for the General Availability user environment capabilities of the ASC community. Appendix A lists these requirements and includes a description of ACE requirements met and those requirements that are not met for each section of this document. The Purple Computing Environment, along with the ACE mappings, has been issued and reviewed throughout the Tri-lab community.« less

  5. Review of the Program. Report No. R-60.

    ERIC Educational Resources Information Center

    Pennsylvania State Univ., University Park. Computer-Assisted Instruction Lab.

    The nine-year history of the Computer Assisted Instruction Laboratory, College of Education, Pennsylvania State University, is traced. Some 30 projects in curriculum development in teacher education, public school classes, and adult vocational education are described, along with several advances in computer-assisted instruction (CAI).…

  6. 10 CFR 2.1003 - Availability of material.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... months in advance of submitting its license application for a geologic repository, the NRC shall make... of privilege in § 2.1006, graphic-oriented documentary material that includes raw data, computer runs, computer programs and codes, field notes, laboratory notes, maps, diagrams and photographs, which have been...

  7. 10 CFR 2.1003 - Availability of material.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... months in advance of submitting its license application for a geologic repository, the NRC shall make... of privilege in § 2.1006, graphic-oriented documentary material that includes raw data, computer runs, computer programs and codes, field notes, laboratory notes, maps, diagrams and photographs, which have been...

  8. Survey of Advanced Technologies in Japan, Vol. 3: Database Reports

    DTIC Science & Technology

    1990-05-01

    INDUSTRIAL CO. LTD. Y NEC CORPORATION, CSC INFORMATION TECHNOLOGY RESEARCH LABORATORIES Y THE UNIVERSITY OF ELECTRO-COMMUNICATIONS Y TOKYO INSTITUTE...77 by the National *"eau of Standards, and is an outgrowth of research performed by IBM. which was based on information theory, using computer...Y COMMUNICATION RESEARCH LABORATORY, MINISTRY OF POSTS AND TELECOMMUNICATIONS Y MATSUSHITA ELECTRONICS CORP., ELECTRONICS RESEARCH LABORATORY Y

  9. The Nuclear Energy Advanced Modeling and Simulation Enabling Computational Technologies FY09 Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diachin, L F; Garaizar, F X; Henson, V E

    2009-10-12

    In this document we report on the status of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Enabling Computational Technologies (ECT) effort. In particular, we provide the context for ECT In the broader NEAMS program and describe the three pillars of the ECT effort, namely, (1) tools and libraries, (2) software quality assurance, and (3) computational facility (computers, storage, etc) needs. We report on our FY09 deliverables to determine the needs of the integrated performance and safety codes (IPSCs) in these three areas and lay out the general plan for software quality assurance to meet the requirements of DOE andmore » the DOE Advanced Fuel Cycle Initiative (AFCI). We conclude with a brief description of our interactions with the Idaho National Laboratory computer center to determine what is needed to expand their role as a NEAMS user facility.« less

  10. Cosmochemistry: Understanding the Solar System through analysis of extraterrestrial materials

    PubMed Central

    MacPherson, Glenn J.; Thiemens, Mark H.

    2011-01-01

    Cosmochemistry is the chemical analysis of extraterrestrial materials. This term generally is taken to mean laboratory analysis, which is the cosmochemistry gold standard because of the ability for repeated analysis under highly controlled conditions using the most advanced instrumentation unhindered by limitations in power, space, or environment. Over the past 40 y, advances in technology have enabled telescopic and spacecraft instruments to provide important data that significantly complement the laboratory data. In this special edition, recent advances in the state of the art of cosmochemistry are presented, which range from instrumental analysis of meteorites to theoretical–computational and astronomical observations. PMID:22128323

  11. "TIS": An Intelligent Gateway Computer for Information and Modeling Networks. Overview.

    ERIC Educational Resources Information Center

    Hampel, Viktor E.; And Others

    TIS (Technology Information System) is being used at the Lawrence Livermore National Laboratory (LLNL) to develop software for Intelligent Gateway Computers (IGC) suitable for the prototyping of advanced, integrated information networks. Dedicated to information management, TIS leads the user to available information resources, on TIS or…

  12. Integration of Computational Chemistry into the Undergraduate Organic Chemistry Laboratory Curriculum

    ERIC Educational Resources Information Center

    Esselman, Brian J.; Hill, Nicholas J.

    2016-01-01

    Advances in software and hardware have promoted the use of computational chemistry in all branches of chemical research to probe important chemical concepts and to support experimentation. Consequently, it has become imperative that students in the modern undergraduate curriculum become adept at performing simple calculations using computational…

  13. Statistical and Microscopic Approach to Gas Phase Chemical Kinetics.

    ERIC Educational Resources Information Center

    Perez, J. M.; Quereda, R.

    1983-01-01

    Describes advanced undergraduate laboratory exercise examining the dependence of the rate constants and the instantaneous concentrations with the nature and energy content in a gas-phase complex reaction. Computer program (with instructions and computation flow charts) used with the exercise is available from the author. (Author/JN)

  14. Process for selecting NEAMS applications for access to Idaho National Laboratory high performance computing resources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michael Pernice

    2010-09-01

    INL has agreed to provide participants in the Nuclear Energy Advanced Mod- eling and Simulation (NEAMS) program with access to its high performance computing (HPC) resources under sponsorship of the Enabling Computational Technologies (ECT) program element. This report documents the process used to select applications and the software stack in place at INL.

  15. Joey Luther | NREL

    Science.gov Websites

    Laboratory to study synthesis and chemical transformation of uniquely shaped colloidal nanocrystals for Advanced Solar Photophysics as well as in NREL's perovskite team. Education B.S. Electrical and Computer

  16. Camera-enabled techniques for organic synthesis

    PubMed Central

    Ingham, Richard J; O’Brien, Matthew; Browne, Duncan L

    2013-01-01

    Summary A great deal of time is spent within synthetic chemistry laboratories on non-value-adding activities such as sample preparation and work-up operations, and labour intensive activities such as extended periods of continued data collection. Using digital cameras connected to computer vision algorithms, camera-enabled apparatus can perform some of these processes in an automated fashion, allowing skilled chemists to spend their time more productively. In this review we describe recent advances in this field of chemical synthesis and discuss how they will lead to advanced synthesis laboratories of the future. PMID:23766820

  17. Leveraging e-Science infrastructure for electrochemical research.

    PubMed

    Peachey, Tom; Mashkina, Elena; Lee, Chong-Yong; Enticott, Colin; Abramson, David; Bond, Alan M; Elton, Darrell; Gavaghan, David J; Stevenson, Gareth P; Kennedy, Gareth F

    2011-08-28

    As in many scientific disciplines, modern chemistry involves a mix of experimentation and computer-supported theory. Historically, these skills have been provided by different groups, and range from traditional 'wet' laboratory science to advanced numerical simulation. Increasingly, progress is made by global collaborations, in which new theory may be developed in one part of the world and applied and tested in the laboratory elsewhere. e-Science, or cyber-infrastructure, underpins such collaborations by providing a unified platform for accessing scientific instruments, computers and data archives, and collaboration tools. In this paper we discuss the application of advanced e-Science software tools to electrochemistry research performed in three different laboratories--two at Monash University in Australia and one at the University of Oxford in the UK. We show that software tools that were originally developed for a range of application domains can be applied to electrochemical problems, in particular Fourier voltammetry. Moreover, we show that, by replacing ad-hoc manual processes with e-Science tools, we obtain more accurate solutions automatically.

  18. Incorporating computational resources in a cancer research program

    PubMed Central

    Woods, Nicholas T.; Jhuraney, Ankita; Monteiro, Alvaro N.A.

    2015-01-01

    Recent technological advances have transformed cancer genetics research. These advances have served as the basis for the generation of a number of richly annotated datasets relevant to the cancer geneticist. In addition, many of these technologies are now within reach of smaller laboratories to answer specific biological questions. Thus, one of the most pressing issues facing an experimental cancer biology research program in genetics is incorporating data from multiple sources to annotate, visualize, and analyze the system under study. Fortunately, there are several computational resources to aid in this process. However, a significant effort is required to adapt a molecular biology-based research program to take advantage of these datasets. Here, we discuss the lessons learned in our laboratory and share several recommendations to make this transition effectively. This article is not meant to be a comprehensive evaluation of all the available resources, but rather highlight those that we have incorporated into our laboratory and how to choose the most appropriate ones for your research program. PMID:25324189

  19. Sscience & technology review; Science Technology Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1996-07-01

    This review is published ten times a year to communicate, to a broad audience, Lawrence Livermore National Laboratory`s scientific and technological accomplishments, particularly in the Laboratory`s core mission areas - global security, energy and the environment, and bioscience and biotechnology. This review for the month of July 1996 discusses: Frontiers of research in advanced computations, The multibeam Fabry-Perot velocimeter: Efficient measurement of high velocities, High-tech tools for the American textile industry, and Rock mechanics: can the Tuff take the stress.

  20. Lawrence Livermore National Laboratories Perspective on Code Development and High Performance Computing Resources in Support of the National HED/ICF Effort

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clouse, C. J.; Edwards, M. J.; McCoy, M. G.

    2015-07-07

    Through its Advanced Scientific Computing (ASC) and Inertial Confinement Fusion (ICF) code development efforts, Lawrence Livermore National Laboratory (LLNL) provides a world leading numerical simulation capability for the National HED/ICF program in support of the Stockpile Stewardship Program (SSP). In addition the ASC effort provides high performance computing platform capabilities upon which these codes are run. LLNL remains committed to, and will work with, the national HED/ICF program community to help insure numerical simulation needs are met and to make those capabilities available, consistent with programmatic priorities and available resources.

  1. A Future State for NASA Laboratories - Working in the 21st Century

    NASA Technical Reports Server (NTRS)

    Kegelman, Jerome T.; Harris, Charles E.; Antcliff, Richard R.; Bushnell, Dennis M.; Dwoyer, Douglas L.

    2009-01-01

    The name "21 st Century Laboratory" is an emerging concept of how NASA (and the world) will conduct research in the very near future. Our approach is to carefully plan for significant technological changes in products, organization, and society. The NASA mission can be the beneficiary of these changes, provided the Agency prepares for the role of 21st Century laboratories in research and technology development and its deployment in this new age. It has been clear for some time now that the technology revolutions, technology "mega-trends" that we are in the midst of now, all have a common element centered around advanced computational modeling of small scale physics. Whether it is nano technology, bio technology or advanced computational technology, all of these megatrends are converging on science at the very small scale where it is profoundly important to consider the quantum effects at play with physics at that scale. Whether it is the bio-technology creation of "nanites" designed to mimic our immune system or the creation of nanoscale infotechnology devices, allowing an order of magnitude increase in computational capability, all involve quantum physics that serves as the heart of these revolutionary changes.

  2. Some research advances in computer graphics that will enhance applications to engineering design

    NASA Technical Reports Server (NTRS)

    Allan, J. J., III

    1975-01-01

    Research in man/machine interactions and graphics hardware/software that will enhance applications to engineering design was described. Research aspects of executive systems, command languages, and networking used in the computer applications laboratory are mentioned. Finally, a few areas where little or no research is being done were identified.

  3. Using Articulate Virtual Laboratories in Teaching Energy Conversion at the U.S. Naval Academy.

    ERIC Educational Resources Information Center

    Wu, C.

    1998-01-01

    The Mechanical Engineering Department at the U.S. Naval Academy is currently evaluating a new teaching method which uses computer software. Utilizing the thermodynamic-based software CyclePad, Intelligent Computer Aided Instruction is incorporated in an advanced energy conversion course for Mechanical Engineering students. The CyclePad software…

  4. NASA Tech Briefs, May 1995. Volume 19, No. 5

    NASA Technical Reports Server (NTRS)

    1995-01-01

    This issue features an resource report on Jet Propulsion Laboratory and a special focus on advanced composites and plastics. It also contains articles on electronic components and circuits, electronic systems, physical sciences, computer programs, mechanics, machinery, manufacturing and fabrication, mathematics and information sciences, and life sciences. This issue also contains a supplement on federal laboratory test and measurements.

  5. JPL basic research review. [research and advanced development

    NASA Technical Reports Server (NTRS)

    1977-01-01

    Current status, projected goals, and results of 49 research and advanced development programs at the Jet Propulsion Laboratory are reported in abstract form. Areas of investigation include: aerodynamics and fluid mechanics, applied mathematics and computer sciences, environment protection, materials science, propulsion, electric and solar power, guidance and navigation, communication and information sciences, general physics, and chemistry.

  6. DOE Advanced Scientific Advisory Committee (ASCAC): Workforce Subcommittee Letter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chapman, Barbara; Calandra, Henri; Crivelli, Silvia

    2014-07-23

    Simulation and computing are essential to much of the research conducted at the DOE national laboratories. Experts in the ASCR ¬relevant Computing Sciences, which encompass a range of disciplines including Computer Science, Applied Mathematics, Statistics and domain Computational Sciences, are an essential element of the workforce in nearly all of the DOE national laboratories. This report seeks to identify the gaps and challenges facing DOE with respect to this workforce. This letter is ASCAC’s response to the charge of February 19, 2014 to identify disciplines in which significantly greater emphasis in workforce training at the graduate or postdoctoral levels ismore » necessary to address workforce gaps in current and future Office of Science mission needs.« less

  7. Advanced processing for high-bandwidth sensor systems

    NASA Astrophysics Data System (ADS)

    Szymanski, John J.; Blain, Phil C.; Bloch, Jeffrey J.; Brislawn, Christopher M.; Brumby, Steven P.; Cafferty, Maureen M.; Dunham, Mark E.; Frigo, Janette R.; Gokhale, Maya; Harvey, Neal R.; Kenyon, Garrett; Kim, Won-Ha; Layne, J.; Lavenier, Dominique D.; McCabe, Kevin P.; Mitchell, Melanie; Moore, Kurt R.; Perkins, Simon J.; Porter, Reid B.; Robinson, S.; Salazar, Alfonso; Theiler, James P.; Young, Aaron C.

    2000-11-01

    Compute performance and algorithm design are key problems of image processing and scientific computing in general. For example, imaging spectrometers are capable of producing data in hundreds of spectral bands with millions of pixels. These data sets show great promise for remote sensing applications, but require new and computationally intensive processing. The goal of the Deployable Adaptive Processing Systems (DAPS) project at Los Alamos National Laboratory is to develop advanced processing hardware and algorithms for high-bandwidth sensor applications. The project has produced electronics for processing multi- and hyper-spectral sensor data, as well as LIDAR data, while employing processing elements using a variety of technologies. The project team is currently working on reconfigurable computing technology and advanced feature extraction techniques, with an emphasis on their application to image and RF signal processing. This paper presents reconfigurable computing technology and advanced feature extraction algorithm work and their application to multi- and hyperspectral image processing. Related projects on genetic algorithms as applied to image processing will be introduced, as will the collaboration between the DAPS project and the DARPA Adaptive Computing Systems program. Further details are presented in other talks during this conference and in other conferences taking place during this symposium.

  8. Advancing crime scene computer forensics techniques

    NASA Astrophysics Data System (ADS)

    Hosmer, Chet; Feldman, John; Giordano, Joe

    1999-02-01

    Computers and network technology have become inexpensive and powerful tools that can be applied to a wide range of criminal activity. Computers have changed the world's view of evidence because computers are used more and more as tools in committing `traditional crimes' such as embezzlements, thefts, extortion and murder. This paper will focus on reviewing the current state-of-the-art of the data recovery and evidence construction tools used in both the field and laboratory for prosection purposes.

  9. Cinema Fire Modelling by FDS

    NASA Astrophysics Data System (ADS)

    Glasa, J.; Valasek, L.; Weisenpacher, P.; Halada, L.

    2013-02-01

    Recent advances in computer fluid dynamics (CFD) and rapid increase of computational power of current computers have led to the development of CFD models capable to describe fire in complex geometries incorporating a wide variety of physical phenomena related to fire. In this paper, we demonstrate the use of Fire Dynamics Simulator (FDS) for cinema fire modelling. FDS is an advanced CFD system intended for simulation of the fire and smoke spread and prediction of thermal flows, toxic substances concentrations and other relevant parameters of fire. The course of fire in a cinema hall is described focusing on related safety risks. Fire properties of flammable materials used in the simulation were determined by laboratory measurements and validated by fire tests and computer simulations

  10. Computing Visible-Surface Representations,

    DTIC Science & Technology

    1985-03-01

    Terzopoulos N00014-75-C-0643 9. PERFORMING ORGANIZATION NAME AMC ADDRESS 10. PROGRAM ELEMENT. PROJECT, TASK Artificial Inteligence Laboratory AREA A...Massachusetts Institute of lechnolog,. Support lbr the laboratory’s Artificial Intelligence research is provided in part by the Advanced Rtccarcl Proj...dynamically maintaining visible surface representations. Whether the intention is to model human vision or to design competent artificial vision systems

  11. Advanced Simulation and Computing: A Summary Report to the Director's Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCoy, M G; Peck, T

    2003-06-01

    It has now been three years since the Advanced Simulation and Computing Program (ASCI), as managed by Defense and Nuclear Technologies (DNT) Directorate, has been reviewed by this Director's Review Committee (DRC). Since that time, there has been considerable progress for all components of the ASCI Program, and these developments will be highlighted in this document and in the presentations planned for June 9 and 10, 2003. There have also been some name changes. Today, the Program is called ''Advanced Simulation and Computing,'' Although it retains the familiar acronym ASCI, the initiative nature of the effort has given way tomore » sustained services as an integral part of the Stockpile Stewardship Program (SSP). All computing efforts at LLNL and the other two Defense Program (DP) laboratories are funded and managed under ASCI. This includes the so-called legacy codes, which remain essential tools in stockpile stewardship. The contract between the Department of Energy (DOE) and the University of California (UC) specifies an independent appraisal of Directorate technical work and programmatic management. Such represents the work of this DNT Review Committee. Beginning this year, the Laboratory is implementing a new review system. This process was negotiated between UC, the National Nuclear Security Administration (NNSA), and the Laboratory Directors. Central to this approach are eight performance objectives that focus on key programmatic and administrative goals. Associated with each of these objectives are a number of performance measures to more clearly characterize the attainment of the objectives. Each performance measure has a lead directorate and one or more contributing directorates. Each measure has an evaluation plan and has identified expected documentation to be included in the ''Assessment File''.« less

  12. MIT CSAIL and Lincoln Laboratory Task Force Report

    DTIC Science & Technology

    2016-08-01

    projects have been very diverse, spanning several areas of CSAIL concentration, including robotics, big data analytics , wireless communications...spanning several areas of CSAIL concentration, including robotics, big data analytics , wireless communications, computing architectures and...to machine learning systems and algorithms, such as recommender systems, and “Big Data ” analytics . Advanced computing architectures broadly refer to

  13. Institute for scientific computing research;fiscal year 1999 annual report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keyes, D

    2000-03-28

    Large-scale scientific computation, and all of the disciplines that support it and help to validate it, have been placed at the focus of Lawrence Livermore National Laboratory by the Accelerated Strategic Computing Initiative (ASCI). The Laboratory operates the computer with the highest peak performance in the world and has undertaken some of the largest and most compute-intensive simulations ever performed. Computers at the architectural extremes, however, are notoriously difficult to use efficiently. Even such successes as the Laboratory's two Bell Prizes awarded in November 1999 only emphasize the need for much better ways of interacting with the results of large-scalemore » simulations. Advances in scientific computing research have, therefore, never been more vital to the core missions of the Laboratory than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, the Laboratory must engage researchers at many academic centers of excellence. In FY 1999, the Institute for Scientific Computing Research (ISCR) has expanded the Laboratory's bridge to the academic community in the form of collaborative subcontracts, visiting faculty, student internships, a workshop, and a very active seminar series. ISCR research participants are integrated almost seamlessly with the Laboratory's Center for Applied Scientific Computing (CASC), which, in turn, addresses computational challenges arising throughout the Laboratory. Administratively, the ISCR flourishes under the Laboratory's University Relations Program (URP). Together with the other four Institutes of the URP, it must navigate a course that allows the Laboratory to benefit from academic exchanges while preserving national security. Although FY 1999 brought more than its share of challenges to the operation of an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and well worth the continued effort. A change of administration for the ISCR occurred during FY 1999. Acting Director John Fitzgerald retired from LLNL in August after 35 years of service, including the last two at helm of the ISCR. David Keyes, who has been a regular visitor in conjunction with ASCI scalable algorithms research since October 1997, overlapped with John for three months and serves half-time as the new Acting Director.« less

  14. Advanced computational tools for 3-D seismic analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barhen, J.; Glover, C.W.; Protopopescu, V.A.

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advancemore » in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.« less

  15. Development of the HERMIES III mobile robot research testbed at Oak Ridge National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Manges, W.W.; Hamel, W.R.; Weisbin, C.R.

    1988-01-01

    The latest robot in the Hostile Environment Robotic Machine Intelligence Experiment Series (HERMIES) is now under development at the Center for Engineering Systems Advanced Research (CESAR) in the Oak Ridge National Laboratory. The HERMIES III robot incorporates a larger than human size 7-degree-of-freedom manipulator mounted on a 2-degree-of-freedom mobile platform including a variety of sensors and computers. The deployment of this robot represents a significant increase in research capabilities for the CESAR laboratory. The initial on-board computer capacity of the robot exceeds that of 20 Vax 11/780s. The navigation and vision algorithms under development make extensive use of the on-boardmore » NCUBE hypercube computer while the sensors are interfaced through five VME computers running the OS-9 real-time, multitasking operating system. This paper describes the motivation, key issues, and detailed design trade-offs of implementing the first phase (basic functionality) of the HERMIES III robot. 10 refs., 7 figs.« less

  16. Acoustic impulse response method as a source of undergraduate research projects and advanced laboratory experiments.

    PubMed

    Robertson, W M; Parker, J M

    2012-03-01

    A straightforward and inexpensive implementation of acoustic impulse response measurement is described utilizing the signal processing technique of coherent averaging. The technique is capable of high signal-to-noise measurements with personal computer data acquisition equipment, an amplifier/speaker, and a high quality microphone. When coupled with simple waveguide test systems fabricated from commercial PVC plumbing pipe, impulse response measurement has proven to be ideal for undergraduate research projects-often of publishable quality-or for advanced laboratory experiments. The technique provides important learning objectives for science or engineering students in areas such as interfacing and computer control of experiments; analog-to-digital conversion and sampling; time and frequency analysis using Fourier transforms; signal processing; and insight into a variety of current research areas such as acoustic bandgap materials, acoustic metamaterials, and fast and slow wave manipulation. © 2012 Acoustical Society of America

  17. Computing Properties of Hadrons, Nuclei and Nuclear Matter from Quantum Chromodynamics (LQCD)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Negele, John W.

    Building on the success of two preceding generations of Scientific Discovery through Advanced Computing (SciDAC) projects, this grant supported the MIT component (P.I. John Negele) of a multi-institutional SciDAC-3 project that also included Brookhaven National Laboratory, the lead laboratory with P. I. Frithjof Karsch serving as Project Director, Thomas Jefferson National Accelerator Facility with P. I. David Richards serving as Co-director, University of Washington with P. I. Martin Savage, University of North Carolina with P. I. Rob Fowler, and College of William and Mary with P. I. Andreas Stathopoulos. Nationally, this multi-institutional project coordinated the software development effort that themore » nuclear physics lattice QCD community needs to ensure that lattice calculations can make optimal use of forthcoming leadership-class and dedicated hardware, including that at the national laboratories, and to exploit future computational resources in the Exascale era.« less

  18. MIT Laboratory for Computer Science Progress Report 27

    DTIC Science & Technology

    1990-06-01

    because of the natural, yet unexploited, concurrence that characterizes contemporary and prospective applications from business to sensory computing...432. 14 Advanced Network Architecture Academic Staff D. Clark, Group Leader D. Tennenhouse J. Saltzer Research Staff J. Davin K. Sollins Graduate...Murray Hill, NJ, July 1989. 23 24 Clinical Decision Making Academic Staff R. Patil P. Szolovits, Group Leader G. Rennels Collaborating Investigators M

  19. PNNLs Data Intensive Computing research battles Homeland Security threats

    ScienceCinema

    David Thurman; Joe Kielman; Katherine Wolf; David Atkinson

    2018-05-11

    The Pacific Northwest National Laboratorys (PNNL's) approach to data intensive computing (DIC) is focused on three key research areas: hybrid hardware architecture, software architectures, and analytic algorithms. Advancements in these areas will help to address, and solve, DIC issues associated with capturing, managing, analyzing and understanding, in near real time, data at volumes and rates that push the frontiers of current technologies.

  20. PNNL pushing scientific discovery through data intensive computing breakthroughs

    ScienceCinema

    Deborah Gracio; David Koppenaal; Ruby Leung

    2018-05-18

    The Pacific Northwest National Laboratory's approach to data intensive computing (DIC) is focused on three key research areas: hybrid hardware architectures, software architectures, and analytic algorithms. Advancements in these areas will help to address, and solve, DIC issues associated with capturing, managing, analyzing and understanding, in near real time, data at volumes and rates that push the frontiers of current technologies.

  1. Modeling, Analysis, and Optimization Issues for Large Space Structures.

    DTIC Science & Technology

    1983-02-01

    There are numerous opportunities - provided by new advances in computer hardware, firmware, software , CAD/CAM systems, computational algorithms and...Institute Department of Mechanical Engineering Dept. of Civil Engineering & Mechanics Troy, NY 12181 Drexel University Philadelphia, PA 19104 Dr...Mechanical Engineering Hampton, VA 23665 Washington, DC 20059 Dr. K. T. Alfriend Mr. Siva S. Banda Department of the Navy Flight Dynamics LaboratoryNaval

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krasheninnikov, Sergei I.; Angus, Justin; Lee, Wonjae

    The goal of the Edge Simulation Laboratory (ESL) multi-institutional project is to advance scientific understanding of the edge plasma region of magnetic fusion devices via a coordinated effort utilizing modern computing resources, advanced algorithms, and ongoing theoretical development. The UCSD team was involved in the development of the COGENT code for kinetic studies across a magnetic separatrix. This work included a kinetic treatment of electrons and multiple ion species (impurities) and accurate collision operators.

  3. Baseline Skills Assessment of the US Army Research Laboratory

    DTIC Science & Technology

    2015-01-01

    level definitions Level Definition 1 Basic understanding, minimal experience 2 More specific understanding, some level of application 3 Expertise...polymers 1 Energy absorbers 2 Computational material modeling 1 Powder metallurgy 1 Tribology 1 Non-destructive inspection 1 Advanced

  4. Recommendations for Establishing the Texas Roadway Research Implementation Center

    DOT National Transportation Integrated Search

    1998-07-01

    The overall objective of the Roadway Research Initiative study was to describe an advanced testing capability, on that would speed implementation of the results from traditional computer and laboratory-based research efforts by providing a reusable t...

  5. Jonathan F. Reichert and Barbara Wolff-Reichert Award for Excellence in Advanced Laboratory Instruction: Advanced Instructional Labs: Why Bother?

    NASA Astrophysics Data System (ADS)

    Bistrow, Van

    What aren't we teaching about physics in the traditional lecture course? Plenty! By offering the Advanced Laboratory Course, we hope to shed light on the following questions: How do we develop a systematic process of doing experiments? How do we record procedures and results? How should we interpret theoretical concepts in the real world? What experimental and computational techniques are available for producing and analyzing data? With what degree of confidence can we trust our measurements and interpretations? How well does a theory represent physical reality? How do we collaborate with experimental partners? How do we best communicate our findings to others?These questions are of fundamental importance to experimental physics, yet are not generally addressed by reading textbooks, attending lectures or doing homework problems. Thus, to provide a more complete understanding of physics, we offer laboratory exercises as a supplement to the other modes of learning. The speaker will describe some examples of experiments, and outline the history, structure and student impressions of the Advanced Lab course at the University of Chicago Department of Physics.

  6. National Laboratory for Advanced Scientific Visualization at UNAM - Mexico

    NASA Astrophysics Data System (ADS)

    Manea, Marina; Constantin Manea, Vlad; Varela, Alfredo

    2016-04-01

    In 2015, the National Autonomous University of Mexico (UNAM) joined the family of Universities and Research Centers where advanced visualization and computing plays a key role to promote and advance missions in research, education, community outreach, as well as business-oriented consulting. This initiative provides access to a great variety of advanced hardware and software resources and offers a range of consulting services that spans a variety of areas related to scientific visualization, among which are: neuroanatomy, embryonic development, genome related studies, geosciences, geography, physics and mathematics related disciplines. The National Laboratory for Advanced Scientific Visualization delivers services through three main infrastructure environments: the 3D fully immersive display system Cave, the high resolution parallel visualization system Powerwall, the high resolution spherical displays Earth Simulator. The entire visualization infrastructure is interconnected to a high-performance-computing-cluster (HPCC) called ADA in honor to Ada Lovelace, considered to be the first computer programmer. The Cave is an extra large 3.6m wide room with projected images on the front, left and right, as well as floor walls. Specialized crystal eyes LCD-shutter glasses provide a strong stereo depth perception, and a variety of tracking devices allow software to track the position of a user's hand, head and wand. The Powerwall is designed to bring large amounts of complex data together through parallel computing for team interaction and collaboration. This system is composed by 24 (6x4) high-resolution ultra-thin (2 mm) bezel monitors connected to a high-performance GPU cluster. The Earth Simulator is a large (60") high-resolution spherical display used for global-scale data visualization like geophysical, meteorological, climate and ecology data. The HPCC-ADA, is a 1000+ computing core system, which offers parallel computing resources to applications that requires large quantity of memory as well as large and fast parallel storage systems. The entire system temperature is controlled by an energy and space efficient cooling solution, based on large rear door liquid cooled heat exchangers. This state-of-the-art infrastructure will boost research activities in the region, offer a powerful scientific tool for teaching at undergraduate and graduate levels, and enhance association and cooperation with business-oriented organizations.

  7. US Department of Energy High School Student Supercomputing Honors Program: A follow-up assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1987-01-01

    The US DOE High School Student Supercomputing Honors Program was designed to recognize high school students with superior skills in mathematics and computer science and to provide them with formal training and experience with advanced computer equipment. This document reports on the participants who attended the first such program, which was held at the National Magnetic Fusion Energy Computer Center at the Lawrence Livermore National Laboratory (LLNL) during August 1985.

  8. Laboratory directed research and development annual report 2004.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This report summarizes progress from the Laboratory Directed Research and Development (LDRD) program during fiscal year 2004. In addition to a programmatic and financial overview, the report includes progress reports from 352 individual R and D projects in 15 categories. The 15 categories are: (1) Advanced Concepts; (2) Advanced Manufacturing; (3) Biotechnology; (4) Chemical and Earth Sciences; (5) Computational and Information Sciences; (6) Differentiating Technologies; (7) Electronics and Photonics; (8) Emerging Threats; (9) Energy and Critical Infrastructures; (10) Engineering Sciences; (11) Grand Challenges; (12) Materials Science and Technology; (13) Nonproliferation and Materials Control; (14) Pulsed Power and High Energy Densitymore » Sciences; and (15) Corporate Objectives.« less

  9. Grand Challenges of Advanced Computing for Energy Innovation Report from the Workshop Held July 31-August 2, 2012

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larzelere, Alex R.; Ashby, Steven F.; Christensen, Dana C.

    2013-03-06

    On July 31-August 2 of 2012, the U.S. Department of Energy (DOE) held a workshop entitled Grand Challenges of Advanced Computing for Energy Innovation. This workshop built on three earlier workshops that clearly identified the potential for the Department and its national laboratories to enable energy innovation. The specific goal of the workshop was to identify the key challenges that the nation must overcome to apply the full benefit of taxpayer-funded advanced computing technologies to U.S. energy innovation in the ways that the country produces, moves, stores, and uses energy. Perhaps more importantly, the workshop also developed a set ofmore » recommendations to help the Department overcome those challenges. These recommendations provide an action plan for what the Department can do in the coming years to improve the nation’s energy future.« less

  10. Brookhaven highlights. Report on research, October 1, 1992--September 30, 1993

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rowe, M.S.; Belford, M.; Cohen, A.

    This report highlights the research activities of Brookhaven National Laboratory during the period dating from October 1, 1992 through September 30, 1993. There are contributions to the report from different programs and departments within the laboratory. These include technology transfer, RHIC, Alternating Gradient Synchrotron, physics, biology, national synchrotron light source, applied science, medical science, advanced technology, chemistry, reactor physics, safety and environmental protection, instrumentation, and computing and communications.

  11. How DARHT Works - the World's Most Powerful X-ray Machine

    ScienceCinema

    None

    2018-06-01

    The Dual Axis Radiographic Hydrodynamic Test (DARHT) facility at Los Alamos National Laboratory is an essential scientific tool that supports Stockpile Stewardship at the Laboratory. The World's most powerful x-ray machine, it's used to take high-speed images of mock nuclear devices - data that is used to confirm and modify advanced computer codes in assuring the safety, security, and effectiveness of the U.S. nuclear deterrent.

  12. Numeric Computation of the Radar Cross Section of In-flight Projectiles

    DTIC Science & Technology

    2016-11-01

    SUBJECT TERMS computational electromagnetics , radar signature, ballistic trajectory, radar cross section, RCS 16. SECURITY CLASSIFICATION OF: 17...under the generic category of rockets, artillery, and mortar (RAM). The electromagnetic (EM) modeling team at the US Army Research Laboratory (ARL) is...ARL-TR-5145. 5. Balanis C. Advanced engineering electromagnetics . New York (NY): Wiley; 1989. 6. Ruck G, Barrick DE, Stuart WD, Krichbaum CK

  13. Image Understanding Research and Its Application to Cartography and Computer-Based Analysis of Aerial Imagery

    DTIC Science & Technology

    1983-09-01

    Report Al-TR-346. Artifcial Intelligence Laboratory, Mamachusetts Institute of Tech- niugy. Cambridge, Mmeh mett. June 19 [G.usmn@ A. Gaman-Arenas...Testbed Coordinator, 415/859-4395 Artificial Intelligence Center Computer Science and Technology Division Prepared for: Defense Advanced Research...to support processing of aerial photographs for such military applications as cartography, Intelligence , weapon guidance, and targeting. A key

  14. Wireless Communications in Reverberant Environments

    DTIC Science & Technology

    2015-01-01

    Secure Wireless Agent Testbed (SWAT), the Protocol Engineering Advanced Networking (PROTEAN) Research Group, the Data Fusion Laboratory (DFL), and the...constraints of their application. 81 Bibliography [1] V. Gungor and G. Hancke, “Industrial wireless sensor networks : Challenges, design principles, and...Bhattacharya, “Path loss estimation for a wireless sensor network for application in ship,” Int. J. of Comput. Sci. and Mobile Computing, vol. 2, no. 6, pp

  15. Advances and Challenges In Uncertainty Quantification with Application to Climate Prediction, ICF design and Science Stockpile Stewardship

    NASA Astrophysics Data System (ADS)

    Klein, R.; Woodward, C. S.; Johannesson, G.; Domyancic, D.; Covey, C. C.; Lucas, D. D.

    2012-12-01

    Uncertainty Quantification (UQ) is a critical field within 21st century simulation science that resides at the very center of the web of emerging predictive capabilities. The science of UQ holds the promise of giving much greater meaning to the results of complex large-scale simulations, allowing for quantifying and bounding uncertainties. This powerful capability will yield new insights into scientific predictions (e.g. Climate) of great impact on both national and international arenas, allow informed decisions on the design of critical experiments (e.g. ICF capsule design, MFE, NE) in many scientific fields, and assign confidence bounds to scientifically predictable outcomes (e.g. nuclear weapons design). In this talk I will discuss a major new strategic initiative (SI) we have developed at Lawrence Livermore National Laboratory to advance the science of Uncertainty Quantification at LLNL focusing in particular on (a) the research and development of new algorithms and methodologies of UQ as applied to multi-physics multi-scale codes, (b) incorporation of these advancements into a global UQ Pipeline (i.e. a computational superstructure) that will simplify user access to sophisticated tools for UQ studies as well as act as a self-guided, self-adapting UQ engine for UQ studies on extreme computing platforms and (c) use laboratory applications as a test bed for new algorithms and methodologies. The initial SI focus has been on applications for the quantification of uncertainty associated with Climate prediction, but the validated UQ methodologies we have developed are now being fed back into Science Based Stockpile Stewardship (SSS) and ICF UQ efforts. To make advancements in several of these UQ grand challenges, I will focus in talk on the following three research areas in our Strategic Initiative: Error Estimation in multi-physics and multi-scale codes ; Tackling the "Curse of High Dimensionality"; and development of an advanced UQ Computational Pipeline to enable complete UQ workflow and analysis for ensemble runs at the extreme scale (e.g. exascale) with self-guiding adaptation in the UQ Pipeline engine. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 and was funded by the Uncertainty Quantification Strategic Initiative Laboratory Directed Research and Development Project at LLNL under project tracking code 10-SI-013 (UCRL LLNL-ABS-569112).

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chrzanowski, P; Walter, K

    For the Laboratory and staff, 2006 was a year of outstanding achievements. As our many accomplishments in this annual report illustrate, the Laboratory's focus on important problems that affect our nation's security and our researchers breakthroughs in science and technology have led to major successes. As a national laboratory that is part of the Department of Energy's National Nuclear Security Administration (DOE/NNSA), Livermore is a key contributor to the Stockpile Stewardship Program for maintaining the safety, security, and reliability of the nation's nuclear weapons stockpile. The program has been highly successful, and our annual report features some of the Laboratory'smore » significant stockpile stewardship accomplishments in 2006. A notable example is a long-term study with Los Alamos National Laboratory, which found that weapon pit performance will not sharply degrade from the aging effects on plutonium. The conclusion was based on a wide range of nonnuclear experiments, detailed simulations, theoretical advances, and thorough analyses of the results of past nuclear tests. The study was a superb scientific effort. The continuing success of stockpile stewardship enabled NNSA in 2006 to lay out Complex 2030, a vision for a transformed nuclear weapons complex that is more responsive, cost efficient, and highly secure. One of the ways our Laboratory will help lead this transformation is through the design and development of reliable replacement warheads (RRWs). Compared to current designs, these warheads would have enhanced performance margins and security features and would be less costly to manufacture and maintain in a smaller, modernized production complex. In early 2007, NNSA selected Lawrence Livermore and Sandia National Laboratories-California to develop ''RRW-1'' for the U.S. Navy. Design efforts for the RRW, the plutonium aging work, and many other stockpile stewardship accomplishments rely on computer simulations performed on NNSA's Advanced Simulation and Computing (ASC) Program supercomputers at Livermore. ASC Purple and BlueGene/L, the world's fastest computer, together provide nearly a half petaflop (500 trillion operations per second) of computer power for use by the three NNSA national laboratories. Livermore-led teams were awarded the Gordon Bell Prize for Peak Performance in both 2005 and 2006. The winning simulations, run on BlueGene/L, investigated the properties of materials at the length and time scales of atomic interactions. The computing power that makes possible such detailed simulations provides unprecedented opportunities for scientific discovery. Laboratory scientists are meeting the extraordinary challenge of creating experimental capabilities to match the resolution of supercomputer simulations. Working with a wide range of collaborators, we are developing experimental tools that gather better data at the nanometer and subnanosecond scales. Applications range from imaging biomolecules to studying matter at extreme conditions of pressure and temperature. The premier high-energy-density experimental physics facility in the world will be the National Ignition Facility (NIF) when construction is completed in 2009. We are leading the national effort to perform the first fusion ignition experiments using NIF's 192-beam laser and prepare to explore some of the remaining important issues in weapons physics. With scientific colleagues from throughout the nation, we are also designing revolutionary experiments on NIF to advance the fields of astrophysics, planetary physics, and materials science. Mission-directed, multidisciplinary science and technology at Livermore is also focused on reducing the threat posed by the proliferation of weapons of mass destruction as well as their acquisition and use by terrorists. The Laboratory helps this important national effort by providing its unique expertise, integration analyses, and operational support to the Department of Homeland Security. For this vital facet of the Laboratory's national security mission, we are developing advanced technologies, such as a pocket-size explosives detector and an airborne persistent surveillance system, both of which earned R&D 100 Awards. Altogether, Livermore won seven R&D 100 Awards in 2006, the most for any organization. Emerging threats to national and global security go beyond defense and homeland security. Livermore pursues major scientific and technical advances to meet the need for a clean environment; clean, abundant energy; better water management; and improved human health. Our annual report highlights the link between human activities and the warming of tropical oceans, as well as techniques for imaging biological molecules and detecting bone cancer in its earliest stages. In addition, we showcase many scientific discoveries: distant planets, the composition of comets, a new superheavy element.« less

  17. Internet: road to heaven or hell for the clinical laboratory?

    PubMed

    Chou, D

    1996-05-01

    The Internet started as a research project by the Department of Defense Advanced Research Projects Agency for networking computers. Ironically, the networking project now predominantly supports human rather than computer communications. The Internet's growth, estimated at 20% per month, has been fueled by commercial and public perception that it will become an important medium for merchandising, marketing, and advertising. For the clinical laboratory, the Internet provides high-speed communications through e-mail and allows the retrieval of important information held in repositories. All this capability comes at a price, including the need to manage a complex technology and the risk of instrusions on patient privacy.

  18. Technology transfer of military space microprocessor developments

    NASA Astrophysics Data System (ADS)

    Gorden, C.; King, D.; Byington, L.; Lanza, D.

    1999-01-01

    Over the past 13 years the Air Force Research Laboratory (AFRL) has led the development of microprocessors and computers for USAF space and strategic missile applications. As a result of these Air Force development programs, advanced computer technology is available for use by civil and commercial space customers as well. The Generic VHSIC Spaceborne Computer (GVSC) program began in 1985 at AFRL to fulfill a deficiency in the availability of space-qualified data and control processors. GVSC developed a radiation hardened multi-chip version of the 16-bit, Mil-Std 1750A microprocessor. The follow-on to GVSC, the Advanced Spaceborne Computer Module (ASCM) program, was initiated by AFRL to establish two industrial sources for complete, radiation-hardened 16-bit and 32-bit computers and microelectronic components. Development of the Control Processor Module (CPM), the first of two ASCM contract phases, concluded in 1994 with the availability of two sources for space-qualified, 16-bit Mil-Std-1750A computers, cards, multi-chip modules, and integrated circuits. The second phase of the program, the Advanced Technology Insertion Module (ATIM), was completed in December 1997. ATIM developed two single board computers based on 32-bit reduced instruction set computer (RISC) processors. GVSC, CPM, and ATIM technologies are flying or baselined into the majority of today's DoD, NASA, and commercial satellite systems.

  19. A Multi-Time Scale Morphable Software Milieu for Polymorphous Computing Architectures (PCA) - Composable, Scalable Systems

    DTIC Science & Technology

    2004-10-01

    MONITORING AGENCY NAME(S) AND ADDRESS(ES) Defense Advanced Research Projects Agency AFRL/IFTC 3701 North Fairfax Drive...Scalable Parallel Libraries for Large-Scale Concurrent Applications," Technical Report UCRL -JC-109251, Lawrence Livermore National Laboratory

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, Valerie

    Given the significant impact of computing on society, it is important that all cultures, especially underrepresented cultures, are fully engaged in the field of computing to ensure that everyone benefits from the advances in computing. This proposal is focused on the field of high performance computing. The lack of cultural diversity in computing, in particular high performance computing, is especially evident with respect to the following ethnic groups – African Americans, Hispanics, and Native Americans – as well as People with Disabilities. The goal of this proposal is to organize and coordinate a National Laboratory Career Development Workshop focused onmore » underrepresented cultures (ethnic cultures and disability cultures) in high performance computing. It is expected that the proposed workshop will increase the engagement of underrepresented cultures in HPC through increased exposure to the excellent work at the national laboratories. The National Laboratory Workshops are focused on the recruitment of senior graduate students and the retention of junior lab staff through the various panels and discussions at the workshop. Further, the workshop will include a community building component that extends beyond the workshop. The workshop was held was held at the Lawrence Livermore National Laboratory campus in Livermore, CA. from June 14 - 15, 2012. The grant provided funding for 25 participants from underrepresented groups. The workshop also included another 25 local participants in the summer programs at Lawrence Livermore National Laboratory. Below are some key results from the assessment of the workshops: 86% of the participants indicated strongly agree or agree to the statement "I am more likely to consider/continue a career at a national laboratory as a result of participating in this workshop." 77% indicated strongly agree or agree to the statement "I plan to pursue a summer internship at a national laboratory." 100% of the participants indicated strongly agree or agree to the statement "The CMD-IT NLPDEV workshop was a valuable experience."« less

  1. Oak Ridge National Laboratory Core Competencies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberto, J.B.; Anderson, T.D.; Berven, B.A.

    1994-12-01

    A core competency is a distinguishing integration of capabilities which enables an organization to deliver mission results. Core competencies represent the collective learning of an organization and provide the capacity to perform present and future missions. Core competencies are distinguishing characteristics which offer comparative advantage and are difficult to reproduce. They exhibit customer focus, mission relevance, and vertical integration from research through applications. They are demonstrable by metrics such as level of investment, uniqueness of facilities and expertise, and national impact. The Oak Ridge National Laboratory (ORNL) has identified four core competencies which satisfy the above criteria. Each core competencymore » represents an annual investment of at least $100M and is characterized by an integration of Laboratory technical foundations in physical, chemical, and materials sciences; biological, environmental, and social sciences; engineering sciences; and computational sciences and informatics. The ability to integrate broad technical foundations to develop and sustain core competencies in support of national R&D goals is a distinguishing strength of the national laboratories. The ORNL core competencies are: 9 Energy Production and End-Use Technologies o Biological and Environmental Sciences and Technology o Advanced Materials Synthesis, Processing, and Characterization & Neutron-Based Science and Technology. The distinguishing characteristics of each ORNL core competency are described. In addition, written material is provided for two emerging competencies: Manufacturing Technologies and Computational Science and Advanced Computing. Distinguishing institutional competencies in the Development and Operation of National Research Facilities, R&D Integration and Partnerships, Technology Transfer, and Science Education are also described. Finally, financial data for the ORNL core competencies are summarized in the appendices.« less

  2. MIT Laboratory for Computer Science Progress Report 26

    DTIC Science & Technology

    1989-06-01

    conteinporary and prospective applications from business to sensory computing. In Sqst.-ns., Languagcs, and Nr/o4orks, our objective is to provide the...numbers 363 through 400. 1,i Advanced Network Architecture Academic Staff D. Clark, Group Leader D. Tennenhouse Restarch Staff J. Davin K. Sollins Graduate...Zurich, Switzerland, May 1989. 23 24 Clinical Decision Making Academic Staff R. Patil P. Szolovits, Group Leader Collaborating Investigators M

  3. Computational Studies of Pyrolysis and Upgrading of Bio-oils: Virtual Special Issue

    DOE PAGES

    Xiong, Qingang; Robichaud, David J.

    2017-03-23

    As research activities continue, our understanding of biomass pyrolysis has been significantly elevated and we sought to arrange this Virtual Special Issue (VSI) in ACS Sustainable Chemistry & Engineering to report recent progress on computational and experimental studies of biomass pyrolysis. Beyond highlighting the five national laboratories' advancements, prestigious researchers in the field of biomass pyrolysis have been invited to report their most recent activities.

  4. Construction of Blaze at the University of Illinois at Chicago: A Shared, High-Performance, Visual Computer for Next-Generation Cyberinfrastructure-Accelerated Scientific, Engineering, Medical and Public Policy Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Maxine D.; Leigh, Jason

    2014-02-17

    The Blaze high-performance visual computing system serves the high-performance computing research and education needs of University of Illinois at Chicago (UIC). Blaze consists of a state-of-the-art, networked, computer cluster and ultra-high-resolution visualization system called CAVE2(TM) that is currently not available anywhere in Illinois. This system is connected via a high-speed 100-Gigabit network to the State of Illinois' I-WIRE optical network, as well as to national and international high speed networks, such as the Internet2, and the Global Lambda Integrated Facility. This enables Blaze to serve as an on-ramp to national cyberinfrastructure, such as the National Science Foundation’s Blue Waters petascalemore » computer at the National Center for Supercomputing Applications at the University of Illinois at Chicago and the Department of Energy’s Argonne Leadership Computing Facility (ALCF) at Argonne National Laboratory. DOE award # DE-SC005067, leveraged with NSF award #CNS-0959053 for “Development of the Next-Generation CAVE Virtual Environment (NG-CAVE),” enabled us to create a first-of-its-kind high-performance visual computing system. The UIC Electronic Visualization Laboratory (EVL) worked with two U.S. companies to advance their commercial products and maintain U.S. leadership in the global information technology economy. New applications are being enabled with the CAVE2/Blaze visual computing system that is advancing scientific research and education in the U.S. and globally, and help train the next-generation workforce.« less

  5. Systems Engineering Building Advances Power Grid Research

    ScienceCinema

    Virden, Jud; Huang, Henry; Skare, Paul; Dagle, Jeff; Imhoff, Carl; Stoustrup, Jakob; Melton, Ron; Stiles, Dennis; Pratt, Rob

    2018-01-16

    Researchers and industry are now better equipped to tackle the nation’s most pressing energy challenges through PNNL’s new Systems Engineering Building – including challenges in grid modernization, buildings efficiency and renewable energy integration. This lab links real-time grid data, software platforms, specialized laboratories and advanced computing resources for the design and demonstration of new tools to modernize the grid and increase buildings energy efficiency.

  6. Virtualization Technologies in Information Systems Education

    ERIC Educational Resources Information Center

    Lunsford, Dale L.

    2009-01-01

    Information systems educators must balance the need to protect the stability, availability, and security of computer laboratories with the learning objectives of various courses. In advanced courses where students need to install, configure, and otherwise manipulate application and operating system settings, this is especially problematic as these…

  7. Clinical laboratory technician to clinical laboratory scientist articulation and distance learning.

    PubMed

    Crowley, J R; Laurich, G A; Mobley, R C; Arnette, A H; Shaikh, A H; Martin, S M

    1999-01-01

    Laboratory workers and educators alike are challenged to support access to education that is current and provides opportunities for career advancement in the work place. The clinical laboratory science (CLS) program at the Medical College of Georgia in Augusta developed a clinical laboratory technician (CLT) to CLS articulation option, expanded it through distance learning, and integrated computer based learning technology into the educational process over a four year period to address technician needs for access to education. Both positive and negative outcomes were realized through these efforts. Twenty-seven students entered the pilot articulation program, graduated, and took a CLS certification examination. Measured in terms of CLS certification, promotions, pay raises, and career advancement, the program described was a success. However, major problems were encountered related to the use of unfamiliar communication technology; administration of the program at distance sites; communication between educational institutions, students, and employers; and competition with CLT programs for internship sites. These problems must be addressed in future efforts to provide a successful distance learning program. Effective methods for meeting educational needs and career ladder expectations of CLTs and their employers are important to the overall quality and appeal of the profession. Educational technology that includes computer-aided instruction, multimedia, and telecommunications can provide powerful tools for education in general and CLT articulation in particular. Careful preparation and vigilant attention to reliable delivery methods as well as students' progress and outcomes is critical for an efficient, economically feasible, and educationally sound program.

  8. TOPICAL REVIEW: Advances and challenges in computational plasma science

    NASA Astrophysics Data System (ADS)

    Tang, W. M.; Chan, V. S.

    2005-02-01

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behaviour. Recent advances in simulations of magnetically confined plasmas are reviewed in this paper, with illustrative examples, chosen from associated research areas such as microturbulence, magnetohydrodynamics and other topics. Progress has been stimulated, in particular, by the exponential growth of computer speed along with significant improvements in computer technology. The advances in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics have produced increasingly good agreement between experimental observations and computational modelling. This was enabled by two key factors: (a) innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales and (b) access to powerful new computational resources. Excellent progress has been made in developing codes for which computer run-time and problem-size scale well with the number of processors on massively parallel processors (MPPs). Examples include the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPPs to produce three-dimensional, general geometry, nonlinear particle simulations that have accelerated advances in understanding the nature of turbulence self-regulation by zonal flows. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In looking towards the future, the current results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. This should produce the scientific excitement which will help to (a) stimulate enhanced cross-cutting collaborations with other fields and (b) attract the bright young talent needed for the future health of the field of plasma science.

  9. Advances and challenges in computational plasma science

    NASA Astrophysics Data System (ADS)

    Tang, W. M.

    2005-02-01

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behaviour. Recent advances in simulations of magnetically confined plasmas are reviewed in this paper, with illustrative examples, chosen from associated research areas such as microturbulence, magnetohydrodynamics and other topics. Progress has been stimulated, in particular, by the exponential growth of computer speed along with significant improvements in computer technology. The advances in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics have produced increasingly good agreement between experimental observations and computational modelling. This was enabled by two key factors: (a) innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales and (b) access to powerful new computational resources. Excellent progress has been made in developing codes for which computer run-time and problem-size scale well with the number of processors on massively parallel processors (MPPs). Examples include the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPPs to produce three-dimensional, general geometry, nonlinear particle simulations that have accelerated advances in understanding the nature of turbulence self-regulation by zonal flows. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In looking towards the future, the current results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. This should produce the scientific excitement which will help to (a) stimulate enhanced cross-cutting collaborations with other fields and (b) attract the bright young talent needed for the future health of the field of plasma science.

  10. Department of Defense In-House RDT and E Activities: Management Analysis Report for Fiscal Year 1993

    DTIC Science & Technology

    1994-11-01

    A worldwide unique lab because it houses a high - speed modeling and simulation system, a prototype...E Division, San Diego, CA: High Performance Computing Laboratory providing a wide range of advanced computer systems for the scientific investigation...Machines CM-200 and a 256-node Thinking Machines CM-S. The CM-5 is in a very large memory, ( high performance 32 Gbytes, >4 0 OFlop) coafiguration,

  11. Laboratory for Computer Science Progress Report 21, July 1983-June 1984.

    DTIC Science & Technology

    1984-06-01

    Systems 269 4. Distributed Consensus 270 5. Election of a Leader in a Distributed Ring of Processors 273 6. Distributed Network Algorithms 274 7. Diagnosis...multiprocessor systems. This facility, funded by the new!y formed Strategic Computing Program of the Defense Advanced Research Projects Agency, will enable...Academic Staff P. Szo)ovits, Group Leader R. Patil Collaborating Investigators M. Criscitiello, M.D., Tufts-New England Medical Center Hospital R

  12. Hot Corrosion Test Facility at the NASA Lewis Special Projects Laboratory

    NASA Technical Reports Server (NTRS)

    Robinson, Raymond C.; Cuy, Michael D.

    1994-01-01

    The Hot Corrosion Test Facility (HCTF) at the NASA Lewis Special Projects Laboratory (SPL) is a high-velocity, pressurized burner rig currently used to evaluate the environmental durability of advanced ceramic materials such as SiC and Si3N4. The HCTF uses laboratory service air which is preheated, mixed with jet fuel, and ignited to simulate the conditions of a gas turbine engine. Air, fuel, and water systems are computer-controlled to maintain test conditions which include maximum air flows of 250 kg/hr (550 lbm/hr), pressures of 100-600 kPa (1-6 atm), and gas temperatures exceeding 1500 C (2732 F). The HCTF provides a relatively inexpensive, yet sophisticated means for researchers to study the high-temperature oxidation of advanced materials, and the injection of a salt solution provides the added capability of conducting hot corrosion studies.

  13. Dakota Uncertainty Quantification Methods Applied to the CFD code Nek5000

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Delchini, Marc-Olivier; Popov, Emilian L.; Pointer, William David

    This report presents the state of advancement of a Nuclear Energy Advanced Modeling and Simulation (NEAMS) project to characterize the uncertainty of the computational fluid dynamics (CFD) code Nek5000 using the Dakota package for flows encountered in the nuclear engineering industry. Nek5000 is a high-order spectral element CFD code developed at Argonne National Laboratory for high-resolution spectral-filtered large eddy simulations (LESs) and unsteady Reynolds-averaged Navier-Stokes (URANS) simulations.

  14. Advanced Technology System Scheduling Governance Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ang, Jim; Carnes, Brian; Hoang, Thuc

    In the fall of 2005, the Advanced Simulation and Computing (ASC) Program appointed a team to formulate a governance model for allocating resources and scheduling the stockpile stewardship workload on ASC capability systems. This update to the original document takes into account the new technical challenges and roles for advanced technology (AT) systems and the new ASC Program workload categories that must be supported. The goal of this updated model is to effectively allocate and schedule AT computing resources among all three National Nuclear Security Administration (NNSA) laboratories for weapons deliverables that merit priority on this class of resource. Themore » process outlined below describes how proposed work can be evaluated and approved for resource allocations while preserving high effective utilization of the systems. This approach will provide the broadest possible benefit to the Stockpile Stewardship Program (SSP).« less

  15. Advanced computations in plasma physics

    NASA Astrophysics Data System (ADS)

    Tang, W. M.

    2002-05-01

    Scientific simulation in tandem with theory and experiment is an essential tool for understanding complex plasma behavior. In this paper we review recent progress and future directions for advanced simulations in magnetically confined plasmas with illustrative examples chosen from magnetic confinement research areas such as microturbulence, magnetohydrodynamics, magnetic reconnection, and others. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales together with access to powerful new computational resources. In particular, the fusion energy science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPP's to produce three-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of turbulence self-regulation by zonal flows. It should be emphasized that these calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to stimulate improved cross-cutting collaborations with other fields and also to help attract bright young talent to plasma science.

  16. Computer Plotting Data Points in the Engine Research Building

    NASA Image and Video Library

    1956-09-21

    A female computer plotting compressor data in the Engine Research Building at the NACA’s Lewis Flight Propulsion Laboratory. The Computing Section was introduced during World War II to relieve short-handed research engineers of some of the tedious data-taking work. The computers made the initial computations and plotted the data graphically. The researcher then analyzed the data and either summarized the findings in a report or made modifications or ran the test again. With the introduction of mechanical computer systems in the 1950s the female computers learned how to encode the punch cards. As the data processing capabilities increased, fewer female computers were needed. Many left on their own to start families, while others earned mathematical degrees and moved into advanced positions.

  17. CFD Based Computations of Flexible Helicopter Blades for Stability Analysis

    NASA Technical Reports Server (NTRS)

    Guruswamy, Guru P.

    2011-01-01

    As a collaborative effort among government aerospace research laboratories an advanced version of a widely used computational fluid dynamics code, OVERFLOW, was recently released. This latest version includes additions to model flexible rotating multiple blades. In this paper, the OVERFLOW code is applied to improve the accuracy of airload computations from the linear lifting line theory that uses displacements from beam model. Data transfers required at every revolution are managed through a Unix based script that runs jobs on large super-cluster computers. Results are demonstrated for the 4-bladed UH-60A helicopter. Deviations of computed data from flight data are evaluated. Fourier analysis post-processing that is suitable for aeroelastic stability computations are performed.

  18. Status of the Short-Pulse X-ray Project at the Advanced Photon Source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nassiri, A; Berenc, T G; Borland, M

    2012-07-01

    The Advanced Photon Source Upgrade (APS-U) Project at Argonne will include generation of short-pulse x-rays based on Zholents deflecting cavity scheme. We have chosen superconducting (SC) cavities in order to have a continuous train of crabbed bunches and flexibility of operating modes. In collaboration with Jefferson Laboratory, we are prototyping and testing a number of single-cell deflecting cavities and associated auxiliary systems with promising initial results. In collaboration with Lawrence Berkeley National Laboratory, we are working to develop state-of-the-art timing, synchronization, and differential rf phase stability systems that are required for SPX. Collaboration with Advanced Computations Department at Stanford Linearmore » Accelerator Center is looking into simulations of complex, multi-cavity geometries with lower- and higher-order modes waveguide dampers using ACE3P. This contribution provides the current R&D status of the SPX project.« less

  19. Research in mobile robotics at ORNL/CESAR (Oak Ridge National Laboratory/Center for Engineering Systems Advanced Research)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mann, R.C.; Weisbin, C.R.; Pin, F.G.

    1989-01-01

    This paper reviews ongoing and planned research with mobile autonomous robots at the Oak Ridge National Laboratory (ORNL), Center for Engineering Systems Advanced Research (CESAR). Specifically we report on results obtained with the robot HERMIES-IIB in navigation, intelligent sensing, learning, and on-board parallel computing in support of these functions. We briefly summarize an experiment with HERMIES-IIB that demonstrates the capability of smooth transitions between robot autonomy and tele-operation. This experiment results from collaboration among teams at the Universities of Florida, Michigan, Tennessee, and Texas; and ORNL in a program targeted at robotics for advanced nuclear power stations. We conclude bymore » summarizing ongoing R D with our new mobile robot HERMIES-III which is equipped with a seven degree-of-freedom research manipulator arm. 12 refs., 4 figs.« less

  20. Serving the Nation for Fifty Years: 1952 - 2002 Lawrence Livermore National Laboratory [LLNL], Fifty Years of Accomplishments

    DOE R&D Accomplishments Database

    2002-01-01

    For 50 years, Lawrence Livermore National Laboratory has been making history and making a difference. The outstanding efforts by a dedicated work force have led to many remarkable accomplishments. Creative individuals and interdisciplinary teams at the Laboratory have sought breakthrough advances to strengthen national security and to help meet other enduring national needs. The Laboratory's rich history includes many interwoven stories -- from the first nuclear test failure to accomplishments meeting today's challenges. Many stories are tied to Livermore's national security mission, which has evolved to include ensuring the safety, security, and reliability of the nation's nuclear weapons without conducting nuclear tests and preventing the proliferation and use of weapons of mass destruction. Throughout its history and in its wide range of research activities, Livermore has achieved breakthroughs in applied and basic science, remarkable feats of engineering, and extraordinary advances in experimental and computational capabilities. From the many stories to tell, one has been selected for each year of the Laboratory's history. Together, these stories give a sense of the Laboratory -- its lasting focus on important missions, dedication to scientific and technical excellence, and drive to made the world more secure and a better place to live.

  1. Formal versus Grass-Roots Training: Women, Work, and Computers.

    ERIC Educational Resources Information Center

    Brunet, Jean; Proulx, Serge

    1989-01-01

    Examines programs in Montreal, Canada, that offer microcomputing training--traditional private courses as well as an experimental, neighborhood-oriented "popular laboratory." Finds that both are used by men to advance their careers but that women use them to catch up and survive economically in a transformed workplace. (SR)

  2. Distributed Drug Discovery: Advancing Chemical Education through Contextualized Combinatorial Solid-Phase Organic Laboratories

    ERIC Educational Resources Information Center

    Scott, William L.; Denton, Ryan E.; Marrs, Kathleen A.; Durrant, Jacob D.; Samaritoni, J. Geno; Abraham, Milata M.; Brown, Stephen P.; Carnahan, Jon M.; Fischer, Lindsey G.; Glos, Courtney E.; Sempsrott, Peter J.; O'Donnell, Martin J.

    2015-01-01

    The Distributed Drug Discovery (D3) program trains students in three drug discovery disciplines (synthesis, computational analysis, and biological screening) while addressing the important challenge of discovering drug leads for neglected diseases. This article focuses on implementation of the synthesis component in the second-semester…

  3. Project Bank: Word Processing on Campus.

    ERIC Educational Resources Information Center

    Hlavin, Robert F.

    Project Bank was initiated at Triton College (Illinois) to increase student awareness of the merits of word processing as it affects their class work and related assignments; to make faculty aware of advances in word processing programs; and to increase the utilization of the college's computer laboratory. All fall 1985 incoming freshmen were…

  4. An Innovative Approach to Bridge a Skill Gap and Grow a Workforce Pipeline: The Computer System, Cluster, and Networking Summer Institute

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Connor, Carolyn Marie; Jacobson, Andree Lars; Bonnie, Amanda Marie

    Sustainable and effective computing infrastructure depends critically on the skills and expertise of domain scientists and of committed and well-trained advanced computing professionals. But, in its ongoing High Performance Computing (HPC) work, Los Alamos National Laboratory noted a persistent shortage of well-prepared applicants, particularly for entry-level cluster administration, file systems administration, and high speed networking positions. Further, based upon recruiting efforts and interactions with universities graduating students in related majors of interest (e.g., computer science (CS)), there has been a long standing skillset gap, as focused training in HPC topics is typically lacking or absent in undergraduate and in evenmore » many graduate programs. Given that the effective operation and use of HPC systems requires specialized and often advanced training, that there is a recognized HPC skillset gap, and that there is intense global competition for computing and computational science talent, there is a long-standing and critical need for innovative approaches to help bridge the gap and create a well-prepared, next generation HPC workforce. Our paper places this need in the context of the HPC work and workforce requirements at Los Alamos National Laboratory (LANL) and presents one such innovative program conceived to address the need, bridge the gap, and grow an HPC workforce pipeline at LANL. The Computer System, Cluster, and Networking Summer Institute (CSCNSI) completed its 10th year in 2016. The story of the CSCNSI and its evolution is detailed below with a description of the design of its Boot Camp, and a summary of its success and some key factors that have enabled that success.« less

  5. An Innovative Approach to Bridge a Skill Gap and Grow a Workforce Pipeline: The Computer System, Cluster, and Networking Summer Institute

    DOE PAGES

    Connor, Carolyn Marie; Jacobson, Andree Lars; Bonnie, Amanda Marie; ...

    2016-11-01

    Sustainable and effective computing infrastructure depends critically on the skills and expertise of domain scientists and of committed and well-trained advanced computing professionals. But, in its ongoing High Performance Computing (HPC) work, Los Alamos National Laboratory noted a persistent shortage of well-prepared applicants, particularly for entry-level cluster administration, file systems administration, and high speed networking positions. Further, based upon recruiting efforts and interactions with universities graduating students in related majors of interest (e.g., computer science (CS)), there has been a long standing skillset gap, as focused training in HPC topics is typically lacking or absent in undergraduate and in evenmore » many graduate programs. Given that the effective operation and use of HPC systems requires specialized and often advanced training, that there is a recognized HPC skillset gap, and that there is intense global competition for computing and computational science talent, there is a long-standing and critical need for innovative approaches to help bridge the gap and create a well-prepared, next generation HPC workforce. Our paper places this need in the context of the HPC work and workforce requirements at Los Alamos National Laboratory (LANL) and presents one such innovative program conceived to address the need, bridge the gap, and grow an HPC workforce pipeline at LANL. The Computer System, Cluster, and Networking Summer Institute (CSCNSI) completed its 10th year in 2016. The story of the CSCNSI and its evolution is detailed below with a description of the design of its Boot Camp, and a summary of its success and some key factors that have enabled that success.« less

  6. Argonne National Laboratory Annual Report of Laboratory Directed Research and Development program activities FY 2011.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Office of The Director)

    As a national laboratory Argonne concentrates on scientific and technological challenges that can only be addressed through a sustained, interdisciplinary focus at a national scale. Argonne's eight major initiatives, as enumerated in its strategic plan, are Hard X-ray Sciences, Leadership Computing, Materials and Molecular Design and Discovery, Energy Storage, Alternative Energy and Efficiency, Nuclear Energy, Biological and Environmental Systems, and National Security. The purposes of Argonne's Laboratory Directed Research and Development (LDRD) Program are to encourage the development of novel technical concepts, enhance the Laboratory's research and development (R and D) capabilities, and pursue its strategic goals. projects are selectedmore » from proposals for creative and innovative R and D studies that require advance exploration before they are considered to be sufficiently developed to obtain support through normal programmatic channels. Among the aims of the projects supported by the LDRD Program are the following: establishment of engineering proof of principle, assessment of design feasibility for prospective facilities, development of instrumentation or computational methods or systems, and discoveries in fundamental science and exploratory development.« less

  7. Laboratory evolution of protein conformational dynamics.

    PubMed

    Campbell, Eleanor C; Correy, Galen J; Mabbitt, Peter D; Buckle, Ashley M; Tokuriki, Nobuhiko; Jackson, Colin J

    2017-11-08

    This review focuses on recent work that has begun to establish specific functional roles for protein conformational dynamics, specifically how the conformational landscapes that proteins can sample can evolve under laboratory based evolutionary selection. We discuss recent technical advances in computational and biophysical chemistry, which have provided us with new ways to dissect evolutionary processes. Finally, we offer some perspectives on the emerging view of conformational dynamics and evolution, and the challenges that we face in rationally engineering conformational dynamics. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Advanced LabVIEW Labs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, Eric D.

    1999-06-17

    In the world of computer-based data acquisition and control, the graphical interface program LabVIEW from National Instruments is so ubiquitous that in many ways it has almost become the laboratory standard. To date, there have been approximately fifteen books concerning LabVIEW, but Professor Essick's treatise takes on a completely different tack than all of the previous discussions. In the more standard treatments of the ways and wherefores of LabVIEW such as LabVIEW Graphical Programming: Practical Applications in Instrumentation and Control by Gary W. Johnson (McGraw Hill, NY 1997), the emphasis has been instructing the reader how to program LabVIEW tomore » create a Virtual Instrument (VI) on the computer for interfacing to a particular instruments. LabVIEW is written in G a graphical programming language developed by National Instruments. In the past the emphasis has been on training the experimenter to learn G . Without going into details here, G incorporates the usual loops, arithmetic expressions, etc., found in many programming languages, but in an icon (graphical) environment. The net result being that LabVIEW contains all of the standard methods needed for interfacing to instruments, data acquisition, data analysis, graphics, and also methodology to incorporate programs written in other languages into LabVIEW. Historically, according to Professor Essick, he developed a series of experiments for an upper division laboratory course for computer-based instrumentation. His observation was that while many students had the necessary background in computer programming languages, there were students who had virtually no concept about writing a computer program let alone a computer- based interfacing program. Thus the beginnings of a concept for not only teaching computer- based instrumentation techniques, but aiso a method for the beginner to experience writing a com- puter program. Professor Essick saw LabVIEW as the perfect environment in which to teach computer-based research skills. With this goal in mind, he has succeeded admirably. Advanced LabVIEW Labs presents a series of chapters devoted to not only introducing the reader to LabVIEW, but also to the concepts necessary for writing a successful computer pro- gram. Each chapter is an assignment for the student and is suitable for a ten week course. The first topic introduces the while loop and waveform chart VI'S. After learning how to launch LabVIEW, the student then leans how to use LabVIEW functions such as sine and cosine. The beauty of thk and subsequent chapters, the student is introduced immediately to computer-based instruction by learning how to display the results in graph form on the screen. At each point along the way, the student is not only introduced to another LabVIEW operation, but also to such subjects as spread sheets for data storage, numerical integration, Fourier transformations', curve fitting algorithms, etc. The last few chapters conclude with the purpose of the learning module, and that is, com- puter-based instrumentation. Computer-based laboratory projects such as analog-to-digital con- version, digitizing oscilloscopes treated. Advanced Lab VIEW Labs finishes with a treatment on GPIB interfacing and finally, the student is asked to create an operating VI for temperature con- trol. This is an excellent text, not only as an treatise on LabVIEW but also as an introduction to computer programming logic. All programmers, who are struggling to not only learning how interface computers to instruments, but also trying understand top down programming and other programming language techniques, should add Advanced Lab-VIEW Labs to their computer library.« less

  9. Advanced LabVIEW Labs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, Eric D.

    1999-06-17

    In the world of computer-based data acquisition and control, the graphical interface program LabVIEW from National Instruments is so ubiquitous that in many ways it has almost become the laboratory standard. To date, there have been approximately fifteen books concerning LabVIEW, but Professor Essick's treatise takes on a completely different tack than all of the previous discussions. In the more standard treatments of the ways and wherefores of LabVIEW such as LabVIEW Graphical Programming: Practical Applications in Instrumentation and Control by Gary W. Johnson (McGraw Hill, NY 1997), the emphasis has been instructing the reader how to program LabVIEW tomore » create a Virtual Instrument (VI) on the computer for interfacing to a particular instruments. LabVIEW is written in "G" a graphical programming language developed by National Instruments. In the past the emphasis has been on training the experimenter to learn "G". Without going into details here, "G" incorporates the usual loops, arithmetic expressions, etc., found in many programming languages, but in an icon (graphical) environment. The net result being that LabVIEW contains all of the standard methods needed for interfacing to instruments, data acquisition, data analysis, graphics, and also methodology to incorporate programs written in other languages into LabVIEW. Historically, according to Professor Essick, he developed a series of experiments for an upper division laboratory course for computer-based instrumentation. His observation was that while many students had the necessary background in computer programming languages, there were students who had virtually no concept about writing a computer program let alone a computer- based interfacing program. Thus the beginnings of a concept for not only teaching computer- based instrumentation techniques, but aiso a method for the beginner to experience writing a com- puter program. Professor Essick saw LabVIEW as the "perfect environment in which to teach computer-based research skills." With this goal in mind, he has succeeded admirably. Advanced LabVIEW Labs presents a series of chapters devoted to not only introducing the reader to LabVIEW, but also to the concepts necessary for writing a successful computer pro- gram. Each chapter is an assignment for the student and is suitable for a ten week course. The first topic introduces the while loop and waveform chart VI'S. After learning how to launch LabVIEW, the student then leans how to use LabVIEW functions such as sine and cosine. The beauty of thk and subsequent chapters, the student is introduced immediately to computer-based instruction by learning how to display the results in graph form on the screen. At each point along the way, the student is not only introduced to another LabVIEW operation, but also to such subjects as spread sheets for data storage, numerical integration, Fourier transformations', curve fitting algorithms, etc. The last few chapters conclude with the purpose of the learning module, and that is, com- puter-based instrumentation. Computer-based laboratory projects such as analog-to-digital con- version, digitizing oscilloscopes treated. Advanced Lab VIEW Labs finishes with a treatment on GPIB interfacing and finally, the student is asked to create an operating VI for temperature con- trol. This is an excellent text, not only as an treatise on LabVIEW but also as an introduction to computer programming logic. All programmers, who are struggling to not only learning how interface computers to instruments, but also trying understand top down programming and other programming language techniques, should add Advanced Lab-VIEW Labs to their computer library.« less

  10. Two-dimensional heat flow apparatus

    NASA Astrophysics Data System (ADS)

    McDougall, Patrick; Ayars, Eric

    2014-06-01

    We have created an apparatus to quantitatively measure two-dimensional heat flow in a metal plate using a grid of temperature sensors read by a microcontroller. Real-time temperature data are collected from the microcontroller by a computer for comparison with a computational model of the heat equation. The microcontroller-based sensor array allows previously unavailable levels of precision at very low cost, and the combination of measurement and modeling makes for an excellent apparatus for the advanced undergraduate laboratory course.

  11. Proactive human-computer collaboration for information discovery

    NASA Astrophysics Data System (ADS)

    DiBona, Phil; Shilliday, Andrew; Barry, Kevin

    2016-05-01

    Lockheed Martin Advanced Technology Laboratories (LM ATL) is researching methods, representations, and processes for human/autonomy collaboration to scale analysis and hypotheses substantiation for intelligence analysts. This research establishes a machinereadable hypothesis representation that is commonsensical to the human analyst. The representation unifies context between the human and computer, enabling autonomy in the form of analytic software, to support the analyst through proactively acquiring, assessing, and organizing high-value information that is needed to inform and substantiate hypotheses.

  12. Cancer's Big Data Problem

    DOE PAGES

    Breaux, Justin H. S.

    2017-03-15

    The US Department of Energy (DOE) has partnered with the National Cancer Institute (NCI) to use DOE supercomputers to aid in the fight against cancer by building sophisticated models based on data available at the population, patient, and molecular levels. Here, through a three-year pilot project called the Joint Design of Advanced Computing Solutions for Cancer (JDACSC), four participating national laboratories--Argonne, Lawrence Livermore, Los Alamos, and Oak Ridge--will focus on three problems singled out by the NCI as the biggest bottlenecks to advancing cancer research.

  13. Cancer's Big Data Problem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Breaux, Justin H. S.

    The US Department of Energy (DOE) has partnered with the National Cancer Institute (NCI) to use DOE supercomputers to aid in the fight against cancer by building sophisticated models based on data available at the population, patient, and molecular levels. Here, through a three-year pilot project called the Joint Design of Advanced Computing Solutions for Cancer (JDACSC), four participating national laboratories--Argonne, Lawrence Livermore, Los Alamos, and Oak Ridge--will focus on three problems singled out by the NCI as the biggest bottlenecks to advancing cancer research.

  14. Scientific Discovery through Advanced Computing in Plasma Science

    NASA Astrophysics Data System (ADS)

    Tang, William

    2005-03-01

    Advanced computing is generally recognized to be an increasingly vital tool for accelerating progress in scientific research during the 21st Century. For example, the Department of Energy's ``Scientific Discovery through Advanced Computing'' (SciDAC) Program was motivated in large measure by the fact that formidable scientific challenges in its research portfolio could best be addressed by utilizing the combination of the rapid advances in super-computing technology together with the emergence of effective new algorithms and computational methodologies. The imperative is to translate such progress into corresponding increases in the performance of the scientific codes used to model complex physical systems such as those encountered in high temperature plasma research. If properly validated against experimental measurements and analytic benchmarks, these codes can provide reliable predictive capability for the behavior of a broad range of complex natural and engineered systems. This talk reviews recent progress and future directions for advanced simulations with some illustrative examples taken from the plasma science applications area. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by the combination of access to powerful new computational resources together with innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning a huge range in time and space scales. In particular, the plasma science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPP's to produce three-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of plasma turbulence in magnetically-confined high temperature plasmas. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to stimulate improved cross-cutting collaborations with other fields and also to help attract bright young talent to the computational science area.

  15. The Computer as a Tool for Learning

    PubMed Central

    Starkweather, John A.

    1986-01-01

    Experimenters from the beginning recognized the advantages computers might offer in medical education. Several medical schools have gained experience in such programs in automated instruction. Television images and graphic display combined with computer control and user interaction are effective for teaching problem solving. The National Board of Medical Examiners has developed patient-case simulation for examining clinical skills, and the National Library of Medicine has experimented with combining media. Advances from the field of artificial intelligence and the availability of increasingly powerful microcomputers at lower cost will aid further development. Computers will likely affect existing educational methods, adding new capabilities to laboratory exercises, to self-assessment and to continuing education. PMID:3544511

  16. Advanced Computation in Plasma Physics

    NASA Astrophysics Data System (ADS)

    Tang, William

    2001-10-01

    Scientific simulation in tandem with theory and experiment is an essential tool for understanding complex plasma behavior. This talk will review recent progress and future directions for advanced simulations in magnetically-confined plasmas with illustrative examples chosen from areas such as microturbulence, magnetohydrodynamics, magnetic reconnection, and others. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales together with access to powerful new computational resources. In particular, the fusion energy science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop MPP's to produce 3-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of turbulence self-regulation by zonal flows. It should be emphasized that these calculations, which typically utilized billions of particles for tens of thousands time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to stimulate improved cross-cutting collaborations with other fields and also to help attract bright young talent to plasma science.

  17. JAX Colony Management System (JCMS): an extensible colony and phenotype data management system.

    PubMed

    Donnelly, Chuck J; McFarland, Mike; Ames, Abigail; Sundberg, Beth; Springer, Dave; Blauth, Peter; Bult, Carol J

    2010-04-01

    The Jackson Laboratory Colony Management System (JCMS) is a software application for managing data and information related to research mouse colonies, associated biospecimens, and experimental protocols. JCMS runs directly on computers that run one of the PC Windows operating systems, but can be accessed via web browser interfaces from any computer running a Windows, Macintosh, or Linux operating system. JCMS can be configured for a single user or multiple users in small- to medium-size work groups. The target audience for JCMS includes laboratory technicians, animal colony managers, and principal investigators. The application provides operational support for colony management and experimental workflows, sample and data tracking through transaction-based data entry forms, and date-driven work reports. Flexible query forms allow researchers to retrieve database records based on user-defined criteria. Recent advances in handheld computers with integrated barcode readers, middleware technologies, web browsers, and wireless networks add to the utility of JCMS by allowing real-time access to the database from any networked computer.

  18. GeoBrain Computational Cyber-laboratory for Earth Science Studies

    NASA Astrophysics Data System (ADS)

    Deng, M.; di, L.

    2009-12-01

    Computational approaches (e.g., computer-based data visualization, analysis and modeling) are critical for conducting increasingly data-intensive Earth science (ES) studies to understand functions and changes of the Earth system. However, currently Earth scientists, educators, and students have met two major barriers that prevent them from being effectively using computational approaches in their learning, research and application activities. The two barriers are: 1) difficulties in finding, obtaining, and using multi-source ES data; and 2) lack of analytic functions and computing resources (e.g., analysis software, computing models, and high performance computing systems) to analyze the data. Taking advantages of recent advances in cyberinfrastructure, Web service, and geospatial interoperability technologies, GeoBrain, a project funded by NASA, has developed a prototype computational cyber-laboratory to effectively remove the two barriers. The cyber-laboratory makes ES data and computational resources at large organizations in distributed locations available to and easily usable by the Earth science community through 1) enabling seamless discovery, access and retrieval of distributed data, 2) federating and enhancing data discovery with a catalogue federation service and a semantically-augmented catalogue service, 3) customizing data access and retrieval at user request with interoperable, personalized, and on-demand data access and services, 4) automating or semi-automating multi-source geospatial data integration, 5) developing a large number of analytic functions as value-added, interoperable, and dynamically chainable geospatial Web services and deploying them in high-performance computing facilities, 6) enabling the online geospatial process modeling and execution, and 7) building a user-friendly extensible web portal for users to access the cyber-laboratory resources. Users can interactively discover the needed data and perform on-demand data analysis and modeling through the web portal. The GeoBrain cyber-laboratory provides solutions to meet common needs of ES research and education, such as, distributed data access and analysis services, easy access to and use of ES data, and enhanced geoprocessing and geospatial modeling capability. It greatly facilitates ES research, education, and applications. The development of the cyber-laboratory provides insights, lessons-learned, and technology readiness to build more capable computing infrastructure for ES studies, which can meet wide-range needs of current and future generations of scientists, researchers, educators, and students for their formal or informal educational training, research projects, career development, and lifelong learning.

  19. Systems Engineering and Integration (SE and I)

    NASA Technical Reports Server (NTRS)

    Chevers, ED; Haley, Sam

    1990-01-01

    The issue of technology advancement and future space transportation vehicles is addressed. The challenge is to develop systems which can be evolved and improved in small incremental steps where each increment reduces present cost, improves, reliability, or does neither but sets the stage for a second incremental upgrade that does. Future requirements are interface standards for commercial off the shelf products to aid in the development of integrated facilities; enhanced automated code generation system slightly coupled to specification and design documentation; modeling tools that support data flow analysis; and shared project data bases consisting of technical characteristics cast information, measurement parameters, and reusable software programs. Topics addressed include: advanced avionics development strategy; risk analysis and management; tool quality management; low cost avionics; cost estimation and benefits; computer aided software engineering; computer systems and software safety; system testability; and advanced avionics laboratories - and rapid prototyping. This presentation is represented by viewgraphs only.

  20. Drug product distribution systems and departmental operations.

    PubMed

    Hynniman, C E

    1991-10-01

    Technologies affecting institutional pharmacy practice and the operation of pharmacy departments are reviewed, future developments are outlined, and implications of these developments for pharmacy education are proposed. Computer technology, especially as applied to areas such as artificial intelligence, online information databases, electronic bulletin boards, hospital information systems, and point-of-care systems, will have a strong impact on pharmacy practice and management in the 1990s. Other areas in which growth is likely to be active include bar-code technology, robotics, and automated drug dispensing. The applications of these technologies are described, with particular attention placed on the effects of increased automation on the drug-dispensing function. Technological advances may effect marked reductions in dispensing and medication errors; questions concerning the cost-effectiveness of these new systems remain to be answered. These advances also create new opportunities for clinical involvement by pharmacists; however, a fundamental understanding of computer systems is essential. Current practitioners can benefit from attending seminars, participating in users' groups, and keeping current with the computer literature. Many students now acquire the needed skills in computer laboratories and in the classroom. Technological advances will offer the opportunity for pharmacists to expand their clinical role.

  1. Molecular Docking of Enzyme Inhibitors: A Computational Tool for Structure-Based Drug Design

    ERIC Educational Resources Information Center

    Rudnitskaya, Aleksandra; Torok, Bela; Torok, Marianna

    2010-01-01

    Molecular docking is a frequently used method in structure-based rational drug design. It is used for evaluating the complex formation of small ligands with large biomolecules, predicting the strength of the bonding forces and finding the best geometrical arrangements. The major goal of this advanced undergraduate biochemistry laboratory exercise…

  2. Modern Technology in Foreign Language Education: Applications and Projects. The ACTFL Foreign Language Education Series.

    ERIC Educational Resources Information Center

    Smith, Wm. Flint, Ed.

    This book, the second of two volumes devoted to instructional media in second language instruction, focuses on specific applications of advanced technology in the classroom. The first part, "Applications," contains seven chapters. They are: "The Language Laboratory in the Computer Age" (S. E. K. Otto); "Television…

  3. Combining Art and Science in "Arts and Sciences" Education

    ERIC Educational Resources Information Center

    Needle, Andrew; Corbo, Christopher; Wong, Denise; Greenfeder, Gary; Raths, Linda; Fulop, Zoltan

    2007-01-01

    Two of this article's authors--an art professor and a biology professor--shared a project for advanced biology, art, nursing, and computer science majors involving scientific research that used digital imaging of the brain of the zebrafish, a newly favored laboratory animal. These contemporary and innovative teaching and learning practices were a…

  4. Center for space microelectronics technology

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The 1992 Technical Report of the Jet Propulsion Laboratory Center for Space Microelectronics Technology summarizes the technical accomplishments, publications, presentations, and patents of the center during the past year. The report lists 187 publications, 253 presentations, and 111 new technology reports and patents in the areas of solid-state devices, photonics, advanced computing, and custom microcircuits.

  5. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan : ASC software quality engineering practices Version 3.0.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turgeon, Jennifer L.; Minana, Molly A.; Hackney, Patricia

    2009-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in the US Department of Energy/National Nuclear Security Agency (DOE/NNSA) Quality Criteria, Revision 10 (QC-1) as 'conformance to customer requirements and expectations'. This quality plan defines the SNL ASC Program software quality engineering (SQE) practices and provides a mapping of these practices to the SNL Corporate Process Requirement (CPR) 001.3.6; 'Corporate Software Engineering Excellence'. This plan also identifies ASC management's and themore » software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals. This SNL ASC Software Quality Plan establishes the signatories commitments to improving software products by applying cost-effective SQE practices. This plan enumerates the SQE practices that comprise the development of SNL ASC's software products and explains the project teams opportunities for tailoring and implementing the practices.« less

  6. Apollo LM guidance computer software for the final lunar descent.

    NASA Technical Reports Server (NTRS)

    Eyles, D.

    1973-01-01

    In all manned lunar landings to date, the lunar module Commander has taken partial manual control of the spacecraft during the final stage of the descent, below roughly 500 ft altitude. This report describes programs developed at the Charles Stark Draper Laboratory, MIT, for use in the LM's guidance computer during the final descent. At this time computational demands on the on-board computer are at a maximum, and particularly close interaction with the crew is necessary. The emphasis is on the design of the computer software rather than on justification of the particular guidance algorithms employed. After the computer and the mission have been introduced, the current configuration of the final landing programs and an advanced version developed experimentally by the author are described.

  7. First-in-Man Computed Tomography-Guided Percutaneous Revascularization of Coronary Chronic Total Occlusion Using a Wearable Computer: Proof of Concept.

    PubMed

    Opolski, Maksymilian P; Debski, Artur; Borucki, Bartosz A; Szpak, Marcin; Staruch, Adam D; Kepka, Cezary; Witkowski, Adam

    2016-06-01

    We report a case of successful computed tomography-guided percutaneous revascularization of a chronically occluded right coronary artery using a wearable, hands-free computer with a head-mounted display worn by interventional cardiologists in the catheterization laboratory. The projection of 3-dimensional computed tomographic reconstructions onto the screen of virtual reality glass allowed the operators to clearly visualize the distal coronary vessel, and verify the direction of the guide wire advancement relative to the course of the occluded vessel segment. This case provides proof of concept that wearable computers can improve operator comfort and procedure efficiency in interventional cardiology. Copyright © 2016 Canadian Cardiovascular Society. Published by Elsevier Inc. All rights reserved.

  8. HPC Annual Report 2017

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dennig, Yasmin

    Sandia National Laboratories has a long history of significant contributions to the high performance community and industry. Our innovative computer architectures allowed the United States to become the first to break the teraFLOP barrier—propelling us to the international spotlight. Our advanced simulation and modeling capabilities have been integral in high consequence US operations such as Operation Burnt Frost. Strong partnerships with industry leaders, such as Cray, Inc. and Goodyear, have enabled them to leverage our high performance computing (HPC) capabilities to gain a tremendous competitive edge in the marketplace. As part of our continuing commitment to providing modern computing infrastructuremore » and systems in support of Sandia missions, we made a major investment in expanding Building 725 to serve as the new home of HPC systems at Sandia. Work is expected to be completed in 2018 and will result in a modern facility of approximately 15,000 square feet of computer center space. The facility will be ready to house the newest National Nuclear Security Administration/Advanced Simulation and Computing (NNSA/ASC) Prototype platform being acquired by Sandia, with delivery in late 2019 or early 2020. This new system will enable continuing advances by Sandia science and engineering staff in the areas of operating system R&D, operation cost effectiveness (power and innovative cooling technologies), user environment and application code performance.« less

  9. Sandia National Laboratories Institutional Plan FY1994--1999

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1993-10-01

    This report presents a five year plan for the laboratory. This plan takes advantage of the technical strengths of the lab and its staff to address issues of concern to the nation on a scope much broader than Sandia`s original mission, while maintaining the general integrity of the laboratory. The plan proposes initiatives in a number of technologies which overlap the needs of its customers and the strengths of its staff. They include: advanced manufacturing technology; electronics; information and computational technology; transportation energy technology and infrastructure; environmental technology; energy research and technology development; biomedical systems engineering; and post-cold war defensemore » imperatives.« less

  10. Numerical Methods for Forward and Inverse Problems in Discontinuous Media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chartier, Timothy P.

    The research emphasis under this grant's funding is in the area of algebraic multigrid methods. The research has two main branches: 1) exploring interdisciplinary applications in which algebraic multigrid can make an impact and 2) extending the scope of algebraic multigrid methods with algorithmic improvements that are based in strong analysis.The work in interdisciplinary applications falls primarily in the field of biomedical imaging. Work under this grant demonstrated the effectiveness and robustness of multigrid for solving linear systems that result from highly heterogeneous finite element method models of the human head. The results in this work also give promise tomore » medical advances possible with software that may be developed. Research to extend the scope of algebraic multigrid has been focused in several areas. In collaboration with researchers at the University of Colorado, Lawrence Livermore National Laboratory, and Los Alamos National Laboratory, the PI developed an adaptive multigrid with subcycling via complementary grids. This method has very cheap computing costs per iterate and is showing promise as a preconditioner for conjugate gradient. Recent work with Los Alamos National Laboratory concentrates on developing algorithms that take advantage of the recent advances in adaptive multigrid research. The results of the various efforts in this research could ultimately have direct use and impact to researchers for a wide variety of applications, including, astrophysics, neuroscience, contaminant transport in porous media, bi-domain heart modeling, modeling of tumor growth, and flow in heterogeneous porous media. This work has already led to basic advances in computational mathematics and numerical linear algebra and will continue to do so into the future.« less

  11. A Quantitative Geochemical Target for Modeling the Formation of the Earth and Moon

    NASA Technical Reports Server (NTRS)

    Boyce, Jeremy W.; Barnes, Jessica J.; McCubbin, Francis M.

    2017-01-01

    The past decade has been one of geochemical, isotopic, and computational advances that are bringing the laboratory measurements and computational modeling neighborhoods of the Earth-Moon community to ever closer proximity. We are now however in the position to become even better neighbors: modelers can generate testable hypthotheses for geochemists; and geochemists can provide quantitive targets for modelers. Here we present a robust example of the latter based on Cl isotope measurements of mare basalts.

  12. Application of software technology to a future spacecraft computer design

    NASA Technical Reports Server (NTRS)

    Labaugh, R. J.

    1980-01-01

    A study was conducted to determine how major improvements in spacecraft computer systems can be obtained from recent advances in hardware and software technology. Investigations into integrated circuit technology indicated that the CMOS/SOS chip set being developed for the Air Force Avionics Laboratory at Wright Patterson had the best potential for improving the performance of spaceborne computer systems. An integral part of the chip set is the bit slice arithmetic and logic unit. The flexibility allowed by microprogramming, combined with the software investigations, led to the specification of a baseline architecture and instruction set.

  13. A historical perspective of the YF-12A thermal loads and structures program

    NASA Technical Reports Server (NTRS)

    Jenkins, Jerald M.; Quinn, Robert D.

    1996-01-01

    Around 1970, the Y-F-12A loads and structures efforts focused on numerous technological issues that needed defining with regard to aircraft that incorporate hot structures in the design. Laboratory structural heating test technology with infrared systems was largely created during this program. The program demonstrated the ability to duplicate the complex flight temperatures of an advanced supersonic airplane in a ground-based laboratory. The ability to heat and load an advanced operational aircraft in a laboratory at high temperatures and return it to flight status without adverse effects was demonstrated. The technology associated with measuring loads with strain gages on a hot structure was demonstrated with a thermal calibration concept. The results demonstrated that the thermal stresses were significant although the airplane was designed to reduce thermal stresses. Considerable modeling detail was required to predict the heat transfer and the corresponding structural characteristics. The overall YF-12A research effort was particularly productive, and a great deal of flight, laboratory, test and computational data were produced and cross-correlated.

  14. Underwater Threat Source Localization: Processing Sensor Network TDOAs with a Terascale Optical Core Device

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barhen, Jacob; Imam, Neena

    2007-01-01

    Revolutionary computing technologies are defined in terms of technological breakthroughs, which leapfrog over near-term projected advances in conventional hardware and software to produce paradigm shifts in computational science. For underwater threat source localization using information provided by a dynamical sensor network, one of the most promising computational advances builds upon the emergence of digital optical-core devices. In this article, we present initial results of sensor network calculations that focus on the concept of signal wavefront time-difference-of-arrival (TDOA). The corresponding algorithms are implemented on the EnLight processing platform recently introduced by Lenslet Laboratories. This tera-scale digital optical core processor is optimizedmore » for array operations, which it performs in a fixed-point-arithmetic architecture. Our results (i) illustrate the ability to reach the required accuracy in the TDOA computation, and (ii) demonstrate that a considerable speed-up can be achieved when using the EnLight 64a prototype processor as compared to a dual Intel XeonTM processor.« less

  15. Mars Science Laboratory CHIMRA: A Device for Processing Powdered Martian Samples

    NASA Technical Reports Server (NTRS)

    Sunshine, Daniel

    2010-01-01

    The CHIMRA is an extraterrestrial sample acquisition and processing device for the Mars Science Laboratory that emphasizes robustness and adaptability through design configuration. This work reviews the guidelines utilized to invent the initial CHIMRA and the strategy employed in advancing the design; these principles will be discussed in relation to both the final CHIMRA design and similar future devices. The computational synthesis necessary to mature a boxed-in impact-generating mechanism will be presented alongside a detailed mechanism description. Results from the development testing required to advance the design for a highly-loaded, long-life and high-speed bearing application will be presented. Lessons learned during the assembly and testing of this subsystem as well as results and lessons from the sample-handling development test program will be reviewed.

  16. Computation of Thermodynamic Equilibria Pertinent to Nuclear Materials in Multi-Physics Codes

    NASA Astrophysics Data System (ADS)

    Piro, Markus Hans Alexander

    Nuclear energy plays a vital role in supporting electrical needs and fulfilling commitments to reduce greenhouse gas emissions. Research is a continuing necessity to improve the predictive capabilities of fuel behaviour in order to reduce costs and to meet increasingly stringent safety requirements by the regulator. Moreover, a renewed interest in nuclear energy has given rise to a "nuclear renaissance" and the necessity to design the next generation of reactors. In support of this goal, significant research efforts have been dedicated to the advancement of numerical modelling and computational tools in simulating various physical and chemical phenomena associated with nuclear fuel behaviour. This undertaking in effect is collecting the experience and observations of a past generation of nuclear engineers and scientists in a meaningful way for future design purposes. There is an increasing desire to integrate thermodynamic computations directly into multi-physics nuclear fuel performance and safety codes. A new equilibrium thermodynamic solver is being developed with this matter as a primary objective. This solver is intended to provide thermodynamic material properties and boundary conditions for continuum transport calculations. There are several concerns with the use of existing commercial thermodynamic codes: computational performance; limited capabilities in handling large multi-component systems of interest to the nuclear industry; convenient incorporation into other codes with quality assurance considerations; and, licensing entanglements associated with code distribution. The development of this software in this research is aimed at addressing all of these concerns. The approach taken in this work exploits fundamental principles of equilibrium thermodynamics to simplify the numerical optimization equations. In brief, the chemical potentials of all species and phases in the system are constrained by estimates of the chemical potentials of the system components at each iterative step, and the objective is to minimize the residuals of the mass balance equations. Several numerical advantages are achieved through this simplification. In particular, computational expense is reduced and the rate of convergence is enhanced. Furthermore, the software has demonstrated the ability to solve systems involving as many as 118 component elements. An early version of the code has already been integrated into the Advanced Multi-Physics (AMP) code under development by the Oak Ridge National Laboratory, Los Alamos National Laboratory, Idaho National Laboratory and Argonne National Laboratory. Keywords: Engineering, Nuclear -- 0552, Engineering, Material Science -- 0794, Chemistry, Mathematics -- 0405, Computer Science -- 0984

  17. Why advanced computing? The key to space-based operations

    NASA Astrophysics Data System (ADS)

    Phister, Paul W., Jr.; Plonisch, Igor; Mineo, Jack

    2000-11-01

    The 'what is the requirement?' aspect of advanced computing and how it relates to and supports Air Force space-based operations is a key issue. In support of the Air Force Space Command's five major mission areas (space control, force enhancement, force applications, space support and mission support), two-fifths of the requirements have associated stringent computing/size implications. The Air Force Research Laboratory's 'migration to space' concept will eventually shift Science and Technology (S&T) dollars from predominantly airborne systems to airborne-and-space related S&T areas. One challenging 'space' area is in the development of sophisticated on-board computing processes for the next generation smaller, cheaper satellite systems. These new space systems (called microsats or nanosats) could be as small as a softball, yet perform functions that are currently being done by large, vulnerable ground-based assets. The Joint Battlespace Infosphere (JBI) concept will be used to manage the overall process of space applications coupled with advancements in computing. The JBI can be defined as a globally interoperable information 'space' which aggregates, integrates, fuses, and intelligently disseminates all relevant battlespace knowledge to support effective decision-making at all echelons of a Joint Task Force (JTF). This paper explores a single theme -- on-board processing is the best avenue to take advantage of advancements in high-performance computing, high-density memories, communications, and re-programmable architecture technologies. The goal is to break away from 'no changes after launch' design to a more flexible design environment that can take advantage of changing space requirements and needs while the space vehicle is 'on orbit.'

  18. Intricacies of modern supercomputing illustrated with recent advances in simulations of strongly correlated electron systems

    NASA Astrophysics Data System (ADS)

    Schulthess, Thomas C.

    2013-03-01

    The continued thousand-fold improvement in sustained application performance per decade on modern supercomputers keeps opening new opportunities for scientific simulations. But supercomputers have become very complex machines, built with thousands or tens of thousands of complex nodes consisting of multiple CPU cores or, most recently, a combination of CPU and GPU processors. Efficient simulations on such high-end computing systems require tailored algorithms that optimally map numerical methods to particular architectures. These intricacies will be illustrated with simulations of strongly correlated electron systems, where the development of quantum cluster methods, Monte Carlo techniques, as well as their optimal implementation by means of algorithms with improved data locality and high arithmetic density have gone hand in hand with evolving computer architectures. The present work would not have been possible without continued access to computing resources at the National Center for Computational Science of Oak Ridge National Laboratory, which is funded by the Facilities Division of the Office of Advanced Scientific Computing Research, and the Swiss National Supercomputing Center (CSCS) that is funded by ETH Zurich.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Raymond, David W.; Blankenship, Douglas A.; Buerger, Stephen

    The dynamic stability of deep drillstrings is challenged by an inability to impart controllability with ever-changing conditions introduced by geology, depth, structural dynamic properties and operating conditions. A multi-organizational LDRD project team at Sandia National Laboratories successfully demonstrated advanced technologies for mitigating drillstring vibrations to improve the reliability of drilling systems used for construction of deep, high-value wells. Using computational modeling and dynamic substructuring techniques, the benefit of controllable actuators at discrete locations in the drillstring is determined. Prototype downhole tools were developed and evaluated in laboratory test fixtures simulating the structural dynamic response of a deep drillstring. A laboratory-basedmore » drilling applicability demonstration was conducted to demonstrate the benefit available from deployment of an autonomous, downhole tool with self-actuation capabilities in response to the dynamic response of the host drillstring. A concept is presented for a prototype drilling tool based upon the technical advances. The technology described herein is the subject of U.S. Patent Application No. 62219481, entitled "DRILLING SYSTEM VIBRATION SUPPRESSION SYSTEMS AND METHODS", filed September 16, 2015.« less

  20. The pKa Cooperative: A Collaborative Effort to Advance Structure-Based Calculations of pKa values and Electrostatic Effects in Proteins

    PubMed Central

    Nielsen, Jens E.; Gunner, M. R.; Bertrand García-Moreno, E.

    2012-01-01

    The pKa Cooperative http://www.pkacoop.org was organized to advance development of accurate and useful computational methods for structure-based calculation of pKa values and electrostatic energy in proteins. The Cooperative brings together laboratories with expertise and interest in theoretical, computational and experimental studies of protein electrostatics. To improve structure-based energy calculations it is necessary to better understand the physical character and molecular determinants of electrostatic effects. The Cooperative thus intends to foment experimental research into fundamental aspects of proteins that depend on electrostatic interactions. It will maintain a depository for experimental data useful for critical assessment of methods for structure-based electrostatics calculations. To help guide the development of computational methods the Cooperative will organize blind prediction exercises. As a first step, computational laboratories were invited to reproduce an unpublished set of experimental pKa values of acidic and basic residues introduced in the interior of staphylococcal nuclease by site-directed mutagenesis. The pKa values of these groups are unique and challenging to simulate owing to the large magnitude of their shifts relative to normal pKa values in water. Many computational methods were tested in this 1st Blind Prediction Challenge and critical assessment exercise. A workshop was organized in the Telluride Science Research Center to assess objectively the performance of many computational methods tested on this one extensive dataset. This volume of PROTEINS: Structure, Function, and Bioinformatics introduces the pKa Cooperative, presents reports submitted by participants in the blind prediction challenge, and highlights some of the problems in structure-based calculations identified during this exercise. PMID:22002877

  1. Simulation of the visual effects of power plant plumes

    Treesearch

    Evelyn F. Treiman; David B. Champion; Mona J. Wecksung; Glenn H. Moore; Andrew Ford; Michael D. Williams

    1979-01-01

    The Los Alamos Scientific Laboratory has developed a computer-assisted technique that can predict the visibility effects of potential energy sources in advance of their construction. This technique has been employed in an economic and environmental analysis comparing a single 3000 MW coal-fired power plant with six 500 MW coal-fired power plants located at hypothetical...

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiong, Qingang; Robichaud, David J.

    As research activities continue, our understanding of biomass pyrolysis has been significantly elevated and we sought to arrange this Virtual Special Issue (VSI) in ACS Sustainable Chemistry & Engineering to report recent progress on computational and experimental studies of biomass pyrolysis. Beyond highlighting the five national laboratories' advancements, prestigious researchers in the field of biomass pyrolysis have been invited to report their most recent activities.

  3. Computational Toxicology as Implemented by the US EPA ...

    EPA Pesticide Factsheets

    Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, the U.S. Environmental Protection Agency EPA is developing robust and flexible computational tools that can be applied to the thousands of chemicals in commerce, and contaminant mixtures found in air, water, and hazardous-waste sites. The Office of Research and Development (ORD) Computational Toxicology Research Program (CTRP) is composed of three main elements. The largest component is the National Center for Computational Toxicology (NCCT), which was established in 2005 to coordinate research on chemical screening and prioritization, informatics, and systems modeling. The second element consists of related activities in the National Health and Environmental Effects Research Laboratory (NHEERL) and the National Exposure Research Laboratory (NERL). The third and final component consists of academic centers working on various aspects of computational toxicology and funded by the U.S. EPA Science to Achieve Results (STAR) program. Together these elements form the key components in the implementation of both the initial strategy, A Framework for a Computational Toxicology Research Program (U.S. EPA, 2003), and the newly released The U.S. Environmental Protection Agency's Strategic Plan for Evaluating the T

  4. Red Storm usage model :Version 1.12.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jefferson, Karen L.; Sturtevant, Judith E.

    Red Storm is an Advanced Simulation and Computing (ASC) funded massively parallel supercomputer located at Sandia National Laboratories (SNL). The Red Storm Usage Model (RSUM) documents the capabilities and the environment provided for the FY05 Tri-Lab Level II Limited Availability Red Storm User Environment Milestone and the FY05 SNL Level II Limited Availability Red Storm Platform Milestone. This document describes specific capabilities, tools, and procedures to support both local and remote users. The model is focused on the needs of the ASC user working in the secure computing environments at Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL),more » and SNL. Additionally, the Red Storm Usage Model maps the provided capabilities to the Tri-Lab ASC Computing Environment (ACE) requirements. The ACE requirements reflect the high performance computing requirements for the ASC community and have been updated in FY05 to reflect the community's needs. For each section of the RSUM, Appendix I maps the ACE requirements to the Limited Availability User Environment capabilities and includes a description of ACE requirements met and those requirements that are not met in that particular section. The Red Storm Usage Model, along with the ACE mappings, has been issued and vetted throughout the Tri-Lab community.« less

  5. Advanced Plant Habitat

    NASA Image and Video Library

    2016-11-17

    A test unit, or prototype, of NASA's Advanced Plant Habitat (APH) was delivered to the Space Station Processing Facility at the agency's Kennedy Space Center in Florida. Inside a laboratory, Engineering Services Contract engineers set up test parameters on computers. From left, are Glenn Washington, ESC quality engineer; Claton Grosse, ESC mechanical engineer; and Jeff Richards, ESC project scientist. The APH is the largest plant chamber built for the agency. It will have 180 sensors and four times the light output of Veggie. The APH will be delivered to the International Space Station in March 2017.

  6. Advanced application flight experiments precision attitude determination system. Volume 2: System tests

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The performance capability of each of two precision attitude determination systems (PADS), one using a strapdown star tracker, and the other using a single-axis gimbal star tracker was measured in the laboratory under simulated orbit conditions. The primary focus of the evaluation was on the contribution to the total system accuracy by the star trackers, and the effectiveness of the software algorithms in functioning with actual sensor signals. A brief description of PADS, the laboratory test configuration and the test facility, is given along with a discussion of the data handling and display, laboratory computer programs, PADS performance evaluation programs, and the strapdown and gimbal system tests. Results are presented and discussed.

  7. The Laboratory for Terrestrial Physics

    NASA Technical Reports Server (NTRS)

    2003-01-01

    The Laboratory for Terrestrial Physics is dedicated to the advancement of knowledge in Earth and planetary science, by conducting innovative research using space technology. The Laboratory's mission and activities support the work and new initiatives at NASA's Goddard Space Flight Center (GSFC). The Laboratory's success contributes to the Earth Science Directorate as a national resource for studies of Earth from Space. The Laboratory is part of the Earth Science Directorate based at the GSFC in Greenbelt, MD. The Directorate itself is comprised of the Global Change Data Center (GCDC), the Space Data and Computing Division (SDCD), and four science Laboratories, including Laboratory for Terrestrial Physics, Laboratory for Atmospheres, and Laboratory for Hydrospheric Processes all in Greenbelt, MD. The fourth research organization, Goddard Institute for Space Studies (GISS), is in New York, NY. Relevant to NASA's Strategic Plan, the Laboratory ensures that all work undertaken and completed is within the vision of GSFC. The philosophy of the Laboratory is to balance the completion of near term goals, while building on the Laboratory's achievements as a foundation for the scientific challenges in the future.

  8. Clinically expedient reporting of rapid diagnostic test information.

    PubMed

    Doern, G V

    1986-03-01

    With the development of rapid diagnostic tests in the clinical microbiology laboratory has come an awareness of the importance of rapid results reporting. Clearly, the potential clinical impact of rapid diagnostic tests is dependent on expeditious reporting. Traditional manual reporting systems are encumbered by the necessity of transcription of test information onto hard copy reports and then the subsequent distribution of such reports into the hands of the user. Laboratory computers when linked directly to CRTs located in nursing stations, ambulatory clinics, or physician's offices, both inside and outside of the hospital, permit essentially instantaneous transfer of test results from the laboratory to the clinician. Computer-assisted results reporting, while representing a significant advance over manual reporting systems is not, however, without problems. Concerns include validation of test information, authorization of users with access to test information, mechanical integrity, and cost. These issues notwithstanding, computerized results reporting will undoubtedly play a central role in optimizing the clinical impact of rapid diagnostic tests.

  9. Oak Ridge National Laboratory Support of Non-light Water Reactor Technologies: Capabilities Assessment for NRC Near-term Implementation Action Plans for Non-light Water Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Belles, Randy; Jain, Prashant K.; Powers, Jeffrey J.

    The Oak Ridge National Laboratory (ORNL) has a rich history of support for light water reactor (LWR) and non-LWR technologies. The ORNL history involves operation of 13 reactors at ORNL including the graphite reactor dating back to World War II, two aqueous homogeneous reactors, two molten salt reactors (MSRs), a fast-burst health physics reactor, and seven LWRs. Operation of the High Flux Isotope Reactor (HFIR) has been ongoing since 1965. Expertise exists amongst the ORNL staff to provide non-LWR training; support evaluation of non-LWR licensing and safety issues; perform modeling and simulation using advanced computational tools; run laboratory experiments usingmore » equipment such as the liquid salt component test facility; and perform in-depth fuel performance and thermal-hydraulic technology reviews using a vast suite of computer codes and tools. Summaries of this expertise are included in this paper.« less

  10. LLIMAS: Revolutionizing integrating modeling and analysis at MIT Lincoln Laboratory

    NASA Astrophysics Data System (ADS)

    Doyle, Keith B.; Stoeckel, Gerhard P.; Rey, Justin J.; Bury, Mark E.

    2017-08-01

    MIT Lincoln Laboratory's Integrated Modeling and Analysis Software (LLIMAS) enables the development of novel engineering solutions for advanced prototype systems through unique insights into engineering performance and interdisciplinary behavior to meet challenging size, weight, power, environmental, and performance requirements. LLIMAS is a multidisciplinary design optimization tool that wraps numerical optimization algorithms around an integrated framework of structural, thermal, optical, stray light, and computational fluid dynamics analysis capabilities. LLIMAS software is highly extensible and has developed organically across a variety of technologies including laser communications, directed energy, photometric detectors, chemical sensing, laser radar, and imaging systems. The custom software architecture leverages the capabilities of existing industry standard commercial software and supports the incorporation of internally developed tools. Recent advances in LLIMAS's Structural-Thermal-Optical Performance (STOP), aeromechanical, and aero-optical capabilities as applied to Lincoln prototypes are presented.

  11. Trinity to Trinity 1945-2015

    ScienceCinema

    Moniz, Ernest; Carr, Alan; Bethe, Hans; Morrison, Phillip; Ramsay, Norman; Teller, Edward; Brixner, Berlyn; Archer, Bill; Agnew, Harold; Morrison, John

    2018-01-16

    The Trinity Test of July 16, 1945 was the first full-scale, real-world test of a nuclear weapon; with the new Trinity supercomputer Los Alamos National Laboratory's goal is to do this virtually, in 3D. Trinity was the culmination of a fantastic effort of groundbreaking science and engineering by hundreds of men and women at Los Alamos and other Manhattan Project sites. It took them less than two years to change the world. The Laboratory is marking the 70th anniversary of the Trinity Test because it not only ushered in the Nuclear Age, but with it the origin of today’s advanced supercomputing. We live in the Age of Supercomputers due in large part to nuclear weapons science here at Los Alamos. National security science, and nuclear weapons science in particular, at Los Alamos National Laboratory have provided a key motivation for the evolution of large-scale scientific computing. Beginning with the Manhattan Project there has been a constant stream of increasingly significant, complex problems in nuclear weapons science whose timely solutions demand larger and faster computers. The relationship between national security science at Los Alamos and the evolution of computing is one of interdependence.

  12. Computer aided design and manufacturing: analysis and development of research issues

    NASA Astrophysics Data System (ADS)

    Taylor, K.; Jadeja, J. C.

    2005-11-01

    The paper focuses on the current issues in the areas of computer aided manufacturing and design. The importance of integrating CAD and CAM is analyzed. The associated issues with the integration and recent advancements in this field have been documented. The development of methods for enhancing productivity is explored. A research experiment was conducted in the laboratories of West Virginia University with an objective to portray effects of various machining parameters on production. Graphical results and their interpretations are supplied to better realize the main purpose of the experimentation.

  13. Human Factors Model

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Jack is an advanced human factors software package that provides a three dimensional model for predicting how a human will interact with a given system or environment. It can be used for a broad range of computer-aided design applications. Jack was developed by the computer Graphics Research Laboratory of the University of Pennsylvania with assistance from NASA's Johnson Space Center, Ames Research Center and the Army. It is the University's first commercial product. Jack is still used for academic purposes at the University of Pennsylvania. Commercial rights were given to Transom Technologies, Inc.

  14. Computational Toxicology at the US EPA | Science Inventory ...

    EPA Pesticide Factsheets

    Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, EPA is developing robust and flexible computational tools that can be applied to the thousands of chemicals in commerce, and contaminant mixtures found in America’s air, water, and hazardous-waste sites. The ORD Computational Toxicology Research Program (CTRP) is composed of three main elements. The largest component is the National Center for Computational Toxicology (NCCT), which was established in 2005 to coordinate research on chemical screening and prioritization, informatics, and systems modeling. The second element consists of related activities in the National Health and Environmental Effects Research Laboratory (NHEERL) and the National Exposure Research Laboratory (NERL). The third and final component consists of academic centers working on various aspects of computational toxicology and funded by the EPA Science to Achieve Results (STAR) program. Key intramural projects of the CTRP include digitizing legacy toxicity testing information toxicity reference database (ToxRefDB), predicting toxicity (ToxCast™) and exposure (ExpoCast™), and creating virtual liver (v-Liver™) and virtual embryo (v-Embryo™) systems models. The models and underlying data are being made publicly available t

  15. Scientific Computing Strategic Plan for the Idaho National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whiting, Eric Todd

    Scientific computing is a critical foundation of modern science. Without innovations in the field of computational science, the essential missions of the Department of Energy (DOE) would go unrealized. Taking a leadership role in such innovations is Idaho National Laboratory’s (INL’s) challenge and charge, and is central to INL’s ongoing success. Computing is an essential part of INL’s future. DOE science and technology missions rely firmly on computing capabilities in various forms. Modeling and simulation, fueled by innovations in computational science and validated through experiment, are a critical foundation of science and engineering. Big data analytics from an increasing numbermore » of widely varied sources is opening new windows of insight and discovery. Computing is a critical tool in education, science, engineering, and experiments. Advanced computing capabilities in the form of people, tools, computers, and facilities, will position INL competitively to deliver results and solutions on important national science and engineering challenges. A computing strategy must include much more than simply computers. The foundational enabling component of computing at many DOE national laboratories is the combination of a showcase like data center facility coupled with a very capable supercomputer. In addition, network connectivity, disk storage systems, and visualization hardware are critical and generally tightly coupled to the computer system and co located in the same facility. The existence of these resources in a single data center facility opens the doors to many opportunities that would not otherwise be possible.« less

  16. Laboratory Directed Research and Development Annual Report for 2011

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hughes, Pamela J.

    2012-04-09

    This report documents progress made on all LDRD-funded projects during fiscal year 2011. The following topics are discussed: (1) Advanced sensors and instrumentation; (2) Biological Sciences; (3) Chemistry; (4) Earth and space sciences; (5) Energy supply and use; (6) Engineering and manufacturing processes; (7) Materials science and technology; (8) Mathematics and computing sciences; (9) Nuclear science and engineering; and (10) Physics.

  17. Laboratory Directed Research and Development FY2010 Annual Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jackson, K J

    2011-03-22

    A premier applied-science laboratory, Lawrence Livermore National Laboratory (LLNL) has at its core a primary national security mission - to ensure the safety, security, and reliability of the nation's nuclear weapons stockpile without nuclear testing, and to prevent and counter the spread and use of weapons of mass destruction: nuclear, chemical, and biological. The Laboratory uses the scientific and engineering expertise and facilities developed for its primary mission to pursue advanced technologies to meet other important national security needs - homeland defense, military operations, and missile defense, for example - that evolve in response to emerging threats. For broader nationalmore » needs, LLNL executes programs in energy security, climate change and long-term energy needs, environmental assessment and management, bioscience and technology to improve human health, and for breakthroughs in fundamental science and technology. With this multidisciplinary expertise, the Laboratory serves as a science and technology resource to the U.S. government and as a partner with industry and academia. This annual report discusses the following topics: (1) Advanced Sensors and Instrumentation; (2) Biological Sciences; (3) Chemistry; (4) Earth and Space Sciences; (5) Energy Supply and Use; (6) Engineering and Manufacturing Processes; (7) Materials Science and Technology; Mathematics and Computing Science; (8) Nuclear Science and Engineering; and (9) Physics.« less

  18. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1 : ASC software quality engineering practices version 1.0.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Minana, Molly A.; Sturtevant, Judith E.; Heaphy, Robert

    2005-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in DOE/AL Quality Criteria (QC-1) as conformance to customer requirements and expectations. This quality plan defines the ASC program software quality practices and provides mappings of these practices to the SNL Corporate Process Requirements (CPR 1.3.2 and CPR 1.3.6) and the Department of Energy (DOE) document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines (GP&G). This quality plan identifies ASC management andmore » software project teams' responsibilities for cost-effective software engineering quality practices. The SNL ASC Software Quality Plan establishes the signatories commitment to improving software products by applying cost-effective software engineering quality practices. This document explains the project teams opportunities for tailoring and implementing the practices; enumerates the practices that compose the development of SNL ASC's software products; and includes a sample assessment checklist that was developed based upon the practices in this document.« less

  19. Empirical analysis of RNA robustness and evolution using high-throughput sequencing of ribozyme reactions.

    PubMed

    Hayden, Eric J

    2016-08-15

    RNA molecules provide a realistic but tractable model of a genotype to phenotype relationship. This relationship has been extensively investigated computationally using secondary structure prediction algorithms. Enzymatic RNA molecules, or ribozymes, offer access to genotypic and phenotypic information in the laboratory. Advancements in high-throughput sequencing technologies have enabled the analysis of sequences in the lab that now rivals what can be accomplished computationally. This has motivated a resurgence of in vitro selection experiments and opened new doors for the analysis of the distribution of RNA functions in genotype space. A body of computational experiments has investigated the persistence of specific RNA structures despite changes in the primary sequence, and how this mutational robustness can promote adaptations. This article summarizes recent approaches that were designed to investigate the role of mutational robustness during the evolution of RNA molecules in the laboratory, and presents theoretical motivations, experimental methods and approaches to data analysis. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. A network-based distributed, media-rich computing and information environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phillips, R.L.

    1995-12-31

    Sunrise is a Los Alamos National Laboratory (LANL) project started in October 1993. It is intended to be a prototype National Information Infrastructure development project. A main focus of Sunrise is to tie together enabling technologies (networking, object-oriented distributed computing, graphical interfaces, security, multi-media technologies, and data-mining technologies) with several specific applications. A diverse set of application areas was chosen to ensure that the solutions developed in the project are as generic as possible. Some of the application areas are materials modeling, medical records and image analysis, transportation simulations, and K-12 education. This paper provides a description of Sunrise andmore » a view of the architecture and objectives of this evolving project. The primary objectives of Sunrise are three-fold: (1) To develop common information-enabling tools for advanced scientific research and its applications to industry; (2) To enhance the capabilities of important research programs at the Laboratory; (3) To define a new way of collaboration between computer science and industrially-relevant research.« less

  1. FY10 Engineering Innovations, Research and Technology Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lane, M A; Aceves, S M; Paulson, C N

    This report summarizes key research, development, and technology advancements in Lawrence Livermore National Laboratory's Engineering Directorate for FY2010. These efforts exemplify Engineering's nearly 60-year history of developing and applying the technology innovations needed for the Laboratory's national security missions, and embody Engineering's mission to ''Enable program success today and ensure the Laboratory's vitality tomorrow.'' Leading off the report is a section featuring compelling engineering innovations. These innovations range from advanced hydrogen storage that enables clean vehicles, to new nuclear material detection technologies, to a landmine detection system using ultra-wideband ground-penetrating radar. Many have been recognized with R&D Magazine's prestigious R&Dmore » 100 Award; all are examples of the forward-looking application of innovative engineering to pressing national problems and challenging customer requirements. Engineering's capability development strategy includes both fundamental research and technology development. Engineering research creates the competencies of the future where discovery-class groundwork is required. Our technology development (or reduction to practice) efforts enable many of the research breakthroughs across the Laboratory to translate from the world of basic research to the national security missions of the Laboratory. This portfolio approach produces new and advanced technological capabilities, and is a unique component of the value proposition of the Lawrence Livermore Laboratory. The balance of the report highlights this work in research and technology, organized into thematic technical areas: Computational Engineering; Micro/Nano-Devices and Structures; Measurement Technologies; Engineering Systems for Knowledge Discovery; and Energy Manipulation. Our investments in these areas serve not only known programmatic requirements of today and tomorrow, but also anticipate the breakthrough engineering innovations that will be needed in the future.« less

  2. Onward to Petaflops Computing

    NASA Technical Reports Server (NTRS)

    Bailey, David H.; Chancellor, Marisa K. (Technical Monitor)

    1997-01-01

    With programs such as the US High Performance Computing and Communications Program (HPCCP), the attention of scientists and engineers worldwide has been focused on the potential of very high performance scientific computing, namely systems that are hundreds or thousands of times more powerful than those typically available in desktop systems at any given point in time. Extending the frontiers of computing in this manner has resulted in remarkable advances, both in computing technology itself and also in the various scientific and engineering disciplines that utilize these systems. Within the month or two, a sustained rate of 1 Tflop/s (also written 1 teraflops, or 10(exp 12) floating-point operations per second) is likely to be achieved by the 'ASCI Red' system at Sandia National Laboratory in New Mexico. With this objective in sight, it is reasonable to ask what lies ahead for high-end computing.

  3. Making Advanced Scientific Algorithms and Big Scientific Data Management More Accessible

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Venkatakrishnan, S. V.; Mohan, K. Aditya; Beattie, Keith

    2016-02-14

    Synchrotrons such as the Advanced Light Source (ALS) at Lawrence Berkeley National Laboratory are known as user facilities. They are sources of extremely bright X-ray beams, and scientists come from all over the world to perform experiments that require these beams. As the complexity of experiments has increased, and the size and rates of data sets has exploded, managing, analyzing and presenting the data collected at synchrotrons has been an increasing challenge. The ALS has partnered with high performance computing, fast networking, and applied mathematics groups to create a"super-facility", giving users simultaneous access to the experimental, computational, and algorithmic resourcesmore » to overcome this challenge. This combination forms an efficient closed loop, where data despite its high rate and volume is transferred and processed, in many cases immediately and automatically, on appropriate compute resources, and results are extracted, visualized, and presented to users or to the experimental control system, both to provide immediate insight and to guide decisions about subsequent experiments during beam-time. In this paper, We will present work done on advanced tomographic reconstruction algorithms to support users of the 3D micron-scale imaging instrument (Beamline 8.3.2, hard X-ray micro-tomography).« less

  4. Material Protection, Accounting, and Control Technologies (MPACT) Advanced Integration Roadmap

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Mike; Cipiti, Ben; Demuth, Scott Francis

    2017-01-30

    The development of sustainable advanced nuclear fuel cycles is a long-term goal of the Office of Nuclear Energy’s (DOE-NE) Fuel Cycle Technologies program. The Material Protection, Accounting, and Control Technologies (MPACT) campaign is supporting research and development (R&D) of advanced instrumentation, analysis tools, and integration methodologies to meet this goal (Miller, 2015). This advanced R&D is intended to facilitate safeguards and security by design of fuel cycle facilities. The lab-scale demonstration of a virtual facility, distributed test bed, that connects the individual tools being developed at National Laboratories and university research establishments, is a key program milestone for 2020. Thesemore » tools will consist of instrumentation and devices as well as computer software for modeling, simulation and integration.« less

  5. Material Protection, Accounting, and Control Technologies (MPACT) Advanced Integration Roadmap

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Durkee, Joe W.; Cipiti, Ben; Demuth, Scott Francis

    The development of sustainable advanced nuclear fuel cycles is a long-term goal of the Office of Nuclear Energy’s (DOE-NE) Fuel Cycle Technologies program. The Material Protection, Accounting, and Control Technologies (MPACT) campaign is supporting research and development (R&D) of advanced instrumentation, analysis tools, and integration methodologies to meet this goal (Miller, 2015). This advanced R&D is intended to facilitate safeguards and security by design of fuel cycle facilities. The lab-scale demonstration of a virtual facility, distributed test bed, that connects the individual tools being developed at National Laboratories and university research establishments, is a key program milestone for 2020. Thesemore » tools will consist of instrumentation and devices as well as computer software for modeling, simulation and integration.« less

  6. Corridor One:An Integrated Distance Visualization Enuronments for SSI+ASCI Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christopher R. Johnson, Charles D. Hansen

    2001-10-29

    The goal of Corridor One: An Integrated Distance Visualization Environment for ASCI and SSI Application was to combine the forces of six leading edge laboratories working in the areas of visualization and distributed computing and high performance networking (Argonne National Laboratory, Lawrence Berkeley National Laboratory, Los Alamos National Laboratory, University of Illinois, University of Utah and Princeton University) to develop and deploy the most advanced integrated distance visualization environment for large-scale scientific visualization and demonstrate it on applications relevant to the DOE SSI and ASCI programs. The Corridor One team brought world class expertise in parallel rendering, deep image basedmore » rendering, immersive environment technology, large-format multi-projector wall based displays, volume and surface visualization algorithms, collaboration tools and streaming media technology, network protocols for image transmission, high-performance networking, quality of service technology and distributed computing middleware. Our strategy was to build on the very successful teams that produced the I-WAY, ''Computational Grids'' and CAVE technology and to add these to the teams that have developed the fastest parallel visualizations systems and the most widely used networking infrastructure for multicast and distributed media. Unfortunately, just as we were getting going on the Corridor One project, DOE cut the program after the first year. As such, our final report consists of our progress during year one of the grant.« less

  7. Three-dimensional printing physiology laboratory technology.

    PubMed

    Sulkin, Matthew S; Widder, Emily; Shao, Connie; Holzem, Katherine M; Gloschat, Christopher; Gutbrod, Sarah R; Efimov, Igor R

    2013-12-01

    Since its inception in 19th-century Germany, the physiology laboratory has been a complex and expensive research enterprise involving experts in various fields of science and engineering. Physiology research has been critically dependent on cutting-edge technological support of mechanical, electrical, optical, and more recently computer engineers. Evolution of modern experimental equipment is constrained by lack of direct communication between the physiological community and industry producing this equipment. Fortunately, recent advances in open source technologies, including three-dimensional printing, open source hardware and software, present an exciting opportunity to bring the design and development of research instrumentation to the end user, i.e., life scientists. Here we provide an overview on how to develop customized, cost-effective experimental equipment for physiology laboratories.

  8. ASC Tri-lab Co-design Level 2 Milestone Report 2015

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hornung, Rich; Jones, Holger; Keasler, Jeff

    2015-09-23

    In 2015, the three Department of Energy (DOE) National Laboratories that make up the Advanced Sci- enti c Computing (ASC) Program (Sandia, Lawrence Livermore, and Los Alamos) collaboratively explored performance portability programming environments in the context of several ASC co-design proxy applica- tions as part of a tri-lab L2 milestone executed by the co-design teams at each laboratory. The programming environments that were studied included Kokkos (developed at Sandia), RAJA (LLNL), and Legion (Stan- ford University). The proxy apps studied included: miniAero, LULESH, CoMD, Kripke, and SNAP. These programming models and proxy-apps are described herein. Each lab focused on amore » particular combination of abstractions and proxy apps, with the goal of assessing performance portability using those. Performance portability was determined by: a) the ability to run a single application source code on multiple advanced architectures, b) comparing runtime performance between \

  9. Testimony to the House Science Space and Technology Committee.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Church, Michael Kenton; Tannenbaum, Benn

    Chairman Smith, Ranking Member Johnson, and distinguished members of the Committee on Science, Space, and Technology, I thank you for the opportunity to testify today on the role of science, engineering, and research at Sandia National Laboratories, one of the nation’s premiere national labs and the nation’s largest Federally Funded Research and Development Center (FFRDC) laboratory. I am Dr. Susan Seestrom, Sandia’s Associate Laboratories Director for Advanced Science & Technology (AST) and Chief Research Officer (CRO). As CRO I am responsible for research strategy, Laboratory Directed Research & Development (LDRD), partnerships strategy, and technology transfer. As director and line managermore » for AST I manage capabilities and mission delivery across a variety of the physical and mathematical sciences and engineering disciplines, such as pulsed power, radiation effects, major environmental testing, high performance computing, and modeling and simulation.« less

  10. Advanced Free Flight Planner and Dispatcher's Workstation: Preliminary Design Specification

    NASA Technical Reports Server (NTRS)

    Wilson, J.; Wright, C.; Couluris, G. J.

    1997-01-01

    The National Aeronautics and Space Administration (NASA) has implemented the Advanced Air Transportation Technology (AATT) program to investigate future improvements to the national and international air traffic management systems. This research, as part of the AATT program, developed preliminary design requirements for an advanced Airline Operations Control (AOC) dispatcher's workstation, with emphasis on flight planning. This design will support the implementation of an experimental workstation in NASA laboratories that would emulate AOC dispatch operations. The work developed an airline flight plan data base and specified requirements for: a computer tool for generation and evaluation of free flight, user preferred trajectories (UPT); the kernel of an advanced flight planning system to be incorporated into the UPT-generation tool; and an AOC workstation to house the UPT-generation tool and to provide a real-time testing environment. A prototype for the advanced flight plan optimization kernel was developed and demonstrated. The flight planner uses dynamic programming to search a four-dimensional wind and temperature grid to identify the optimal route, altitude and speed for successive segments of a flight. An iterative process is employed in which a series of trajectories are successively refined until the LTPT is identified. The flight planner is designed to function in the current operational environment as well as in free flight. The free flight environment would enable greater flexibility in UPT selection based on alleviation of current procedural constraints. The prototype also takes advantage of advanced computer processing capabilities to implement more powerful optimization routines than would be possible with older computer systems.

  11. Use of ``virtual'' field trips in teaching introductory geology

    NASA Astrophysics Data System (ADS)

    Hurst, Stephen D.

    1998-08-01

    We designed a series of case studies for Introductory Geology Laboratory courses using computer visualization techniques integrated with traditional laboratory materials. These consist of a comprehensive case study which requires three two-hour long laboratory periods to complete, and several shorter case studies requiring one or two, two-hour laboratory periods. Currently we have prototypes of the Yellowstone National Park, Hawaii volcanoes and the Mid-Atlantic Ridge case studies. The Yellowstone prototype can be used to learn about a wide variety of rocks and minerals, about geothermal activity and hydrology, about volcanic hazards and the hot-spot theory of plate tectonics. The Hawaiian exercise goes into more depth about volcanoes, volcanic rocks and their relationship to plate movements. The Mid-Atlantic Ridge project focuses on formation of new ocean crust and mineral-rich hydrothermal deposits at spreading centers. With new improvements in visualization technology that are making their way to personal computers, we are now closer to the ideal of a "virtual" field trip. We are currently making scenes of field areas in Hawaii and Yellowstone which allow the student to pan around the area and zoom in on interesting objects. Specific rocks in the scene will be able to be "picked up" and studied in three dimensions. This technology improves the ability of the computer to present a realistic simulation of the field area and allows the student to have more control over the presentation. This advanced interactive technology is intuitive to control, relatively cheap and easy to add to existing computer programs and documents.

  12. Investigating the Relationship between the Half-Life Decay of the Height and the Coefficient of Restitution of Bouncing Balls Using a Microcomputer-Based Laboratory

    ERIC Educational Resources Information Center

    Amrani, D.

    2010-01-01

    This pedagogical activity is aimed at students using a computer-learning environment with advanced tools for data analysis. It investigates the relationship between the coefficient of restitution and the way the heights of different bouncing balls decrease in a number of bounces with time. The time between successive ball bounces, or…

  13. Impact of Multi-Media Tutorials in a Computer Science Laboratory Course--An Empirical Study

    ERIC Educational Resources Information Center

    Dalal, Medha

    2014-01-01

    Higher education institutes of North America, Europe and far-east Asia have been leveraging the advances in ICT for quite some time. However, research based knowledge on the use of ICT in the higher education institutes of central and south-east Asia is still not readily available. The study presented in this paper explores a variant of teaching…

  14. Argonne News Brief: Cancer’s Big Data Problem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Data is pouring into the hands of cancer researchers, thanks to improvements in imaging, models and understanding of genetics. But they’ll need a lot of help—and some powerful supercomputers—to translate this data into better, more personalized treatment for cancer patients. A new initiative called the Joint Design of Advanced Computing Solutions for Cancer, which taps four different national laboratories, is poised to help.

  15. Advanced Concepts Theory Annual Report 1983.

    DTIC Science & Technology

    1984-05-18

    variety of theoretical models, tools, and computational strategies to understand, guide, and predict the behavior of high brightness, laboratory x-ray... theoretical models must treat hard and soft x-ray emission from different electron configurations with K, L, and M shells, and they must include... theoretical effort has basis for comprehending the trends which appear in the been devoted to elucidating the effects of opacity on the numerical results

  16. Annoyance caused by advanced turboprop aircraft flyover noise: Comparison of different propeller configurations

    NASA Technical Reports Server (NTRS)

    Mccurdy, David A.

    1991-01-01

    A laboratory experiment was conducted to compare the annoyance of flyover noise from advanced turboprop aircraft having different propeller configurations with the annoyance of conventional turboprop and turbofan aircraft flyover noise. A computer synthesis system was used to generate 40 realistic, time varying simulations of advanced turboprop takeoff noise. Of the 40 noises, single-rotating propeller configurations (8) and counter-rotating propeller configurations with an equal (12) and unequal (20) number of blades on each rotor were represented. Analyses found that advanced turboprops with single-rotating propellers were, on average, slightly less annoying than the other aircraft. Fundamental frequency and tone-to-broadband noise ratio affected annoyance response to advanced turboprops, but the effects varied with propeller configuration and noise metric. The addition of duration corrections and corrections for tones above 500 Hz to the noise measurement procedures improved annoyance prediction ability.

  17. Annoyance caused by advanced turboprop aircraft flyover noise: Comparison of different propeller configurations

    NASA Astrophysics Data System (ADS)

    McCurdy, David A.

    1991-10-01

    A laboratory experiment was conducted to compare the annoyance of flyover noise from advanced turboprop aircraft having different propeller configurations with the annoyance of conventional turboprop and turbofan aircraft flyover noise. A computer synthesis system was used to generate 40 realistic, time varying simulations of advanced turboprop takeoff noise. Of the 40 noises, single-rotating propeller configurations (8) and counter-rotating propeller configurations with an equal (12) and unequal (20) number of blades on each rotor were represented. Analyses found that advanced turboprops with single-rotating propellers were, on average, slightly less annoying than the other aircraft. Fundamental frequency and tone-to-broadband noise ratio affected annoyance response to advanced turboprops, but the effects varied with propeller configuration and noise metric. The addition of duration corrections and corrections for tones above 500 Hz to the noise measurement procedures improved annoyance prediction ability.

  18. CE-ACCE: The Cloud Enabled Advanced sCience Compute Environment

    NASA Astrophysics Data System (ADS)

    Cinquini, L.; Freeborn, D. J.; Hardman, S. H.; Wong, C.

    2017-12-01

    Traditionally, Earth Science data from NASA remote sensing instruments has been processed by building custom data processing pipelines (often based on a common workflow engine or framework) which are typically deployed and run on an internal cluster of computing resources. This approach has some intrinsic limitations: it requires each mission to develop and deploy a custom software package on top of the adopted framework; it makes use of dedicated hardware, network and storage resources, which must be specifically purchased, maintained and re-purposed at mission completion; and computing services cannot be scaled on demand beyond the capability of the available servers.More recently, the rise of Cloud computing, coupled with other advances in containerization technology (most prominently, Docker) and micro-services architecture, has enabled a new paradigm, whereby space mission data can be processed through standard system architectures, which can be seamlessly deployed and scaled on demand on either on-premise clusters, or commercial Cloud providers. In this talk, we will present one such architecture named CE-ACCE ("Cloud Enabled Advanced sCience Compute Environment"), which we have been developing at the NASA Jet Propulsion Laboratory over the past year. CE-ACCE is based on the Apache OODT ("Object Oriented Data Technology") suite of services for full data lifecycle management, which are turned into a composable array of Docker images, and complemented by a plug-in model for mission-specific customization. We have applied this infrastructure to both flying and upcoming NASA missions, such as ECOSTRESS and SMAP, and demonstrated deployment on the Amazon Cloud, either using simple EC2 instances, or advanced AWS services such as Amazon Lambda and ECS (EC2 Container Services).

  19. Neutron Characterization for Additive Manufacturing

    NASA Technical Reports Server (NTRS)

    Watkins, Thomas; Bilheux, Hassina; An, Ke; Payzant, Andrew; DeHoff, Ryan; Duty, Chad; Peter, William; Blue, Craig; Brice, Craig A.

    2013-01-01

    Oak Ridge National Laboratory (ORNL) is leveraging decades of experience in neutron characterization of advanced materials together with resources such as the Spallation Neutron Source (SNS) and the High Flux Isotope Reactor (HFIR) shown in Fig. 1 to solve challenging problems in additive manufacturing (AM). Additive manufacturing, or three-dimensional (3-D) printing, is a rapidly maturing technology wherein components are built by selectively adding feedstock material at locations specified by a computer model. The majority of these technologies use thermally driven phase change mechanisms to convert the feedstock into functioning material. As the molten material cools and solidifies, the component is subjected to significant thermal gradients, generating significant internal stresses throughout the part (Fig. 2). As layers are added, inherent residual stresses cause warping and distortions that lead to geometrical differences between the final part and the original computer generated design. This effect also limits geometries that can be fabricated using AM, such as thin-walled, high-aspect- ratio, and overhanging structures. Distortion may be minimized by intelligent toolpath planning or strategic placement of support structures, but these approaches are not well understood and often "Edisonian" in nature. Residual stresses can also impact component performance during operation. For example, in a thermally cycled environment such as a high-pressure turbine engine, residual stresses can cause components to distort unpredictably. Different thermal treatments on as-fabricated AM components have been used to minimize residual stress, but components still retain a nonhomogeneous stress state and/or demonstrate a relaxation-derived geometric distortion. Industry, federal laboratory, and university collaboration is needed to address these challenges and enable the U.S. to compete in the global market. Work is currently being conducted on AM technologies at the ORNL Manufacturing Demonstration Facility (MDF) sponsored by the DOE's Advanced Manufacturing Office. The MDF is focusing on R&D of both metal and polymer AM pertaining to in-situ process monitoring and closed-loop controls; implementation of advanced materials in AM technologies; and demonstration, characterization, and optimization of next-generation technologies. ORNL is working directly with industry partners to leverage world-leading facilities in fields such as high performance computing, advanced materials characterization, and neutron sciences to solve fundamental challenges in advanced manufacturing. Specifically, MDF is leveraging two of the world's most advanced neutron facilities, the HFIR and SNS, to characterize additive manufactured components.

  20. Introducing Enabling Computational Tools to the Climate Sciences: Multi-Resolution Climate Modeling with Adaptive Cubed-Sphere Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jablonowski, Christiane

    The research investigates and advances strategies how to bridge the scale discrepancies between local, regional and global phenomena in climate models without the prohibitive computational costs of global cloud-resolving simulations. In particular, the research explores new frontiers in computational geoscience by introducing high-order Adaptive Mesh Refinement (AMR) techniques into climate research. AMR and statically-adapted variable-resolution approaches represent an emerging trend for atmospheric models and are likely to become the new norm in future-generation weather and climate models. The research advances the understanding of multi-scale interactions in the climate system and showcases a pathway how to model these interactions effectively withmore » advanced computational tools, like the Chombo AMR library developed at the Lawrence Berkeley National Laboratory. The research is interdisciplinary and combines applied mathematics, scientific computing and the atmospheric sciences. In this research project, a hierarchy of high-order atmospheric models on cubed-sphere computational grids have been developed that serve as an algorithmic prototype for the finite-volume solution-adaptive Chombo-AMR approach. The foci of the investigations have lied on the characteristics of both static mesh adaptations and dynamically-adaptive grids that can capture flow fields of interest like tropical cyclones. Six research themes have been chosen. These are (1) the introduction of adaptive mesh refinement techniques into the climate sciences, (2) advanced algorithms for nonhydrostatic atmospheric dynamical cores, (3) an assessment of the interplay between resolved-scale dynamical motions and subgrid-scale physical parameterizations, (4) evaluation techniques for atmospheric model hierarchies, (5) the comparison of AMR refinement strategies and (6) tropical cyclone studies with a focus on multi-scale interactions and variable-resolution modeling. The results of this research project demonstrate significant advances in all six research areas. The major conclusions are that statically-adaptive variable-resolution modeling is currently becoming mature in the climate sciences, and that AMR holds outstanding promise for future-generation weather and climate models on high-performance computing architectures.« less

  1. Advanced Thermal Batteries.

    DTIC Science & Technology

    1981-06-01

    ADVANCED THERMAL BATTERIES NATIONAL UNION ELECTRIC CORPORATION ADVANCE SCIENCE DIVISION 1201 E. BELL STREET BLXXMINGTON, ILLINOIS 61701 JUNE 1981...December 1978 in: " Advanced Thermal Batteries " AFAPL-TR-78-114 Air Force Aero Propulsion Laboratory Air Force Wright Aeronautical Laboratories Air Force...March 1980 in: " Advanced Thermal Batteries " AFAPL-TR-80-2017 Air Force Aero Propulsion Laboratory Air Force Wright Aeronautical Laboratories Air Force

  2. Real-Time Hardware-in-the-Loop Simulation of Ares I Launch Vehicle

    NASA Technical Reports Server (NTRS)

    Tobbe, Patrick; Matras, Alex; Walker, David; Wilson, Heath; Fulton, Chris; Alday, Nathan; Betts, Kevin; Hughes, Ryan; Turbe, Michael

    2009-01-01

    The Ares Real-Time Environment for Modeling, Integration, and Simulation (ARTEMIS) has been developed for use by the Ares I launch vehicle System Integration Laboratory at the Marshall Space Flight Center. The primary purpose of the Ares System Integration Laboratory is to test the vehicle avionics hardware and software in a hardware - in-the-loop environment to certify that the integrated system is prepared for flight. ARTEMIS has been designed to be the real-time simulation backbone to stimulate all required Ares components for verification testing. ARTE_VIIS provides high -fidelity dynamics, actuator, and sensor models to simulate an accurate flight trajectory in order to ensure realistic test conditions. ARTEMIS has been designed to take advantage of the advances in underlying computational power now available to support hardware-in-the-loop testing to achieve real-time simulation with unprecedented model fidelity. A modular realtime design relying on a fully distributed computing architecture has been implemented.

  3. Laboratory directed research and development fy1999 annual report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Al-Ayat, R A

    2000-04-11

    The Lawrence Livermore National Laboratory (LLNL) was founded in 1952 and has been managed since its inception by the University of California (UC) for the U.S. Department of Energy (DOE). Because of this long association with UC, the Laboratory has been able to recruit a world-class workforce, establish an atmosphere of intellectual freedom and innovation, and achieve recognition in relevant fields of knowledge as a scientific and technological leader. This environment and reputation are essential for sustained scientific and technical excellence. As a DOE national laboratory with about 7,000 employees, LLNL has an essential and compelling primary mission to ensuremore » that the nation's nuclear weapons remain safe, secure, and reliable and to prevent the spread and use of nuclear weapons worldwide. The Laboratory receives funding from the DOE Assistant Secretary for Defense Programs, whose focus is stewardship of our nuclear weapons stockpile. Funding is also provided by the Deputy Administrator for Defense Nuclear Nonproliferation, many Department of Defense sponsors, other federal agencies, and the private sector. As a multidisciplinary laboratory, LLNL has applied its considerable skills in high-performance computing, advanced engineering, and the management of large research and development projects to become the science and technology leader in those areas of its mission responsibility. The Laboratory Directed Research and Development (LDRD) Program was authorized by the U.S. Congress in 1984. The Program allows the Director of each DOE laboratory to fund advanced, creative, and innovative research and development (R&D) activities that will ensure scientific and technical vitality in the continually evolving mission areas at DOE and the Laboratory. In addition, the LDRD Program provides LLNL with the flexibility to nurture and enrich essential scientific and technical competencies, which attract the most qualified scientists and engineers. The LDRD Program also enables many collaborations with the scientific community in academia, national and international laboratories, and industry. The projects in the FY1999 LDRD portfolio were carefully selected to continue vigorous support of the strategic vision and the long-term goals of DOE and the Laboratory. Projects chosen for LDRD funding undergo stringent selection processes, which look for high-potential scientific return, emphasize strategic relevance, and feature technical peer reviews by external and internal experts. The FY1999 projects described in this annual report focus on supporting the Laboratory's national security needs: stewardship of the U.S. nuclear weapons stockpile, responsibility for the counter- and nonproliferation of weapons of mass destruction, development of high-performance computing, and support of DOE environmental research and waste management programs. In the past, LDRD investments have significantly enhanced LLNL scientific capabilities and greatly contributed to the Laboratory's ability to meet its national security programmatic requirements. Examples of past investments include technical precursors to the Accelerated Strategic Computing Initiative (ASCI), special-materials processing and characterization, and biodefense. Our analysis of the FY1999 portfolio shows that it strongly supports the Laboratory's national security mission. About 95% of the LDRD dollars have directly supported LLNL's national security activities in FY1999, which far exceeds the portion of LLNL's overall budget supported by National Security Programs, which is 63% for FY1999.« less

  4. Laboratories | NREL

    Science.gov Websites

    | Z A Accelerated Exposure Testing Laboratory Advanced Optical Materials Laboratory Advanced Thermal Laboratory Structural Testing Laboratory Surface Analysis Laboratory Systems Performance Laboratory T Thermal Storage Materials Laboratory Thermal Storage Process and Components Laboratory Thin-Film Deposition

  5. Magneto Caloric Effect in Ni-Mn-Ga alloys: First Principles and Experimental studies

    NASA Astrophysics Data System (ADS)

    Odbadrakh, Khorgolkhuu; Nicholson, Don; Brown, Gregory; Rusanu, Aurelian; Rios, Orlando; Hodges, Jason; Safa-Sefat, Athena; Ludtka, Gerard; Eisenbach, Markus; Evans, Boyd

    2012-02-01

    Understanding the Magneto-Caloric Effect (MCE) in alloys with real technological potential is important to the development of viable MCE based products. We report results of computational and experimental investigation of a candidate MCE materials Ni-Mn-Ga alloys. The Wang-Landau statistical method is used in tandem with Locally Self-consistent Multiple Scattering (LSMS) method to explore magnetic states of the system. A classical Heisenberg Hamiltonian is parametrized based on these states and used in obtaining the density of magnetic states. The Currie temperature, isothermal entropy change, and adiabatic temperature change are then calculated from the density of states. Experiments to observe the structural and magnetic phase transformations were performed at the Spallation Neutron Source (SNS) at Oak Ridge National Laboratory (ORNL) on alloys of Ni-Mn-Ga and Fe-Ni-Mn-Ga-Cu. Data from the observations are discussed in comparison with the computational studies. This work was sponsored by the Laboratory Directed Research and Development Program (ORNL), by the Mathematical, Information, and Computational Sciences Division; Office of Advanced Scientific Computing Research (US DOE), and by the Materials Sciences and Engineering Division; Office of Basic Energy Sciences (US DOE).

  6. Combining high performance simulation, data acquisition, and graphics display computers

    NASA Technical Reports Server (NTRS)

    Hickman, Robert J.

    1989-01-01

    Issues involved in the continuing development of an advanced simulation complex are discussed. This approach provides the capability to perform the majority of tests on advanced systems, non-destructively. The controlled test environments can be replicated to examine the response of the systems under test to alternative treatments of the system control design, or test the function and qualification of specific hardware. Field tests verify that the elements simulated in the laboratories are sufficient. The digital computer is hosted by a Digital Equipment Corp. MicroVAX computer with an Aptec Computer Systems Model 24 I/O computer performing the communication function. An Applied Dynamics International AD100 performs the high speed simulation computing and an Evans and Sutherland PS350 performs on-line graphics display. A Scientific Computer Systems SCS40 acts as a high performance FORTRAN program processor to support the complex, by generating numerous large files from programs coded in FORTRAN that are required for the real time processing. Four programming languages are involved in the process, FORTRAN, ADSIM, ADRIO, and STAPLE. FORTRAN is employed on the MicroVAX host to initialize and terminate the simulation runs on the system. The generation of the data files on the SCS40 also is performed with FORTRAN programs. ADSIM and ADIRO are used to program the processing elements of the AD100 and its IOCP processor. STAPLE is used to program the Aptec DIP and DIA processors.

  7. Be a Mentor and Experience the Excitement of Rediscovery | Poster

    Cancer.gov

    You don’t really know something until you can teach it to someone. Raul Cachau said he believes this is not only true in academia, but in research laboratories as well. He said that being a mentor means rediscovering things long taken for granted. “It really forces you to rethink some of the things you do,” said Cachau, Ph.D., principal scientist, Advanced Biomedical Computing

  8. Analysis of Alternatives (AoA) of Open Colllaboration and Research Capabilities Collaboratipon in Research and Engineering in Advanced Technology and Education and High-Performance Computing Innovation Center (HPCIC) on the LVOC.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vrieling, P. Douglas

    2016-01-01

    The Livermore Valley Open Campus (LVOC), a joint initiative of the National Nuclear Security Administration (NNSA), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL), enhances the national security missions of NNSA by promoting greater collaboration between world-class scientists at the national security laboratories, and their partners in industry and academia. Strengthening the science, technology, and engineering (ST&E) base of our nation is one of the NNSA’s top goals. By conducting coordinated and collaborative programs, LVOC enhances both the NNSA and the broader national science and technology base, and helps to ensure the health of core capabilities at LLNLmore » and SNL. These capabilities must remain strong to enable the laboratories to execute their primary mission for NNSA.« less

  9. Experimental, Theoretical and Computational Studies of Plasma-Based Concepts for Future High Energy Accelerators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joshi, Chan; Mori, W.

    2013-10-21

    This is the final report on the DOE grant number DE-FG02-92ER40727 titled, “Experimental, Theoretical and Computational Studies of Plasma-Based Concepts for Future High Energy Accelerators.” During this grant period the UCLA program on Advanced Plasma Based Accelerators, headed by Professor C. Joshi has made many key scientific advances and trained a generation of students, many of whom have stayed in this research field and even started research programs of their own. In this final report however, we will focus on the last three years of the grant and report on the scientific progress made in each of the four tasksmore » listed under this grant. Four tasks are focused on: Plasma Wakefield Accelerator Research at FACET, SLAC National Accelerator Laboratory, In House Research at UCLA’s Neptune and 20 TW Laser Laboratories, Laser-Wakefield Acceleration (LWFA) in Self Guided Regime: Experiments at the Callisto Laser at LLNL, and Theory and Simulations. Major scientific results have been obtained in each of the four tasks described in this report. These have led to publications in the prestigious scientific journals, graduation and continued training of high quality Ph.D. level students and have kept the U.S. at the forefront of plasma-based accelerators research field.« less

  10. Clinical chemistry through Clinical Chemistry: a journal timeline.

    PubMed

    Rej, Robert

    2004-12-01

    The establishment of the modern discipline of clinical chemistry was concurrent with the foundation of the journal Clinical Chemistry and that of the American Association for Clinical Chemistry in the late 1940s and early 1950s. To mark the 50th volume of this Journal, I chronicle and highlight scientific milestones, and those within the discipline, as documented in the pages of Clinical Chemistry. Amazing progress has been made in the field of laboratory diagnostics over these five decades, in many cases paralleling-as well as being bolstered by-the rapid pace in the development of computer technologies. Specific areas of laboratory medicine particularly well represented in Clinical Chemistry include lipids, endocrinology, protein markers, quality of laboratory measurements, molecular diagnostics, and general advances in methodology and instrumentation.

  11. On the Reaction Mechanism of Acetaldehyde Decomposition on Mo(110)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mei, Donghai; Karim, Ayman M.; Wang, Yong

    2012-02-16

    The strong Mo-O bond strength provides promising reactivity of Mo-based catalysts for the deoxygenation of biomass-derived oxygenates. Combining the novel dimer saddle point searching method with periodic spin-polarized density functional theory calculations, we investigated the reaction pathways of a acetaldehyde decomposition on the clean Mo(110) surface. Two reaction pathways were identified, a selective deoxygenation and a nonselective fragmentation pathways. We found that acetaldehyde preferentially adsorbs at the pseudo 3-fold hollow site in the η2(C,O) configuration on Mo(110). Among four possible bond (β-C-H, γ-C-H, C-O and C-C) cleavages, the initial decomposition of the adsorbed acetaldehyde produces either ethylidene via the C-Omore » bond scission or acetyl via the β-C-H bond scission while the C-C and the γ-C-H bond cleavages of acetaldehyde leading to the formation of methyl (and formyl) and formylmethyl are unlikely. Further dehydrogenations of ethylidene into either ethylidyne or vinyl are competing and very facile with low activation barriers of 0.24 and 0.31 eV, respectively. Concurrently, the formed acetyl would deoxygenate into ethylidyne via the C-O cleavage rather than breaking the C-C or the C-H bonds. The selective deoxygenation of acetaldehyde forming ethylene is inhibited by relatively weaker hydrogenation capability of the Mo(110) surface. Instead, the nonselective pathway via vinyl and vinylidene dehydrogenations to ethynyl as the final hydrocarbon fragment is kinetically favorable. On the other hand, the strong interaction between ethylene and the Mo(110) surface also leads to ethylene decomposition instead of desorption into the gas phase. This work was financially supported by the National Advanced Biofuels Consortium (NABC). Computing time was granted by a user project (emsl42292) at the Molecular Science Computing Facility in the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL). This work was financially supported by the National Advanced Biofuels Consortium (NABC). Computing time was granted by a user project (emsl42292) at the Molecular Science Computing Facility in the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL). The EMSL is a U.S. Department of Energy (DOE) national scientific user facility located at Pacific Northwest National Laboratory (PNNL) and supported by the DOE Office of Biological and Environmental Research. Pacific Northwest National Laboratory is operated by Battelle for the U.S. Department of Energy.« less

  12. A hardware/software environment to support R D in intelligent machines and mobile robotic systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mann, R.C.

    1990-01-01

    The Center for Engineering Systems Advanced Research (CESAR) serves as a focal point at the Oak Ridge National Laboratory (ORNL) for basic and applied research in intelligent machines. R D at CESAR addresses issues related to autonomous systems, unstructured (i.e. incompletely known) operational environments, and multiple performing agents. Two mobile robot prototypes (HERMIES-IIB and HERMIES-III) are being used to test new developments in several robot component technologies. This paper briefly introduces the computing environment at CESAR which includes three hypercube concurrent computers (two on-board the mobile robots), a graphics workstation, VAX, and multiple VME-based systems (several on-board the mobile robots).more » The current software environment at CESAR is intended to satisfy several goals, e.g.: code portability, re-usability in different experimental scenarios, modularity, concurrent computer hardware transparent to applications programmer, future support for multiple mobile robots, support human-machine interface modules, and support for integration of software from other, geographically disparate laboratories with different hardware set-ups. 6 refs., 1 fig.« less

  13. Teaching hydrogeology: a review of current practice

    NASA Astrophysics Data System (ADS)

    Gleeson, T.; Allen, D. M.; Ferguson, G.

    2012-07-01

    Hydrogeology is now taught in a broad spectrum of departments and institutions to students with diverse backgrounds. Successful instruction in hydrogeology thus requires a variety of pedagogical approaches depending on desired learning outcomes and the background of students. We review the pedagogical literature in hydrogeology to highlight recent advances and analyze a 2005 survey among 68 hydrogeology instructors. The literature and survey results suggest there are only ~ 15 topics that are considered crucial by most hydrogeologists and > 100 other topics that are considered crucial by some hydrogeologists. The crucial topics focus on properties of aquifers and fundamentals of groundwater flow, and should likely be part of all undergraduate hydrogeology courses. Other topics can supplement and support these crucial topics, depending on desired learning outcomes. Classroom settings continue to provide a venue for emphasizing fundamental knowledge. However, recent pedagogical advances are biased towards field and laboratory instruction with a goal of bolstering experiential learning. Field methods build on the fundamentals taught in the classroom and emphasize the collection of data, data uncertainty, and the development of vocational skills. Laboratory and computer-based exercises similarly build on theory, and offer an opportunity for data analysis and integration. The literature suggests curricula at all levels should ideally balance field, laboratory, and classroom pedagogy into an iterative and integrative whole. An integrated, iterative and balanced approach leads to greater student motivation and advancement of theoretical and vocational knowledge.

  14. Flowing with the changing needs of hydrogeology instruction

    NASA Astrophysics Data System (ADS)

    Gleeson, T.; Allen, D. M.; Ferguson, G.

    2012-01-01

    Hydrogeology is now taught in a broad spectrum of departments and institutions to students with diverse backgrounds. Successful instruction in hydrogeology thus requires a variety of pedagogical approaches depending on desired learning outcomes and the diverse background of students. We review the pedagogical literature in hydrogeology to highlight recent advances and analyze a 2005 survey of 68 hydrogeology instructors. The literature and survey results suggest there are ~15 topics that are considered crucial by most hydrogeologists and >100 other topics that are considered crucial by some hydrogeologists. The crucial topics focus on properties of aquifers and fundamentals of groundwater flow, and should likely be part of all undergraduate hydrogeology courses. Other topics can supplement and support these crucial topics, depending on desired learning outcomes. Classroom settings continue to provide a venue for emphasizing fundamental knowledge. However, recent pedagogical advances are biased towards field and laboratory instruction with a goal of bolstering experiential learning. Field methods build on the fundamentals taught in the classroom and emphasize the collection of data, data uncertainty, and the development of vocational skills. Laboratory and computer-based exercises similarly build on theory, and offer an opportunity for data analysis and integration. The literature suggests curricula at all levels should ideally balance field, laboratory, and classroom pedagogy into an iterative and integrative whole. An integrated approach leads to greater student motivation and advancement of theoretical and vocational knowledge.

  15. NASA Applications of Molecular Nanotechnology

    NASA Technical Reports Server (NTRS)

    Globus, Al; Bailey, David; Han, Jie; Jaffe, Richard; Levit, Creon; Merkle, Ralph; Srivastava, Deepak

    1998-01-01

    Laboratories throughout the world are rapidly gaining atomically precise control over matter. As this control extends to an ever wider variety of materials, processes and devices, opportunities for applications relevant to NASA's missions will be created. This document surveys a number of future molecular nanotechnology capabilities of aerospace interest. Computer applications, launch vehicle improvements, and active materials appear to be of particular interest. We also list a number of applications for each of NASA's enterprises. If advanced molecular nanotechnology can be developed, almost all of NASA's endeavors will be radically improved. In particular, a sufficiently advanced molecular nanotechnology can arguably bring large scale space colonization within our grasp.

  16. Computer Security for Commercial Nuclear Power Plants - Literature Review for Korea Hydro Nuclear Power Central Research Institute

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duran, Felicia Angelica; Waymire, Russell L.

    2013-10-01

    Sandia National Laboratories (SNL) is providing training and consultation activities on security planning and design for the Korea Hydro and Nuclear Power Central Research Institute (KHNPCRI). As part of this effort, SNL performed a literature review on computer security requirements, guidance and best practices that are applicable to an advanced nuclear power plant. This report documents the review of reports generated by SNL and other organizations [U.S. Nuclear Regulatory Commission, Nuclear Energy Institute, and International Atomic Energy Agency] related to protection of information technology resources, primarily digital controls and computer resources and their data networks. Copies of the key documentsmore » have also been provided to KHNP-CRI.« less

  17. Intelligent redundant actuation system requirements and preliminary system design

    NASA Technical Reports Server (NTRS)

    Defeo, P.; Geiger, L. J.; Harris, J.

    1985-01-01

    Several redundant actuation system configurations were designed and demonstrated to satisfy the stringent operational requirements of advanced flight control systems. However, this has been accomplished largely through brute force hardware redundancy, resulting in significantly increased computational requirements on the flight control computers which perform the failure analysis and reconfiguration management. Modern technology now provides powerful, low-cost microprocessors which are effective in performing failure isolation and configuration management at the local actuator level. One such concept, called an Intelligent Redundant Actuation System (IRAS), significantly reduces the flight control computer requirements and performs the local tasks more comprehensively than previously feasible. The requirements and preliminary design of an experimental laboratory system capable of demonstrating the concept and sufficiently flexible to explore a variety of configurations are discussed.

  18. Reinventing patient-centered computing for the twenty-first century.

    PubMed

    Goldberg, H S; Morales, A; Gottlieb, L; Meador, L; Safran, C

    2001-01-01

    Despite evidence over the past decade that patients like and will use patient-centered computing systems in managing their health, patients have remained forgotten stakeholders in advances in clinical computing systems. We present a framework for patient empowerment and the technical realization of that framework in an architecture called CareLink. In an evaluation of the initial deployment of CareLink in the support of neonatal intensive care, we have demonstrated a reduction in the length of stay for very-low birthweight infants, and an improvement in family satisfaction with care delivery. With the ubiquitous adoption of the Internet into the general culture, patient-centered computing provides the opportunity to mend broken health care relationships and reconnect patients to the care delivery process. CareLink itself provides functionality to support both clinical care and research, and provides a living laboratory for the further study of patient-centered computing.

  19. Computed intraoperative navigation guidance--a preliminary report on a new technique.

    PubMed

    Enislidis, G; Wagner, A; Ploder, O; Ewers, R

    1997-08-01

    To assess the value of a computer-assisted three-dimensional guidance system (Virtual Patient System) in maxillofacial operations. Laboratory and open clinical study. Teaching Hospital, Austria. 6 patients undergoing various procedures including removal of foreign body (n=3) and biopsy, maxillary advancement, and insertion of implants (n=1 each). Storage of computed tomographic (CT) pictures on an optical disc, and imposition of intraoperative video images on to these. The resulting display is shown to the surgeon on a micromonitor in his head-up display for guidance during the operations. To improve orientation during complex or minimally invasive maxillofacial procedures and to make such operations easier and less traumatic. Successful transferral of computed navigation technology into an operation room environment and positive evaluation of the method by the surgeons involved. Computer-assisted three-dimensional guidance systems have the potential for making complex or minimally invasive procedures easier to do, thereby reducing postoperative morbidity.

  20. Review of wireless and wearable electroencephalogram systems and brain-computer interfaces--a mini-review.

    PubMed

    Lin, Chin-Teng; Ko, Li-Wei; Chang, Meng-Hsiu; Duann, Jeng-Ren; Chen, Jing-Ying; Su, Tung-Ping; Jung, Tzyy-Ping

    2010-01-01

    Biomedical signal monitoring systems have rapidly advanced in recent years, propelled by significant advances in electronic and information technologies. Brain-computer interface (BCI) is one of the important research branches and has become a hot topic in the study of neural engineering, rehabilitation, and brain science. Traditionally, most BCI systems use bulky, wired laboratory-oriented sensing equipments to measure brain activity under well-controlled conditions within a confined space. Using bulky sensing equipments not only is uncomfortable and inconvenient for users, but also impedes their ability to perform routine tasks in daily operational environments. Furthermore, owing to large data volumes, signal processing of BCI systems is often performed off-line using high-end personal computers, hindering the applications of BCI in real-world environments. To be practical for routine use by unconstrained, freely-moving users, BCI systems must be noninvasive, nonintrusive, lightweight and capable of online signal processing. This work reviews recent online BCI systems, focusing especially on wearable, wireless and real-time systems. Copyright 2009 S. Karger AG, Basel.

  1. PREFACE: 16th International workshop on Advanced Computing and Analysis Techniques in physics research (ACAT2014)

    NASA Astrophysics Data System (ADS)

    Fiala, L.; Lokajicek, M.; Tumova, N.

    2015-05-01

    This volume of the IOP Conference Series is dedicated to scientific contributions presented at the 16th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2014), this year the motto was ''bridging disciplines''. The conference took place on September 1-5, 2014, at the Faculty of Civil Engineering, Czech Technical University in Prague, Czech Republic. The 16th edition of ACAT explored the boundaries of computing system architectures, data analysis algorithmics, automatic calculations, and theoretical calculation technologies. It provided a forum for confronting and exchanging ideas among these fields, where new approaches in computing technologies for scientific research were explored and promoted. This year's edition of the workshop brought together over 140 participants from all over the world. The workshop's 16 invited speakers presented key topics on advanced computing and analysis techniques in physics. During the workshop, 60 talks and 40 posters were presented in three tracks: Computing Technology for Physics Research, Data Analysis - Algorithms and Tools, and Computations in Theoretical Physics: Techniques and Methods. The round table enabled discussions on expanding software, knowledge sharing and scientific collaboration in the respective areas. ACAT 2014 was generously sponsored by Western Digital, Brookhaven National Laboratory, Hewlett Packard, DataDirect Networks, M Computers, Bright Computing, Huawei and PDV-Systemhaus. Special appreciations go to the track liaisons Lorenzo Moneta, Axel Naumann and Grigory Rubtsov for their work on the scientific program and the publication preparation. ACAT's IACC would also like to express its gratitude to all referees for their work on making sure the contributions are published in the proceedings. Our thanks extend to the conference liaisons Andrei Kataev and Jerome Lauret who worked with the local contacts and made this conference possible as well as to the program coordinator Federico Carminati and the conference chair Denis Perret-Gallix for their global supervision. Further information on ACAT 2014 can be found at http://www.particle.cz/acat2014

  2. Summary Report of Working Group 2: Computation

    NASA Astrophysics Data System (ADS)

    Stoltz, P. H.; Tsung, R. S.

    2009-01-01

    The working group on computation addressed three physics areas: (i) plasma-based accelerators (laser-driven and beam-driven), (ii) high gradient structure-based accelerators, and (iii) electron beam sources and transport [1]. Highlights of the talks in these areas included new models of breakdown on the microscopic scale, new three-dimensional multipacting calculations with both finite difference and finite element codes, and detailed comparisons of new electron gun models with standard models such as PARMELA. The group also addressed two areas of advances in computation: (i) new algorithms, including simulation in a Lorentz-boosted frame that can reduce computation time orders of magnitude, and (ii) new hardware architectures, like graphics processing units and Cell processors that promise dramatic increases in computing power. Highlights of the talks in these areas included results from the first large-scale parallel finite element particle-in-cell code (PIC), many order-of-magnitude speedup of, and details of porting the VPIC code to the Roadrunner supercomputer. The working group featured two plenary talks, one by Brian Albright of Los Alamos National Laboratory on the performance of the VPIC code on the Roadrunner supercomputer, and one by David Bruhwiler of Tech-X Corporation on recent advances in computation for advanced accelerators. Highlights of the talk by Albright included the first one trillion particle simulations, a sustained performance of 0.3 petaflops, and an eight times speedup of science calculations, including back-scatter in laser-plasma interaction. Highlights of the talk by Bruhwiler included simulations of 10 GeV accelerator laser wakefield stages including external injection, new developments in electromagnetic simulations of electron guns using finite difference and finite element approaches.

  3. Summary Report of Working Group 2: Computation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stoltz, P. H.; Tsung, R. S.

    2009-01-22

    The working group on computation addressed three physics areas: (i) plasma-based accelerators (laser-driven and beam-driven), (ii) high gradient structure-based accelerators, and (iii) electron beam sources and transport [1]. Highlights of the talks in these areas included new models of breakdown on the microscopic scale, new three-dimensional multipacting calculations with both finite difference and finite element codes, and detailed comparisons of new electron gun models with standard models such as PARMELA. The group also addressed two areas of advances in computation: (i) new algorithms, including simulation in a Lorentz-boosted frame that can reduce computation time orders of magnitude, and (ii) newmore » hardware architectures, like graphics processing units and Cell processors that promise dramatic increases in computing power. Highlights of the talks in these areas included results from the first large-scale parallel finite element particle-in-cell code (PIC), many order-of-magnitude speedup of, and details of porting the VPIC code to the Roadrunner supercomputer. The working group featured two plenary talks, one by Brian Albright of Los Alamos National Laboratory on the performance of the VPIC code on the Roadrunner supercomputer, and one by David Bruhwiler of Tech-X Corporation on recent advances in computation for advanced accelerators. Highlights of the talk by Albright included the first one trillion particle simulations, a sustained performance of 0.3 petaflops, and an eight times speedup of science calculations, including back-scatter in laser-plasma interaction. Highlights of the talk by Bruhwiler included simulations of 10 GeV accelerator laser wakefield stages including external injection, new developments in electromagnetic simulations of electron guns using finite difference and finite element approaches.« less

  4. Strategic directions of computing at Fermilab

    NASA Astrophysics Data System (ADS)

    Wolbers, Stephen

    1998-05-01

    Fermilab computing has changed a great deal over the years, driven by the demands of the Fermilab experimental community to record and analyze larger and larger datasets, by the desire to take advantage of advances in computing hardware and software, and by the advances coming from the R&D efforts of the Fermilab Computing Division. The strategic directions of Fermilab Computing continue to be driven by the needs of the experimental program. The current fixed-target run will produce over 100 TBytes of raw data and systems must be in place to allow the timely analysis of the data. The collider run II, beginning in 1999, is projected to produce of order 1 PByte of data per year. There will be a major change in methodology and software language as the experiments move away from FORTRAN and into object-oriented languages. Increased use of automation and the reduction of operator-assisted tape mounts will be required to meet the needs of the large experiments and large data sets. Work will continue on higher-rate data acquisition systems for future experiments and projects. R&D projects will be pursued as necessary to provide software, tools, or systems which cannot be purchased or acquired elsewhere. A closer working relation with other high energy laboratories will be pursued to reduce duplication of effort and to allow effective collaboration on many aspects of HEP computing.

  5. National Storage Laboratory: a collaborative research project

    NASA Astrophysics Data System (ADS)

    Coyne, Robert A.; Hulen, Harry; Watson, Richard W.

    1993-01-01

    The grand challenges of science and industry that are driving computing and communications have created corresponding challenges in information storage and retrieval. An industry-led collaborative project has been organized to investigate technology for storage systems that will be the future repositories of national information assets. Industry participants are IBM Federal Systems Company, Ampex Recording Systems Corporation, General Atomics DISCOS Division, IBM ADSTAR, Maximum Strategy Corporation, Network Systems Corporation, and Zitel Corporation. Industry members of the collaborative project are funding their own participation. Lawrence Livermore National Laboratory through its National Energy Research Supercomputer Center (NERSC) will participate in the project as the operational site and provider of applications. The expected result is the creation of a National Storage Laboratory to serve as a prototype and demonstration facility. It is expected that this prototype will represent a significant advance in the technology for distributed storage systems capable of handling gigabyte-class files at gigabit-per-second data rates. Specifically, the collaboration expects to make significant advances in hardware, software, and systems technology in four areas of need, (1) network-attached high performance storage; (2) multiple, dynamic, distributed storage hierarchies; (3) layered access to storage system services; and (4) storage system management.

  6. Langley applications experiments data management system study. [for space shuttles

    NASA Technical Reports Server (NTRS)

    Lanham, C. C., Jr.

    1975-01-01

    A data management system study is presented that defines, in functional terms, the most cost effective ground data management system to support Advanced Technology Laboratory (ATL) flights of the space shuttle. Results from each subtask performed and the recommended system configuration for reformatting the experiment instrumentation tapes to computer compatible tape are examined. Included are cost factors for development of a mini control center for real-time support of the ATL flights.

  7. Computational mechanics research and support for aerodynamics and hydraulics at TFHRC, year 2 quarter 1 progress report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lottes, S.A.; Bojanowski, C.; Shen, J.

    2012-04-09

    The computational fluid dynamics (CFD) and computational structural mechanics (CSM) focus areas at Argonne's Transportation Research and Analysis Computing Center (TRACC) initiated a project to support and compliment the experimental programs at the Turner-Fairbank Highway Research Center (TFHRC) with high performance computing based analysis capabilities in August 2010. The project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at TFHRC for a period of five years, beginning in October 2010. Themore » analysis methods employ well-benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focus of the project are wind and water effects on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to improve design allowing for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, CFD analysis of the operation of the wind tunnel in the TFHRC wind engineering laboratory. This quarterly report documents technical progress on the project tasks for the period of October through December 2011.« less

  8. Computational mechanics research and support for aerodynamics and hydraulics at TFHRC, year 2 quarter 2 progress report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lottes, S.A.; Bojanowski, C.; Shen, J.

    2012-06-28

    The computational fluid dynamics (CFD) and computational structural mechanics (CSM) focus areas at Argonne's Transportation Research and Analysis Computing Center (TRACC) initiated a project to support and compliment the experimental programs at the Turner-Fairbank Highway Research Center (TFHRC) with high performance computing based analysis capabilities in August 2010. The project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at TFHRC for a period of five years, beginning in October 2010. Themore » analysis methods employ well benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focus of the project are wind and water effects on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to improve design allowing for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, CFD analysis of the operation of the wind tunnel in the TFHRC wind engineering laboratory. This quarterly report documents technical progress on the project tasks for the period of January through March 2012.« less

  9. Advanced Engineering Environment FY09/10 pilot project.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lamph, Jane Ann; Kiba, Grant W.; Pomplun, Alan R.

    2010-06-01

    The Advanced Engineering Environment (AEE) project identifies emerging engineering environment tools and assesses their value to Sandia National Laboratories and our partners in the Nuclear Security Enterprise (NSE) by testing them in our design environment. This project accomplished several pilot activities, including: the preliminary definition of an engineering bill of materials (BOM) based product structure in the Windchill PDMLink 9.0 application; an evaluation of Mentor Graphics Data Management System (DMS) application for electrical computer-aided design (ECAD) library administration; and implementation and documentation of a Windchill 9.1 application upgrade. The project also supported the migration of legacy data from existing corporatemore » product lifecycle management systems into new classified and unclassified Windchill PDMLink 9.0 systems. The project included two infrastructure modernization efforts: the replacement of two aging AEE development servers for reliable platforms for ongoing AEE project work; and the replacement of four critical application and license servers that support design and engineering work at the Sandia National Laboratories/California site.« less

  10. Virtual medicine: Utilization of the advanced cardiac imaging patient avatar for procedural planning and facilitation.

    PubMed

    Shinbane, Jerold S; Saxon, Leslie A

    Advances in imaging technology have led to a paradigm shift from planning of cardiovascular procedures and surgeries requiring the actual patient in a "brick and mortar" hospital to utilization of the digitalized patient in the virtual hospital. Cardiovascular computed tomographic angiography (CCTA) and cardiovascular magnetic resonance (CMR) digitalized 3-D patient representation of individual patient anatomy and physiology serves as an avatar allowing for virtual delineation of the most optimal approaches to cardiovascular procedures and surgeries prior to actual hospitalization. Pre-hospitalization reconstruction and analysis of anatomy and pathophysiology previously only accessible during the actual procedure could potentially limit the intrinsic risks related to time in the operating room, cardiac procedural laboratory and overall hospital environment. Although applications are specific to areas of cardiovascular specialty focus, there are unifying themes related to the utilization of technologies. The virtual patient avatar computer can also be used for procedural planning, computational modeling of anatomy, simulation of predicted therapeutic result, printing of 3-D models, and augmentation of real time procedural performance. Examples of the above techniques are at various stages of development for application to the spectrum of cardiovascular disease processes, including percutaneous, surgical and hybrid minimally invasive interventions. A multidisciplinary approach within medicine and engineering is necessary for creation of robust algorithms for maximal utilization of the virtual patient avatar in the digital medical center. Utilization of the virtual advanced cardiac imaging patient avatar will play an important role in the virtual health care system. Although there has been a rapid proliferation of early data, advanced imaging applications require further assessment and validation of accuracy, reproducibility, standardization, safety, efficacy, quality, cost effectiveness, and overall value to medical care. Copyright © 2018 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.

  11. Crosscut report: Exascale Requirements Reviews, March 9–10, 2017 – Tysons Corner, Virginia. An Office of Science review sponsored by: Advanced Scientific Computing Research, Basic Energy Sciences, Biological and Environmental Research, Fusion Energy Sciences, High Energy Physics, Nuclear Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerber, Richard; Hack, James; Riley, Katherine

    The mission of the U.S. Department of Energy Office of Science (DOE SC) is the delivery of scientific discoveries and major scientific tools to transform our understanding of nature and to advance the energy, economic, and national security missions of the United States. To achieve these goals in today’s world requires investments in not only the traditional scientific endeavors of theory and experiment, but also in computational science and the facilities that support large-scale simulation and data analysis. The Advanced Scientific Computing Research (ASCR) program addresses these challenges in the Office of Science. ASCR’s mission is to discover, develop, andmore » deploy computational and networking capabilities to analyze, model, simulate, and predict complex phenomena important to DOE. ASCR supports research in computational science, three high-performance computing (HPC) facilities — the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory and Leadership Computing Facilities at Argonne (ALCF) and Oak Ridge (OLCF) National Laboratories — and the Energy Sciences Network (ESnet) at Berkeley Lab. ASCR is guided by science needs as it develops research programs, computers, and networks at the leading edge of technologies. As we approach the era of exascale computing, technology changes are creating challenges for science programs in SC for those who need to use high performance computing and data systems effectively. Numerous significant modifications to today’s tools and techniques will be needed to realize the full potential of emerging computing systems and other novel computing architectures. To assess these needs and challenges, ASCR held a series of Exascale Requirements Reviews in 2015–2017, one with each of the six SC program offices,1 and a subsequent Crosscut Review that sought to integrate the findings from each. Participants at the reviews were drawn from the communities of leading domain scientists, experts in computer science and applied mathematics, ASCR facility staff, and DOE program managers in ASCR and the respective program offices. The purpose of these reviews was to identify mission-critical scientific problems within the DOE Office of Science (including experimental facilities) and determine the requirements for the exascale ecosystem that would be needed to address those challenges. The exascale ecosystem includes exascale computing systems, high-end data capabilities, efficient software at scale, libraries, tools, and other capabilities. This effort will contribute to the development of a strategic roadmap for ASCR compute and data facility investments and will help the ASCR Facility Division establish partnerships with Office of Science stakeholders. It will also inform the Office of Science research needs and agenda. The results of the six reviews have been published in reports available on the web at http://exascaleage.org/. This report presents a summary of the individual reports and of common and crosscutting findings, and it identifies opportunities for productive collaborations among the DOE SC program offices.« less

  12. Autonomous space processor for orbital debris

    NASA Technical Reports Server (NTRS)

    Ramohalli, Kumar; Campbell, David; Brockman, Jeff P.; Carter, Bruce; Donelson, Leslie; John, Lawrence E.; Marine, Micky C.; Rodina, Dan D.

    1989-01-01

    This work continues to develop advanced designs toward the ultimate goal of a GETAWAY SPECIAL to demonstrate economical removal of orbital debris utilizing local resources in orbit. The fundamental technical feasibility was demonstrated last year through theoretical calculations, quantitative computer animation, a solar focal point cutter, a robotic arm design and a subscale model. During this reporting period, several improvements are made in the solar cutter, such as auto track capabilities, better quality reflectors and a more versatile framework. The major advance has been in the design, fabrication and working demonstration of a ROBOTIC ARM that has several degrees of freedom. The functions were specifically tailored for the orbital debris handling. These advances are discussed here. Also a small fraction of the resources were allocated towards research in flame augmentation in SCRAMJETS for the NASP. Here, the fundamental advance was the attainment of Mach numbers up to 0.6 in the flame zone and a vastly improved injection system; the current work is expected to achieve supersonic combustion in the laboratory and an advanced monitoring system.

  13. A cost-effective approach to establishing a surgical skills laboratory.

    PubMed

    Berg, David A; Milner, Richard E; Fisher, Carol A; Goldberg, Amy J; Dempsey, Daniel T; Grewal, Harsh

    2007-11-01

    Recent studies comparing inexpensive low-fidelity box trainers to expensive computer-based virtual reality systems demonstrate similar acquisition of surgical skills and transferability to the clinical setting. With new mandates emerging that all surgical residency programs have access to a surgical skills laboratory, we describe our cost-effective approach to teaching basic and advanced open and laparoscopic skills utilizing inexpensive bench models, box trainers, and animate models. Open models (basic skills, bowel anastomosis, vascular anastomosis, trauma skills) and laparoscopic models (basic skills, cholecystectomy, Nissen fundoplication, suturing and knot tying, advanced in vivo skills) are constructed using a combination of materials found in our surgical research laboratories, retail stores, or donated by industry. Expired surgical materials are obtained from our hospital operating room and animal organs from food-processing plants. In vivo models are performed in an approved research facility. Operation, maintenance, and administration of the surgical skills laboratory are coordinated by a salaried manager, and instruction is the responsibility of all surgical faculty from our institution. Overall, the cost analyses of our initial startup costs and operational expenditures over a 3-year period revealed a progressive decrease in yearly cost per resident (2002-2003, $1,151; 2003-2004, $1,049; and 2004-2005, $982). Our approach to surgical skills education can serve as a template for any surgery program with limited financial resources.

  14. Critical issues using brain-computer interfaces for augmentative and alternative communication.

    PubMed

    Hill, Katya; Kovacs, Thomas; Shin, Sangeun

    2015-03-01

    Brain-computer interfaces (BCIs) may potentially be of significant practical value to patients in advanced stages of amyotrophic lateral sclerosis and locked-in syndrome for whom conventional augmentative and alternative communication (AAC) systems, which require some measure of consistent voluntary muscle control, are not satisfactory options. However, BCIs have primarily been used for communication in laboratory research settings. This article discusses 4 critical issues that should be addressed as BCIs are translated out of laboratory settings to become fully functional BCI/AAC systems that may be implemented clinically. These issues include (1) identification of primary, secondary, and tertiary system features; (2) integrating BCI/AAC systems in the World Health Organization's International Classification of Functioning, Disability and Health framework; (3) implementing language-based assessment and intervention; and (4) performance measurement. A clinical demonstration project is presented as an example of research beginning to address these critical issues. Copyright © 2015 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  15. Mass Storage and Retrieval at Rome Laboratory

    NASA Technical Reports Server (NTRS)

    Kann, Joshua L.; Canfield, Brady W.; Jamberdino, Albert A.; Clarke, Bernard J.; Daniszewski, Ed; Sunada, Gary

    1996-01-01

    As the speed and power of modern digital computers continues to advance, the demands on secondary mass storage systems grow. In many cases, the limitations of existing mass storage reduce the overall effectiveness of the computing system. Image storage and retrieval is one important area where improved storage technologies are required. Three dimensional optical memories offer the advantage of large data density, on the order of 1 Tb/cm(exp 3), and faster transfer rates because of the parallel nature of optical recording. Such a system allows for the storage of multiple-Gbit sized images, which can be recorded and accessed at reasonable rates. Rome Laboratory is currently investigating several techniques to perform three-dimensional optical storage including holographic recording, two-photon recording, persistent spectral-hole burning, multi-wavelength DNA recording, and the use of bacteriorhodopsin as a recording material. In this paper, the current status of each of these on-going efforts is discussed. In particular, the potential payoffs as well as possible limitations are addressed.

  16. The Sunrise project: An R&D project for a national information infrastructure prototype

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Juhnyoung

    1995-02-01

    Sunrise is a Los Alamos National Laboratory (LANL) project started in October 1993. It is intended to a prototype National Information Infrastructure (NII) development project. A main focus of Sunrise is to tie together enabling technologies (networking, object-oriented distributed computing, graphical interfaces, security, multimedia technologies, and data mining technologies) with several specific applications. A diverse set of application areas was chosen to ensure that the solutions developed in the project are as generic as possible. Some of the application areas are materials modeling, medical records and image analysis, transportation simulations, and education. This paper provides a description of Sunrise andmore » a view of the architecture and objectives of this evolving project. The primary objectives of Sunrise are three-fold: (1) To develop common information-enabling tools for advanced scientific research and its applications to industry; (2) To enhance the capabilities of important research programs at the Laboratory; and (3) To define a new way of collaboration between computer science and industrially relevant research.« less

  17. 2016 Energetic Materials Gordon Research Conference and Gordon Research Seminar Research Area 7: Chemical Sciences 7.0 Chemical Sciences (Dr. James K. Parker)

    DTIC Science & Technology

    2016-08-10

    thermal decomposition and mechanical damage of energetics. The program for the meeting included nine oral presentation sessions. Discussion leaders...USA) 7:30 pm - 7:35 pm Introduction by Discussion Leader 7:35 pm - 7:50 pm Vincent Baijot (Laboratory for Analysis and Architecture of Systems , CNRS...were synthesis of new materials, performance, advanced diagnostics, experimental techniques, theoretical approaches, and computational models for

  18. A visiting scientist program in atmospheric sciences for the Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Davis, M. H.

    1989-01-01

    A visiting scientist program was conducted in the atmospheric sciences and related areas at the Goddard Laboratory for Atmospheres. Research was performed in mathematical analysis as applied to computer modeling of the atmospheres; development of atmospheric modeling programs; analysis of remotely sensed atmospheric, surface, and oceanic data and its incorporation into atmospheric models; development of advanced remote sensing instrumentation; and related research areas. The specific research efforts are detailed by tasks.

  19. CASL Dakota Capabilities Summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Brian M.; Simmons, Chris; Williams, Brian J.

    2017-10-10

    The Dakota software project serves the mission of Sandia National Laboratories and supports a worldwide user community by delivering state-of-the-art research and robust, usable software for optimization and uncertainty quantification. These capabilities enable advanced exploration and riskinformed prediction with a wide range of computational science and engineering models. Dakota is the verification and validation (V&V) / uncertainty quantification (UQ) software delivery vehicle for CASL, allowing analysts across focus areas to apply these capabilities to myriad nuclear engineering analyses.

  20. A New Approach to Computing Information in Measurements of Non-Resolved Space Objects by the Falcon Telescope Network

    DTIC Science & Technology

    2014-09-01

    Analysis Simulation for Advanced Tracking (TASAT) satellite modeling tool [8,9]. The method uses the bi-reflectance distribution functions ( BRDF ...directional Reflectance Model Validation and Utilization, Air Force Avionics Laboratory Technical Report, AFAL-TR-73-303, October 1973. [10] Hall, D...failing to comply with a collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE SEP 2014 2. REPORT

  1. Advances in space robotics

    NASA Technical Reports Server (NTRS)

    Varsi, Giulio

    1989-01-01

    The problem of the remote control of space operations is addressed by identifying the key technical challenge: the management of contact forces and the principal performance parameters. Three principal classes of devices for remote operation are identified: anthropomorphic exoskeletons, computer aided teleoperators, and supervised telerobots. Their fields of application are described, and areas in which progress has reached the level of system or subsystem laboratory demonstrations are indicated. Key test results, indicating performance at a level useful for design tradeoffs, are reported.

  2. A Divide-and-Conquer/Cellular-Decomposition Framework for Million-to-Billion Atom Simulations of Chemical Reactions

    DTIC Science & Technology

    2007-01-01

    as a function of the particle velocity that drives the shock [7]. The MD and experimen- tal data agree very well. Furthermore, the simulation shows...topological anomalies in multimillion - node chemical bond networks in materials [48]. At the Col- laboratory for Advanced Computing and Simulations ...to-billion atom simulations of chemical reactions Aiichiro Nakano a,*, Rajiv K. Kalia a, Ken-ichi Nomura a, Ashish Sharma a, Priya Vashishta a, Fuyuki

  3. Advances in synthetic peptides reagent discovery

    NASA Astrophysics Data System (ADS)

    Adams, Bryn L.; Sarkes, Deborah A.; Finch, Amethist S.; Stratis-Cullum, Dimitra N.

    2013-05-01

    Bacterial display technology offers a number of advantages over competing display technologies (e.g, phage) for the rapid discovery and development of peptides with interaction targeted to materials ranging from biological hazards through inorganic metals. We have previously shown that discovery of synthetic peptide reagents utilizing bacterial display technology is relatively simple and rapid to make laboratory automation possible. This included extensive study of the protective antigen system of Bacillus anthracis, including development of discovery, characterization, and computational biology capabilities for in-silico optimization. Although the benefits towards CBD goals are evident, the impact is far-reaching due to our ability to understand and harness peptide interactions that are ultimately extendable to the hybrid biomaterials of the future. In this paper, we describe advances in peptide discovery including, new target systems (e.g. non-biological materials), advanced library development and clone analysis including integrated reporting.

  4. Design Considerations of a Virtual Laboratory for Advanced X-ray Sources

    NASA Astrophysics Data System (ADS)

    Luginsland, J. W.; Frese, M. H.; Frese, S. D.; Watrous, J. J.; Heileman, G. L.

    2004-11-01

    The field of scientific computation has greatly advanced in the last few years, resulting in the ability to perform complex computer simulations that can predict the performance of real-world experiments in a number of fields of study. Among the forces driving this new computational capability is the advent of parallel algorithms, allowing calculations in three-dimensional space with realistic time scales. Electromagnetic radiation sources driven by high-voltage, high-current electron beams offer an area to further push the state-of-the-art in high fidelity, first-principles simulation tools. The physics of these x-ray sources combine kinetic plasma physics (electron beams) with dense fluid-like plasma physics (anode plasmas) and x-ray generation (bremsstrahlung). There are a number of mature techniques and software packages for dealing with the individual aspects of these sources, such as Particle-In-Cell (PIC), Magneto-Hydrodynamics (MHD), and radiation transport codes. The current effort is focused on developing an object-oriented software environment using the Rational© Unified Process and the Unified Modeling Language (UML) to provide a framework where multiple 3D parallel physics packages, such as a PIC code (ICEPIC), a MHD code (MACH), and a x-ray transport code (ITS) can co-exist in a system-of-systems approach to modeling advanced x-ray sources. Initial software design and assessments of the various physics algorithms' fidelity will be presented.

  5. Effectiveness of educational technology to improve patient care in pharmacy curricula.

    PubMed

    Smith, Michael A; Benedict, Neal

    2015-02-17

    A review of the literature on the effectiveness of educational technologies to teach patient care skills to pharmacy students was conducted. Nineteen articles met inclusion criteria for the review. Seven of the articles included computer-aided instruction, 4 utilized human-patient simulation, 1 used both computer-aided instruction and human-patient simulation, and 7 utilized virtual patients. Educational technology was employed with more than 2700 students at 12 colleges and schools of pharmacy in courses including pharmacotherapeutics, skills and patient care laboratories, drug diversion, and advanced pharmacy practice experience (APPE) orientation. Students who learned by means of human-patient simulation and virtual patients reported enjoying the learning activity, whereas the results with computer-aided instruction were mixed. Moreover, the effect on learning was significant in the human-patient simulation and virtual patient studies, while conflicting data emerged on the effectiveness of computer-aided instruction.

  6. Advanced turboprop aircraft flyover noise: Annoyance to counter-rotating-propeller configurations with an equal number of blades on each rotor, preliminary results

    NASA Technical Reports Server (NTRS)

    Mccurdy, David A.

    1988-01-01

    A laboratory experiment was conducted to quantify the annoyance of people to the flyover noise of advanced turboprop aircraft with counter-rotating propellers (CRP) having an equal number of blades on each rotor. The objectives were: to determine the effects of total content on annoyance; and compare annoyance to n x n CRP advanced turboprop aircraft with annoyance to conventional turboprop and jet aircraft. A computer synthesis system was used to generate 27 realistic, time-varying simulations of advanced turboprop takeoff noise in which the tonal content was systematically varied to represent the factorial combinations of nine fundamental frequencies and three tone-to-broadband noise ratios. These advanced turboprop simulations along with recordings of five conventional turboprop takeoffs and five conventional jet takeoffs were presented at three D-weighted sound pressure levels to 64 subjects in an anechoic chamber. Analyses of the subjects' annoyance judgments compare the three aircraft types and examined the effects of the differences in tonal content among the advanced turboprop noises. The annoyance prediction ability of various noise metrics is also examined.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zitney, S.E.; McCorkle, D.; Yang, C.

    Process modeling and simulation tools are widely used for the design and operation of advanced power generation systems. These tools enable engineers to solve the critical process systems engineering problems that arise throughout the lifecycle of a power plant, such as designing a new process, troubleshooting a process unit or optimizing operations of the full process. To analyze the impact of complex thermal and fluid flow phenomena on overall power plant performance, the Department of Energy’s (DOE) National Energy Technology Laboratory (NETL) has developed the Advanced Process Engineering Co-Simulator (APECS). The APECS system is an integrated software suite that combinesmore » process simulation (e.g., Aspen Plus) and high-fidelity equipment simulations such as those based on computational fluid dynamics (CFD), together with advanced analysis capabilities including case studies, sensitivity analysis, stochastic simulation for risk/uncertainty analysis, and multi-objective optimization. In this paper we discuss the initial phases of the integration of the APECS system with the immersive and interactive virtual engineering software, VE-Suite, developed at Iowa State University and Ames Laboratory. VE-Suite uses the ActiveX (OLE Automation) controls in the Aspen Plus process simulator wrapped by the CASI library developed by Reaction Engineering International to run process/CFD co-simulations and query for results. This integration represents a necessary step in the development of virtual power plant co-simulations that will ultimately reduce the time, cost, and technical risk of developing advanced power generation systems.« less

  8. Cytobank: providing an analytics platform for community cytometry data analysis and collaboration.

    PubMed

    Chen, Tiffany J; Kotecha, Nikesh

    2014-01-01

    Cytometry is used extensively in clinical and laboratory settings to diagnose and track cell subsets in blood and tissue. High-throughput, single-cell approaches leveraging cytometry are developed and applied in the computational and systems biology communities by researchers, who seek to improve the diagnosis of human diseases, map the structures of cell signaling networks, and identify new cell types. Data analysis and management present a bottleneck in the flow of knowledge from bench to clinic. Multi-parameter flow and mass cytometry enable identification of signaling profiles of patient cell samples. Currently, this process is manual, requiring hours of work to summarize multi-dimensional data and translate these data for input into other analysis programs. In addition, the increase in the number and size of collaborative cytometry studies as well as the computational complexity of analytical tools require the ability to assemble sufficient and appropriately configured computing capacity on demand. There is a critical need for platforms that can be used by both clinical and basic researchers who routinely rely on cytometry. Recent advances provide a unique opportunity to facilitate collaboration and analysis and management of cytometry data. Specifically, advances in cloud computing and virtualization are enabling efficient use of large computing resources for analysis and backup. An example is Cytobank, a platform that allows researchers to annotate, analyze, and share results along with the underlying single-cell data.

  9. Laboratory directed research and development program FY 1999

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hansen, Todd; Levy, Karin

    2000-03-08

    The Ernest Orlando Lawrence Berkeley National Laboratory (Berkeley Lab or LBNL) is a multi-program national research facility operated by the University of California for the Department of Energy (DOE). As an integral element of DOE's National Laboratory System, Berkeley Lab supports DOE's missions in fundamental science, energy resources, and environmental quality. Berkeley Lab programs advance four distinct goals for DOE and the nation: (1) To perform leading multidisciplinary research in the computing sciences, physical sciences, energy sciences, biosciences, and general sciences in a manner that ensures employee and public safety and protection of the environment. (2) To develop and operatemore » unique national experimental facilities for qualified investigators. (3) To educate and train future generations of scientists and engineers to promote national science and education goals. (4) To transfer knowledge and technological innovations and to foster productive relationships among Berkeley Lab's research programs, universities, and industry in order to promote national economic competitiveness. This is the annual report on Laboratory Directed Research and Development (LDRD) program for FY99.« less

  10. Mouse Genome Informatics (MGI) Resource: Genetic, Genomic, and Biological Knowledgebase for the Laboratory Mouse.

    PubMed

    Eppig, Janan T

    2017-07-01

    The Mouse Genome Informatics (MGI) Resource supports basic, translational, and computational research by providing high-quality, integrated data on the genetics, genomics, and biology of the laboratory mouse. MGI serves a strategic role for the scientific community in facilitating biomedical, experimental, and computational studies investigating the genetics and processes of diseases and enabling the development and testing of new disease models and therapeutic interventions. This review describes the nexus of the body of growing genetic and biological data and the advances in computer technology in the late 1980s, including the World Wide Web, that together launched the beginnings of MGI. MGI develops and maintains a gold-standard resource that reflects the current state of knowledge, provides semantic and contextual data integration that fosters hypothesis testing, continually develops new and improved tools for searching and analysis, and partners with the scientific community to assure research data needs are met. Here we describe one slice of MGI relating to the development of community-wide large-scale mutagenesis and phenotyping projects and introduce ways to access and use these MGI data. References and links to additional MGI aspects are provided. © The Author 2017. Published by Oxford University Press.

  11. Mouse Genome Informatics (MGI) Resource: Genetic, Genomic, and Biological Knowledgebase for the Laboratory Mouse

    PubMed Central

    Eppig, Janan T.

    2017-01-01

    Abstract The Mouse Genome Informatics (MGI) Resource supports basic, translational, and computational research by providing high-quality, integrated data on the genetics, genomics, and biology of the laboratory mouse. MGI serves a strategic role for the scientific community in facilitating biomedical, experimental, and computational studies investigating the genetics and processes of diseases and enabling the development and testing of new disease models and therapeutic interventions. This review describes the nexus of the body of growing genetic and biological data and the advances in computer technology in the late 1980s, including the World Wide Web, that together launched the beginnings of MGI. MGI develops and maintains a gold-standard resource that reflects the current state of knowledge, provides semantic and contextual data integration that fosters hypothesis testing, continually develops new and improved tools for searching and analysis, and partners with the scientific community to assure research data needs are met. Here we describe one slice of MGI relating to the development of community-wide large-scale mutagenesis and phenotyping projects and introduce ways to access and use these MGI data. References and links to additional MGI aspects are provided. PMID:28838066

  12. The AAPT Advanced Laboratory Task Force Report

    NASA Astrophysics Data System (ADS)

    Dunham, Jeffrey

    2008-04-01

    In late 2005, the American Association of Physics Teachers (AAPT) assembled a seven-member Advanced Laboratory Task Force^ to recommend ways that AAPT could increase the degree and effectiveness of its interactions with physics teachers of upper-division physics laboratories, with the ultimate goal of improving the teaching of advanced laboratories. The task force completed its work during the first half of 2006 and its recommendations were presented to the AAPT Executive Committee in July 2006. This talk will present the recommendations of the task force and actions taken by AAPT in response to them. The curricular goals of the advanced laboratory course at various institutions will also be discussed. The talk will conclude with an appeal to the APS membership to support ongoing efforts to revitalize advanced laboratory course instruction. ^Members of the Advanced Laboratory Task Force: Van Bistrow, University of Chicago; Bob DeSerio, University of Florida; Jeff Dunham, Middlebury College (Chair); Elizabeth George, Wittenburg University; Daryl Preston, California State University, East Bay; Patricia Sparks, Harvey Mudd College; Gerald Taylor, James Madison University; and David Van Baak, Calvin College.

  13. Neural networks: Application to medical imaging

    NASA Technical Reports Server (NTRS)

    Clarke, Laurence P.

    1994-01-01

    The research mission is the development of computer assisted diagnostic (CAD) methods for improved diagnosis of medical images including digital x-ray sensors and tomographic imaging modalities. The CAD algorithms include advanced methods for adaptive nonlinear filters for image noise suppression, hybrid wavelet methods for feature segmentation and enhancement, and high convergence neural networks for feature detection and VLSI implementation of neural networks for real time analysis. Other missions include (1) implementation of CAD methods on hospital based picture archiving computer systems (PACS) and information networks for central and remote diagnosis and (2) collaboration with defense and medical industry, NASA, and federal laboratories in the area of dual use technology conversion from defense or aerospace to medicine.

  14. Advanced Computational Thermal Fluid Physics (CTFP) and Its Assessment for Light Water Reactors and Supercritical Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D.M. McEligot; K. G. Condie; G. E. McCreery

    2005-10-01

    Background: The ultimate goal of the study is the improvement of predictive methods for safety analyses and design of Generation IV reactor systems such as supercritical water reactors (SCWR) for higher efficiency, improved performance and operation, design simplification, enhanced safety and reduced waste and cost. The objective of this Korean / US / laboratory / university collaboration of coupled fundamental computational and experimental studies is to develop the supporting knowledge needed for improved predictive techniques for use in the technology development of Generation IV reactor concepts and their passive safety systems. The present study emphasizes SCWR concepts in the Generationmore » IV program.« less

  15. Mesoscale Science with High Energy X-ray Diffraction Microscopy at the Advanced Photon Source

    NASA Astrophysics Data System (ADS)

    Suter, Robert

    2014-03-01

    Spatially resolved diffraction of monochromatic high energy (> 50 keV) x-rays is used to map microstructural quantities inside of bulk polycrystalline materials. The non-destructive nature of High Energy Diffraction Microscopy (HEDM) measurements allows tracking of responses as samples undergo thermo-mechanical or other treatments. Volumes of the order of a cubic millimeter are probed with micron scale spatial resolution. Data sets allow direct comparisons to computational models of responses that frequently involve long-ranged, multi-grain interactions; such direct comparisons have only become possible with the development of HEDM and other high energy x-ray methods. Near-field measurements map the crystallographic orientation field within and between grains using a computational reconstruction method that simulates the experimental geometry and matches orientations in micron sized volume elements to experimental data containing projected grain images in large numbers of Bragg peaks. Far-field measurements yield elastic strain tensors through indexing schemes that sort observed diffraction peaks into sets associated with individual crystals and detect small radial motions in large numbers of such peaks. Combined measurements, facilitated by a new end station hutch at Advanced Photon Source beamline 1-ID, are mutually beneficial and result in accelerated data reduction. Further, absorption tomography yields density contrast that locates secondary phases, void clusters, and cracks, and tracks sample shape during deformation. A collaboration led by the Air Force Research Laboratory and including the Advanced Photon Source, Lawrence Livermore National Laboratory, Carnegie Mellon University, Petra-III, and Cornell University and CHESS is developing software and hardware for combined measurements. Examples of these capabilities include tracking of grain boundary migrations during thermal annealing, tensile deformation of zirconium, and combined measurements of nickel superalloys and a titanium alloy under tensile forces. Work supported by NSF grant DMR-1105173

  16. Laboratory Directed Research and Development Program FY 2006

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hansen

    2007-03-08

    The Ernest Orlando Lawrence Berkeley National Laboratory (Berkeley Lab or LBNL) is a multi-program national research facility operated by the University of California for the Department of Energy (DOE). As an integral element of DOE's National Laboratory System, Berkeley Lab supports DOE's missions in fundamental science, energy resources, and environmental quality. Berkeley Lab programs advance four distinct goals for DOE and the nation: (1) To perform leading multidisciplinary research in the computing sciences, physical sciences, energy sciences, biosciences, and general sciences in a manner that ensures employee and public safety and protection of the environment. (2) To develop and operatemore » unique national experimental facilities for qualified investigators. (3) To educate and train future generations of scientists and engineers to promote national science and education goals. (4) To transfer knowledge and technological innovations and to foster productive relationships among Berkeley Lab's research programs, universities, and industry in order to promote national economic competitiveness.« less

  17. Testing activities at the National Battery Test Laboratory

    NASA Astrophysics Data System (ADS)

    Hornstra, F.; Deluca, W. H.; Mulcahey, T. P.

    The National Battery Test Laboratory (NBTL) is an Argonne National Laboratory facility for testing, evaluating, and studying advanced electric storage batteries. The facility tests batteries developed under Department of Energy programs and from private industry. These include batteries intended for future electric vehicle (EV) propulsion, electric utility load leveling (LL), and solar energy storage. Since becoming operational, the NBTL has evaluated well over 1400 cells (generally in the form of three- to six-cell modules, but up to 140-cell batteries) of various technologies. Performance characterization assessments are conducted under a series of charge/discharge cycles with constant current, constant power, peak power, and computer simulated dynamic load profile conditions. Flexible charging algorithms are provided to accommodate the specific needs of each battery under test. Special studies are conducted to explore and optimize charge procedures, to investigate the impact of unique load demands on battery performance, and to analyze the thermal management requirements of battery systems.

  18. Aquatic models, genomics and chemical risk management.

    PubMed

    Cheng, Keith C; Hinton, David E; Mattingly, Carolyn J; Planchart, Antonio

    2012-01-01

    The 5th Aquatic Animal Models for Human Disease meeting follows four previous meetings (Nairn et al., 2001; Schmale, 2004; Schmale et al., 2007; Hinton et al., 2009) in which advances in aquatic animal models for human disease research were reported, and community discussion of future direction was pursued. At this meeting, discussion at a workshop entitled Bioinformatics and Computational Biology with Web-based Resources (20 September 2010) led to an important conclusion: Aquatic model research using feral and experimental fish, in combination with web-based access to annotated anatomical atlases and toxicological databases, yields data that advance our understanding of human gene function, and can be used to facilitate environmental management and drug development. We propose here that the effects of genes and environment are best appreciated within an anatomical context - the specifically affected cells and organs in the whole animal. We envision the use of automated, whole-animal imaging at cellular resolution and computational morphometry facilitated by high-performance computing and automated entry into toxicological databases, as anchors for genetic and toxicological data, and as connectors between human and model system data. These principles should be applied to both laboratory and feral fish populations, which have been virtually irreplaceable sentinals for environmental contamination that results in human morbidity and mortality. We conclude that automation, database generation, and web-based accessibility, facilitated by genomic/transcriptomic data and high-performance and cloud computing, will potentiate the unique and potentially key roles that aquatic models play in advancing systems biology, drug development, and environmental risk management. Copyright © 2011 Elsevier Inc. All rights reserved.

  19. Software for visualization, analysis, and manipulation of laser scan images

    NASA Astrophysics Data System (ADS)

    Burnsides, Dennis B.

    1997-03-01

    The recent introduction of laser surface scanning to scientific applications presents a challenge to computer scientists and engineers. Full utilization of this two- dimensional (2-D) and three-dimensional (3-D) data requires advances in techniques and methods for data processing and visualization. This paper explores the development of software to support the visualization, analysis and manipulation of laser scan images. Specific examples presented are from on-going efforts at the Air Force Computerized Anthropometric Research and Design (CARD) Laboratory.

  20. Artwork Separation

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Under a grant from California Institute of Technology, Jet Propulsion Laboratory (JPL) and LACMA (Los Angeles County Museum of Art) used image enhancement techniques to separate x-ray images of paintings when one had been painted on top of another. The technique is derived from computer processing of spacecraft-acquired imagery, and will allow earlier paintings, some of which have been covered for centuries, to be evaluated. JPL developed the program for "subtracting" the top painting and enhancing the bottom one, and believes an even more advanced system is possible.

  1. Framework for Flux Qubit Design

    NASA Astrophysics Data System (ADS)

    Yan, Fei; Kamal, Archana; Krantz, Philip; Campbell, Daniel; Kim, David; Yoder, Jonilyn; Orlando, Terry; Gustavsson, Simon; Oliver, William; Engineering Quantum Systems Team

    A qubit design for higher performance relies on the understanding of how various qubit properties are related to design parameters. We construct a framework for understanding the qubit design in the flux regime. We explore different parameter regimes, looking for features desirable for certain purpose in the context of quantum computing. This research was funded by the Office of the Director of National Intelligence (ODNI), Intelligence Advanced Research Projects Activity (IARPA) via MIT Lincoln Laboratory under Air Force Contract No. FA8721-05-C-0002.

  2. Astronomy Aid

    NASA Technical Reports Server (NTRS)

    1995-01-01

    As a Jet Propulsion Laboratory astronomer, John D. Callahan developed a computer program called Multimission Interactive Planner (MIP) to help astronomers analyze scientific and optical data collected on the Voyager's Grand Tour. The commercial version of the program called XonVu is published by XonTech, Inc. Callahan has since developed two more advanced programs based on MIP technology, Grand Tour and Jovian Traveler, which simulate Voyager and Giotto missions. The software allows astronomers and space novices to view the objects seen by the spacecraft, manipulating perspective, distance and field of vision.

  3. Advancements in silicon web technology

    NASA Technical Reports Server (NTRS)

    Hopkins, R. H.; Easoz, J.; Mchugh, J. P.; Piotrowski, P.; Hundal, R.

    1987-01-01

    Low defect density silicon web crystals up to 7 cm wide are produced from systems whose thermal environments are designed for low stress conditions using computer techniques. During growth, the average silicon melt temperature, the lateral melt temperature distribution, and the melt level are each controlled by digital closed loop systems to maintain thermal steady state and to minimize the labor content of the process. Web solar cell efficiencies of 17.2 pct AM1 have been obtained in the laboratory while 15 pct efficiencies are common in pilot production.

  4. Diagnostic Pathology and Laboratory Medicine in the Age of “Omics”

    PubMed Central

    Finn, William G.

    2007-01-01

    Functional genomics and proteomics involve the simultaneous analysis of hundreds or thousands of expressed genes or proteins and have spawned the modern discipline of computational biology. Novel informatic applications, including sophisticated dimensionality reduction strategies and cancer outlier profile analysis, can distill clinically exploitable biomarkers from enormous experimental datasets. Diagnostic pathologists are now charged with translating the knowledge generated by the “omics” revolution into clinical practice. Food and Drug Administration-approved proprietary testing platforms based on microarray technologies already exist and will expand greatly in the coming years. However, for diagnostic pathology, the greatest promise of the “omics” age resides in the explosion in information technology (IT). IT applications allow for the digitization of histological slides, transforming them into minable data and enabling content-based searching and archiving of histological materials. IT will also allow for the optimization of existing (and often underused) clinical laboratory technologies such as flow cytometry and high-throughput core laboratory functions. The state of pathology practice does not always keep up with the pace of technological advancement. However, to use fully the potential of these emerging technologies for the benefit of patients, pathologists and clinical scientists must embrace the changes and transformational advances that will characterize this new era. PMID:17652635

  5. Innovative approach towards understanding optics

    NASA Astrophysics Data System (ADS)

    Garg, Amit; Bharadwaj, Sadashiv Raj; Kumar, Raj; Shudhanshu, Avinash Kumar; Verma, Deepak Kumar

    2016-01-01

    Over the last few years, there has been a decline in the students’ interest towards Science and Optics. Use of technology in the form of various types of sensors and data acquisition systems has come as a saviour. Till date, manual routine tools and techniques are used to perform various experimental procedures in most of the science/optics laboratories in our country. The manual tools are cumbersome whereas the automated ones are costly. It does not enthuse young researchers towards the science laboratories. There is a need to develop applications which can be easily integrated, tailored at school and undergraduate level laboratories and are economical at the same time. Equipments with advanced technologies are available but they are uneconomical and have complicated working principle with a black box approach. The present work describes development of portable tools and applications which are user-friendly. This is being implemented using open-source physical computing platform based on a simple low cost microcontroller board and a development environment for writing software. The present paper reports the development of an automated spectrometer, an instrument used in almost all optics experiments at undergraduate level, and students’ response to this innovation. These tools will inspire young researchers towards science and facilitate development of advance low cost equipments making life easier for Indian as well as developing nations.

  6. Constitutive Modeling of the Thermomechanical Behavior of Rock Salt

    NASA Astrophysics Data System (ADS)

    Hampel, A.

    2016-12-01

    For the safe disposal of heat-generating high-level radioactive waste in rock salt formations, highly reliable numerical simulations of the thermomechanical and hydraulic behavior of the host rock have to be performed. Today, the huge progress in computer technology has enabled experts to calculate large and detailed computer models of underground repositories. However, the big ad­van­ces in computer technology are only beneficial when the applied material models and modeling procedures also meet very high demands. They result from the fact that the evaluation of the long-term integrity of the geological barrier requires an extra­polation of a highly nonlinear deforma­tion behavior to up to 1 million years, while the underlying experimental investigations in the laboratory or in situ have a duration of only days, weeks or at most some years. Several advanced constitutive models were developed and continuously improved to describe the dependences of various deformation phenomena in rock salt on in-situ relevant boundary conditions: transient and steady-state creep, evolution of damage and dilatancy in the DRZ, failure, post-failure behavior, residual strength, damage and dilatancy reduction, and healing. In a joint project series between 2004 and 2016, fundamental features of the advanced models were investigated and compared in detail with benchmark calculations. The study included procedures for the determination of characteristic salt-type-specific model parameter values and for the performance of numerical calculations of underground structures. Based on the results of this work and on specific laboratory investigations, the rock mechanical modeling is currently developed further in a common research project of experts from Germany and the United States. In this presentation, an overview about the work and results of the project series is given and the current joint research project WEIMOS is introduced.

  7. Air Flow Modeling in the Wind Tunnel of the FHWA Aerodynamics Laboratory at Turner-Fairbank Highway Research Center

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sitek, M. A.; Lottes, S. A.; Bojanowski, C.

    Computational fluid dynamics (CFD) modeling is widely used in industry for design and in the research community to support, compliment, and extend the scope of experimental studies. Analysis of transportation infrastructure using high performance cluster computing with CFD and structural mechanics software is done at the Transportation Research and Analysis Computing Center (TRACC) at Argonne National Laboratory. These resources, available at TRACC, were used to perform advanced three-dimensional computational simulations of the wind tunnel laboratory at the Turner-Fairbank Highway Research Center (TFHRC). The goals were to verify the CFD model of the laboratory wind tunnel and then to use versionsmore » of the model to provide the capability to (1) perform larger parametric series of tests than can be easily done in the laboratory with available budget and time, (2) to extend testing to wind speeds that cannot be achieved in the laboratory, and (3) to run types of tests that are very difficult or impossible to run in the laboratory. Modern CFD software has many physics models and domain meshing options. Models, including the choice of turbulence and other physics models and settings, the computational mesh, and the solver settings, need to be validated against measurements to verify that the results are sufficiently accurate for use in engineering applications. The wind tunnel model was built and tested, by comparing to experimental measurements, to provide a valuable tool to perform these types of studies in the future as a complement and extension to TFHRC’s experimental capabilities. Wind tunnel testing at TFHRC is conducted in a subsonic open-jet wind tunnel with a 1.83 m (6 foot) by 1.83 m (6 foot) cross section. A three component dual force-balance system is used to measure forces acting on tested models, and a three degree of freedom suspension system is used for dynamic response tests. Pictures of the room are shown in Figure 1-1 to Figure 1-4. A detailed CAD geometry and CFD model of the wind tunnel laboratory at TFHRC was built and tested. Results were compared against experimental wind velocity measurements at a large number of locations around the room. This testing included an assessment of the air flow uniformity provided by the tunnel to the test zone and assessment of room geometry effects, such as influence of the proximity the room walls, the non-symmetrical position of the tunnel in the room, and the influence of the room setup on the air flow in the room. This information is useful both for simplifying the computational model and in deciding whether or not moving, or removing, some of the furniture or other movable objects in the room will change the flow in the test zone.« less

  8. Benefit from NASA

    NASA Image and Video Library

    2001-09-01

    The high-tech art of digital signal processing (DSP) was pioneered at NASA's Jet Propulsion Laboratory (JPL) in the mid-1960s for use in the Apollo Lunar Landing Program. Designed to computer enhance pictures of the Moon, this technology became the basis for the Landsat Earth resources satellites and subsequently has been incorporated into a broad range of Earthbound medical and diagnostic tools. DSP is employed in advanced body imaging techniques including Computer-Aided Tomography, also known as CT and CATScan, and Magnetic Resonance Imaging (MRI). CT images are collected by irradiating a thin slice of the body with a fan-shaped x-ray beam from a number of directions around the body's perimeter. A tomographic (slice-like) picture is reconstructed from these multiple views by a computer. MRI employs a magnetic field and radio waves, rather than x-rays, to create images.

  9. Optimization of Diode Laser System to Treat Benign Prostate Hyperplasia Final Report CRADA No. TSB-1154-95

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    London, Richard A; Byrne, Mark

    Benign prostate hyperplasia (BPH) is a pervasive condition of enlargement of the male prostate gland which leads to several urinary difficulties ranging from hesitancy to incontinence to kidney dysfunction in severe cases. Currently the most common therapy is transurethral resection of the prostate (TURP) utilizing an electrosurgical device. Although TURP is largely successful, new BPH therapy methods are desired to reduce the cost and recovery time, improve the success rate, and reduce side effects. Recently, lasers have been introduced for this purpose. Indigo Medical Inc. is currently engaged in the development, testing, and preparation for sales of a new diodemore » laser based BPH therapy system. The development is based on laboratory experiments, animal studies, and a limited FDA-approved clinical trial in the US and in other countries. The addition of sophisticated numerical modeling, of the sort that has been highly developed at Lawrence Livermore National Laboratory, can greatly aid in the design of the system and treatment protocol. The benefits to DOE include the maintenance and advancement of numerical modeling expertise in radiation-matter interactions of the sort essential for the stockpile stewardship, inertial confinement fusion, and advanced manufacturing, and the push on advanced scientific computational methods, ultimately in areas such as 3-D transport.« less

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zitney, S.E.

    This presentation will examine process systems engineering R&D needs for application to advanced fossil energy (FE) systems and highlight ongoing research activities at the National Energy Technology Laboratory (NETL) under the auspices of a recently launched Collaboratory for Process & Dynamic Systems Research. The three current technology focus areas include: 1) High-fidelity systems with NETL's award-winning Advanced Process Engineering Co-Simulator (APECS) technology for integrating process simulation with computational fluid dynamics (CFD) and virtual engineering concepts, 2) Dynamic systems with R&D on plant-wide IGCC dynamic simulation, control, and real-time training applications, and 3) Systems optimization including large-scale process optimization, stochastic simulationmore » for risk/uncertainty analysis, and cost estimation. Continued R&D aimed at these and other key process systems engineering models, methods, and tools will accelerate the development of advanced gasification-based FE systems and produce increasingly valuable outcomes for DOE and the Nation.« less

  11. AST Combustion Workshop: Diagnostics Working Group Report

    NASA Technical Reports Server (NTRS)

    Locke, Randy J.; Hicks, Yolanda R.; Hanson, Ronald K.

    1996-01-01

    A workshop was convened under NASA's Advanced Subsonics Technologies (AST) Program. Many of the principal combustion diagnosticians from industry, academia, and government laboratories were assembled in the Diagnostics/Testing Subsection of this workshop to discuss the requirements and obstacles to the successful implementation of advanced diagnostic techniques to the test environment of the proposed AST combustor. The participants, who represented the major relevant areas of advanced diagnostic methods currently applied to combustion and related fields, first established the anticipated AST combustor flowfield conditions. Critical flow parameters were then examined and prioritized as to their importance to combustor/fuel injector design and manufacture, environmental concerns, and computational interests. Diagnostic techniques were then evaluated in terms of current status, merits and obstacles for each flow parameter. All evaluations are presented in tabular form and recommendations are made on the best-suited diagnostic method to implement for each flow parameter in order of applicability and intrinsic value.

  12. Special issue on the "Consortium for Advanced Simulation of Light Water Reactors Research and Development Progress"

    NASA Astrophysics Data System (ADS)

    Turinsky, Paul J.; Martin, William R.

    2017-04-01

    In this special issue of the Journal of Computational Physics, the research and development completed at the time of manuscript submission by the Consortium for Advanced Simulation of Light Water Reactors (CASL) is presented. CASL is the first of several Energy Innovation Hubs that have been created by the Department of Energy. The Hubs are modeled after the strong scientific management characteristics of the Manhattan Project and AT&T Bell Laboratories, and function as integrated research centers that combine basic and applied research with engineering to accelerate scientific discovery that addresses critical energy issues. Lifetime of a Hub is expected to be five or ten years depending upon performance, with CASL being granted a ten year lifetime.

  13. Redirecting Under-Utilised Computer Laboratories into Cluster Computing Facilities

    ERIC Educational Resources Information Center

    Atkinson, John S.; Spenneman, Dirk H. R.; Cornforth, David

    2005-01-01

    Purpose: To provide administrators at an Australian university with data on the feasibility of redirecting under-utilised computer laboratories facilities into a distributed high performance computing facility. Design/methodology/approach: The individual log-in records for each computer located in the computer laboratories at the university were…

  14. Comparison of advanced turboprop and conventional jet and propeller aircraft flyover noise annoyance: Preliminary results

    NASA Technical Reports Server (NTRS)

    Mccurdy, D. A.

    1985-01-01

    A laboratory experiment was conducted to compare the flyover noise annoyance of proposed advanced turboprop aircraft with that of conventional turboprop and jet aircraft. The effects of fundamental frequency and tone-to-broadband noise ratio on advanced turboprop annoyance were also examined. A computer synthesis system is used to generate 18 realistic, time varying simulations of propeller aircraft takeoff noise in which the harmonic content is systematically varied to represent the factorial combinations of six fundamental frequencies ranging from 67.5 Hz to 292.5 Hz and three tone-to-broadband noise ratios of 0, 15, and 30 dB. These advanced turboprop simulations along with recordings of five conventional turboprop takeoffs and five conventional jet takeoffs are presented at D-weighted sound pressure levels of 70, 80, and 90 dB to 32 subjects in an anechoic chamber. Analyses of the subjects' annoyance judgments compare the three categories of aircraft and examine the effects of the differences in harmonic content among the advanced turboprop noises. The annoyance prediction ability of various noise measurement procedures and corrections is also examined.

  15. Comparison of advanced turboprop and conventional jet and propeller aircraft flyover noise annoyance - Preliminary results

    NASA Technical Reports Server (NTRS)

    Mccurdy, D. A.

    1985-01-01

    A laboratory experiment was conducted to compare the flyover noise annoyance of proposed advanced turboprop aircraft with that of conventional turboprop and jet aircraft. The effects of fundamental frequency and tone-to-broadband noise ratio on advanced turboprop annoyance were also examined. A computer synthesis system was used to generate 18 realistic, time varyring simulations of propeller aircraft takeoff noise in which the harmonic content was systematically varied to represent the factorial combinations of six fundamental frequencies ranging from 67.5 Hz to 292.5 Hz and three tone-to-broadband noise ratios of 0, 15, and 30 dB. These advanced turboprop simulations along with recordings of five conventional turboprop takeoffs and five conventional jet takeoffs were presented at D-weighted sound pressure levels of 70, 80, and 90 dB to 32 subjects in an anechoic chamber. Analyses of the subjects' annoyance judgments compare the three categories of aircraft and examine the effects of the differences in harmonic content among the advanced turboprop noises. The annoyance prediction ability of various noise measurement procedures and corrections is also examined.

  16. An Update on Improvements to NiCE Support for PROTEUS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennett, Andrew; McCaskey, Alexander J.; Billings, Jay Jay

    2015-09-01

    The Department of Energy Office of Nuclear Energy's Nuclear Energy Advanced Modeling and Simulation (NEAMS) program has supported the development of the NEAMS Integrated Computational Environment (NiCE), a modeling and simulation workflow environment that provides services and plugins to facilitate tasks such as code execution, model input construction, visualization, and data analysis. This report details the development of workflows for the reactor core neutronics application, PROTEUS. This advanced neutronics application (primarily developed at Argonne National Laboratory) aims to improve nuclear reactor design and analysis by providing an extensible and massively parallel, finite-element solver for current and advanced reactor fuel neutronicsmore » modeling. The integration of PROTEUS-specific tools into NiCE is intended to make the advanced capabilities that PROTEUS provides more accessible to the nuclear energy research and development community. This report will detail the work done to improve existing PROTEUS workflow support in NiCE. We will demonstrate and discuss these improvements, including the development of flexible IO services, an improved interface for input generation, and the addition of advanced Fortran development tools natively in the platform.« less

  17. Pathfinder radar development at Sandia National Laboratories

    NASA Astrophysics Data System (ADS)

    Castillo, Steven

    2016-05-01

    Since the invention of Synthetic Aperture Radar imaging in the 1950's, users or potential users have sought to exploit SAR imagery for a variety of applications including the earth sciences and defense. At Sandia Laboratories, SAR Research and Development and associated defense applications grew out of the nuclear weapons program in the 1980's and over the years has become a highly viable ISR sensor for a variety of tactical applications. Sandia SAR systems excel where real-­-time, high-­-resolution, all-­-weather, day or night surveillance is required for developing situational awareness. This presentation will discuss the various aspects of Sandia's airborne ISR capability with respect to issues related to current operational success as well as the future direction of the capability as Sandia seeks to improve the SAR capability it delivers into multiple mission scenarios. Issues discussed include fundamental radar capabilities, advanced exploitation techniques and human-­-computer interface (HMI) challenges that are part of the advances required to maintain Sandia's ability to continue to support ever changing and demanding mission challenges.

  18. Improved Algorithms Speed It Up for Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hazi, A

    2005-09-20

    Huge computers, huge codes, complex problems to solve. The longer it takes to run a code, the more it costs. One way to speed things up and save time and money is through hardware improvements--faster processors, different system designs, bigger computers. But another side of supercomputing can reap savings in time and speed: software improvements to make codes--particularly the mathematical algorithms that form them--run faster and more efficiently. Speed up math? Is that really possible? According to Livermore physicist Eugene Brooks, the answer is a resounding yes. ''Sure, you get great speed-ups by improving hardware,'' says Brooks, the deputy leadermore » for Computational Physics in N Division, which is part of Livermore's Physics and Advanced Technologies (PAT) Directorate. ''But the real bonus comes on the software side, where improvements in software can lead to orders of magnitude improvement in run times.'' Brooks knows whereof he speaks. Working with Laboratory physicist Abraham Szoeke and others, he has been instrumental in devising ways to shrink the running time of what has, historically, been a tough computational nut to crack: radiation transport codes based on the statistical or Monte Carlo method of calculation. And Brooks is not the only one. Others around the Laboratory, including physicists Andrew Williamson, Randolph Hood, and Jeff Grossman, have come up with innovative ways to speed up Monte Carlo calculations using pure mathematics.« less

  19. A short course on measure and probability theories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pebay, Philippe Pierre

    2004-02-01

    This brief Introduction to Measure Theory, and its applications to Probabilities, corresponds to the lecture notes of a seminar series given at Sandia National Laboratories in Livermore, during the spring of 2003. The goal of these seminars was to provide a minimal background to Computational Combustion scientists interested in using more advanced stochastic concepts and methods, e.g., in the context of uncertainty quantification. Indeed, most mechanical engineering curricula do not provide students with formal training in the field of probability, and even in less in measure theory. However, stochastic methods have been used more and more extensively in the pastmore » decade, and have provided more successful computational tools. Scientists at the Combustion Research Facility of Sandia National Laboratories have been using computational stochastic methods for years. Addressing more and more complex applications, and facing difficult problems that arose in applications showed the need for a better understanding of theoretical foundations. This is why the seminar series was launched, and these notes summarize most of the concepts which have been discussed. The goal of the seminars was to bring a group of mechanical engineers and computational combustion scientists to a full understanding of N. WIENER'S polynomial chaos theory. Therefore, these lectures notes are built along those lines, and are not intended to be exhaustive. In particular, the author welcomes any comments or criticisms.« less

  20. Computational mechanics research and support for aerodynamics and hydraulics at TFHRC year 1 quarter 4 progress report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lottes, S.A.; Kulak, R.F.; Bojanowski, C.

    2011-12-09

    The computational fluid dynamics (CFD) and computational structural mechanics (CSM) focus areas at Argonne's Transportation Research and Analysis Computing Center (TRACC) initiated a project to support and compliment the experimental programs at the Turner-Fairbank Highway Research Center (TFHRC) with high performance computing based analysis capabilities in August 2010. The project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at TFHRC for a period of five years, beginning in October 2010. Themore » analysis methods employ well-benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focus of the project are wind and water effects on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to assess them for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, CFD analysis of the operation of the wind tunnel in the TFCHR wind engineering laboratory, vehicle stability under high wind loading, and the use of electromagnetic shock absorbers to improve vehicle stability under high wind conditions. This quarterly report documents technical progress on the project tasks for the period of July through September 2011.« less

  1. Pulmonary atelectasis and survival in advanced non-small cell lung carcinoma.

    PubMed

    Bulbul, Yilmaz; Eris, Bulent; Orem, Asim; Gulsoy, Ayhan; Oztuna, Funda; Ozlu, Tevfik; Ozsu, Savas

    2010-08-01

    Atelectasis was reported as a favorable prognostic sign of pulmonary carcinoma; however, the underlying mechanism in those patients is not known. In this study, we aimed to investigate prospectively the potential impact of atelectasis and/or obstructive pneumonitis (AO) on survival and the relation between atelectasis and some laboratory blood parameters. The study was conducted on 87 advanced stage non-small cell lung cancer (NSCLC) patients. Clinical and laboratory parameters of patients at first presentation were recorded, and patients were divided into two groups according to the presence of AO in thorax computed tomography (CT). Survival was calculated using Kaplan-Meier and univariate Cox's regression analyses. Laboratory parameters that might be related with prolonged survival in atelectasis were compared using chi-square, Student's t, and Mann-Whitney U tests. Of the patients, 54% had stage IV disease, and AO was detected in 48.3% of all cases. Overall median survival was 13.2 months for all cases, 10.9 months for patients without AO, and 13.9 months for patients with AO (P=0.067). Survival was significantly longer in stage III patients with AO (14.5 months versus 9.2 months, P=0.032), but not in stage IV patients. Patients with AO in stage III had significantly lower platelet counts (P=0.032) and blood sedimentation rates than did those with no AO (P=0.045). We concluded that atelectasis and/or obstructive pneumonitis was associated with prolonged survival in locally advanced NSCLC. There was also a clear association between atelectasis and/or obstructive pneumonitis and platelets and blood sedimentation rate.

  2. Pulmonary atelectasis and survival in advanced non-small cell lung carcinoma

    PubMed Central

    2010-01-01

    Atelectasis was reported as a favorable prognostic sign of pulmonary carcinoma; however, the underlying mechanism in those patients is not known. In this study, we aimed to investigate prospectively the potential impact of atelectasis and/or obstructive pneumonitis (AO) on survival and the relation between atelectasis and some laboratory blood parameters. The study was conducted on 87 advanced stage non-small cell lung cancer (NSCLC) patients. Clinical and laboratory parameters of patients at first presentation were recorded, and patients were divided into two groups according to the presence of AO in thorax computed tomography (CT). Survival was calculated using Kaplan-Meier and univariate Cox's regression analyses. Laboratory parameters that might be related with prolonged survival in atelectasis were compared using chi-square, Student's t, and Mann-Whitney U tests. Of the patients, 54% had stage IV disease, and AO was detected in 48.3% of all cases. Overall median survival was 13.2 months for all cases, 10.9 months for patients without AO, and 13.9 months for patients with AO (P = 0.067). Survival was significantly longer in stage III patients with AO (14.5 months versus 9.2 months, P = 0.032), but not in stage IV patients. Patients with AO in stage III had significantly lower platelet counts (P = 0.032) and blood sedimentation rates than did those with no AO (P = 0.045). We concluded that atelectasis and/or obstructive pneumonitis was associated with prolonged survival in locally advanced NSCLC. There was also a clear association between atelectasis and/or obstructive pneumonitis and platelets and blood sedimentation rate. PMID:20636252

  3. Comparison of measured temperatures, thermal stresses and creep residues with predictions on a built-up titanium structure

    NASA Technical Reports Server (NTRS)

    Jenkins, Jerald M.

    1987-01-01

    Temperature, thermal stresses, and residual creep stresses were studied by comparing laboratory values measured on a built-up titanium structure with values calculated from finite-element models. Several such models were used to examine the relationship between computational thermal stresses and thermal stresses measured on a built-up structure. Element suitability, element density, and computational temperature discrepancies were studied to determine their impact on measured and calculated thermal stress. The optimum number of elements is established from a balance between element density and suitable safety margins, such that the answer is acceptably safe yet is economical from a computational viewpoint. It is noted that situations exist where relatively small excursions of calculated temperatures from measured values result in far more than proportional increases in thermal stress values. Measured residual stresses due to creep significantly exceeded the values computed by the piecewise linear elastic strain analogy approach. The most important element in the computation is the correct definition of the creep law. Computational methodology advances in predicting residual stresses due to creep require significantly more viscoelastic material characterization.

  4. A detailed experimental study of a DNA computer with two endonucleases.

    PubMed

    Sakowski, Sebastian; Krasiński, Tadeusz; Sarnik, Joanna; Blasiak, Janusz; Waldmajer, Jacek; Poplawski, Tomasz

    2017-07-14

    Great advances in biotechnology have allowed the construction of a computer from DNA. One of the proposed solutions is a biomolecular finite automaton, a simple two-state DNA computer without memory, which was presented by Ehud Shapiro's group at the Weizmann Institute of Science. The main problem with this computer, in which biomolecules carry out logical operations, is its complexity - increasing the number of states of biomolecular automata. In this study, we constructed (in laboratory conditions) a six-state DNA computer that uses two endonucleases (e.g. AcuI and BbvI) and a ligase. We have presented a detailed experimental verification of its feasibility. We described the effect of the number of states, the length of input data, and the nondeterminism on the computing process. We also tested different automata (with three, four, and six states) running on various accepted input words of different lengths such as ab, aab, aaab, ababa, and of an unaccepted word ba. Moreover, this article presents the reaction optimization and the methods of eliminating certain biochemical problems occurring in the implementation of a biomolecular DNA automaton based on two endonucleases.

  5. Recovery Act. Development and Validation of an Advanced Stimulation Prediction Model for Enhanced Geothermal System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gutierrez, Marte

    The research project aims to develop and validate an advanced computer model that can be used in the planning and design of stimulation techniques to create engineered reservoirs for Enhanced Geothermal Systems. The specific objectives of the proposal are to: 1) Develop a true three-dimensional hydro-thermal fracturing simulator that is particularly suited for EGS reservoir creation. 2) Perform laboratory scale model tests of hydraulic fracturing and proppant flow/transport using a polyaxial loading device, and use the laboratory results to test and validate the 3D simulator. 3) Perform discrete element/particulate modeling of proppant transport in hydraulic fractures, and use the resultsmore » to improve understand of proppant flow and transport. 4) Test and validate the 3D hydro-thermal fracturing simulator against case histories of EGS energy production. 5) Develop a plan to commercialize the 3D fracturing and proppant flow/transport simulator. The project is expected to yield several specific results and benefits. Major technical products from the proposal include: 1) A true-3D hydro-thermal fracturing computer code that is particularly suited to EGS, 2) Documented results of scale model tests on hydro-thermal fracturing and fracture propping in an analogue crystalline rock, 3) Documented procedures and results of discrete element/particulate modeling of flow and transport of proppants for EGS applications, and 4) Database of monitoring data, with focus of Acoustic Emissions (AE) from lab scale modeling and field case histories of EGS reservoir creation.« less

  6. Recovery Act. Development and Validation of an Advanced Stimulation Prediction Model for Enhanced Geothermal Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gutierrez, Marte

    2013-12-31

    This research project aims to develop and validate an advanced computer model that can be used in the planning and design of stimulation techniques to create engineered reservoirs for Enhanced Geothermal Systems. The specific objectives of the proposal are to; Develop a true three-dimensional hydro-thermal fracturing simulator that is particularly suited for EGS reservoir creation; Perform laboratory scale model tests of hydraulic fracturing and proppant flow/transport using a polyaxial loading device, and use the laboratory results to test and validate the 3D simulator; Perform discrete element/particulate modeling of proppant transport in hydraulic fractures, and use the results to improve understandmore » of proppant flow and transport; Test and validate the 3D hydro-thermal fracturing simulator against case histories of EGS energy production; and Develop a plan to commercialize the 3D fracturing and proppant flow/transport simulator. The project is expected to yield several specific results and benefits. Major technical products from the proposal include; A true-3D hydro-thermal fracturing computer code that is particularly suited to EGS; Documented results of scale model tests on hydro-thermal fracturing and fracture propping in an analogue crystalline rock; Documented procedures and results of discrete element/particulate modeling of flow and transport of proppants for EGS applications; and Database of monitoring data, with focus of Acoustic Emissions (AE) from lab scale modeling and field case histories of EGS reservoir creation.« less

  7. Systems engineering and integration: Advanced avionics laboratories

    NASA Technical Reports Server (NTRS)

    1990-01-01

    In order to develop the new generation of avionics which will be necessary for upcoming programs such as the Lunar/Mars Initiative, Advanced Launch System, and the National Aerospace Plane, new Advanced Avionics Laboratories are required. To minimize costs and maximize benefits, these laboratories should be capable of supporting multiple avionics development efforts at a single location, and should be of a common design to support and encourage data sharing. Recent technological advances provide the capability of letting the designer or analyst perform simulations and testing in an environment similar to his engineering environment and these features should be incorporated into the new laboratories. Existing and emerging hardware and software standards must be incorporated wherever possible to provide additional cost savings and compatibility. Special care must be taken to design the laboratories such that real-time hardware-in-the-loop performance is not sacrificed in the pursuit of these goals. A special program-independent funding source should be identified for the development of Advanced Avionics Laboratories as resources supporting a wide range of upcoming NASA programs.

  8. 76 FR 41234 - Advanced Scientific Computing Advisory Committee Charter Renewal

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-13

    ... Secretariat, General Services Administration, notice is hereby given that the Advanced Scientific Computing... advice and recommendations concerning the Advanced Scientific Computing program in response only to... Advanced Scientific Computing Research program and recommendations based thereon; --Advice on the computing...

  9. Supercomputing Sheds Light on the Dark Universe

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Habib, Salman; Heitmann, Katrin

    2012-11-15

    At Argonne National Laboratory, scientists are using supercomputers to shed light on one of the great mysteries in science today, the Dark Universe. With Mira, a petascale supercomputer at the Argonne Leadership Computing Facility, a team led by physicists Salman Habib and Katrin Heitmann will run the largest, most complex simulation of the universe ever attempted. By contrasting the results from Mira with state-of-the-art telescope surveys, the scientists hope to gain new insights into the distribution of matter in the universe, advancing future investigations of dark energy and dark matter into a new realm. The team's research was named amore » finalist for the 2012 Gordon Bell Prize, an award recognizing outstanding achievement in high-performance computing.« less

  10. Recent Advancements in the Numerical Simulation of Surface Irradiance for Solar Energy Applications: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, Yu; Sengupta, Manajit; Deline, Chris

    This paper briefly reviews the National Renewable Energy Laboratory's recent efforts on developing all-sky solar irradiance models for solar energy applications. The Fast All-sky Radiation Model for Solar applications (FARMS) utilizes the simulation of clear-sky transmittance and reflectance and a parameterization of cloud transmittance and reflectance to rapidly compute broadband irradiances on horizontal surfaces. FARMS delivers accuracy that is comparable to the two-stream approximation, but it is approximately 1,000 times faster. A FARMS-Narrowband Irradiance over Tilted surfaces (FARMS-NIT) has been developed to compute spectral irradiances on photovoltaic (PV) panels in 2002 wavelength bands. Further, FARMS-NIT has been extended for bifacialmore » PV panels.« less

  11. Computational Science in Armenia (Invited Talk)

    NASA Astrophysics Data System (ADS)

    Marandjian, H.; Shoukourian, Yu.

    This survey is devoted to the development of informatics and computer science in Armenia. The results in theoretical computer science (algebraic models, solutions to systems of general form recursive equations, the methods of coding theory, pattern recognition and image processing), constitute the theoretical basis for developing problem-solving-oriented environments. As examples can be mentioned: a synthesizer of optimized distributed recursive programs, software tools for cluster-oriented implementations of two-dimensional cellular automata, a grid-aware web interface with advanced service trading for linear algebra calculations. In the direction of solving scientific problems that require high-performance computing resources, examples of completed projects include the field of physics (parallel computing of complex quantum systems), astrophysics (Armenian virtual laboratory), biology (molecular dynamics study of human red blood cell membrane), meteorology (implementing and evaluating the Weather Research and Forecast Model for the territory of Armenia). The overview also notes that the Institute for Informatics and Automation Problems of the National Academy of Sciences of Armenia has established a scientific and educational infrastructure, uniting computing clusters of scientific and educational institutions of the country and provides the scientific community with access to local and international computational resources, that is a strong support for computational science in Armenia.

  12. Mechanisms and Dynamics of Abiotic and Biotic Interactions at Environmental Interfaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roso, Kevin M.

    The Stanford EMSI (SEMSI) was established in 2004 through joint funding by the National Science Foundation and the OBER-ERSD. It encompasses a number of universities and national laboratories. The PNNL component of the SEMSI is funded by ERSD and is the focus of this report. This component has the objective of providing theory support to the SEMSI by bringing computational capabilities and expertise to bear on important electron transfer problems at mineral/water and mineral/microbe interfaces. PNNL staff member Dr. Kevin Rosso, who is also ''matrixed'' into the Environmental Molecular Sciences Laboratory (EMSL) at PNNL, is a co-PI on the SEMSImore » project and the PNNL lead. The EMSL computational facilities being applied to the SEMSI project include the 11.8 teraflop massively-parallel supercomputer. Science goals of this EMSL/SEMSI partnership include advancing our understanding of: (1) The kinetics of U(VI) and Cr(VI) reduction by aqueous and solid-phase Fe(II), (2) The structure of mineral surfaces in equilibrium with solution, and (3) Mechanisms of bacterial electron transfer to iron oxide surfaces via outer-membrane cytochromes.« less

  13. Transitioning EEG experiments away from the laboratory using a Raspberry Pi 2.

    PubMed

    Kuziek, Jonathan W P; Shienh, Axita; Mathewson, Kyle E

    2017-02-01

    Electroencephalography (EEG) experiments are typically performed in controlled laboratory settings to minimise noise and produce reliable measurements. These controlled conditions also reduce the applicability of the obtained results to more varied environments and may limit their relevance to everyday situations. Advances in computer portability may increase the mobility and applicability of EEG results while decreasing costs. In this experiment we show that stimulus presentation using a Raspberry Pi 2 computer provides a low cost, reliable alternative to a traditional desktop PC in the administration of EEG experimental tasks. Significant and reliable MMN and P3 activity, typical event-related potentials (ERPs) associated with an auditory oddball paradigm, were measured while experiments were administered using the Raspberry Pi 2. While latency differences in ERP triggering were observed between systems, these differences reduced power only marginally, likely due to the reduced processing power of the Raspberry Pi 2. An auditory oddball task administered using the Raspberry Pi 2 produced similar ERPs to those derived from a desktop PC in a laboratory setting. Despite temporal differences and slight increases in trials needed for similar statistical power, the Raspberry Pi 2 can be used to design and present auditory experiments comparable to a PC. Our results show that the Raspberry Pi 2 is a low cost alternative to the desktop PC when administering EEG experiments and, due to its small size and low power consumption, will enable mobile EEG experiments unconstrained by a traditional laboratory setting. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. A curriculum for real-time computer and control systems engineering

    NASA Technical Reports Server (NTRS)

    Halang, Wolfgang A.

    1990-01-01

    An outline of a syllabus for the education of real-time-systems engineers is given. This comprises the treatment of basic concepts, real-time software engineering, and programming in high-level real-time languages, real-time operating systems with special emphasis on such topics as task scheduling, hardware architectures, and especially distributed automation structures, process interfacing, system reliability and fault-tolerance, and integrated project development support systems. Accompanying course material and laboratory work are outlined, and suggestions for establishing a laboratory with advanced, but low-cost, hardware and software are provided. How the curriculum can be extended into a second semester is discussed, and areas for possible graduate research are listed. The suitable selection of a high-level real-time language and supporting operating system for teaching purposes is considered.

  15. A straightforward graphical user interface for basic and advanced signal processing of thermographic infrared sequences

    NASA Astrophysics Data System (ADS)

    Klein, Matthieu T.; Ibarra-Castanedo, Clemente; Maldague, Xavier P.; Bendada, Abdelhakim

    2008-03-01

    IR-View, is a free and open source Matlab software that was released in 1998 at the Computer Vision and Systems Laboratory (CVSL) at Université Laval, Canada, as an answer to many common and recurrent needs in Infrared thermography. IR-View has proven to be a useful tool at CVSL for the past 10 years. The software by itself and/or its concept and functions may be of interest for other laboratories and companies working in research in the IR NDT field. This article describes the functions and processing techniques integrated to IR-View, freely downloadable under the GNU license at http://mivim.gel.ulaval.ca. Demonstration of IR-View functionalities will also be done during the DSS08 SPIE Defense and Security Symposium.

  16. Propel: A Discontinuous-Galerkin Finite Element Code for Solving the Reacting Navier-Stokes Equations

    NASA Astrophysics Data System (ADS)

    Johnson, Ryan; Kercher, Andrew; Schwer, Douglas; Corrigan, Andrew; Kailasanath, Kazhikathra

    2017-11-01

    This presentation focuses on the development of a Discontinuous Galerkin (DG) method for application to chemically reacting flows. The in-house code, called Propel, was developed by the Laboratory of Computational Physics and Fluid Dynamics at the Naval Research Laboratory. It was designed specifically for developing advanced multi-dimensional algorithms to run efficiently on new and innovative architectures such as GPUs. For these results, Propel solves for convection and diffusion simultaneously with detailed transport and thermodynamics. Chemistry is currently solved in a time-split approach using Strang-splitting with finite element DG time integration of chemical source terms. Results presented here show canonical unsteady reacting flow cases, such as co-flow and splitter plate, and we report performance for higher order DG on CPU and GPUs.

  17. Electromyographic studies of motor control in humans.

    PubMed

    Shahani, B T; Wierzbicka, M M

    1987-11-01

    Electromyography and electroneurography have proved to be useful in investigation and understanding of a variety of neurologic disorders. In most laboratories, however, these electrodiagnostic techniques have been used to help in the diagnosis of diseases that affect the peripheral nerves, neuromuscular junctions, or skeletal muscle fibers. Although major advances in electronic and computer technology have made it possible to study, quantitate, and document reflex activity in intact human subjects, most neurologists still rely on gross clinical observations and most electromyographers continue to use conventional techniques of EMG and nerve conduction studies to differentiate "myopathy" from "neuropathy." This article is a review of some of the electromyographic techniques that have been used in the authors' laboratory for the study of normal and abnormal motor control in man and the treatment of patients with disorders of motor control.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    East, D. R.; Sexton, J.

    This was a collaborative effort between Lawrence Livermore National Security, LLC as manager and operator of Lawrence Livermore National Laboratory (LLNL) and IBM TJ Watson Research Center to research, assess feasibility and develop an implementation plan for a High Performance Computing Innovation Center (HPCIC) in the Livermore Valley Open Campus (LVOC). The ultimate goal of this work was to help advance the State of California and U.S. commercial competitiveness in the arena of High Performance Computing (HPC) by accelerating the adoption of computational science solutions, consistent with recent DOE strategy directives. The desired result of this CRADA was a well-researched,more » carefully analyzed market evaluation that would identify those firms in core sectors of the US economy seeking to adopt or expand their use of HPC to become more competitive globally, and to define how those firms could be helped by the HPCIC with IBM as an integral partner.« less

  19. Robot Design

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Martin Marietta Aero and Naval Systems has advanced the CAD art to a very high level at its Robotics Laboratory. One of the company's major projects is construction of a huge Field Material Handling Robot for the Army's Human Engineering Lab. Design of FMR, intended to move heavy and dangerous material such as ammunition, was a triumph in CAD Engineering. Separate computer problems modeled the robot's kinematics and dynamics, yielding such parameters as the strength of materials required for each component, the length of the arms, their degree of freedom and power of hydraulic system needed. The Robotics Lab went a step further and added data enabling computer simulation and animation of the robot's total operational capability under various loading and unloading conditions. NASA computer program (IAC), integrated Analysis Capability Engineering Database was used. Program contains a series of modules that can stand alone or be integrated with data from sensors or software tools.

  20. Benefit from NASA

    NASA Image and Video Library

    2001-01-01

    The high-tech art of digital signal processing (DSP) was pioneered at NASA's Jet Propulsion Laboratory (JPL) in the mid-1960s for use in the Apollo Lunar Landing Program. Designed to computer enhance pictures of the Moon, this technology became the basis for the Landsat Earth resources satellites and subsequently has been incorporated into a broad range of Earthbound medical and diagnostic tools. DSP is employed in advanced body imaging techniques including Computer-Aided Tomography, also known as CT and CATScan, and Magnetic Resonance Imaging (MRI). CT images are collected by irradiating a thin slice of the body with a fan-shaped x-ray beam from a number of directions around the body's perimeter. A tomographic (slice-like) picture is reconstructed from these multiple views by a computer. MRI employs a magnetic field and radio waves, rather than x-rays, to create images. In this photograph, a patient undergoes an open MRI.

  1. Personal Computer-less (PC-less) Microcontroller Training Kit

    NASA Astrophysics Data System (ADS)

    Somantri, Y.; Wahyudin, D.; Fushilat, I.

    2018-02-01

    The need of microcontroller training kit is necessary for practical work of students of electrical engineering education. However, to use available training kit not only costly but also does not meet the need of laboratory requirements. An affordable and portable microcontroller kit could answer such problem. This paper explains the design and development of Personal Computer Less (PC-Less) Microcontroller Training Kit. It was developed based on Lattepanda processor and Arduino microcontroller as target. The training kit equipped with advanced input-output interfaces that adopted the concept of low cost and low power system. The preliminary usability testing proved this device can be used as a tool for microcontroller programming and industrial automation training. By adopting the concept of portability, the device could be operated in the rural area which electricity and computer infrastructure are limited. Furthermore, the training kit is suitable for student of electrical engineering student from university and vocational high school.

  2. Data and results of a laboratory investigation of microprocessor upset caused by simulated lightning-induced analog transients

    NASA Technical Reports Server (NTRS)

    Belcastro, C. M.

    1984-01-01

    Advanced composite aircraft designs include fault-tolerant computer-based digital control systems with thigh reliability requirements for adverse as well as optimum operating environments. Since aircraft penetrate intense electromagnetic fields during thunderstorms, onboard computer systems maya be subjected to field-induced transient voltages and currents resulting in functional error modes which are collectively referred to as digital system upset. A methodology was developed for assessing the upset susceptibility of a computer system onboard an aircraft flying through a lightning environment. Upset error modes in a general-purpose microprocessor were studied via tests which involved the random input of analog transients which model lightning-induced signals onto interface lines of an 8080-based microcomputer from which upset error data were recorded. The application of Markov modeling to upset susceptibility estimation is discussed and a stochastic model development.

  3. Computational studies of horizontal axis wind turbines in high wind speed condition using advanced turbulence models

    NASA Astrophysics Data System (ADS)

    Benjanirat, Sarun

    Next generation horizontal-axis wind turbines (HAWTs) will operate at very high wind speeds. Existing engineering approaches for modeling the flow phenomena are based on blade element theory, and cannot adequately account for 3-D separated, unsteady flow effects. Therefore, researchers around the world are beginning to model these flows using first principles-based computational fluid dynamics (CFD) approaches. In this study, an existing first principles-based Navier-Stokes approach is being enhanced to model HAWTs at high wind speeds. The enhancements include improved grid topology, implicit time-marching algorithms, and advanced turbulence models. The advanced turbulence models include the Spalart-Allmaras one-equation model, k-epsilon, k-o and Shear Stress Transport (k-o-SST) models. These models are also integrated with detached eddy simulation (DES) models. Results are presented for a range of wind speeds, for a configuration termed National Renewable Energy Laboratory Phase VI rotor, tested at NASA Ames Research Center. Grid sensitivity studies are also presented. Additionally, effects of existing transition models on the predictions are assessed. Data presented include power/torque production, radial distribution of normal and tangential pressure forces, root bending moments, and surface pressure fields. Good agreement was obtained between the predictions and experiments for most of the conditions, particularly with the Spalart-Allmaras-DES model.

  4. Rapid execution of fan beam image reconstruction algorithms using efficient computational techniques and special-purpose processors

    NASA Astrophysics Data System (ADS)

    Gilbert, B. K.; Robb, R. A.; Chu, A.; Kenue, S. K.; Lent, A. H.; Swartzlander, E. E., Jr.

    1981-02-01

    Rapid advances during the past ten years of several forms of computer-assisted tomography (CT) have resulted in the development of numerous algorithms to convert raw projection data into cross-sectional images. These reconstruction algorithms are either 'iterative,' in which a large matrix algebraic equation is solved by successive approximation techniques; or 'closed form'. Continuing evolution of the closed form algorithms has allowed the newest versions to produce excellent reconstructed images in most applications. This paper will review several computer software and special-purpose digital hardware implementations of closed form algorithms, either proposed during the past several years by a number of workers or actually implemented in commercial or research CT scanners. The discussion will also cover a number of recently investigated algorithmic modifications which reduce the amount of computation required to execute the reconstruction process, as well as several new special-purpose digital hardware implementations under development in laboratories at the Mayo Clinic.

  5. Using the Computer as a Laboratory Instrument.

    ERIC Educational Resources Information Center

    Collings, Peter J.; Greenslade, Thomas B., Jr.

    1989-01-01

    Reports experiences during a two-year period in introducing the computer to the laboratory and students to the computer as a laboratory instrument. Describes a working philosophy, data acquisition system, and experiments. Summarizes the laboratory procedures of nine experiments, covering mechanics, heat, electromagnetism, and optics. (YP)

  6. 3D Printing in the Laboratory: Maximize Time and Funds with Customized and Open-Source Labware.

    PubMed

    Coakley, Meghan; Hurt, Darrell E

    2016-08-01

    3D printing, also known as additive manufacturing, is the computer-guided process of fabricating physical objects by depositing successive layers of material. It has transformed manufacturing across virtually every industry, bringing about incredible advances in research and medicine. The rapidly growing consumer market now includes convenient and affordable "desktop" 3D printers. These are being used in the laboratory to create custom 3D-printed equipment, and a growing community of designers are contributing open-source, cost-effective innovations that can be used by both professionals and enthusiasts. User stories from investigators at the National Institutes of Health and the biomedical research community demonstrate the power of 3D printing to save valuable time and funding. While adoption of 3D printing has been slow in the biosciences to date, the potential is vast. The market predicts that within several years, 3D printers could be commonplace within the home; with so many practical uses for 3D printing, we anticipate that the technology will also play an increasingly important role in the laboratory. © 2016 Society for Laboratory Automation and Screening.

  7. Use of the computational-informational web-GIS system for the development of climatology students' skills in modeling and understanding climate change

    NASA Astrophysics Data System (ADS)

    Gordova, Yulia; Martynova, Yulia; Shulgina, Tamara

    2015-04-01

    The current situation with the training of specialists in environmental sciences is complicated by the fact that the very scientific field is experiencing a period of rapid development. Global change has caused the development of measurement techniques and modeling of environmental characteristics, accompanied by the expansion of the conceptual and mathematical apparatus. Understanding and forecasting processes in the Earth system requires extensive use of mathematical modeling and advanced computing technologies. As a rule, available training programs in the environmental sciences disciplines do not have time to adapt to such rapid changes in the domain content. As a result, graduates of faculties do not understand processes and mechanisms of the global change, have only superficial knowledge of mathematical modeling of processes in the environment. They do not have the required skills in numerical modeling, data processing and analysis of observations and computation outputs and are not prepared to work with the meteorological data. For adequate training of future specialists in environmental sciences we propose the following approach, which reflects the new "research" paradigm in education. We believe that the training of such specialists should be done not in an artificial learning environment, but based on actual operating information-computational systems used in environment studies, in the so-called virtual research environment via development of virtual research and learning laboratories. In the report the results of the use of computational-informational web-GIS system "Climate" (http://climate.scert.ru/) as a prototype of such laboratory are discussed. The approach is realized at Tomsk State University to prepare bachelors in meteorology. Student survey shows that their knowledge has become deeper and more systemic after undergoing training in virtual learning laboratory. The scientific team plans to assist any educators to utilize the system in earth science education. This work is partially supported by SB RAS project VIII.80.2.1, RFBR grants 13-05-12034 and 14-05-00502.

  8. Laboratory Computing Resource Center

    Science.gov Websites

    Systems Computing and Data Resources Purchasing Resources Future Plans For Users Getting Started Using LCRC Software Best Practices and Policies Getting Help Support Laboratory Computing Resource Center Laboratory Computing Resource Center Latest Announcements See All April 27, 2018, Announcements, John Low

  9. An Overview of High Performance Computing and Challenges for the Future

    ScienceCinema

    Google Tech Talks

    2017-12-09

    In this talk we examine how high performance computing has changed over the last 10-year and look toward the future in terms of trends. These changes have had and will continue to have a major impact on our software. A new generation of software libraries and lgorithms are needed for the effective and reliable use of (wide area) dynamic, distributed and parallel environments. Some of the software and algorithm challenges have already been encountered, such as management of communication and memory hierarchies through a combination of compile--time and run--time techniques, but the increased scale of computation, depth of memory hierarchies, range of latencies, and increased run--time environment variability will make these problems much harder. We will focus on the redesign of software to fit multicore architectures. Speaker: Jack Dongarra University of Tennessee Oak Ridge National Laboratory University of Manchester Jack Dongarra received a Bachelor of Science in Mathematics from Chicago State University in 1972 and a Master of Science in Computer Science from the Illinois Institute of Technology in 1973. He received his Ph.D. in Applied Mathematics from the University of New Mexico in 1980. He worked at the Argonne National Laboratory until 1989, becoming a senior scientist. He now holds an appointment as University Distinguished Professor of Computer Science in the Electrical Engineering and Computer Science Department at the University of Tennessee, has the position of a Distinguished Research Staff member in the Computer Science and Mathematics Division at Oak Ridge National Laboratory (ORNL), Turing Fellow in the Computer Science and Mathematics Schools at the University of Manchester, and an Adjunct Professor in the Computer Science Department at Rice University. He specializes in numerical algorithms in linear algebra, parallel computing, the use of advanced-computer architectures, programming methodology, and tools for parallel computers. His research includes the development, testing and documentation of high quality mathematical software. He has contributed to the design and implementation of the following open source software packages and systems: EISPACK, LINPACK, the BLAS, LAPACK, ScaLAPACK, Netlib, PVM, MPI, NetSolve, Top500, ATLAS, and PAPI. He has published approximately 200 articles, papers, reports and technical memoranda and he is coauthor of several books. He was awarded the IEEE Sid Fernbach Award in 2004 for his contributions in the application of high performance computers using innovative approaches. He is a Fellow of the AAAS, ACM, and the IEEE and a member of the National Academy of Engineering.

  10. An Overview of High Performance Computing and Challenges for the Future

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Google Tech Talks

    In this talk we examine how high performance computing has changed over the last 10-year and look toward the future in terms of trends. These changes have had and will continue to have a major impact on our software. A new generation of software libraries and lgorithms are needed for the effective and reliable use of (wide area) dynamic, distributed and parallel environments. Some of the software and algorithm challenges have already been encountered, such as management of communication and memory hierarchies through a combination of compile--time and run--time techniques, but the increased scale of computation, depth of memory hierarchies,more » range of latencies, and increased run--time environment variability will make these problems much harder. We will focus on the redesign of software to fit multicore architectures. Speaker: Jack Dongarra University of Tennessee Oak Ridge National Laboratory University of Manchester Jack Dongarra received a Bachelor of Science in Mathematics from Chicago State University in 1972 and a Master of Science in Computer Science from the Illinois Institute of Technology in 1973. He received his Ph.D. in Applied Mathematics from the University of New Mexico in 1980. He worked at the Argonne National Laboratory until 1989, becoming a senior scientist. He now holds an appointment as University Distinguished Professor of Computer Science in the Electrical Engineering and Computer Science Department at the University of Tennessee, has the position of a Distinguished Research Staff member in the Computer Science and Mathematics Division at Oak Ridge National Laboratory (ORNL), Turing Fellow in the Computer Science and Mathematics Schools at the University of Manchester, and an Adjunct Professor in the Computer Science Department at Rice University. He specializes in numerical algorithms in linear algebra, parallel computing, the use of advanced-computer architectures, programming methodology, and tools for parallel computers. His research includes the development, testing and documentation of high quality mathematical software. He has contributed to the design and implementation of the following open source software packages and systems: EISPACK, LINPACK, the BLAS, LAPACK, ScaLAPACK, Netlib, PVM, MPI, NetSolve, Top500, ATLAS, and PAPI. He has published approximately 200 articles, papers, reports and technical memoranda and he is coauthor of several books. He was awarded the IEEE Sid Fernbach Award in 2004 for his contributions in the application of high performance computers using innovative approaches. He is a Fellow of the AAAS, ACM, and the IEEE and a member of the National Academy of Engineering.« less

  11. An Overview of the Computational Physics and Methods Group at Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, Randal Scott

    CCS Division was formed to strengthen the visibility and impact of computer science and computational physics research on strategic directions for the Laboratory. Both computer science and computational science are now central to scientific discovery and innovation. They have become indispensable tools for all other scientific missions at the Laboratory. CCS Division forms a bridge between external partners and Laboratory programs, bringing new ideas and technologies to bear on today’s important problems and attracting high-quality technical staff members to the Laboratory. The Computational Physics and Methods Group CCS-2 conducts methods research and develops scientific software aimed at the latest andmore » emerging HPC systems.« less

  12. Advanced engineering environment collaboration project.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lamph, Jane Ann; Pomplun, Alan R.; Kiba, Grant W.

    2008-12-01

    The Advanced Engineering Environment (AEE) is a model for an engineering design and communications system that will enhance project collaboration throughout the nuclear weapons complex (NWC). Sandia National Laboratories and Parametric Technology Corporation (PTC) worked together on a prototype project to evaluate the suitability of a portion of PTC's Windchill 9.0 suite of data management, design and collaboration tools as the basis for an AEE. The AEE project team implemented Windchill 9.0 development servers in both classified and unclassified domains and used them to test and evaluate the Windchill tool suite relative to the needs of the NWC using weaponsmore » project use cases. A primary deliverable was the development of a new real time collaborative desktop design and engineering process using PDMLink (data management tool), Pro/Engineer (mechanical computer aided design tool) and ProductView Lite (visualization tool). Additional project activities included evaluations of PTC's electrical computer aided design, visualization, and engineering calculations applications. This report documents the AEE project work to share information and lessons learned with other NWC sites. It also provides PTC with recommendations for improving their products for NWC applications.« less

  13. Assessing your competitors' application of CIM/CIP. [Computer Integrated Manufacturing/Processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    King, M.J.; Evans, H.N.

    1993-07-01

    As part of the authors consulting assignments, they are frequently asked to describe what is best industry practice in the area of computer integrated manufacturing/processing (CIM/CIP). This might be specific to a particular piece, such as advanced controls or a laboratory system. Often it is in response to the enormous publicity given to CIM/CIP--begging the question, Who in the hydrocarbon industry is actually doing it '' Although much of this information is available to consultants, client confidentiality precludes its release. Instead, included is a questionnaire intended to be completed by representatives of manufacturing sites. The data gathered will be analyzedmore » and reported in a future issue. The intent is to give anyone who has completed the questionnaire the opportunity to assess the position of his or her site with respect to the competition. To show how this might work a prototype study was completed. This included an estimate of the advanced control benefits achieved in 68 refineries in Western Europe. So that sites could be compared, these were expressed as a percentage of the maximum economically achievable.« less

  14. NIMROD: A computational laboratory for studying nonlinear fusion magnetohydrodynamics

    NASA Astrophysics Data System (ADS)

    Sovinec, C. R.; Gianakon, T. A.; Held, E. D.; Kruger, S. E.; Schnack, D. D.

    2003-05-01

    Nonlinear numerical studies of macroscopic modes in a variety of magnetic fusion experiments are made possible by the flexible high-order accurate spatial representation and semi-implicit time advance in the NIMROD simulation code [A. H. Glasser et al., Plasma Phys. Controlled Fusion 41, A747 (1999)]. Simulation of a resistive magnetohydrodynamics mode in a shaped toroidal tokamak equilibrium demonstrates computation with disparate time scales, simulations of discharge 87009 in the DIII-D tokamak [J. L. Luxon et al., Plasma Physics and Controlled Nuclear Fusion Research 1986 (International Atomic Energy Agency, Vienna, 1987), Vol. I, p. 159] confirm an analytic scaling for the temporal evolution of an ideal mode subject to plasma-β increasing beyond marginality, and a spherical torus simulation demonstrates nonlinear free-boundary capabilities. A comparison of numerical results on magnetic relaxation finds the n=1 mode and flux amplification in spheromaks to be very closely related to the m=1 dynamo modes and magnetic reversal in reversed-field pinch configurations. Advances in local and nonlocal closure relations developed for modeling kinetic effects in fluid simulation are also described.

  15. 76 FR 31945 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-02

    ... DEPARTMENT OF ENERGY Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy... teleconference meeting of the Advanced Scientific Computing Advisory Committee (ASCAC). The Federal [email protected] . FOR FURTHER INFORMATION CONTACT: Melea Baker, Office of Advanced Scientific Computing...

  16. Incorporating 3D-printing technology in the design of head-caps and electrode drives for recording neurons in multiple brain regions.

    PubMed

    Headley, Drew B; DeLucca, Michael V; Haufler, Darrell; Paré, Denis

    2015-04-01

    Recent advances in recording and computing hardware have enabled laboratories to record the electrical activity of multiple brain regions simultaneously. Lagging behind these technical advances, however, are the methods needed to rapidly produce microdrives and head-caps that can flexibly accommodate different recording configurations. Indeed, most available designs target single or adjacent brain regions, and, if multiple sites are targeted, specially constructed head-caps are used. Here, we present a novel design style, for both microdrives and head-caps, which takes advantage of three-dimensional printing technology. This design facilitates targeting of multiple brain regions in various configurations. Moreover, the parts are easily fabricated in large quantities, with only minor hand-tooling and finishing required. Copyright © 2015 the American Physiological Society.

  17. Sandia Technology engineering and science accomplishments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This report briefly discusses the following research being conducted at Sandia Laboratories: Advanced Manufacturing -- Sandia technology helps keep US industry in the lead; Microelectronics-Sandia`s unique facilities transform research advances into manufacturable products; Energy -- Sandia`s energy programs focus on strengthening industrial growth and political decisionmaking; Environment -- Sandia is a leader in environmentally conscious manufacturing and hazardous waste reduction; Health Care -- New biomedical technologies help reduce cost and improve quality of health care; Information & Computation -- Sandia aims to help make the information age a reality; Transportation -- This new initiative at the Labs will help improvemore » transportation, safety,l efficiency, and economy; Nonproliferation -- Dismantlement and arms control are major areas of emphasis at Sandia; and Awards and Patents -- Talented, dedicated employees are the backbone of Sandia`s success.« less

  18. Advanced TIL system for laser beam focusing in a turbulent regime

    NASA Astrophysics Data System (ADS)

    Sprangle, Phillip A.; Ting, Antonio C.; Kaganovich, Dmitry; Khizhnyak, Anatoliy I.; Tomov, Ivan V.; Markov, Vladimir B.; Korobkin, Dmitriy V.

    2014-10-01

    This paper discusses an advanced target in the loop (ATIL) system with its performance based on a nonlinear phase conjugation scheme that performs rapid adjustment of the laser beam wavefront to mitigate effects associated with atmospheric turbulence along the propagation path. The ATIL method allows positional control of the laser spot (the beacon) on a remote imaged-resolved target. The size of this beacon is governed by the reciprocity of two counterpropagating beams (one towards the target and another scattered by the target) and the fidelity of the phase conjugation scheme. In this presentation we will present the results of the thorough analysis of ATIL operation, factors that affect its performance, its focusing efficiency and the comparison of laboratory experimental validation and computer simulation results.

  19. Incorporating 3D-printing technology in the design of head-caps and electrode drives for recording neurons in multiple brain regions

    PubMed Central

    DeLucca, Michael V.; Haufler, Darrell; Paré, Denis

    2015-01-01

    Recent advances in recording and computing hardware have enabled laboratories to record the electrical activity of multiple brain regions simultaneously. Lagging behind these technical advances, however, are the methods needed to rapidly produce microdrives and head-caps that can flexibly accommodate different recording configurations. Indeed, most available designs target single or adjacent brain regions, and, if multiple sites are targeted, specially constructed head-caps are used. Here, we present a novel design style, for both microdrives and head-caps, which takes advantage of three-dimensional printing technology. This design facilitates targeting of multiple brain regions in various configurations. Moreover, the parts are easily fabricated in large quantities, with only minor hand-tooling and finishing required. PMID:25652930

  20. How to Quickly Import CAD Geometry into Thermal Desktop

    NASA Technical Reports Server (NTRS)

    Wright, Shonte; Beltran, Emilio

    2002-01-01

    There are several groups at JPL (Jet Propulsion Laboratory) that are committed to concurrent design efforts, two are featured here. Center for Space Mission Architecture and Design (CSMAD) enables the practical application of advanced process technologies in JPL's mission architecture process. Team I functions as an incubator for projects that are in the Discovery, and even pre-Discovery proposal stages. JPL's concurrent design environment is to a large extent centered on the CAD (Computer Aided Design) file. During concurrent design sessions CAD geometry is ported to other more specialized engineering design packages.

  1. Millstone: software for multiplex microbial genome analysis and engineering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goodman, Daniel B.; Kuznetsov, Gleb; Lajoie, Marc J.

    Inexpensive DNA sequencing and advances in genome editing have made computational analysis a major rate-limiting step in adaptive laboratory evolution and microbial genome engineering. Here, we describe Millstone, a web-based platform that automates genotype comparison and visualization for projects with up to hundreds of genomic samples. To enable iterative genome engineering, Millstone allows users to design oligonucleotide libraries and create successive versions of reference genomes. Millstone is open source and easily deployable to a cloud platform, local cluster, or desktop, making it a scalable solution for any lab.

  2. Millstone: software for multiplex microbial genome analysis and engineering.

    PubMed

    Goodman, Daniel B; Kuznetsov, Gleb; Lajoie, Marc J; Ahern, Brian W; Napolitano, Michael G; Chen, Kevin Y; Chen, Changping; Church, George M

    2017-05-25

    Inexpensive DNA sequencing and advances in genome editing have made computational analysis a major rate-limiting step in adaptive laboratory evolution and microbial genome engineering. We describe Millstone, a web-based platform that automates genotype comparison and visualization for projects with up to hundreds of genomic samples. To enable iterative genome engineering, Millstone allows users to design oligonucleotide libraries and create successive versions of reference genomes. Millstone is open source and easily deployable to a cloud platform, local cluster, or desktop, making it a scalable solution for any lab.

  3. Millstone: software for multiplex microbial genome analysis and engineering

    DOE PAGES

    Goodman, Daniel B.; Kuznetsov, Gleb; Lajoie, Marc J.; ...

    2017-05-25

    Inexpensive DNA sequencing and advances in genome editing have made computational analysis a major rate-limiting step in adaptive laboratory evolution and microbial genome engineering. Here, we describe Millstone, a web-based platform that automates genotype comparison and visualization for projects with up to hundreds of genomic samples. To enable iterative genome engineering, Millstone allows users to design oligonucleotide libraries and create successive versions of reference genomes. Millstone is open source and easily deployable to a cloud platform, local cluster, or desktop, making it a scalable solution for any lab.

  4. The evolving trend in spacecraft health analysis

    NASA Technical Reports Server (NTRS)

    Kirkpatrick, Russell L.

    1993-01-01

    The Space Flight Operations Center inaugurated the concept of a central data repository for spacecraft data and the distribution of computing power to the end users for that data's analysis at the Jet Propulsion Laboratory. The Advanced Multimission Operations System is continuing the evolution of this concept as new technologies emerge. Constant improvements in data management tools, data visualization, and hardware lead to ever expanding ideas for improving the analysis of spacecraft health in an era of budget constrained mission operations systems. The foundation of this evolution, its history, and its current plans will be discussed.

  5. Whale Identification

    NASA Technical Reports Server (NTRS)

    1991-01-01

    R:BASE for DOS, a computer program developed under NASA contract, has been adapted by the National Marine Mammal Laboratory and the College of the Atlantic to provide and advanced computerized photo matching technique for identification of humpback whales. The program compares photos with stored digitized descriptions, enabling researchers to track and determine distribution and migration patterns. R:BASE is a spinoff of RIM (Relational Information Manager), which was used to store data for analyzing heat shielding tiles on the Space Shuttle Orbiter. It is now the world's second largest selling line of microcomputer database management software.

  6. Current Capabilities at SNL for the Integration of Small Modular Reactors onto Smart Microgrids Using Sandia's Smart Microgrid Technology High Performance Computing and Advanced Manufacturing.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rodriguez, Salvador B.

    Smart grids are a crucial component for enabling the nation’s future energy needs, as part of a modernization effort led by the Department of Energy. Smart grids and smart microgrids are being considered in niche applications, and as part of a comprehensive energy strategy to help manage the nation’s growing energy demands, for critical infrastructures, military installations, small rural communities, and large populations with limited water supplies. As part of a far-reaching strategic initiative, Sandia National Laboratories (SNL) presents herein a unique, three-pronged approach to integrate small modular reactors (SMRs) into microgrids, with the goal of providing economically-competitive, reliable, andmore » secure energy to meet the nation’s needs. SNL’s triad methodology involves an innovative blend of smart microgrid technology, high performance computing (HPC), and advanced manufacturing (AM). In this report, Sandia’s current capabilities in those areas are summarized, as well as paths forward that will enable DOE to achieve its energy goals. In the area of smart grid/microgrid technology, Sandia’s current computational capabilities can model the entire grid, including temporal aspects and cyber security issues. Our tools include system development, integration, testing and evaluation, monitoring, and sustainment.« less

  7. A wireless potentiostat for mobile chemical sensing and biosensing.

    PubMed

    Steinberg, Matthew D; Kassal, Petar; Kereković, Irena; Steinberg, Ivana Murković

    2015-10-01

    Wireless chemical sensors are used as analytical devices in homeland defence, home-based healthcare, food logistics and more generally for the Sensor Internet of Things (SIoT). Presented here is a battery-powered and highly portable credit-card size potentiostat that is suitable for performing mobile and wearable amperometric electrochemical measurements with seamless wireless data transfer to mobile computing devices. The mobile electrochemical analytical system has been evaluated in the laboratory with a model redox system - the reduction of hexacyanoferrate(III) - and also with commercially available enzymatic blood-glucose test-strips. The potentiostat communicates wirelessly with mobile devices such as tablets or Smartphones by near-field communication (NFC) or with personal computers by radio-frequency identification (RFID), and thus provides a solution to the 'missing link' in connectivity that often exists between low-cost mobile and wearable chemical sensors and ubiquitous mobile computing products. The mobile potentiostat has been evaluated in the laboratory with a set of proof-of-concept experiments, and its analytical performance compared with a commercial laboratory potentiostat (R(2)=0.9999). These first experimental results demonstrate the functionality of the wireless potentiostat and suggest that the device could be suitable for wearable and point-of-sample analytical measurements. We conclude that the wireless potentiostat could contribute significantly to the advancement of mobile chemical sensor research and adoption, in particular for wearable sensors in healthcare and sport physiology, for wound monitoring and in mobile point-of-sample diagnostics as well as more generally as a part of the Sensor Internet of Things. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. ANL statement of site strategy for computing workstations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fenske, K.R.; Boxberger, L.M.; Amiot, L.W.

    1991-11-01

    This Statement of Site Strategy describes the procedure at Argonne National Laboratory for defining, acquiring, using, and evaluating scientific and office workstations and related equipment and software in accord with DOE Order 1360.1A (5-30-85), and Laboratory policy. It is Laboratory policy to promote the installation and use of computing workstations to improve productivity and communications for both programmatic and support personnel, to ensure that computing workstations acquisitions meet the expressed need in a cost-effective manner, and to ensure that acquisitions of computing workstations are in accord with Laboratory and DOE policies. The overall computing site strategy at ANL is tomore » develop a hierarchy of integrated computing system resources to address the current and future computing needs of the laboratory. The major system components of this hierarchical strategy are: Supercomputers, Parallel computers, Centralized general purpose computers, Distributed multipurpose minicomputers, and Computing workstations and office automation support systems. Computing workstations include personal computers, scientific and engineering workstations, computer terminals, microcomputers, word processing and office automation electronic workstations, and associated software and peripheral devices costing less than $25,000 per item.« less

  9. 75 FR 9887 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-04

    ... DEPARTMENT OF ENERGY Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy... Advanced Scientific Computing Advisory Committee (ASCAC). Federal Advisory Committee Act (Pub. L. 92-463... INFORMATION CONTACT: Melea Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building...

  10. 76 FR 9765 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-22

    ... DEPARTMENT OF ENERGY Advanced Scientific Computing Advisory Committee AGENCY: Office of Science... Advanced Scientific Computing Advisory Committee (ASCAC). The Federal Advisory Committee Act (Pub. L. 92... INFORMATION CONTACT: Melea Baker, Office of Advanced Scientific Computing Research, SC-21/Germantown Building...

  11. 77 FR 45345 - DOE/Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-31

    ... Recompetition results for Scientific Discovery through Advanced Computing (SciDAC) applications Co-design Public... DEPARTMENT OF ENERGY DOE/Advanced Scientific Computing Advisory Committee AGENCY: Office of... the Advanced Scientific Computing Advisory Committee (ASCAC). The Federal Advisory Committee Act (Pub...

  12. 75 FR 64720 - DOE/Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-20

    ... DEPARTMENT OF ENERGY DOE/Advanced Scientific Computing Advisory Committee AGENCY: Department of... the Advanced Scientific Computing Advisory Committee (ASCAC). Federal Advisory Committee Act (Pub. L.... FOR FURTHER INFORMATION CONTACT: Melea Baker, Office of Advanced Scientific Computing Research; SC-21...

  13. Computing Cosmic Cataclysms

    NASA Technical Reports Server (NTRS)

    Centrella, Joan M.

    2010-01-01

    The final merger of two black holes releases a tremendous amount of energy, more than the combined light from all the stars in the visible universe. This energy is emitted in the form of gravitational waves, and observing these sources with gravitational wave detectors requires that we know the pattern or fingerprint of the radiation emitted. Since black hole mergers take place in regions of extreme gravitational fields, we need to solve Einstein's equations of general relativity on a computer in order to calculate these wave patterns. For more than 30 years, scientists have tried to compute these wave patterns. However, their computer codes have been plagued by problems that caused them to crash. This situation has changed dramatically in the past few years, with a series of amazing breakthroughs. This talk will take you on this quest for these gravitational wave patterns, showing how a spacetime is constructed on a computer to build a simulation laboratory for binary black hole mergers. We will focus on the recent advances that are revealing these waveforms, and the dramatic new potential for discoveries that arises when these sources will be observed.

  14. Towards Reproducibility in Computational Hydrology

    NASA Astrophysics Data System (ADS)

    Hutton, Christopher; Wagener, Thorsten; Freer, Jim; Han, Dawei; Duffy, Chris; Arheimer, Berit

    2017-04-01

    Reproducibility is a foundational principle in scientific research. The ability to independently re-run an experiment helps to verify the legitimacy of individual findings, and evolve (or reject) hypotheses and models of how environmental systems function, and move them from specific circumstances to more general theory. Yet in computational hydrology (and in environmental science more widely) the code and data that produces published results are not regularly made available, and even if they are made available, there remains a multitude of generally unreported choices that an individual scientist may have made that impact the study result. This situation strongly inhibits the ability of our community to reproduce and verify previous findings, as all the information and boundary conditions required to set up a computational experiment simply cannot be reported in an article's text alone. In Hutton et al 2016 [1], we argue that a cultural change is required in the computational hydrological community, in order to advance and make more robust the process of knowledge creation and hypothesis testing. We need to adopt common standards and infrastructures to: (1) make code readable and re-useable; (2) create well-documented workflows that combine re-useable code together with data to enable published scientific findings to be reproduced; (3) make code and workflows available, easy to find, and easy to interpret, using code and code metadata repositories. To create change we argue for improved graduate training in these areas. In this talk we reflect on our progress in achieving reproducible, open science in computational hydrology, which are relevant to the broader computational geoscience community. In particular, we draw on our experience in the Switch-On (EU funded) virtual water science laboratory (http://www.switch-on-vwsl.eu/participate/), which is an open platform for collaboration in hydrological experiments (e.g. [2]). While we use computational hydrology as the example application area, we believe that our conclusions are of value to the wider environmental and geoscience community as far as the use of code and models for scientific advancement is concerned. References: [1] Hutton, C., T. Wagener, J. Freer, D. Han, C. Duffy, and B. Arheimer (2016), Most computational hydrology is not reproducible, so is it really science?, Water Resour. Res., 52, 7548-7555, doi:10.1002/2016WR019285. [2] Ceola, S., et al. (2015), Virtual laboratories: New opportunities for collaborative water science, Hydrol. Earth Syst. Sci. Discuss., 11(12), 13443-13478, doi:10.5194/hessd-11-13443-2014.

  15. PREFACE: 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT2013)

    NASA Astrophysics Data System (ADS)

    Wang, Jianxiong

    2014-06-01

    This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2013) which took place on 16-21 May 2013 at the Institute of High Energy Physics, Chinese Academy of Sciences, Beijing, China. The workshop series brings together computer science researchers and practitioners, and researchers from particle physics and related fields to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. This year's edition of the workshop brought together over 120 participants from all over the world. 18 invited speakers presented key topics on the universe in computer, Computing in Earth Sciences, multivariate data analysis, automated computation in Quantum Field Theory as well as computing and data analysis challenges in many fields. Over 70 other talks and posters presented state-of-the-art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. The round table discussions on open-source, knowledge sharing and scientific collaboration stimulate us to think over the issue in the respective areas. ACAT 2013 was generously sponsored by the Chinese Academy of Sciences (CAS), National Natural Science Foundation of China (NFSC), Brookhaven National Laboratory in the USA (BNL), Peking University (PKU), Theoretical Physics Cernter for Science facilities of CAS (TPCSF-CAS) and Sugon. We would like to thank all the participants for their scientific contributions and for the en- thusiastic participation in all its activities of the workshop. Further information on ACAT 2013 can be found at http://acat2013.ihep.ac.cn. Professor Jianxiong Wang Institute of High Energy Physics Chinese Academy of Science Details of committees and sponsors are available in the PDF

  16. PREFACE: 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011)

    NASA Astrophysics Data System (ADS)

    Teodorescu, Liliana; Britton, David; Glover, Nigel; Heinrich, Gudrun; Lauret, Jérôme; Naumann, Axel; Speer, Thomas; Teixeira-Dias, Pedro

    2012-06-01

    ACAT2011 This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011) which took place on 5-7 September 2011 at Brunel University, UK. The workshop series, which began in 1990 in Lyon, France, brings together computer science researchers and practitioners, and researchers from particle physics and related fields in order to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. It is a forum for the exchange of ideas among the fields, exploring and promoting cutting-edge computing, data analysis and theoretical calculation techniques in fundamental physics research. This year's edition of the workshop brought together over 100 participants from all over the world. 14 invited speakers presented key topics on computing ecosystems, cloud computing, multivariate data analysis, symbolic and automatic theoretical calculations as well as computing and data analysis challenges in astrophysics, bioinformatics and musicology. Over 80 other talks and posters presented state-of-the art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. Panel and round table discussions on data management and multivariate data analysis uncovered new ideas and collaboration opportunities in the respective areas. This edition of ACAT was generously sponsored by the Science and Technology Facility Council (STFC), the Institute for Particle Physics Phenomenology (IPPP) at Durham University, Brookhaven National Laboratory in the USA and Dell. We would like to thank all the participants of the workshop for the high level of their scientific contributions and for the enthusiastic participation in all its activities which were, ultimately, the key factors in the success of the workshop. Further information on ACAT 2011 can be found at http://acat2011.cern.ch Dr Liliana Teodorescu Brunel University ACATgroup The PDF also contains details of the workshop's committees and sponsors.

  17. Improved detonation modeling with CHEETAH

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heller, A.

    1997-11-01

    A Livermore software program called CHEETAH, an important, even indispensable tool for energetic materials researchers worldwide, was made more powerful in the summer of 1997 with the release of CHEETAH 2.0, an advanced version that simulates a wider variety of detonations. Derived from more than 40 years of experiments on high explosives at Lawrence Livermore and Los Alamos national laboratories, CHEETAH predicts the results from detonating a mixture of specified reactants. It operates by solving thermodynamic equations to predict detonation products and such properties as temperature, pressure, volume, and total energy released. The code is prized by synthesis chemists andmore » other researchers because it allows them to vary the starting molecules and conditions to optimize the desired performance properties. One of the Laboratory`s most popular computer codes, CHEETAH is used at more than 200 sites worldwide, including ones in England, Canada, Sweden, Switzerland, and France. Most sites are defense-related, although a few users, such as Japanese fireworks researchers, are in the civilian sector.« less

  18. Modeling Laboratory Astrophysics Experiments using the CRASH code

    NASA Astrophysics Data System (ADS)

    Trantham, Matthew; Drake, R. P.; Grosskopf, Michael; Bauerle, Matthew; Kruanz, Carolyn; Keiter, Paul; Malamud, Guy; Crash Team

    2013-10-01

    The understanding of high energy density systems can be advanced by laboratory astrophysics experiments. Computer simulations can assist in the design and analysis of these experiments. The Center for Radiative Shock Hydrodynamics (CRASH) at the University of Michigan developed a code that has been used to design and analyze high-energy-density experiments on OMEGA, NIF, and other large laser facilities. This Eulerian code uses block-adaptive mesh refinement (AMR) with implicit multigroup radiation transport and electron heat conduction. This poster/talk will demonstrate some of the experiments the CRASH code has helped design or analyze including: Radiative shocks experiments, Kelvin-Helmholtz experiments, Rayleigh-Taylor experiments, plasma sheet, and interacting jets experiments. This work is funded by the Predictive Sciences Academic Alliances Program in NNSA-ASC via grant DEFC52- 08NA28616, by the NNSA-DS and SC-OFES Joint Program in High-Energy-Density Laboratory Plasmas, grant number DE-FG52-09NA29548, and by the National Laser User Facility Program, grant number DE-NA0000850.

  19. Six dimensional X-ray Tensor Tomography with a compact laboratory setup

    NASA Astrophysics Data System (ADS)

    Sharma, Y.; Wieczorek, M.; Schaff, F.; Seyyedi, S.; Prade, F.; Pfeiffer, F.; Lasser, T.

    2016-09-01

    Attenuation based X-ray micro computed tomography (XCT) provides three-dimensional images with micrometer resolution. However, there is a trade-off between the smallest size of the structures that can be resolved and the measurable sample size. In this letter, we present an imaging method using a compact laboratory setup that reveals information about micrometer-sized structures within samples that are several orders of magnitudes larger. We combine the anisotropic dark-field signal obtained in a grating interferometer and advanced tomographic reconstruction methods to reconstruct a six dimensional scattering tensor at every spatial location in three dimensions. The scattering tensor, thus obtained, encodes information about the orientation of micron-sized structures such as fibres in composite materials or dentinal tubules in human teeth. The sparse acquisition schemes presented in this letter enable the measurement of the full scattering tensor at every spatial location and can be easily incorporated in a practical, commercially feasible laboratory setup using conventional X-ray tubes, thus allowing for widespread industrial applications.

  20. Laboratory Directed Research and Development FY2011 Annual Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Craig, W; Sketchley, J; Kotta, P

    2012-03-22

    A premier applied-science laboratory, Lawrence Livermore National Laboratory (LLNL) has earned the reputation as a leader in providing science and technology solutions to the most pressing national and global security problems. The LDRD Program, established by Congress at all DOE national laboratories in 1991, is LLNL's most important single resource for fostering excellent science and technology for today's needs and tomorrow's challenges. The LDRD internally directed research and development funding at LLNL enables high-risk, potentially high-payoff projects at the forefront of science and technology. The LDRD Program at Livermore serves to: (1) Support the Laboratory's missions, strategic plan, and foundationalmore » science; (2) Maintain the Laboratory's science and technology vitality; (3) Promote recruiting and retention; (4) Pursue collaborations; (5) Generate intellectual property; and (6) Strengthen the U.S. economy. Myriad LDRD projects over the years have made important contributions to every facet of the Laboratory's mission and strategic plan, including its commitment to nuclear, global, and energy and environmental security, as well as cutting-edge science and technology and engineering in high-energy-density matter, high-performance computing and simulation, materials and chemistry at the extremes, information systems, measurements and experimental science, and energy manipulation. A summary of each project was submitted by the principal investigator. Project summaries include the scope, motivation, goals, relevance to DOE/NNSA and LLNL mission areas, the technical progress achieved in FY11, and a list of publications that resulted from the research. The projects are: (1) Nuclear Threat Reduction; (2) Biosecurity; (3) High-Performance Computing and Simulation; (4) Intelligence; (5) Cybersecurity; (6) Energy Security; (7) Carbon Capture; (8) Material Properties, Theory, and Design; (9) Radiochemistry; (10) High-Energy-Density Science; (11) Laser Inertial-Fusion Energy; (12) Advanced Laser Optical Systems and Applications; (12) Space Security; (13) Stockpile Stewardship Science; (14) National Security; (15) Alternative Energy; and (16) Climatic Change.« less

  1. GC/IR computer-aided identification of anaerobic bacteria

    NASA Astrophysics Data System (ADS)

    Ye, Hunian; Zhang, Feng S.; Yang, Hua; Li, Zhu; Ye, Song

    1993-09-01

    A new method was developed to identify anaerobic bacteria by using pattern recognition. The method is depended on GC / JR data. The system is intended for use as a precise rapid and reproduceable aid in the identification of unknown isolates. Key Words: Anaerobic bacteria Pattern recognition Computeraided identification GC / JR 1 . TNTRODUCTTON A major problem in the field of anaerobic bacteriology is the difficulty in accurately precisely and rapidly identifying unknown isolates. Tn the proceedings of the Third International Symposium on Rapid Methods and Automation in Microbiology C. M. Moss said: " Chromatographic analysis is a new future for clinical microbiology" . 12 years past and so far it seems that this is an idea whose time has not get come but it close. Now two major advances that have brought the technology forword in terms ofmaking it appropriate for use in the clinical laboratory can aldo be cited. One is the development and implementation of fused silica capillary columns. In contrast to packed columns and those of'' greater width these columns allow reproducible recovery of hydroxey fatty acids with the same carbon chain length. The second advance is the efficient data processing afforded by modern microcomputer systems. On the other hand the practical steps for sample preparation also are an advance in the clinical laboratory. Chromatographic Analysis means mainly of analysis of fatty acids. The most common

  2. Radiation Detection Center on the Front Lines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hazi, A

    2005-09-20

    Many of today's radiation detection tools were developed in the 1960s. For years, the Laboratory's expertise in radiation detection resided mostly within its nuclear test program. When nuclear testing was halted in the 1990s, many of Livermore's radiation detection experts were dispersed to other parts of the Laboratory, including the directorates of Chemistry and Materials Science (CMS); Physics and Advanced Technologies (PAT); Defense and Nuclear Technologies (DNT); and Nonproliferation, Arms Control, and International Security (NAI). The RDC was formed to maximize the benefit of radiation detection technologies being developed in 15 to 20 research and development (R&D) programs. These effortsmore » involve more than 200 Laboratory employees across eight directorates, in areas that range from electronics to computer simulations. The RDC's primary focus is the detection, identification, and analysis of nuclear materials and weapons. A newly formed outreach program within the RDC is responsible for conducting radiation detection workshops and seminars across the country and for coordinating university student internships. Simon Labov, director of the RDC, says, ''Virtually all of the Laboratory's programs use radiation detection devices in some way. For example, DNT uses radiation detection to create radiographs for their work in stockpile stewardship and in diagnosing explosives; CMS uses it to develop technology for advancing the detection, diagnosis, and treatment of cancer; and the Energy and Environment Directorate uses radiation detection in the Marshall Islands to monitor the aftermath of nuclear testing in the Pacific. In the future, the National Ignition Facility will use radiation detection to probe laser targets and study shock dynamics.''« less

  3. NEAMS Update. Quarterly Report for October - December 2011.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bradley, K.

    2012-02-16

    The Advanced Modeling and Simulation Office within the DOE Office of Nuclear Energy (NE) has been charged with revolutionizing the design tools used to build nuclear power plants during the next 10 years. To accomplish this, the DOE has brought together the national laboratories, U.S. universities, and the nuclear energy industry to establish the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Program. The mission of NEAMS is to modernize computer modeling of nuclear energy systems and improve the fidelity and validity of modeling results using contemporary software environments and high-performance computers. NEAMS will create a set of engineering-level codes aimedmore » at designing and analyzing the performance and safety of nuclear power plants and reactor fuels. The truly predictive nature of these codes will be achieved by modeling the governing phenomena at the spatial and temporal scales that dominate the behavior. These codes will be executed within a simulation environment that orchestrates code integration with respect to spatial meshing, computational resources, and execution to give the user a common 'look and feel' for setting up problems and displaying results. NEAMS is building upon a suite of existing simulation tools, including those developed by the federal Scientific Discovery through Advanced Computing and Advanced Simulation and Computing programs. NEAMS also draws upon existing simulation tools for materials and nuclear systems, although many of these are limited in terms of scale, applicability, and portability (their ability to be integrated into contemporary software and hardware architectures). NEAMS investments have directly and indirectly supported additional NE research and development programs, including those devoted to waste repositories, safeguarded separations systems, and long-term storage of used nuclear fuel. NEAMS is organized into two broad efforts, each comprising four elements. The quarterly highlights October-December 2011 are: (1) Version 1.0 of AMP, the fuel assembly performance code, was tested on the JAGUAR supercomputer and released on November 1, 2011, a detailed discussion of this new simulation tool is given; (2) A coolant sub-channel model and a preliminary UO{sub 2} smeared-cracking model were implemented in BISON, the single-pin fuel code, more information on how these models were developed and benchmarked is given; (3) The Object Kinetic Monte Carlo model was implemented to account for nucleation events in meso-scale simulations and a discussion of the significance of this advance is given; (4) The SHARP neutronics module, PROTEUS, was expanded to be applicable to all types of reactors, and a discussion of the importance of PROTEUS is given; (5) A plan has been finalized for integrating the high-fidelity, three-dimensional reactor code SHARP with both the systems-level code RELAP7 and the fuel assembly code AMP. This is a new initiative; (6) Work began to evaluate the applicability of AMP to the problem of dry storage of used fuel and to define a relevant problem to test the applicability; (7) A code to obtain phonon spectra from the force-constant matrix for a crystalline lattice has been completed. This important bridge between subcontinuum and continuum phenomena is discussed; (8) Benchmarking was begun on the meso-scale, finite-element fuels code MARMOT to validate its new variable splitting algorithm; (9) A very computationally demanding simulation of diffusion-driven nucleation of new microstructural features has been completed. An explanation of the difficulty of this simulation is given; (10) Experiments were conducted with deformed steel to validate a crystal plasticity finite-element code for bodycentered cubic iron; (11) The Capability Transfer Roadmap was completed and published as an internal laboratory technical report; (12) The AMP fuel assembly code input generator was integrated into the NEAMS Integrated Computational Environment (NiCE). More details on the planned NEAMS computing environment is given; and (13) The NEAMS program website (neams.energy.gov) is nearly ready to launch.« less

  4. Advanced Engineering Fibers.

    ERIC Educational Resources Information Center

    Edie, Dan D.; Dunham, Michael G.

    1987-01-01

    Describes Clemson University's Advanced Engineered Fibers Laboratory, which was established to provide national leadership and expertise in developing the processing equipment and advance fibers necessary for the chemical, fiber, and textile industries to enter the composite materials market. Discusses some of the laboratory's activities in…

  5. STRUCTURED LEARNING AND TRAINING ENVIRONMENTS--A PREPARATION LABORATORY FOR ADVANCED MAMMALIAN PHYSIOLOGY.

    ERIC Educational Resources Information Center

    FIEL, NICHOLAS J.; JOHNSTON, RAYMOND F.

    A PREPARATION LABORATORY WAS DESIGNED TO FAMILIARIZE STUDENTS IN ADVANCED MAMMALIAN PHYSIOLOGY WITH LABORATORY SKILLS AND TECHNIQUES AND THUS SHORTEN THE TIME THEY SPEND IN SETTING UP ACTUAL EXPERIMENTS. THE LABORATORY LASTS 30 MINUTES, IS FLEXIBLE AND SIMPLE OF OPERATION, AND DOES NOT REQUIRE A PROFESSOR'S PRESENCE. THE BASIC TRAINING UNIT IS THE…

  6. Murder, insanity, and medical expert witnesses.

    PubMed

    Ciccone, J R

    1992-06-01

    Recent advances in the ability to study brain anatomy and function and attempts to link these findings with human behavior have captured the attention of the legal system. This had led to the increasing use of the "neurological defense" to support a plea of not guilty by reason of insanity. This article explores the history of the insanity defense and explores the role of the medical expert witnesses in integrating clinical and laboratory findings, eg, computed tomographic scans, magnetic resonance scans, and single-photon emission computed tomographic scans. Three cases involving murder and brain dysfunction are discussed: the first case involves a subarachnoid hemorrhage resulting in visual perceptual and memory impairment; the second case, a diagnosis of Alzheimer's disease; and the third case, the controverted diagnosis of complex partial seizures in a serial killer.

  7. Crashworthiness: Planes, trains, and automobiles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Logan, R.W.; Tokarz, F.J.; Whirley, R.G.

    A powerful DYNA3D computer code simulates the dynamic effects of stress traveling through structures. It is the most advanced modeling tool available to study crashworthiness problems and to analyze impacts. Now used by some 1000 companies, government research laboratories, and universities in the U.S. and abroad, DYNA3D is also a preeminent example of successful technology transfer. The initial interest in such a code was to simulate the structural response of weapons systems. The need was to model not the explosive or nuclear events themselves but rather the impacts of weapons systems with the ground, tracking the stress waves as theymore » move through the object. This type of computer simulation augmented or, in certain cases, reduced the need for expensive and time-consuming crash testing.« less

  8. Real-time structured light intraoral 3D measurement pipeline

    NASA Astrophysics Data System (ADS)

    Gheorghe, Radu; Tchouprakov, Andrei; Sokolov, Roman

    2013-02-01

    Computer aided design and manufacturing (CAD/CAM) is increasingly becoming a standard feature and service provided to patients in dentist offices and denture manufacturing laboratories. Although the quality of the tools and data has slowly improved in the last years, due to various surface measurement challenges, practical, accurate, invivo, real-time 3D high quality data acquisition and processing still needs improving. Advances in GPU computational power have allowed for achieving near real-time 3D intraoral in-vivo scanning of patient's teeth. We explore in this paper, from a real-time perspective, a hardware-software-GPU solution that addresses all the requirements mentioned before. Moreover we exemplify and quantify the hard and soft deadlines required by such a system and illustrate how they are supported in our implementation.

  9. The computational structural mechanics testbed architecture. Volume 2: Directives

    NASA Technical Reports Server (NTRS)

    Felippa, Carlos A.

    1989-01-01

    This is the second of a set of five volumes which describe the software architecture for the Computational Structural Mechanics Testbed. Derived from NICE, an integrated software system developed at Lockheed Palo Alto Research Laboratory, the architecture is composed of the command language (CLAMP), the command language interpreter (CLIP), and the data manager (GAL). Volumes 1, 2, and 3 (NASA CR's 178384, 178385, and 178386, respectively) describe CLAMP and CLIP and the CLIP-processor interface. Volumes 4 and 5 (NASA CR's 178387 and 178388, respectively) describe GAL and its low-level I/O. CLAMP, an acronym for Command Language for Applied Mechanics Processors, is designed to control the flow of execution of processors written for NICE. Volume 2 describes the CLIP directives in detail. It is intended for intermediate and advanced users.

  10. Center for Computational Structures Technology

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Perry, Ferman W.

    1995-01-01

    The Center for Computational Structures Technology (CST) is intended to serve as a focal point for the diverse CST research activities. The CST activities include the use of numerical simulation and artificial intelligence methods in modeling, analysis, sensitivity studies, and optimization of flight-vehicle structures. The Center is located at NASA Langley and is an integral part of the School of Engineering and Applied Science of the University of Virginia. The key elements of the Center are: (1) conducting innovative research on advanced topics of CST; (2) acting as pathfinder by demonstrating to the research community what can be done (high-potential, high-risk research); (3) strong collaboration with NASA scientists and researchers from universities and other government laboratories; and (4) rapid dissemination of CST to industry, through integration of industrial personnel into the ongoing research efforts.

  11. Evaluation of reliability modeling tools for advanced fault tolerant systems

    NASA Technical Reports Server (NTRS)

    Baker, Robert; Scheper, Charlotte

    1986-01-01

    The Computer Aided Reliability Estimation (CARE III) and Automated Reliability Interactice Estimation System (ARIES 82) reliability tools for application to advanced fault tolerance aerospace systems were evaluated. To determine reliability modeling requirements, the evaluation focused on the Draper Laboratories' Advanced Information Processing System (AIPS) architecture as an example architecture for fault tolerance aerospace systems. Advantages and limitations were identified for each reliability evaluation tool. The CARE III program was designed primarily for analyzing ultrareliable flight control systems. The ARIES 82 program's primary use was to support university research and teaching. Both CARE III and ARIES 82 were not suited for determining the reliability of complex nodal networks of the type used to interconnect processing sites in the AIPS architecture. It was concluded that ARIES was not suitable for modeling advanced fault tolerant systems. It was further concluded that subject to some limitations (the difficulty in modeling systems with unpowered spare modules, systems where equipment maintenance must be considered, systems where failure depends on the sequence in which faults occurred, and systems where multiple faults greater than a double near coincident faults must be considered), CARE III is best suited for evaluating the reliability of advanced tolerant systems for air transport.

  12. A Computational Methodology for Simulating Thermal Loss Testing of the Advanced Stirling Convertor

    NASA Technical Reports Server (NTRS)

    Reid, Terry V.; Wilson, Scott D.; Schifer, Nicholas A.; Briggs, Maxwell H.

    2012-01-01

    The U.S. Department of Energy (DOE) and Lockheed Martin Space Systems Company (LMSSC) have been developing the Advanced Stirling Radioisotope Generator (ASRG) for use as a power system for space science missions. This generator would use two highefficiency Advanced Stirling Convertors (ASCs), developed by Sunpower Inc. and NASA Glenn Research Center (GRC). The ASCs convert thermal energy from a radioisotope heat source into electricity. As part of ground testing of these ASCs, different operating conditions are used to simulate expected mission conditions. These conditions require achieving a particular operating frequency, hot end and cold end temperatures, and specified electrical power output for a given net heat input. In an effort to improve net heat input predictions, numerous tasks have been performed which provided a more accurate value for net heat input into the ASCs, including the use of multidimensional numerical models. Validation test hardware has also been used to provide a direct comparison of numerical results and validate the multi-dimensional numerical models used to predict convertor net heat input and efficiency. These validation tests were designed to simulate the temperature profile of an operating Stirling convertor and resulted in a measured net heat input of 244.4 W. The methodology was applied to the multi-dimensional numerical model which resulted in a net heat input of 240.3 W. The computational methodology resulted in a value of net heat input that was 1.7 percent less than that measured during laboratory testing. The resulting computational methodology and results are discussed.

  13. User's guide to the Residual Gas Analyzer (RGA)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Artman, S.A.

    1988-08-04

    The Residual Gas Analyzer (RGA), a Model 100C UTI quadrupole mass spectrometer, measures the concentrations of selected masses in the Fusion Energy Division's (FED) Advanced Toroidal Facility (ATF). The RGA software is a VAX FORTRAN computer program which controls the experimental apparatus, records the raw data, performs data reduction, and plots the data. The RGA program allows data to be collected from an RGA on ATF or from either of two RGAs in the laboratory. In the laboratory, the RGA diagnostic plays an important role in outgassing studied on various candidate materials for fusion experiments. One such material, graphite, ismore » being used more often in fusion experiments due to its ability to withstand high power loads. One of the functions of the RGA diagnostic is aid in the determination of the best grade of graphite to be used in these experiments and to study the procedures used to condition it. A procedure of particular interest involves baking the graphite sample in order to remove impurities that may be present in it. These impurities can be studied while in the ATF plasma or while being baked and outgassed in the laboratory. The Residual Gas Analyzer is a quadrupole mass spectrometer capable of scanning masses ranging in size from 1 atomic mass unit (amu) to 300 amu while under computer control. The procedure for collecting data for a particular mass is outlined.« less

  14. Use of artificial intelligence in analytical systems for the clinical laboratory

    PubMed Central

    Truchaud, Alain; Ozawa, Kyoichi; Pardue, Harry; Schnipelsky, Paul

    1995-01-01

    The incorporation of information-processing technology into analytical systems in the form of standard computing software has recently been advanced by the introduction of artificial intelligence (AI), both as expert systems and as neural networks. This paper considers the role of software in system operation, control and automation, and attempts to define intelligence. AI is characterized by its ability to deal with incomplete and imprecise information and to accumulate knowledge. Expert systems, building on standard computing techniques, depend heavily on the domain experts and knowledge engineers that have programmed them to represent the real world. Neural networks are intended to emulate the pattern-recognition and parallel processing capabilities of the human brain and are taught rather than programmed. The future may lie in a combination of the recognition ability of the neural network and the rationalization capability of the expert system. In the second part of the paper, examples are given of applications of AI in stand-alone systems for knowledge engineering and medical diagnosis and in embedded systems for failure detection, image analysis, user interfacing, natural language processing, robotics and machine learning, as related to clinical laboratories. It is concluded that AI constitutes a collective form of intellectual propery, and that there is a need for better documentation, evaluation and regulation of the systems already being used in clinical laboratories. PMID:18924784

  15. Effects of Combined Hands-on Laboratory and Computer Modeling on Student Learning of Gas Laws: A Quasi-Experimental Study

    ERIC Educational Resources Information Center

    Liu, Xiufeng

    2006-01-01

    Based on current theories of chemistry learning, this study intends to test a hypothesis that computer modeling enhanced hands-on chemistry laboratories are more effective than hands-on laboratories or computer modeling laboratories alone in facilitating high school students' understanding of chemistry concepts. Thirty-three high school chemistry…

  16. Use of Computed X-ray Tomographic Data for Analyzing the Thermodynamics of a Dissociating Porous Sand/Hydrate Mixture

    DOE R&D Accomplishments Database

    Freifeld, Barry M.; Kneafsey, Timothy J.; Tomutsa, Liviu; Stern, Laura A.; Kirby, Stephen H.

    2002-02-28

    X-ray computed tomography (CT) is a method that has been used extensively in laboratory experiments for measuring rock properties and fluid transport behavior. More recently, CT scanning has been applied successfully to detect the presence and study the behavior of naturally occurring hydrates. In this study, we used a modified medical CT scanner to image and analyze the progression of a dissociation front in a synthetic methane hydrate/sand mixture. The sample was initially scanned under conditions at which the hydrate is stable (atmospheric pressure and liquid nitrogen temperature, 77 K). The end of the sample holder was then exposed to the ambient air, and the core was continuously scanned as dissociation occurred in response to the rising temperature. CT imaging captured the advancing dissociation front clearly and accurately. The evolved gas volume was monitored as a function of time. Measured by CT, the advancing hydrate dissociation front was modeled as a thermal conduction problem explicitly incorporating the enthalpy of dissociation, using the Stefan moving-boundary-value approach. The assumptions needed to perform the analysis consisted of temperatures at the model boundaries. The estimated value for thermal conductivity of 2.6 W/m K for the remaining water ice/sand mixture is higher than expected based on conduction alone; this high value may represent a lumped parameter that incorporates the processes of heat conduction, methane gas convection, and any kinetic effects that occur during dissociation. The technique presented here has broad implications for future laboratory and field testing that incorporates geophysical techniques to monitor gas hydrate dissociation.

  17. Computational modeling of drug-resistant bacteria. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MacDougall, Preston

    2015-03-12

    Initial proposal summary: The evolution of antibiotic-resistant mutants among bacteria (superbugs) is a persistent and growing threat to public health. In many ways, we are engaged in a war with these microorganisms, where the corresponding arms race involves chemical weapons and biological targets. Just as advances in microelectronics, imaging technology and feature recognition software have turned conventional munitions into smart bombs, the long-term objectives of this proposal are to develop highly effective antibiotics using next-generation biomolecular modeling capabilities in tandem with novel subatomic feature detection software. Using model compounds and targets, our design methodology will be validated with correspondingly ultra-highmore » resolution structure-determination methods at premier DOE facilities (single-crystal X-ray diffraction at Argonne National Laboratory, and neutron diffraction at Oak Ridge National Laboratory). The objectives and accomplishments are summarized.« less

  18. 2008 ALCF annual report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Drugan, C.

    2009-12-07

    The word 'breakthrough' aptly describes the transformational science and milestones achieved at the Argonne Leadership Computing Facility (ALCF) throughout 2008. The number of research endeavors undertaken at the ALCF through the U.S. Department of Energy's (DOE) Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program grew from 9 in 2007 to 20 in 2008. The allocation of computer time awarded to researchers on the Blue Gene/P also spiked significantly - from nearly 10 million processor hours in 2007 to 111 million in 2008. To support this research, we expanded the capabilities of Intrepid, an IBM Blue Gene/P systemmore » at the ALCF, to 557 teraflops (TF) for production use. Furthermore, we enabled breakthrough levels of productivity and capability in visualization and data analysis with Eureka, a powerful installation of NVIDIA Quadro Plex S4 external graphics processing units. Eureka delivered a quantum leap in visual compute density, providing more than 111 TF and more than 3.2 terabytes of RAM. On April 21, 2008, the dedication of the ALCF realized DOE's vision to bring the power of the Department's high performance computing to open scientific research. In June, the IBM Blue Gene/P supercomputer at the ALCF debuted as the world's fastest for open science and third fastest overall. No question that the science benefited from this growth and system improvement. Four research projects spearheaded by Argonne National Laboratory computer scientists and ALCF users were named to the list of top ten scientific accomplishments supported by DOE's Advanced Scientific Computing Research (ASCR) program. Three of the top ten projects used extensive grants of computing time on the ALCF's Blue Gene/P to model the molecular basis of Parkinson's disease, design proteins at atomic scale, and create enzymes. As the year came to a close, the ALCF was recognized with several prestigious awards at SC08 in November. We provided resources for Linear Scaling Divide-and-Conquer Electronic Structure Calculations for Thousand Atom Nanostructures, a collaborative effort between Argonne, Lawrence Berkeley National Laboratory, and Oak Ridge National Laboratory that received the ACM Gordon Bell Prize Special Award for Algorithmic Innovation. The ALCF also was named a winner in two of the four categories in the HPC Challenge best performance benchmark competition.« less

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schecker, Jay A

    After a prolonged absence, the word 'nuclear' has returned to the lexicon of sustainable domestic energy resources. Due in no small part to its demonstrated reliability, nuclear power is poised to playa greater role in the nation's energy future, producing clean, carbon-neutral electricity and contributing even more to our energy security. To nuclear scientists, the resurgence presents an opportunity to inject new technologies into the industry to maximize the benefits that nuclear energy can provide. 'By developing new options for waste management and exploiting new materials to make key technological advances, we can significantly impact the use of nuclear energymore » in our future energy mix,' says Chris Stanek, a materials scientist at Los Alamos National Laboratory. Stanek approaches the big technology challenges by thinking way small, all the way down to the atoms. He and his colleagues are using cutting edge atomic-scale simulations to address a difficult aspect of nuclear waste -- predicting its behavior far into the future. Their research is part of a broader, coordinated effort on the part of the Laboratory to use its considerable experimental, theoretical, and computational capabilities to explore advanced materials central to not only waste issues, but to nuclear fuels as well.« less

  20. Contributions of CCLM to advances in quality control.

    PubMed

    Kazmierczak, Steven C

    2013-01-01

    Abstract The discipline of laboratory medicine is relatively young when considered in the context of the history of medicine itself. The history of quality control, within the context of laboratory medicine, also enjoys a relatively brief, but rich history. Laboratory quality control continues to evolve along with advances in automation, measurement techniques and information technology. Clinical Chemistry and Laboratory Medicine (CCLM) has played a key role in helping disseminate information about the proper use and utility of quality control. Publication of important advances in quality control techniques and dissemination of guidelines concerned with laboratory quality control has undoubtedly helped readers of this journal keep up to date on the most recent developments in this field.

  1. Conventional Microscopy vs. Computer Imagery in Chiropractic Education.

    PubMed

    Cunningham, Christine M; Larzelere, Elizabeth D; Arar, Ilija

    2008-01-01

    As human tissue pathology slides become increasingly difficult to obtain, other methods of teaching microscopy in educational laboratories must be considered. The purpose of this study was to evaluate our students' satisfaction with newly implemented computer imagery based laboratory instruction and to obtain input from their perspective on the advantages and disadvantages of computerized vs. traditional microscope laboratories. This undertaking involved the creation of a new computer laboratory. Robbins and Cotran Pathologic Basis of Disease, 7(th)ed, was chosen as the required text which gave students access to the Robbins Pathology website, including complete content of text, Interactive Case Study Companion, and Virtual Microscope. Students had experience with traditional microscopes in their histology and microbiology laboratory courses. Student satisfaction with computer based learning was assessed using a 28 question survey which was administered to three successive trimesters of pathology students (n=193) using the computer survey website Zoomerang. Answers were given on a scale of 1-5 and statistically analyzed using weighted averages. The survey data indicated that students were satisfied with computer based learning activities during pathology laboratory instruction. The most favorable aspect to computer imagery was 24-7 availability (weighted avg. 4.16), followed by clarification offered by accompanying text and captions (weighted avg. 4.08). Although advantages and disadvantages exist in using conventional microscopy and computer imagery, current pathology teaching environments warrant investigation of replacing traditional microscope exercises with computer applications. Chiropractic students supported the adoption of computer-assisted instruction in pathology laboratories.

  2. Preface: SciDAC 2008

    NASA Astrophysics Data System (ADS)

    Stevens, Rick

    2008-07-01

    The fourth annual Scientific Discovery through Advanced Computing (SciDAC) Conference was held June 13-18, 2008, in Seattle, Washington. The SciDAC conference series is the premier communitywide venue for presentation of results from the DOE Office of Science's interdisciplinary computational science program. Started in 2001 and renewed in 2006, the DOE SciDAC program is the country's - and arguably the world's - most significant interdisciplinary research program supporting the development of advanced scientific computing methods and their application to fundamental and applied areas of science. SciDAC supports computational science across many disciplines, including astrophysics, biology, chemistry, fusion sciences, and nuclear physics. Moreover, the program actively encourages the creation of long-term partnerships among scientists focused on challenging problems and computer scientists and applied mathematicians developing the technology and tools needed to address those problems. The SciDAC program has played an increasingly important role in scientific research by allowing scientists to create more accurate models of complex processes, simulate problems once thought to be impossible, and analyze the growing amount of data generated by experiments. To help further the research community's ability to tap into the capabilities of current and future supercomputers, Under Secretary for Science, Raymond Orbach, launched the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program in 2003. The INCITE program was conceived specifically to seek out computationally intensive, large-scale research projects with the potential to significantly advance key areas in science and engineering. The program encourages proposals from universities, other research institutions, and industry. During the first two years of the INCITE program, 10 percent of the resources at NERSC were allocated to INCITE awardees. However, demand for supercomputing resources far exceeded available systems; and in 2003, the Office of Science identified increasing computing capability by a factor of 100 as the second priority on its Facilities of the Future list. The goal was to establish leadership-class computing resources to support open science. As a result of a peer reviewed competition, the first leadership computing facility was established at Oak Ridge National Laboratory in 2004. A second leadership computing facility was established at Argonne National Laboratory in 2006. This expansion of computational resources led to a corresponding expansion of the INCITE program. In 2008, Argonne, Lawrence Berkeley, Oak Ridge, and Pacific Northwest national laboratories all provided resources for INCITE. By awarding large blocks of computer time on the DOE leadership computing facilities, the INCITE program enables the largest-scale computations to be pursued. In 2009, INCITE will award over half a billion node-hours of time. The SciDAC conference celebrates progress in advancing science through large-scale modeling and simulation. Over 350 participants attended this year's talks, poster sessions, and tutorials, spanning the disciplines supported by DOE. While the principal focus was on SciDAC accomplishments, this year's conference also included invited presentations and posters from DOE INCITE awardees. Another new feature in the SciDAC conference series was an electronic theater and video poster session, which provided an opportunity for the community to see over 50 scientific visualizations in a venue equipped with many high-resolution large-format displays. To highlight the growing international interest in petascale computing, this year's SciDAC conference included a keynote presentation by Herman Lederer from the Max Planck Institut, one of the leaders of DEISA (Distributed European Infrastructure for Supercomputing Applications) project and a member of the PRACE consortium, Europe's main petascale project. We also heard excellent talks from several European groups, including Laurent Gicquel of CERFACS, who spoke on `Large-Eddy Simulations of Turbulent Reacting Flows of Real Burners: Status and Challenges', and Jean-Francois Hamelin from EDF, who presented a talk on `Getting Ready for Petaflop Capacities and Beyond: A Utility Perspective'. Two other compelling addresses gave attendees a glimpse into the future. Tomas Diaz de la Rubia of Lawrence Livermore National Laboratory spoke on a vision for a fusion/fission hybrid reactor known as the `LIFE Engine' and discussed some of the materials and modeling challenges that need to be overcome to realize the vision for a 1000-year greenhouse-gas-free power source. Dan Reed from Microsoft gave a capstone talk on the convergence of technology, architecture, and infrastructure for cloud computing, data-intensive computing, and exascale computing (1018 flops/sec). High-performance computing is making rapid strides. The SciDAC community's computational resources are expanding dramatically. In the summer of 2008 the first general purpose petascale system (IBM Cell-based RoadRunner at Los Alamos National Laboratory) was recognized in the top 500 list of fastest machines heralding in the dawning of the petascale era. The DOE's leadership computing facility at Argonne reached number three on the Top 500 and is at the moment the most capable open science machine based on an IBM BG/P system with a peak performance of over 550 teraflops/sec. Later this year Oak Ridge is expected to deploy a 1 petaflops/sec Cray XT system. And even before the scientific community has had an opportunity to make significant use of petascale systems, the computer science research community is forging ahead with ideas and strategies for development of systems that may by the end of the next decade sustain exascale performance. Several talks addressed barriers to, and strategies for, achieving exascale capabilities. The last day of the conference was devoted to tutorials hosted by Microsoft Research at a new conference facility in Redmond, Washington. Over 90 people attended the tutorials, which covered topics ranging from an introduction to BG/P programming to advanced numerical libraries. The SciDAC and INCITE programs and the DOE Office of Advanced Scientific Computing Research core program investments in applied mathematics, computer science, and computational and networking facilities provide a nearly optimum framework for advancing computational science for DOE's Office of Science. At a broader level this framework also is benefiting the entire American scientific enterprise. As we look forward, it is clear that computational approaches will play an increasingly significant role in addressing challenging problems in basic science, energy, and environmental research. It takes many people to organize and support the SciDAC conference, and I would like to thank as many of them as possible. The backbone of the conference is the technical program; and the task of selecting, vetting, and recruiting speakers is the job of the organizing committee. I thank the members of this committee for all the hard work and the many tens of conference calls that enabled a wonderful program to be assembled. This year the following people served on the organizing committee: Jim Ahrens, LANL; David Bader, LLNL; Bryan Barnett, Microsoft; Peter Beckman, ANL; Vincent Chan, GA; Jackie Chen, SNL; Lori Diachin, LLNL; Dan Fay, Microsoft; Ian Foster, ANL; Mark Gordon, Ames; Mohammad Khaleel, PNNL; David Keyes, Columbia University; Bob Lucas, University of Southern California; Tony Mezzacappa, ORNL; Jeff Nichols, ORNL; David Nowak, ANL; Michael Papka, ANL; Thomas Schultess, ORNL; Horst Simon, LBNL; David Skinner, LBNL; Panagiotis Spentzouris, Fermilab; Bob Sugar, UCSB; and Kathy Yelick, LBNL. I owe a special thanks to Mike Papka and Jim Ahrens for handling the electronic theater. I also thank all those who submitted videos. It was a highly successful experiment. Behind the scenes an enormous amount of work is required to make a large conference go smoothly. First I thank Cheryl Zidel for her tireless efforts as organizing committee liaison and posters chair and, in general, handling all of my end of the program and keeping me calm. I also thank Gail Pieper for her work in editing the proceedings, Beth Cerny Patino for her work on the Organizing Committee website and electronic theater, and Ken Raffenetti for his work in keeping that website working. Jon Bashor and John Hules did an excellent job in handling conference communications. I thank Caitlin Youngquist for the striking graphic design; Dan Fay for tutorials arrangements; and Lynn Dory, Suzanne Stevenson, Sarah Pebelske and Sarah Zidel for on-site registration and conference support. We all owe Yeen Mankin an extra-special thanks for choosing the hotel, handling contracts, arranging menus, securing venues, and reassuring the chair that everything was under control. We are pleased to have obtained corporate sponsorship from Cray, IBM, Intel, HP, and SiCortex. I thank all the speakers and panel presenters. I also thank the former conference chairs Tony Metzzacappa, Bill Tang, and David Keyes, who were never far away for advice and encouragement. Finally, I offer my thanks to Michael Strayer, without whose leadership, vision, and persistence the SciDAC program would not have come into being and flourished. I am honored to be part of his program and his friend. Rick Stevens Seattle, Washington July 18, 2008

  3. Distribution of man-machine controls in space teleoperation

    NASA Technical Reports Server (NTRS)

    Bejczy, A. K.

    1982-01-01

    The distribution of control between man and machine is dependent on the tasks, available technology, human performance characteristics and control goals. This dependency has very specific projections on systems designed for teleoperation in space. This paper gives a brief outline of the space-related issues and presents the results of advanced teleoperator research and development at the Jet Propulsion Laboratory (JPL). The research and development work includes smart sensors, flexible computer controls and intelligent man-machine interface devices in the area of visual displays and kinesthetic man-machine coupling in remote control of manipulators. Some of the development results have been tested at the Johnson Space Center (JSC) using the simulated full-scale Shuttle Remote Manipulator System (RMS). The research and development work for advanced space teleoperation is far from complete and poses many interdisciplinary challenges.

  4. Simulator platform for fast reactor operation and safety technology demonstration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vilim, R. B.; Park, Y. S.; Grandy, C.

    2012-07-30

    A simulator platform for visualization and demonstration of innovative concepts in fast reactor technology is described. The objective is to make more accessible the workings of fast reactor technology innovations and to do so in a human factors environment that uses state-of-the art visualization technologies. In this work the computer codes in use at Argonne National Laboratory (ANL) for the design of fast reactor systems are being integrated to run on this platform. This includes linking reactor systems codes with mechanical structures codes and using advanced graphics to depict the thermo-hydraulic-structure interactions that give rise to an inherently safe responsemore » to upsets. It also includes visualization of mechanical systems operation including advanced concepts that make use of robotics for operations, in-service inspection, and maintenance.« less

  5. Integration of a Communicating Science Module into an Advanced Chemistry Laboratory Course

    ERIC Educational Resources Information Center

    Renaud, Jessica; Squier, Christopher; Larsen, Sarah C.

    2006-01-01

    A communicating science module was introduced into an advanced undergraduate physical chemistry laboratory course. The module was integrated into the course such that students received formal instruction in communicating science interwoven with the chemistry laboratory curriculum. The content of the communicating science module included three…

  6. A Multistep Synthesis for an Advanced Undergraduate Organic Chemistry Laboratory

    ERIC Educational Resources Information Center

    Chang Ji; Peters, Dennis G.

    2006-01-01

    Multistep syntheses are often important components of the undergraduate organic laboratory experience and a three-step synthesis of 5-(2-sulfhydrylethyl) salicylaldehyde was described. The experiment is useful as a special project for an advanced undergraduate organic chemistry laboratory course and offers opportunities for students to master a…

  7. Results and Analysis of the Infrastructure Request for Information (DE-SOL-0008318)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heidrich, Brenden John

    2015-07-01

    The Department of Energy (DOE) Office of Nuclear Energy (NE) released a request for information (RFI) (DE-SOL-0008318) for “University, National Laboratory, Industry and International Input on Potential Office of Nuclear Energy Infrastructure Investments” on April 13, 2015. DOE-NE solicited information on five specific types of capabilities as well as any others suggested by the community. The RFI proposal period closed on June 19, 2015. From the 26 responses, 34 individual proposals were extracted. Eighteen were associated with a DOE national laboratory, including Argonne National Laboratory (ANL), Brookhaven National Laboratory (BNL), Idaho National Laboratory (INL), Los Alamos National Laboratory (LANL), Pacificmore » Northwest National Laboratory (PNNL) and Sandia National Laboratory (SNL). Oak Ridge National Laboratory (ORNL) was referenced in a proposal as a proposed capability location, although the proposal did not originate with ORNL. Five US universities submitted proposals (Massachusetts Institute of Technology, Pennsylvania State University, Rensselaer Polytechnic Institute, University of Houston and the University of Michigan). Three industrial/commercial institutions submitted proposals (AREVA NP, Babcock and Wilcox (B&W) and the Electric Power Research Institute (EPRI)). Eight major themes emerged from the submissions as areas needing additional capability or support for existing capabilities. Two submissions supported multiple areas. The major themes are: Advanced Manufacturing (AM), High Performance Computing (HPC), Ion Irradiation with X-Ray Diagnostics (IIX), Ion Irradiation with TEM Visualization (IIT), Radiochemistry Laboratories (RCL), Test Reactors, Neutron Sources and Critical Facilities (RX) , Sample Preparation and Post-Irradiation Examination (PIE) and Thermal-Hydraulics Test Facilities (THF).« less

  8. Using a Cloud Computing System to Reduce Door-to-Balloon Time in Acute ST-Elevation Myocardial Infarction Patients Transferred for Percutaneous Coronary Intervention.

    PubMed

    Ho, Chi-Kung; Chen, Fu-Cheng; Chen, Yung-Lung; Wang, Hui-Ting; Lee, Chien-Ho; Chung, Wen-Jung; Lin, Cheng-Jui; Hsueh, Shu-Kai; Hung, Shin-Chiang; Wu, Kuan-Han; Liu, Chu-Feng; Kung, Chia-Te; Cheng, Cheng-I

    2017-01-01

    This study evaluated the impact on clinical outcomes using a cloud computing system to reduce percutaneous coronary intervention hospital door-to-balloon (DTB) time for ST segment elevation myocardial infarction (STEMI). A total of 369 patients before and after implementation of the transfer protocol were enrolled. Of these patients, 262 were transferred through protocol while the other 107 patients were transferred through the traditional referral process. There were no significant differences in DTB time, pain to door of STEMI receiving center arrival time, and pain to balloon time between the two groups. Pain to electrocardiography time in patients with Killip I/II and catheterization laboratory to balloon time in patients with Killip III/IV were significantly reduced in transferred through protocol group compared to in traditional referral process group (both p < 0.05). There were also no remarkable differences in the complication rate and 30-day mortality between two groups. The multivariate analysis revealed that the independent predictors of 30-day mortality were elderly patients, advanced Killip score, and higher level of troponin-I. This study showed that patients transferred through our present protocol could reduce pain to electrocardiography and catheterization laboratory to balloon time in Killip I/II and III/IV patients separately. However, this study showed that using a cloud computing system in our present protocol did not reduce DTB time.

  9. The Center for Nanophase Materials Sciences

    NASA Astrophysics Data System (ADS)

    Lowndes, Douglas

    2005-03-01

    The Center for Nanophase Materials Sciences (CNMS) located at Oak Ridge National Laboratory (ORNL) will be the first DOE Nanoscale Science Research Center to begin operation, with construction to be completed in April 2005 and initial operations in October 2005. The CNMS' scientific program has been developed through workshops with the national community, with the goal of creating a highly collaborative research environment to accelerate discovery and drive technological advances. Research at the CNMS is organized under seven Scientific Themes selected to address challenges to understanding and to exploit particular ORNL strengths (see http://cnms.ornl.govhttp://cnms.ornl.gov). These include extensive synthesis and characterization capabilities for soft, hard, nanostructured, magnetic and catalytic materials and their composites; neutron scattering at the Spallation Neutron Source and High Flux Isotope Reactor; computational nanoscience in the CNMS' Nanomaterials Theory Institute and utilizing facilities and expertise of the Center for Computational Sciences and the new Leadership Scientific Computing Facility at ORNL; a new CNMS Nanofabrication Research Laboratory; and a suite of unique and state-of-the-art instruments to be made reliably available to the national community for imaging, manipulation, and properties measurements on nanoscale materials in controlled environments. The new research facilities will be described together with the planned operation of the user research program, the latter illustrated by the current ``jump start'' user program that utilizes existing ORNL/CNMS facilities.

  10. PGOPHER in the Classroom and the Laboratory

    NASA Astrophysics Data System (ADS)

    Western, Colin

    2015-06-01

    PGOPHER is a general purpose program for simulating and fitting rotational, vibrational and electronic spectra. As it uses a graphical user interface the basic operation is sufficiently straightforward to make it suitable for use in undergraduate practicals and computer based classes. This talk will present two experiments that have been in regular use by Bristol undergraduates for some years based on the analysis of infra-red spectra of cigarette smoke and, for more advanced students, visible and near ultra-violet spectra of a nitrogen discharge and a hydrocarbon flame. For all of these the rotational structure is analysed and used to explore ideas of bonding. The talk will discuss the requirements for the apparatus and the support required. Other ideas for other possible experiments and computer based exercises will also be presented, including a group exercise. The PGOPHER program is open source, and is available for Microsoft Windows, Apple Mac and Linux. It can be freely downloaded from the supporting website http://pgopher.chm.bris.ac.uk. The program does not require any installation process, so can be run on student's own machines or easily setup on classroom or laboratory computers. PGOPHER, a Program for Simulating Rotational, Vibrational and Electronic Structure, C. M. Western, University of Bristol, http://pgopher.chm.bris.ac.uk PGOPHER version 8.0, C M Western, 2014, University of Bristol Research Data Repository, doi:10.5523/bris.huflggvpcuc1zvliqed497r2

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jun

    Our group has been working with ANL collaborators on the topic bridging the gap between parallel file system and local file system during the course of this project period. We visited Argonne National Lab -- Dr. Robert Ross's group for one week in the past summer 2007. We looked over our current project progress and planned the activities for the incoming years 2008-09. The PI met Dr. Robert Ross several times such as HEC FSIO workshop 08, SC08 and SC10. We explored the opportunities to develop a production system by leveraging our current prototype to (SOGP+PVFS) a new PVFS version.more » We delivered SOGP+PVFS codes to ANL PVFS2 group in 2008.We also talked about exploring a potential project on developing new parallel programming models and runtime systems for data-intensive scalable computing (DISC). The methodology is to evolve MPI towards DISC by incorporating some functions of Google MapReduce parallel programming model. More recently, we are together exploring how to leverage existing works to perform (1) coordination/aggregation of local I/O operations prior to movement over the WAN, (2) efficient bulk data movement over the WAN, (3) latency hiding techniques for latency-intensive operations. Since 2009, we start applying Hadoop/MapReduce to some HEC applications with LANL scientists John Bent and Salman Habib. Another on-going work is to improve checkpoint performance at I/O forwarding Layer for the Road Runner super computer with James Nuetz and Gary Gridder at LANL. Two senior undergraduates from our research group did summer internships about high-performance file and storage system projects in LANL since 2008 for consecutive three years. Both of them are now pursuing Ph.D. degree in our group and will be 4th year in the PhD program in Fall 2011 and go to LANL to advance two above-mentioned works during this winter break. Since 2009, we have been collaborating with several computer scientists (Gary Grider, John bent, Parks Fields, James Nunez, Hsing-Bung Chen, etc) from HPC5 and James Ahrens from Advanced Computing Laboratory in Los Alamos National Laboratory. We hold a weekly conference and/or video meeting on advancing works at two fronts: the hardware/software infrastructure of building large-scale data intensive cluster and research publications. Our group members assist in constructing several onsite LANL data intensive clusters. Two parties have been developing software codes and research papers together using both sides resources.« less

  12. BioVeL: a virtual laboratory for data analysis and modelling in biodiversity science and ecology.

    PubMed

    Hardisty, Alex R; Bacall, Finn; Beard, Niall; Balcázar-Vargas, Maria-Paula; Balech, Bachir; Barcza, Zoltán; Bourlat, Sarah J; De Giovanni, Renato; de Jong, Yde; De Leo, Francesca; Dobor, Laura; Donvito, Giacinto; Fellows, Donal; Guerra, Antonio Fernandez; Ferreira, Nuno; Fetyukova, Yuliya; Fosso, Bruno; Giddy, Jonathan; Goble, Carole; Güntsch, Anton; Haines, Robert; Ernst, Vera Hernández; Hettling, Hannes; Hidy, Dóra; Horváth, Ferenc; Ittzés, Dóra; Ittzés, Péter; Jones, Andrew; Kottmann, Renzo; Kulawik, Robert; Leidenberger, Sonja; Lyytikäinen-Saarenmaa, Päivi; Mathew, Cherian; Morrison, Norman; Nenadic, Aleksandra; de la Hidalga, Abraham Nieva; Obst, Matthias; Oostermeijer, Gerard; Paymal, Elisabeth; Pesole, Graziano; Pinto, Salvatore; Poigné, Axel; Fernandez, Francisco Quevedo; Santamaria, Monica; Saarenmaa, Hannu; Sipos, Gergely; Sylla, Karl-Heinz; Tähtinen, Marko; Vicario, Saverio; Vos, Rutger Aldo; Williams, Alan R; Yilmaz, Pelin

    2016-10-20

    Making forecasts about biodiversity and giving support to policy relies increasingly on large collections of data held electronically, and on substantial computational capability and capacity to analyse, model, simulate and predict using such data. However, the physically distributed nature of data resources and of expertise in advanced analytical tools creates many challenges for the modern scientist. Across the wider biological sciences, presenting such capabilities on the Internet (as "Web services") and using scientific workflow systems to compose them for particular tasks is a practical way to carry out robust "in silico" science. However, use of this approach in biodiversity science and ecology has thus far been quite limited. BioVeL is a virtual laboratory for data analysis and modelling in biodiversity science and ecology, freely accessible via the Internet. BioVeL includes functions for accessing and analysing data through curated Web services; for performing complex in silico analysis through exposure of R programs, workflows, and batch processing functions; for on-line collaboration through sharing of workflows and workflow runs; for experiment documentation through reproducibility and repeatability; and for computational support via seamless connections to supporting computing infrastructures. We developed and improved more than 60 Web services with significant potential in many different kinds of data analysis and modelling tasks. We composed reusable workflows using these Web services, also incorporating R programs. Deploying these tools into an easy-to-use and accessible 'virtual laboratory', free via the Internet, we applied the workflows in several diverse case studies. We opened the virtual laboratory for public use and through a programme of external engagement we actively encouraged scientists and third party application and tool developers to try out the services and contribute to the activity. Our work shows we can deliver an operational, scalable and flexible Internet-based virtual laboratory to meet new demands for data processing and analysis in biodiversity science and ecology. In particular, we have successfully integrated existing and popular tools and practices from different scientific disciplines to be used in biodiversity and ecological research.

  13. Quantifying the debonding of inclusions through tomography and computational homology.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Wei-Yang; Johnson, George C.; Mota, Alejandro

    2010-09-01

    This report describes a Laboratory Directed Research and Development (LDRD) project to use of synchrotron-radiation computed tomography (SRCT) data to determine the conditions and mechanisms that lead to void nucleation in rolled alloys. The Advanced Light Source (ALS) at Lawrence Berkeley National Laboratory (LBNL) has provided SRCT data of a few specimens of 7075-T7351 aluminum plate (widely used for aerospace applications) stretched to failure, loaded in directions perpendicular and parallel to the rolling direction. The resolution of SRCT data is 900nm, which allows elucidation of the mechanisms governing void growth and coalescence. This resolution is not fine enough, however, formore » nucleation. We propose the use statistics and image processing techniques to obtain sub-resolution scale information from these data, and thus determine where in the specimen and when during the loading program nucleation occurs and the mechanisms that lead to it. Quantitative analysis of the tomography data, however, leads to the conclusion that the reconstruction process compromises the information obtained from the scans. Alternate, more powerful reconstruction algorithms are needed to address this problem, but those fall beyond the scope of this project.« less

  14. A Software Laboratory Environment for Computer-Based Problem Solving.

    ERIC Educational Resources Information Center

    Kurtz, Barry L.; O'Neal, Micheal B.

    This paper describes a National Science Foundation-sponsored project at Louisiana Technological University to develop computer-based laboratories for "hands-on" introductions to major topics of computer science. The underlying strategy is to develop structured laboratory environments that present abstract concepts through the use of…

  15. Telerobotic control of the seven-degree-of-freedom CESAR manipulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Babcock, S.M.; Dubey, R.V.; Euler, J.A.

    1988-01-01

    The application of a computationally efficient kinematic control scheme for manipulators with redundant degrees of freedom to the unilateral telerobotic control of seven-degree-of-freedom manipulator (CESARM) at the Oak Ridge National Laboratory Center for Engineering Systems Advanced Research is presented. The kinematic control scheme uses a gradient projection optimization method, which eliminates that need to determine the generalized inverse of the Jacobian when solving for joint velocities, given Cartesian end-effector velocities. A six-degree-of-freedom (nonreplica) master controller is used. Performance indices for redundancy resolution are discussed. 5 ref., 6 figs.

  16. Two Cases of Lethal Complications Following Ultrasound-Guided Percutaneous Fine-Needle Biopsy of the Liver

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Drinkovic, Ivan; Brkljacic, Boris

    1996-09-15

    Two cases with lethal complications are reported among 1750 ultrasound (US)-guided percutaneous fine-needle liver biopsies performed in our department. The first patient had angiosarcoma of the liver which was not suspected after computed tomography (CT) and US studies had been performed. The other patient had hepatocellular carcinoma in advanced hepatic cirrhosis. Death was due to bleeding in both cases. Pre-procedure laboratory tests did not reveal the existence of major bleeding disorders in either case. Normal liver tissue was interposed in the needle track between the liver capsule and the lesions which were targeted.

  17. Astronomy and astrophysics for the 1980's, volume 1

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The programs recommended address the most significant questions that confront contemporary astronomy and fall into three general categories: prerequisites for research initiatives, including instrumentation and detectors, theory and data analysis, computational facilities, laboratory astrophysics, and technical support at ground-based observatories; programs including an Advanced X-ray Astrophysics Facility, a Very-Long Baseline Array, a Technology Telescope and a Large Deployable Reflector; and programs for study and development, including X-ray observatories in space, instruments for the detection of gravitational waves from astronomical objects, and long duration spaceflights of infrared telescopes. Estimated costs of these programs are provided.

  18. Be a Mentor and Experience the Excitement of Rediscovery | Poster

    Cancer.gov

    You don’t really know something until you can teach it to someone. Raul Cachau said he believes this is not only true in academia, but in research laboratories as well. He said that being a mentor means rediscovering things long taken for granted. “It really forces you to rethink some of the things you do,” said Cachau, Ph.D., principal scientist, Advanced Biomedical Computing Center (ABCC). “It brings focus to many of the things that happen on a daily basis … There’s a positive impact to taking a fresh look at something.”

  19. Spacelab user implementation assessment study (software requirements analysis). Volume 1: Executive study

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The primary objective of this study was to develop an integrated approach for the development, implementation, and utilization of all software that is required to efficiently and cost-effectively support advanced technology laboratory flight and ground operations. It was recognized that certain aspects of the operations would be mandatory computerized services; computerization of other aspects would be optional. Thus, the analyses encompassed not only alternate computer utilization and implementations but trade studies of the programmatic effects of non-computerized versus computerized approaches to the operations. A general overview of the study is presented.

  20. Astronomy and astrophysics for the 1980's, volume 1

    NASA Astrophysics Data System (ADS)

    The programs recommended address the most significant questions that confront contemporary astronomy and fall into three general categories: prerequisites for research initiatives, including instrumentation and detectors, theory and data analysis, computational facilities, laboratory astrophysics, and technical support at ground-based observatories; programs including an Advanced X-ray Astrophysics Facility, a Very-Long Baseline Array, a Technology Telescope and a Large Deployable Reflector; and programs for study and development, including X-ray observatories in space, instruments for the detection of gravitational waves from astronomical objects, and long duration spaceflights of infrared telescopes. Estimated costs of these programs are provided.

  1. A Distributed Laboratory for Event-Driven Coastal Prediction and Hazard Planning

    NASA Astrophysics Data System (ADS)

    Bogden, P.; Allen, G.; MacLaren, J.; Creager, G. J.; Flournoy, L.; Sheng, Y. P.; Graber, H.; Graves, S.; Conover, H.; Luettich, R.; Perrie, W.; Ramakrishnan, L.; Reed, D. A.; Wang, H. V.

    2006-12-01

    The 2005 Atlantic hurricane season was the most active in recorded history. Collectively, 2005 hurricanes caused more than 2,280 deaths and record damages of over 100 billion dollars. Of the storms that made landfall, Dennis, Emily, Katrina, Rita, and Wilma caused most of the destruction. Accurate predictions of storm-driven surge, wave height, and inundation can save lives and help keep recovery costs down, provided the information gets to emergency response managers in time. The information must be available well in advance of landfall so that responders can weigh the costs of unnecessary evacuation against the costs of inadequate preparation. The SURA Coastal Ocean Observing and Prediction (SCOOP) Program is a multi-institution collaboration implementing a modular, distributed service-oriented architecture for real time prediction and visualization of the impacts of extreme atmospheric events. The modular infrastructure enables real-time prediction of multi- scale, multi-model, dynamic, data-driven applications. SURA institutions are working together to create a virtual and distributed laboratory integrating coastal models, simulation data, and observations with computational resources and high speed networks. The loosely coupled architecture allows teams of computer and coastal scientists at multiple institutions to innovate complex system components that are interconnected with relatively stable interfaces. The operational system standardizes at the interface level to enable substantial innovation by complementary communities of coastal and computer scientists. This architectural philosophy solves a long-standing problem associated with the transition from research to operations. The SCOOP Program thereby implements a prototype laboratory consistent with the vision of a national, multi-agency initiative called the Integrated Ocean Observing System (IOOS). Several service- oriented components of the SCOOP enterprise architecture have already been designed and implemented, including data archive and transport services, metadata registry and retrieval (catalog), resource management, and portal interfaces. SCOOP partners are integrating these at the service level and implementing reconfigurable workflows for several kinds of user scenarios, and are working with resource providers to prototype new policies and technologies for on-demand computing.

  2. Meta-analysis of the effectiveness of computer-based laboratory versus traditional hands-on laboratory in college and pre-college science instructions

    NASA Astrophysics Data System (ADS)

    Onuoha, Cajetan O.

    The purpose of this research study was to determine the overall effectiveness of computer-based laboratory compared with the traditional hands-on laboratory for improving students' science academic achievement and attitudes towards science subjects at the college and pre-college levels of education in the United States. Meta-analysis was used to synthesis the findings from 38 primary research studies conducted and/or reported in the United States between 1996 and 2006 that compared the effectiveness of computer-based laboratory with the traditional hands-on laboratory on measures related to science academic achievements and attitudes towards science subjects. The 38 primary research studies, with total subjects of 3,824 generated a total of 67 weighted individual effect sizes that were used in this meta-analysis. The study found that computer-based laboratory had small positive effect sizes over the traditional hands-on laboratory (ES = +0.26) on measures related to students' science academic achievements and attitudes towards science subjects (ES = +0.22). It was also found that computer-based laboratory produced more significant effects on physical science subjects compared to biological sciences (ES = +0.34, +0.17).

  3. Investigations in Computer-Aided Instruction and Computer-Aided Controls. Final Report.

    ERIC Educational Resources Information Center

    Rosenberg, R.C.; And Others

    These research projects, designed to delve into certain relationships between humans and computers, are focused on computer-assisted instruction and on man-computer interaction. One study demonstrates that within the limits of formal engineering theory, a computer simulated laboratory (Dynamic Systems Laboratory) can be built in which freshmen…

  4. Simple and Rapid System for Two-Dimensional Gel Electrophoresis Technique: A Laboratory Exercise for High School Students

    ERIC Educational Resources Information Center

    Maurye, Praveen; Basu, Arpita; Biswas, Jayanta Kumar; Bandyopadhyay, Tapas Kumar; Naskar, Malay

    2018-01-01

    Polyacrylamide gel electrophoresis (PAGE) is the most classical technique favored worldwide for resolution of macromolecules in many biochemistry laboratories due to its incessant advanced developments and wide modifications. These ever-growing advancements in the basic laboratory equipments lead to emergence of many expensive, complex, and tricky…

  5. Laboratory Demonstrations for PDE and Metals Combustion at NASA MSFC's Advanced Propulsion Laboratory

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Report provides status reporting on activities under order no. H-30549 for the period December 1 through December 31, 1999. Details the activities of the contract in the coordination of planned conduct of experiments at the MSFC Advanced Propulsion Laboratory in pulse detonation MHD power production and metals combustion.

  6. Knowledge management: Role of the the Radiation Safety Information Computational Center (RSICC)

    NASA Astrophysics Data System (ADS)

    Valentine, Timothy

    2017-09-01

    The Radiation Safety Information Computational Center (RSICC) at Oak Ridge National Laboratory (ORNL) is an information analysis center that collects, archives, evaluates, synthesizes and distributes information, data and codes that are used in various nuclear technology applications. RSICC retains more than 2,000 software packages that have been provided by code developers from various federal and international agencies. RSICC's customers (scientists, engineers, and students from around the world) obtain access to such computing codes (source and/or executable versions) and processed nuclear data files to promote on-going research, to ensure nuclear and radiological safety, and to advance nuclear technology. The role of such information analysis centers is critical for supporting and sustaining nuclear education and training programs both domestically and internationally, as the majority of RSICC's customers are students attending U.S. universities. Additionally, RSICC operates a secure CLOUD computing system to provide access to sensitive export-controlled modeling and simulation (M&S) tools that support both domestic and international activities. This presentation will provide a general review of RSICC's activities, services, and systems that support knowledge management and education and training in the nuclear field.

  7. A novel concept for smart trepanation.

    PubMed

    Follmann, Axel; Korff, Alexander; Fuertjes, Tobias; Kunze, Sandra C; Schmieder, Kirsten; Radermacher, Klaus

    2012-01-01

    Trepanation of the skull is a common procedure in craniofacial and neurosurgical interventions, allowing access to the innermost cranial structures. Despite a careful advancement, injury of the dura mater represents a frequent complication during these cranial openings. The technology of computer-assisted surgery offers different support systems such as navigated tools and surgical robots. This article presents a novel technical approach toward an image- and sensor-based synergistic control of the cutting depth of a manually guided soft-tissue-preserving saw. Feasibility studies in a laboratory setup modeling relevant skull tissue parameters demonstrate that errors due to computed tomography or magnetic resonance image segmentation and registration, optical tracking, and mechanical tolerances of up to 2.5 mm, imminent to many computer-assisted surgery systems, can be compensated for by the cutting tool characteristics without damaging the dura. In conclusion, the feasibility of a computer-controlled trepanation system providing a safer and efficient trepanation has been demonstrated. Injuries of the dura mater can be avoided, and the bone cutting gap can be reduced to 0.5 mm with potential benefits for the reintegration of the bone flap.

  8. Binary Black Holes, Gravitational Waves, and Numerical Relativity

    NASA Technical Reports Server (NTRS)

    Centrella, Joan

    2007-01-01

    Massive black hole (MBH) binaries are found at the centers of most galaxies. MBH mergers trace galaxy mergers and are strong sources of gravitational waves. Observing these sources with gravitational wave detectors requires that we know the radiation waveforms they emit. Since these mergers take place in regions of very strong gravitational fields, we need to solve Einstein's equations of general relativity on a computer in order to calculate these waveforms. For more than 30 years, scientists have tried to compute these waveforms using the methods of numerical relativity. The resulting computer codes have been plagued by instabilities. causing them to crash well before the black hole:, in the binary could complete even a single orbit. Recently this situation has changed dramatically, with a series of amazing breakthroughs. This presentation shows how a spacetime is constructed on a computer to build a simulation laboratory for binary black hole mergers. Focus is on the recent advances that that reveal these waveforms, and the potential for discoveries that arises when these sources are observed by LIGO and LISA.

  9. Determination of Absolute Zero Using a Computer-Based Laboratory

    ERIC Educational Resources Information Center

    Amrani, D.

    2007-01-01

    We present a simple computer-based laboratory experiment for evaluating absolute zero in degrees Celsius, which can be performed in college and undergraduate physical sciences laboratory courses. With a computer, absolute zero apparatus can help demonstrators or students to observe the relationship between temperature and pressure and use…

  10. Final Report: Quantification of Uncertainty in Extreme Scale Computations (QUEST)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marzouk, Youssef; Conrad, Patrick; Bigoni, Daniele

    QUEST (\\url{www.quest-scidac.org}) is a SciDAC Institute that is focused on uncertainty quantification (UQ) in large-scale scientific computations. Our goals are to (1) advance the state of the art in UQ mathematics, algorithms, and software; and (2) provide modeling, algorithmic, and general UQ expertise, together with software tools, to other SciDAC projects, thereby enabling and guiding a broad range of UQ activities in their respective contexts. QUEST is a collaboration among six institutions (Sandia National Laboratories, Los Alamos National Laboratory, the University of Southern California, Massachusetts Institute of Technology, the University of Texas at Austin, and Duke University) with a historymore » of joint UQ research. Our vision encompasses all aspects of UQ in leadership-class computing. This includes the well-founded setup of UQ problems; characterization of the input space given available data/information; local and global sensitivity analysis; adaptive dimensionality and order reduction; forward and inverse propagation of uncertainty; handling of application code failures, missing data, and hardware/software fault tolerance; and model inadequacy, comparison, validation, selection, and averaging. The nature of the UQ problem requires the seamless combination of data, models, and information across this landscape in a manner that provides a self-consistent quantification of requisite uncertainties in predictions from computational models. Accordingly, our UQ methods and tools span an interdisciplinary space across applied math, information theory, and statistics. The MIT QUEST effort centers on statistical inference and methods for surrogate or reduced-order modeling. MIT personnel have been responsible for the development of adaptive sampling methods, methods for approximating computationally intensive models, and software for both forward uncertainty propagation and statistical inverse problems. A key software product of the MIT QUEST effort is the MIT Uncertainty Quantification library, called MUQ (\\url{muq.mit.edu}).« less

  11. In silico assessment of the acute toxicity of chemicals: recent advances and new model for multitasking prediction of toxic effect.

    PubMed

    Kleandrova, Valeria V; Luan, Feng; Speck-Planche, Alejandro; Cordeiro, M Natália D S

    2015-01-01

    The assessment of acute toxicity is one of the most important stages to ensure the safety of chemicals with potential applications in pharmaceutical sciences, biomedical research, or any other industrial branch. A huge and indiscriminate number of toxicity assays have been carried out on laboratory animals. In this sense, computational approaches involving models based on quantitative-structure activity/toxicity relationships (QSAR/QSTR) can help to rationalize time and financial costs. Here, we discuss the most significant advances in the last 6 years focused on the use of QSAR/QSTR models to predict acute toxicity of drugs/chemicals in laboratory animals, employing large and heterogeneous datasets. The advantages and drawbacks of the different QSAR/QSTR models are analyzed. As a contribution to the field, we introduce the first multitasking (mtk) QSTR model for simultaneous prediction of acute toxicity of compounds by considering different routes of administration, diverse breeds of laboratory animals, and the reliability of the experimental conditions. The mtk-QSTR model was based on artificial neural networks (ANN), allowing the classification of compounds as toxic or non-toxic. This model correctly classified more than 94% of the 1646 cases present in the whole dataset, and its applicability was demonstrated by performing predictions of different chemicals such as drugs, dietary supplements, and molecules which could serve as nanocarriers for drug delivery. The predictions given by the mtk-QSTR model are in very good agreement with the experimental results.

  12. University of Rochester, Laboratory for Laser Energetics

    NASA Astrophysics Data System (ADS)

    1987-01-01

    In FY86 the Laboratory has produced a list of accomplishments in which it takes pride. LLE has met every laser-fusion program milestone to date in a program of research for direct-drive ultraviolet laser fusion originally formulated in 1981. LLE scientists authored or co-authored 135 scientific papers during 1985 to 1986. The collaborative experiments with NRL, LANL, and LLNL have led to a number of important ICF results. The cryogenic target system developed by KMS Fusion for LLE will be used in future high-density experiments on OMEGA to demonstrate the compression of thermonuclear fuel to 100 to 200 times that of solid (20 to 40 g/cm) in a test of the direct-drive concept, as noted in the National Academy of Sciences' report. The excellence of the advanced technology efforts at LLE is illustrated by the establishment of the Ultrafast Science Center by the Department of Defense through the Air Force Office of Scientific Research. Research in the Center will concentrate on bridging the gap between high-speed electronics and ultrafast optics by providing education, research, and development in areas critical to future communications and high-speed computer systems. The Laboratory for Laser Energetics continues its pioneering work on the interaction of intense radiation with matter. This includes inertial-fusion and advanced optical and optical electronics research; training people in the technology and applications of high-power, short-pulse lasers; and interacting with the scientific community, business, industry, and government to promote the growth of laser technology.

  13. 75 FR 43518 - Advanced Scientific Computing Advisory Committee; Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-26

    ... DEPARTMENT OF ENERGY Advanced Scientific Computing Advisory Committee; Meeting AGENCY: Office of... Scientific Computing Advisory Committee (ASCAC). Federal Advisory Committee Act (Pub. L. 92-463, 86 Stat. 770...: Melea Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U. S...

  14. New computing systems, future computing environment, and their implications on structural analysis and design

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Housner, Jerrold M.

    1993-01-01

    Recent advances in computer technology that are likely to impact structural analysis and design of flight vehicles are reviewed. A brief summary is given of the advances in microelectronics, networking technologies, and in the user-interface hardware and software. The major features of new and projected computing systems, including high performance computers, parallel processing machines, and small systems, are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed. The impact of the advances in computer technology on structural analysis and the design of flight vehicles is described. A scenario for future computing paradigms is presented, and the near-term needs in the computational structures area are outlined.

  15. Accelerating Technology Development through Integrated Computation and Experimentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shekhawat, Dushyant; Srivastava, Rameshwar D.; Ciferno, Jared

    2013-08-15

    This special section of Energy & Fuels comprises a selection of papers presented at the topical conference “Accelerating Technology Development through Integrated Computation and Experimentation”, sponsored and organized by the United States Department of Energy’s National Energy Technology Laboratory (NETL) as part of the 2012 American Institute of Chemical Engineers (AIChE) Annual Meeting held in Pittsburgh, PA, Oct 28-Nov 2, 2012. That topical conference focused on the latest research and development efforts in five main areas related to fossil energy, with each area focusing on the utilization of both experimental and computational approaches: (1) gas separations (membranes, sorbents, and solventsmore » for CO{sub 2}, H{sub 2}, and O{sub 2} production), (2) CO{sub 2} utilization (enhanced oil recovery, chemical production, mineralization, etc.), (3) carbon sequestration (flow in natural systems), (4) advanced power cycles (oxy-combustion, chemical looping, gasification, etc.), and (5) fuel processing (H{sub 2} production for fuel cells).« less

  16. Diagnostic imaging applications; Proceedings of the Meeting, Amsterdam, Netherlands, October 8, 9, 1984

    NASA Technical Reports Server (NTRS)

    Beckenbach, E. S. (Editor)

    1984-01-01

    It is more important than ever that engineers have an understanding of the future needs of clinical and research medicine, and that physicians know somthing about probable future developments in instrumentation capabilities. Only by maintaining such a dialog can the most effective application of technological advances to medicine be achieved. This workshop attempted to provide this kind of information transfer in the limited field of diagnostic imaging. Biomedical research at the Jet Propulsion Laboratory is discussed, taking into account imaging results from space exploration missions, as well as biomedical research tasks based in these technologies. Attention is also given to current and future indications for magnetic resonance in medicine, high speed quantitative digital microscopy, computer processing of radiographic images, computed tomography and its modern applications, position emission tomography, and developments related to medical ultrasound.

  17. Lawrence Livermore National Laboratory`s Computer Security Short Subjects Videos: Hidden Password, The Incident, Dangerous Games and The Mess; Computer Security Awareness Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    A video on computer security is described. Lonnie Moore, the Computer Security Manager, CSSM/CPPM at Lawrence Livermore National Laboratory (LLNL) and Gale Warshawsky, the Coordinator for Computer Security Education and Awareness at LLNL, wanted to share topics such as computer ethics, software piracy, privacy issues, and protecting information in a format that would capture and hold an audience`s attention. Four Computer Security Short Subject videos were produced which ranged from 1--3 minutes each. These videos are very effective education and awareness tools that can be used to generate discussions about computer security concerns and good computing practices.

  18. Live broadcast of laparoscopic surgery to handheld computers.

    PubMed

    Gandsas, A; McIntire, K; Park, A

    2004-06-01

    Thanks to advances in computer power and miniaturization technology, portable electronic devices are now being used to assist physicians with various applications that extend far beyond Web browsing or sending e-mail. Handheld computers are used for electronic medical records, billing, coding, and to enable convenient access to electronic journals for reference purposes. The results of diagnostic investigations, such as laboratory results, study reports, and still radiographic pictures, can also be downloaded into portable devices for later view. Handheld computer technology, combined with wireless protocols and streaming video technology, has the added potential to become a powerful educational tool for medical students and residents. The purpose of this study was to assess the feasibility of transferring multimedia data in real time to a handheld computer via a wireless network and displaying them on the computer screens of clients at remote locations. A live laparoscopic splenectomy was transmitted live to eight handheld computers simultaneously through our institution's wireless network. All eight viewers were able to view the procedure and to hear the surgeon's comments throughout the entire duration of the operation. Handheld computer technology can play a key role in surgical education by delivering information to surgical residents or students when they are geographically distant from the actual event. Validation of this new technology by conducting clinical research is still needed to determine whether resident physicians or medical students can benefit from the use of handheld computers.

  19. Three Decades of Research on Computer Applications in Health Care

    PubMed Central

    Michael Fitzmaurice, J.; Adams, Karen; Eisenberg, John M.

    2002-01-01

    The Agency for Healthcare Research and Quality and its predecessor organizations—collectively referred to here as AHRQ—have a productive history of funding research and development in the field of medical informatics, with grant investments since 1968 totaling $107 million. Many computerized interventions that are commonplace today, such as drug interaction alerts, had their genesis in early AHRQ initiatives. This review provides a historical perspective on AHRQ investment in medical informatics research. It shows that grants provided by AHRQ resulted in achievements that include advancing automation in the clinical laboratory and radiology, assisting in technology development (computer languages, software, and hardware), evaluating the effectiveness of computer-based medical information systems, facilitating the evolution of computer-aided decision making, promoting computer-initiated quality assurance programs, backing the formation and application of comprehensive data banks, enhancing the management of specific conditions such as HIV infection, and supporting health data coding and standards initiatives. Other federal agencies and private organizations have also supported research in medical informatics, some earlier and to a greater degree than AHRQ. The results and relative roles of these related efforts are beyond the scope of this review. PMID:11861630

  20. A professional development model for medical laboratory scientists working in the immunohematology laboratory.

    PubMed

    Garza, Melinda N; Pulido, Lila A; Amerson, Megan; Ali, Faheem A; Greenhill, Brandy A; Griffin, Gary; Alvarez, Enrique; Whatley, Marsha; Hu, Peter C

    2012-01-01

    Transfusion medicine, a section of the Department of Laboratory Medicine at The University of Texas MD Anderson Cancer Center is committed to the education and advancement of its health care professionals. It is our belief that giving medical laboratory professionals a path for advancement leads to excellence and increases overall professionalism in the Immunohematology Laboratory. As a result of this strong commitment to excellence and professionalism, the Immunohematology laboratory has instituted a Professional Development Model (PDM) that aims to create Medical Laboratory Scientists (MLS) that are not only more knowledgeable, but are continually striving for excellence. In addition, these MLS are poised for advancement in their careers. The professional development model consists of four levels: Discovery, Application, Maturation, and Expert. The model was formulated to serve as a detailed path to the mastery of all process and methods in the Immunohematology Laboratory. Each level in the professional development model consists of tasks that optimize the laboratory workflow and allow for concurrent training. Completion of a level in the PDM is rewarded with financial incentive and further advancement in the field. The PDM for Medical Laboratory Scientists in the Immunohematology Laboratory fosters personal development, rewards growth and competency, and sets high standards for all services and skills provided. This model is a vital component of the Immunohematology Laboratory and aims to ensure the highest quality of care and standards in their testing. It is because of the success of this model and the robustness of its content that we hope other medical laboratories aim to reach the same level of excellence and professionalism, and adapt this model into their own environment.

  1. Application Modernization at LLNL and the Sierra Center of Excellence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neely, J. Robert; de Supinski, Bronis R.

    We repport that in 2014, Lawrence Livermore National Laboratory began acquisition of Sierra, a pre-exascale system from IBM and Nvidia. It marks a significant shift in direction for LLNL by introducing the concept of heterogeneous computing via GPUs. LLNL’s mission requires application teams to prepare for this paradigm shift. Thus, the Sierra procurement required a proposed Center of Excellence that would align the expertise of the chosen vendors with laboratory personnel that represent the application developers, system software, and tool providers in a concentrated effort to prepare the laboratory’s codes in advance of the system transitioning to production in 2018.more » Finally, this article presents LLNL’s overall application strategy, with a focus on how LLNL is collaborating with IBM and Nvidia to ensure a successful transition of its mission-oriented applications into the exascale era.« less

  2. Shrink-film microfluidic education modules: Complete devices within minutes

    PubMed Central

    Nguyen, Diep; McLane, Jolie; Lew, Valerie; Pegan, Jonathan; Khine, Michelle

    2011-01-01

    As advances in microfluidics continue to make contributions to diagnostics and life sciences, broader awareness of this expanding field becomes necessary. By leveraging low-cost microfabrication techniques that require no capital equipment or infrastructure, simple, accessible, and effective educational modules can be made available for a broad range of educational needs from middle school demonstrations to college laboratory classes. These modules demonstrate key microfluidic concepts such as diffusion and separation as well as “laboratory on-chip” applications including chemical reactions and biological assays. These modules are intended to provide an interdisciplinary hands-on experience, including chip design, fabrication of functional devices, and experiments at the microscale. Consequently, students will be able to conceptualize physics at small scales, gain experience in computer-aided design and microfabrication, and perform experiments—all in the context of addressing real-world challenges by making their own lab-on-chip devices. PMID:21799715

  3. Shrink-film microfluidic education modules: Complete devices within minutes.

    PubMed

    Nguyen, Diep; McLane, Jolie; Lew, Valerie; Pegan, Jonathan; Khine, Michelle

    2011-06-01

    As advances in microfluidics continue to make contributions to diagnostics and life sciences, broader awareness of this expanding field becomes necessary. By leveraging low-cost microfabrication techniques that require no capital equipment or infrastructure, simple, accessible, and effective educational modules can be made available for a broad range of educational needs from middle school demonstrations to college laboratory classes. These modules demonstrate key microfluidic concepts such as diffusion and separation as well as "laboratory on-chip" applications including chemical reactions and biological assays. These modules are intended to provide an interdisciplinary hands-on experience, including chip design, fabrication of functional devices, and experiments at the microscale. Consequently, students will be able to conceptualize physics at small scales, gain experience in computer-aided design and microfabrication, and perform experiments-all in the context of addressing real-world challenges by making their own lab-on-chip devices.

  4. Application Modernization at LLNL and the Sierra Center of Excellence

    DOE PAGES

    Neely, J. Robert; de Supinski, Bronis R.

    2017-09-01

    We repport that in 2014, Lawrence Livermore National Laboratory began acquisition of Sierra, a pre-exascale system from IBM and Nvidia. It marks a significant shift in direction for LLNL by introducing the concept of heterogeneous computing via GPUs. LLNL’s mission requires application teams to prepare for this paradigm shift. Thus, the Sierra procurement required a proposed Center of Excellence that would align the expertise of the chosen vendors with laboratory personnel that represent the application developers, system software, and tool providers in a concentrated effort to prepare the laboratory’s codes in advance of the system transitioning to production in 2018.more » Finally, this article presents LLNL’s overall application strategy, with a focus on how LLNL is collaborating with IBM and Nvidia to ensure a successful transition of its mission-oriented applications into the exascale era.« less

  5. A Comparison of the Apple Macintosh and IBM PC in Laboratory Applications.

    ERIC Educational Resources Information Center

    Williams, Ron

    1986-01-01

    Compares Apple Macintosh and IBM PC microcomputers in terms of their usefulness in the laboratory. No attempt is made to equalize the two computer systems since they represent opposite ends of the computer spectrum. Indicates that the IBM PC is the most useful general-purpose personal computer for laboratory applications. (JN)

  6. Voting with Their Seats: Computer Laboratory Design and the Casual User

    ERIC Educational Resources Information Center

    Spennemann, Dirk H. R.; Atkinson, John; Cornforth, David

    2007-01-01

    Student computer laboratories are provided by most teaching institutions around the world; however, what is the most effective layout for such facilities? The log-in data files from computer laboratories at a regional university in Australia were analysed to determine whether there was a pattern in student seating. In particular, it was…

  7. Examining Student Outcomes in University Computer Laboratory Environments: Issues for Educational Management

    ERIC Educational Resources Information Center

    Newby, Michael; Marcoulides, Laura D.

    2008-01-01

    Purpose: The purpose of this paper is to model the relationship between student performance, student attitudes, and computer laboratory environments. Design/methodology/approach: Data were collected from 234 college students enrolled in courses that involved the use of a computer to solve problems and provided the laboratory experience by means of…

  8. Creating and Using a Computer Networking and Systems Administration Laboratory Built under Relaxed Financial Constraints

    ERIC Educational Resources Information Center

    Conlon, Michael P.; Mullins, Paul

    2011-01-01

    The Computer Science Department at Slippery Rock University created a laboratory for its Computer Networks and System Administration and Security courses under relaxed financial constraints. This paper describes the department's experience designing and using this laboratory, including lessons learned and descriptions of some student projects…

  9. Developing Materials Processing to Performance Modeling Capabilities and the Need for Exascale Computing Architectures (and Beyond)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schraad, Mark William; Luscher, Darby Jon

    Additive Manufacturing techniques are presenting the Department of Energy and the NNSA Laboratories with new opportunities to consider novel component production and repair processes, and to manufacture materials with tailored response and optimized performance characteristics. Additive Manufacturing technologies already are being applied to primary NNSA mission areas, including Nuclear Weapons. These mission areas are adapting to these new manufacturing methods, because of potential advantages, such as smaller manufacturing footprints, reduced needs for specialized tooling, an ability to embed sensing, novel part repair options, an ability to accommodate complex geometries, and lighter weight materials. To realize the full potential of Additivemore » Manufacturing as a game-changing technology for the NNSA’s national security missions; however, significant progress must be made in several key technical areas. In addition to advances in engineering design, process optimization and automation, and accelerated feedstock design and manufacture, significant progress must be made in modeling and simulation. First and foremost, a more mature understanding of the process-structure-property-performance relationships must be developed. Because Additive Manufacturing processes change the nature of a material’s structure below the engineering scale, new models are required to predict materials response across the spectrum of relevant length scales, from the atomistic to the continuum. New diagnostics will be required to characterize materials response across these scales. And not just models, but advanced algorithms, next-generation codes, and advanced computer architectures will be required to complement the associated modeling activities. Based on preliminary work in each of these areas, a strong argument for the need for Exascale computing architectures can be made, if a legitimate predictive capability is to be developed.« less

  10. A Comprehensive Microfluidics Device Construction and Characterization Module for the Advanced Undergraduate Analytical Chemistry Laboratory

    ERIC Educational Resources Information Center

    Piunno, Paul A. E.; Zetina, Adrian; Chu, Norman; Tavares, Anthony J.; Noor, M. Omair; Petryayeva, Eleonora; Uddayasankar, Uvaraj; Veglio, Andrew

    2014-01-01

    An advanced analytical chemistry undergraduate laboratory module on microfluidics that spans 4 weeks (4 h per week) is presented. The laboratory module focuses on comprehensive experiential learning of microfluidic device fabrication and the core characteristics of microfluidic devices as they pertain to fluid flow and the manipulation of samples.…

  11. 78 FR 59927 - Next Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-30

    ... Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and Systems Biology..., Computational, and Systems Biology [External Review Draft]'' (EPA/600/R-13/214A). EPA is also announcing that... Advances in Molecular, Computational, and Systems Biology [External Review Draft]'' is available primarily...

  12. 78 FR 68058 - Next Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-13

    ... Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and Systems Biology... Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and Systems Biology..., computational, and systems biology data can better inform risk assessment. This draft document is available for...

  13. Making Advanced Computer Science Topics More Accessible through Interactive Technologies

    ERIC Educational Resources Information Center

    Shao, Kun; Maher, Peter

    2012-01-01

    Purpose: Teaching advanced technical concepts in a computer science program to students of different technical backgrounds presents many challenges. The purpose of this paper is to present a detailed experimental pedagogy in teaching advanced computer science topics, such as computer networking, telecommunications and data structures using…

  14. An iLab for Teaching Advanced Logic Concepts with Hardware Descriptive Languages

    ERIC Educational Resources Information Center

    Ayodele, Kayode P.; Inyang, Isaac A.; Kehinde, Lawrence O.

    2015-01-01

    One of the more interesting approaches to teaching advanced logic concepts is the use of online laboratory frameworks to provide student access to remote field-programmable devices. There is as yet, however, no conclusive evidence of the effectiveness of such an approach. This paper presents the Advanced Digital Lab, a remote laboratory based on…

  15. Multi-modality molecular imaging: pre-clinical laboratory configuration

    NASA Astrophysics Data System (ADS)

    Wu, Yanjun; Wellen, Jeremy W.; Sarkar, Susanta K.

    2006-02-01

    In recent years, the prevalence of in vivo molecular imaging applications has rapidly increased. Here we report on the construction of a multi-modality imaging facility in a pharmaceutical setting that is expected to further advance existing capabilities for in vivo imaging of drug distribution and the interaction with their target. The imaging instrumentation in our facility includes a microPET scanner, a four wavelength time-domain optical imaging scanner, a 9.4T/30cm MRI scanner and a SPECT/X-ray CT scanner. An electronics shop and a computer room dedicated to image analysis are additional features of the facility. The layout of the facility was designed with a central animal preparation room surrounded by separate laboratory rooms for each of the major imaging modalities to accommodate the work-flow of simultaneous in vivo imaging experiments. This report will focus on the design of and anticipated applications for our microPET and optical imaging laboratory spaces. Additionally, we will discuss efforts to maximize the daily throughput of animal scans through development of efficient experimental work-flows and the use of multiple animals in a single scanning session.

  16. A Computerized Data-Capture System for Animal Biosafety Level 4 Laboratories

    PubMed Central

    Bente, Dennis A; Friesen, Jeremy; White, Kyle; Koll, Jordan; Kobinger, Gary P

    2011-01-01

    The restrictive nature of an Animal Biosafety Level 4 (ABSL4) laboratory complicates even simple clinical evaluation including data capture. Typically, clinical data are recorded on paper during procedures, faxed out of the ABSL4, and subsequently manually entered into a computer. This system has many disadvantages including transcriptional errors. Here, we describe the development of a highly customizable, tablet-PC-based computerized data-capture system, allowing reliable collection of observational and clinical data from experimental animals in a restrictive biocontainment setting. A multidisciplinary team with skills in containment laboratory animal science, database design, and software engineering collaborated on the development of this system. The goals were to design an easy-to-use and flexible user interface on a touch-screen tablet PC with user-supportable processes for recovery, full auditing capabilities, and cost effectiveness. The system simplifies data capture, reduces the necessary time in an ABSL4 environment, offers timely reporting and review of data, facilitates statistical analysis, reduces potential of erroneous data entry, improves quality assurance of animal care, and advances the use and refinement of humane endpoints. PMID:22330712

  17. 3D Printing in the Laboratory: Maximize Time and Funds with Customized and Open-Source Labware

    PubMed Central

    Coakley, Meghan; Hurt, Darrell E.

    2016-01-01

    3D printing, also known as additive manufacturing, is the computer-guided process of fabricating physical objects by depositing successive layers of material. It has transformed manufacturing across virtually every industry, bringing about incredible advances in research and medicine. The rapidly growing consumer market now includes convenient and affordable “desktop” 3D printers. These are being used in the laboratory to create custom 3D-printed equipment, and a growing community of designers are contributing open-source, cost-effective innovations that can be used by both professionals and enthusiasts. User stories from investigators at the National Institutes of Health and the biomedical research community demonstrate the power of 3D printing to save valuable time and funding. While adoption of 3D printing has been slow in the biosciences to date, the potential is vast. The market predicts that within several years, 3D printers could be commonplace within the home; with so many practical uses for 3D printing, we anticipate that the technology will also play an increasingly important role in the laboratory. PMID:27197798

  18. Introduction of optical tweezers in advanced physics laboratory

    NASA Astrophysics Data System (ADS)

    Wang, Gang

    2017-08-01

    Laboratories are an essential part of undergraduate optoelectronics and photonics education. Of particular interest are the sequence of laboratories which offer students meaningful research experience within a reasonable time-frame limited by regular laboratory hours. We will present our introduction of optical tweezers into the upper-level physics laboratory. We developed the sequence of experiments in the Advanced Lab to offer students sufficient freedom to explore, rather than simply setting up a demonstration following certain recipes. We will also present its impact on our current curriculum of optoelectronics concentration within the physics program.

  19. Interactive virtual optical laboratories

    NASA Astrophysics Data System (ADS)

    Liu, Xuan; Yang, Yi

    2017-08-01

    Laboratory experiences are essential for optics education. However, college students have limited access to advanced optical equipment that is generally expensive and complicated. Hence there is a need for innovative solutions to expose students to advanced optics laboratories. Here we describe a novel approach, interactive virtual optical laboratory (IVOL) that allows unlimited number of students to participate the lab session remotely through internet, to improve laboratory education in photonics. Although students are not physically conducting the experiment, IVOL is designed to engage students, by actively involving students in the decision making process throughout the experiment.

  20. Assessment of Sensor Technologies for Advanced Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Korsah, Kofi; Ramuhalli, Pradeep; Vlim, R.

    2016-10-01

    Sensors and measurement technologies provide information on processes, support operations and provide indications of component health. They are therefore crucial to plant operations and to commercialization of advanced reactors (AdvRx). This report, developed by a three-laboratory team consisting of Argonne National Laboratory (ANL), Oak Ridge National Laboratory (ORNL) and Pacific Northwest National Laboratory (PNNL), provides an assessment of sensor technologies and a determination of measurement needs for AdvRx. It provides the technical basis for identifying and prioritizing research targets within the instrumentation and control (I&C) Technology Area under the Department of Energy’s (DOE’s) Advanced Reactor Technology (ART) program and contributesmore » to the design and implementation of AdvRx concepts.« less

  1. Advanced Propulsion Physics Lab: Eagleworks Investigations

    NASA Technical Reports Server (NTRS)

    Scogin, Tyler

    2014-01-01

    Eagleworks Laboratory is an advanced propulsions physics laboratory with two primary investigations currently underway. The first is a Quantum Vacuum Plasma Thruster (QVPT or Q-thrusters), an advanced electric propulsion technology in the development and demonstration phase. The second investigation is in Warp Field Interferometry (WFI). This is an investigation of Dr. Harold "Sonny" White's theoretical physics models for warp field equations using optical experiments in the Electro Optical laboratory (EOL) at Johnson Space Center. These investigations are pursuing technology necessary to enable human exploration of the solar system and beyond.

  2. Status Report on NEAMS System Analysis Module Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu, R.; Fanning, T. H.; Sumner, T.

    2015-12-01

    Under the Reactor Product Line (RPL) of DOE-NE’s Nuclear Energy Advanced Modeling and Simulation (NEAMS) program, an advanced SFR System Analysis Module (SAM) is being developed at Argonne National Laboratory. The goal of the SAM development is to provide fast-running, improved-fidelity, whole-plant transient analyses capabilities. SAM utilizes an object-oriented application framework MOOSE), and its underlying meshing and finite-element library libMesh, as well as linear and non-linear solvers PETSc, to leverage modern advanced software environments and numerical methods. It also incorporates advances in physical and empirical models and seeks closure models based on information from high-fidelity simulations and experiments. This reportmore » provides an update on the SAM development, and summarizes the activities performed in FY15 and the first quarter of FY16. The tasks include: (1) implement the support of 2nd-order finite elements in SAM components for improved accuracy and computational efficiency; (2) improve the conjugate heat transfer modeling and develop pseudo 3-D full-core reactor heat transfer capabilities; (3) perform verification and validation tests as well as demonstration simulations; (4) develop the coupling requirements for SAS4A/SASSYS-1 and SAM integration.« less

  3. The Advanced Software Development and Commercialization Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gallopoulos, E.; Canfield, T.R.; Minkoff, M.

    1990-09-01

    This is the first of a series of reports pertaining to progress in the Advanced Software Development and Commercialization Project, a joint collaborative effort between the Center for Supercomputing Research and Development of the University of Illinois and the Computing and Telecommunications Division of Argonne National Laboratory. The purpose of this work is to apply techniques of parallel computing that were pioneered by University of Illinois researchers to mature computational fluid dynamics (CFD) and structural dynamics (SD) computer codes developed at Argonne. The collaboration in this project will bring this unique combination of expertise to bear, for the first time,more » on industrially important problems. By so doing, it will expose the strengths and weaknesses of existing techniques for parallelizing programs and will identify those problems that need to be solved in order to enable wide spread production use of parallel computers. Secondly, the increased efficiency of the CFD and SD codes themselves will enable the simulation of larger, more accurate engineering models that involve fluid and structural dynamics. In order to realize the above two goals, we are considering two production codes that have been developed at ANL and are widely used by both industry and Universities. These are COMMIX and WHAMS-3D. The first is a computational fluid dynamics code that is used for both nuclear reactor design and safety and as a design tool for the casting industry. The second is a three-dimensional structural dynamics code used in nuclear reactor safety as well as crashworthiness studies. These codes are currently available for both sequential and vector computers only. Our main goal is to port and optimize these two codes on shared memory multiprocessors. In so doing, we shall establish a process that can be followed in optimizing other sequential or vector engineering codes for parallel processors.« less

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zitney, S.E.

    This paper highlights the use of the CAPE-OPEN (CO) standard interfaces in the Advanced Process Engineering Co-Simulator (APECS) developed at the National Energy Technology Laboratory (NETL). The APECS system uses the CO unit operation, thermodynamic, and reaction interfaces to provide its plug-and-play co-simulation capabilities, including the integration of process simulation with computational fluid dynamics (CFD) simulation. APECS also relies heavily on the use of a CO COM/CORBA bridge for running process/CFD co-simulations on multiple operating systems. For process optimization in the face of multiple and some time conflicting objectives, APECS offers stochastic modeling and multi-objective optimization capabilities developed to complymore » with the CO software standard. At NETL, system analysts are applying APECS to a wide variety of advanced power generation systems, ranging from small fuel cell systems to commercial-scale power plants including the coal-fired, gasification-based FutureGen power and hydrogen production plant.« less

  5. Recent Advances in Targeted and Untargeted Metabolomics by NMR and MS/NMR Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bingol, Kerem

    Metabolomics has made significant progress in multiple fronts in the last 18 months. This minireview aimed to give an overview of these advancements in the light of their contribution to targeted and untargeted metabolomics. New computational approaches have emerged to overcome manual absolute quantitation step of metabolites in 1D 1H NMR spectra. This provides more consistency between inter-laboratory comparisons. Integration of 2D NMR metabolomics databases under a unified web server allowed very accurate identification of the metabolites that have been catalogued in these databases. For the remaining uncatalogued and unknown metabolites, new cheminformatics approaches have been developed by combining NMRmore » and mass spectrometry. These hybrid NMR/MS approaches accelerated the identification of unknowns in untargeted studies, and now they are allowing to profile ever larger number of metabolites in application studies.« less

  6. Implementation, capabilities, and benchmarking of Shift, a massively parallel Monte Carlo radiation transport code

    DOE PAGES

    Pandya, Tara M.; Johnson, Seth R.; Evans, Thomas M.; ...

    2015-12-21

    This paper discusses the implementation, capabilities, and validation of Shift, a massively parallel Monte Carlo radiation transport package developed and maintained at Oak Ridge National Laboratory. It has been developed to scale well from laptop to small computing clusters to advanced supercomputers. Special features of Shift include hybrid capabilities for variance reduction such as CADIS and FW-CADIS, and advanced parallel decomposition and tally methods optimized for scalability on supercomputing architectures. Shift has been validated and verified against various reactor physics benchmarks and compares well to other state-of-the-art Monte Carlo radiation transport codes such as MCNP5, CE KENO-VI, and OpenMC. Somemore » specific benchmarks used for verification and validation include the CASL VERA criticality test suite and several Westinghouse AP1000 ® problems. These benchmark and scaling studies show promising results.« less

  7. Systems Biology-Driven Hypotheses Tested In Vivo: The Need to Advancing Molecular Imaging Tools.

    PubMed

    Verma, Garima; Palombo, Alessandro; Grigioni, Mauro; La Monaca, Morena; D'Avenio, Giuseppe

    2018-01-01

    Processing and interpretation of biological images may provide invaluable insights on complex, living systems because images capture the overall dynamics as a "whole." Therefore, "extraction" of key, quantitative morphological parameters could be, at least in principle, helpful in building a reliable systems biology approach in understanding living objects. Molecular imaging tools for system biology models have attained widespread usage in modern experimental laboratories. Here, we provide an overview on advances in the computational technology and different instrumentations focused on molecular image processing and analysis. Quantitative data analysis through various open source software and algorithmic protocols will provide a novel approach for modeling the experimental research program. Besides this, we also highlight the predictable future trends regarding methods for automatically analyzing biological data. Such tools will be very useful to understand the detailed biological and mathematical expressions under in-silico system biology processes with modeling properties.

  8. NPL scoops £25m for advanced metrology centre

    NASA Astrophysics Data System (ADS)

    Singh Chadha, Kulvinder

    2013-03-01

    The National Physical Laboratory (NPL) in Teddington, UK, is to receive £25m towards the construction of an Advanced Metrology Laboratory (AML) that will contain up to 20 labs and be complete by 2017.

  9. Final report and recommendations of the ESnet Authentication Pilot Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, G.R.; Moore, J.P.; Athey, C.L.

    1995-01-01

    To conduct their work, U.S. Department of Energy (DOE) researchers require access to a wide range of computing systems and information resources outside of their respective laboratories. Electronically communicating with peers using the global Internet has become a necessity to effective collaboration with university, industrial, and other government partners. DOE`s Energy Sciences Network (ESnet) needs to be engineered to facilitate this {open_quotes}collaboratory{close_quotes} while ensuring the protection of government computing resources from unauthorized use. Sensitive information and intellectual properties must be protected from unauthorized disclosure, modification, or destruction. In August 1993, DOE funded four ESnet sites (Argonne National Laboratory, Lawrence Livermoremore » National Laboratory, the National Energy Research Supercomputer Center, and Pacific Northwest Laboratory) to begin implementing and evaluating authenticated ESnet services using the advanced Kerberos Version 5. The purpose of this project was to identify, understand, and resolve the technical, procedural, cultural, and policy issues surrounding peer-to-peer authentication in an inter-organization internet. The investigators have concluded that, with certain conditions, Kerberos Version 5 is a suitable technology to enable ESnet users to freely share resources and information without compromising the integrity of their systems and data. The pilot project has demonstrated that Kerberos Version 5 is capable of supporting trusted third-party authentication across an inter-organization internet and that Kerberos Version 5 would be practical to implement across the ESnet community within the U.S. The investigators made several modifications to the Kerberos Version 5 system that are necessary for operation in the current Internet environment and have documented other technical shortcomings that must be addressed before large-scale deployment is attempted.« less

  10. Advanced Computer Typography.

    DTIC Science & Technology

    1981-12-01

    ADVANCED COMPUTER TYPOGRAPHY .(U) DEC 81 A V HERSHEY UNCLASSIFIED NPS012-81-005 M MEEEIEEEII IIUJIL15I.4 MICROCQP RE SO.JjI ON ft R NPS012-81-005...NAVAL POSTGRADUATE SCHOOL 0Monterey, California DTIC SELECTEWA APR 5 1982 B ADVANCED COMPUTER TYPOGRAPHY by A. V. HERSHEY December 1981 OApproved for...Subtitle) S. TYPE Or REPORT & PERIOD COVERED Final ADVANCED COMPUTER TYPOGRAPHY Dec 1979 - Dec 1981 S. PERFORMING ORG. REPORT NUMBER 7. AUTHOR(s) S CONTRACT

  11. CUBE (Computer Use By Engineers) symposium abstracts. [LASL, October 4--6, 1978

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruminer, J.J.

    1978-07-01

    This report presents the abstracts for the CUBE (Computer Use by Engineers) Symposium, October 4, through 6, 1978. Contributors are from Lawrence Livermore Laboratory, Los Alamos Scientific Laboratory, and Sandia Laboratories.

  12. Report on Computing and Networking in the Space Science Laboratory by the SSL Computer Committee

    NASA Technical Reports Server (NTRS)

    Gallagher, D. L. (Editor)

    1993-01-01

    The Space Science Laboratory (SSL) at Marshall Space Flight Center is a multiprogram facility. Scientific research is conducted in four discipline areas: earth science and applications, solar-terrestrial physics, astrophysics, and microgravity science and applications. Representatives from each of these discipline areas participate in a Laboratory computer requirements committee, which developed this document. The purpose is to establish and discuss Laboratory objectives for computing and networking in support of science. The purpose is also to lay the foundation for a collective, multiprogram approach to providing these services. Special recognition is given to the importance of the national and international efforts of our research communities toward the development of interoperable, network-based computer applications.

  13. Installation of Computerized Procedure System and Advanced Alarm System in the Human Systems Simulation Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Le Blanc, Katya Lee; Spielman, Zachary Alexander; Rice, Brandon Charles

    2016-04-01

    This report describes the installation of two advanced control room technologies, an advanced alarm system and a computerized procedure system, into the Human Systems Simulation Laboratory (HSSL). Installation of these technologies enables future phases of this research by providing a platform to systematically evaluate the effect of these technologies on operator and plant performance.

  14. Advanced Laboratory NMR Spectrometer with Applications.

    ERIC Educational Resources Information Center

    Biscegli, Clovis; And Others

    1982-01-01

    A description is given of an inexpensive nuclear magnetic resonance (NMR) spectrometer suitable for use in advanced laboratory courses. Applications to the nondestructive analysis of the oil content in corn seeds and in monitoring the crystallization of polymers are presented. (SK)

  15. Real-Time, Sensor-Based Computing in the Laboratory.

    ERIC Educational Resources Information Center

    Badmus, O. O.; And Others

    1996-01-01

    Demonstrates the importance of Real-Time, Sensor-Based (RTSB) computing and how it can be easily and effectively integrated into university student laboratories. Describes the experimental processes, the process instrumentation and process-computer interface, the computer and communications systems, and typical software. Provides much technical…

  16. Material Protection, Accounting, and Control Technologies (MPACT): Modeling and Simulation Roadmap

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cipiti, Benjamin; Dunn, Timothy; Durbin, Samual

    The development of sustainable advanced nuclear fuel cycles is a long-term goal of the Office of Nuclear Energy’s (DOE-NE) Fuel Cycle Technologies program. The Material Protection, Accounting, and Control Technologies (MPACT) campaign is supporting research and development (R&D) of advanced instrumentation, analysis tools, and integration methodologies to meet this goal. This advanced R&D is intended to facilitate safeguards and security by design of fuel cycle facilities. The lab-scale demonstration of a virtual facility, distributed test bed, that connects the individual tools being developed at National Laboratories and university research establishments, is a key program milestone for 2020. These tools willmore » consist of instrumentation and devices as well as computer software for modeling. To aid in framing its long-term goal, during FY16, a modeling and simulation roadmap is being developed for three major areas of investigation: (1) radiation transport and sensors, (2) process and chemical models, and (3) shock physics and assessments. For each area, current modeling approaches are described, and gaps and needs are identified.« less

  17. Increasing the power of accelerated molecular dynamics methods and plans to exploit the coming exascale

    NASA Astrophysics Data System (ADS)

    Voter, Arthur

    Many important materials processes take place on time scales that far exceed the roughly one microsecond accessible to molecular dynamics simulation. Typically, this long-time evolution is characterized by a succession of thermally activated infrequent events involving defects in the material. In the accelerated molecular dynamics (AMD) methodology, known characteristics of infrequent-event systems are exploited to make reactive events take place more frequently, in a dynamically correct way. For certain processes, this approach has been remarkably successful, offering a view of complex dynamical evolution on time scales of microseconds, milliseconds, and sometimes beyond. We have recently made advances in all three of the basic AMD methods (hyperdynamics, parallel replica dynamics, and temperature accelerated dynamics (TAD)), exploiting both algorithmic advances and novel parallelization approaches. I will describe these advances, present some examples of our latest results, and discuss what should be possible when exascale computing arrives in roughly five years. Funded by the U.S. Department of Energy, Office of Basic Energy Sciences, Materials Sciences and Engineering Division, and by the Los Alamos Laboratory Directed Research and Development program.

  18. Kinetics and Photochemistry of Ruthenium Bisbipyridine Diacetonitrile Complexes: An Interdisciplinary Inorganic and Physical Chemistry Laboratory Exercise.

    PubMed

    Rapp, Teresa L; Phillips, Susan R; Dmochowski, Ivan J

    2016-12-13

    The study of ruthenium polypyridyl complexes can be widely applied across disciplines in the undergraduate curriculum. Ruthenium photochemistry has advanced many fields including dye-sensitized solar cells, photoredox catalysis, light-driven water oxidation, and biological electron transfer. Equally promising are ruthenium polypyridyl complexes that provide a sterically bulky, photolabile moiety for transiently "caging" biologically active molecules. Photouncaging involves the use of visible (1-photon) or near-IR (2-photon) light to break one or more bonds between ruthenium and coordinated ligand(s), which can occur on short time scales and in high quantum yields. In this work we demonstrate the use of a model "caged" acetonitrile complex, Ru(2,2'-bipyridine) 2 (acetonitrile) 2 , or RuMeCN in an advanced synthesis and physical chemistry laboratory. Students made RuMeCN in an advanced synthesis laboratory course and performed UV-vis spectroscopy and electrochemistry. The following semester students investigated RuMeCN photolysis kinetics in a physical chemistry laboratory. These two exercises may also be combined to create a 2-week module in an advanced undergraduate laboratory course.

  19. Kinetics and Photochemistry of Ruthenium Bisbipyridine Diacetonitrile Complexes: An Interdisciplinary Inorganic and Physical Chemistry Laboratory Exercise

    PubMed Central

    2016-01-01

    The study of ruthenium polypyridyl complexes can be widely applied across disciplines in the undergraduate curriculum. Ruthenium photochemistry has advanced many fields including dye-sensitized solar cells, photoredox catalysis, light-driven water oxidation, and biological electron transfer. Equally promising are ruthenium polypyridyl complexes that provide a sterically bulky, photolabile moiety for transiently “caging” biologically active molecules. Photouncaging involves the use of visible (1-photon) or near-IR (2-photon) light to break one or more bonds between ruthenium and coordinated ligand(s), which can occur on short time scales and in high quantum yields. In this work we demonstrate the use of a model “caged” acetonitrile complex, Ru(2,2′-bipyridine)2(acetonitrile)2, or RuMeCN in an advanced synthesis and physical chemistry laboratory. Students made RuMeCN in an advanced synthesis laboratory course and performed UV–vis spectroscopy and electrochemistry. The following semester students investigated RuMeCN photolysis kinetics in a physical chemistry laboratory. These two exercises may also be combined to create a 2-week module in an advanced undergraduate laboratory course. PMID:28649139

  20. Strategic Computing. New-Generation Computing Technology: A Strategic Plan for Its Development and Application to Critical Problems in Defense

    DTIC Science & Technology

    1983-10-28

    Computing. By seizing an opportunity to leverage recent advances in artificial intelligence, computer science, and microelectronics, the Agency plans...occurred in many separated areas of artificial intelligence, computer science, and microelectronics. Advances in "expert system" technology now...and expert knowledge o Advances in Artificial Intelligence: Mechanization of speech recognition, vision, and natural language understanding. o

  1. Using a Cloud Computing System to Reduce Door-to-Balloon Time in Acute ST-Elevation Myocardial Infarction Patients Transferred for Percutaneous Coronary Intervention

    PubMed Central

    Ho, Chi-Kung; Wang, Hui-Ting; Lee, Chien-Ho; Chung, Wen-Jung; Lin, Cheng-Jui; Hsueh, Shu-Kai; Hung, Shin-Chiang; Wu, Kuan-Han; Liu, Chu-Feng; Kung, Chia-Te

    2017-01-01

    Background This study evaluated the impact on clinical outcomes using a cloud computing system to reduce percutaneous coronary intervention hospital door-to-balloon (DTB) time for ST segment elevation myocardial infarction (STEMI). Methods A total of 369 patients before and after implementation of the transfer protocol were enrolled. Of these patients, 262 were transferred through protocol while the other 107 patients were transferred through the traditional referral process. Results There were no significant differences in DTB time, pain to door of STEMI receiving center arrival time, and pain to balloon time between the two groups. Pain to electrocardiography time in patients with Killip I/II and catheterization laboratory to balloon time in patients with Killip III/IV were significantly reduced in transferred through protocol group compared to in traditional referral process group (both p < 0.05). There were also no remarkable differences in the complication rate and 30-day mortality between two groups. The multivariate analysis revealed that the independent predictors of 30-day mortality were elderly patients, advanced Killip score, and higher level of troponin-I. Conclusions This study showed that patients transferred through our present protocol could reduce pain to electrocardiography and catheterization laboratory to balloon time in Killip I/II and III/IV patients separately. However, this study showed that using a cloud computing system in our present protocol did not reduce DTB time. PMID:28900621

  2. The Computer Bulletin Board.

    ERIC Educational Resources Information Center

    Batt, Russell H., Ed.

    1990-01-01

    Four applications of microcomputers in the chemical laboratory are presented. Included are "Mass Spectrometer Interface with an Apple II Computer,""Interfacing the Spectronic 20 to a Computer,""A pH-Monitoring and Control System for Teaching Laboratories," and "A Computer-Aided Optical Melting Point Device." Software, instrumentation, and uses are…

  3. Development of a Reduced-Order Three-Dimensional Flow Model for Thermal Mixing and Stratification Simulation during Reactor Transients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu, Rui

    2017-09-03

    Mixing, thermal-stratification, and mass transport phenomena in large pools or enclosures play major roles for the safety of reactor systems. Depending on the fidelity requirement and computational resources, various modeling methods, from the 0-D perfect mixing model to 3-D Computational Fluid Dynamics (CFD) models, are available. Each is associated with its own advantages and shortcomings. It is very desirable to develop an advanced and efficient thermal mixing and stratification modeling capability embedded in a modern system analysis code to improve the accuracy of reactor safety analyses and to reduce modeling uncertainties. An advanced system analysis tool, SAM, is being developedmore » at Argonne National Laboratory for advanced non-LWR reactor safety analysis. While SAM is being developed as a system-level modeling and simulation tool, a reduced-order three-dimensional module is under development to model the multi-dimensional flow and thermal mixing and stratification in large enclosures of reactor systems. This paper provides an overview of the three-dimensional finite element flow model in SAM, including the governing equations, stabilization scheme, and solution methods. Additionally, several verification and validation tests are presented, including lid-driven cavity flow, natural convection inside a cavity, laminar flow in a channel of parallel plates. Based on the comparisons with the analytical solutions and experimental results, it is demonstrated that the developed 3-D fluid model can perform very well for a wide range of flow problems.« less

  4. The Scanning Electron Microscope As An Accelerator For The Undergraduate Advanced Physics Laboratory

    NASA Astrophysics Data System (ADS)

    Peterson, Randolph S.; Berggren, Karl K.; Mondol, Mark

    2011-06-01

    Few universities or colleges have an accelerator for use with advanced physics laboratories, but many of these institutions have a scanning electron microscope (SEM) on site, often in the biology department. As an accelerator for the undergraduate, advanced physics laboratory, the SEM is an excellent substitute for an ion accelerator. Although there are no nuclear physics experiments that can be performed with a typical 30 kV SEM, there is an opportunity for experimental work on accelerator physics, atomic physics, electron-solid interactions, and the basics of modern e-beam lithography.

  5. Post-Genomics and Vaccine Improvement for Leishmania

    PubMed Central

    Seyed, Negar; Taheri, Tahereh; Rafati, Sima

    2016-01-01

    Leishmaniasis is a parasitic disease that primarily affects Asia, Africa, South America, and the Mediterranean basin. Despite extensive efforts to develop an effective prophylactic vaccine, no promising vaccine is available yet. However, recent advancements in computational vaccinology on the one hand and genome sequencing approaches on the other have generated new hopes in vaccine development. Computational genome mining for new vaccine candidates is known as reverse vaccinology and is believed to further extend the current list of Leishmania vaccine candidates. Reverse vaccinology can also reduce the intrinsic risks associated with live attenuated vaccines. Individual epitopes arranged in tandem as polytopes are also a possible outcome of reverse genome mining. Here, we will briefly compare reverse vaccinology with conventional vaccinology in respect to Leishmania vaccine, and we will discuss how it influences the aforementioned topics. We will also introduce new in vivo models that will bridge the gap between human and laboratory animal models in future studies. PMID:27092123

  6. Land use, water and Mediterranean landscapes: modelling long-term dynamics of complex socio-ecological systems.

    PubMed

    Barton, C Michael; Ullah, Isaac I; Bergin, Sean

    2010-11-28

    The evolution of Mediterranean landscapes during the Holocene has been increasingly governed by the complex interactions of water and human land use. Different land-use practices change the amount of water flowing across the surface and infiltrating the soil, and change water's ability to move surface sediments. Conversely, water amplifies the impacts of human land use and extends the ecological footprint of human activities far beyond the borders of towns and fields. Advances in computational modelling offer new tools to study the complex feedbacks between land use, land cover, topography and surface water. The Mediterranean Landscape Dynamics project (MedLand) is building a modelling laboratory where experiments can be carried out on the long-term impacts of agropastoral land use, and whose results can be tested against the archaeological record. These computational experiments are providing new insights into the socio-ecological consequences of human decisions at varying temporal and spatial scales.

  7. Hypersonic simulations using open-source CFD and DSMC solvers

    NASA Astrophysics Data System (ADS)

    Casseau, V.; Scanlon, T. J.; John, B.; Emerson, D. R.; Brown, R. E.

    2016-11-01

    Hypersonic hybrid hydrodynamic-molecular gas flow solvers are required to satisfy the two essential requirements of any high-speed reacting code, these being physical accuracy and computational efficiency. The James Weir Fluids Laboratory at the University of Strathclyde is currently developing an open-source hybrid code which will eventually reconcile the direct simulation Monte-Carlo method, making use of the OpenFOAM application called dsmcFoam, and the newly coded open-source two-temperature computational fluid dynamics solver named hy2Foam. In conjunction with employing the CVDV chemistry-vibration model in hy2Foam, novel use is made of the QK rates in a CFD solver. In this paper, further testing is performed, in particular with the CFD solver, to ensure its efficacy before considering more advanced test cases. The hy2Foam and dsmcFoam codes have shown to compare reasonably well, thus providing a useful basis for other codes to compare against.

  8. A DAFT DL_POLY distributed memory adaptation of the Smoothed Particle Mesh Ewald method

    NASA Astrophysics Data System (ADS)

    Bush, I. J.; Todorov, I. T.; Smith, W.

    2006-09-01

    The Smoothed Particle Mesh Ewald method [U. Essmann, L. Perera, M.L. Berkowtz, T. Darden, H. Lee, L.G. Pedersen, J. Chem. Phys. 103 (1995) 8577] for calculating long ranged forces in molecular simulation has been adapted for the parallel molecular dynamics code DL_POLY_3 [I.T. Todorov, W. Smith, Philos. Trans. Roy. Soc. London 362 (2004) 1835], making use of a novel 3D Fast Fourier Transform (DAFT) [I.J. Bush, The Daresbury Advanced Fourier transform, Daresbury Laboratory, 1999] that perfectly matches the Domain Decomposition (DD) parallelisation strategy [W. Smith, Comput. Phys. Comm. 62 (1991) 229; M.R.S. Pinches, D. Tildesley, W. Smith, Mol. Sim. 6 (1991) 51; D. Rapaport, Comput. Phys. Comm. 62 (1991) 217] of the DL_POLY_3 code. In this article we describe software adaptations undertaken to import this functionality and provide a review of its performance.

  9. Large-eddy simulation of a boundary layer with concave streamwise curvature

    NASA Technical Reports Server (NTRS)

    Lund, Thomas S.

    1994-01-01

    Turbulence modeling continues to be one of the most difficult problems in fluid mechanics. Existing prediction methods are well developed for certain classes of simple equilibrium flows, but are still not entirely satisfactory for a large category of complex non-equilibrium flows found in engineering practice. Direct and large-eddy simulation (LES) approaches have long been believed to have great potential for the accurate prediction of difficult turbulent flows, but the associated computational cost has been prohibitive for practical problems. This remains true for direct simulation but is no longer clear for large-eddy simulation. Advances in computer hardware, numerical methods, and subgrid-scale modeling have made it possible to conduct LES for flows or practical interest at Reynolds numbers in the range of laboratory experiments. The objective of this work is to apply ES and the dynamic subgrid-scale model to the flow of a boundary layer over a concave surface.

  10. Scalar transport across the turbulent/non-turbulent interface in jets: Schmidt number effects

    NASA Astrophysics Data System (ADS)

    Silva, Tiago S.; B. da Silva, Carlos; Idmec Team

    2016-11-01

    The dynamics of a passive scalar field near a turbulent/non-turbulent interface (TNTI) is analysed through direct numerical simulations (DNS) of turbulent planar jets, with Reynolds numbers ranging from 142 <= Reλ <= 246 , and Schmidt numbers from 0 . 07 <= Sc <= 7 . The steepness of the scalar gradient, as observed from conditional profiles near the TNTI, increases with the Schmidt number. Conditional scalar gradient budgets show that for low and moderate Schmidt numbers a diffusive superlayer emerges at the TNTI, where the scalar gradient diffusion dominates, while the production is negligible. For low Schmidt numbers the growth of the turbulent front is commanded by the molecular diffusion, whereas the scalar gradient convection is negligible. The authors acknowledge the Laboratory for Advanced Computing at University of Coimbra for providing HPC, computing, consulting resources that have contributed to the research results reported within this paper. URL http://www.lca.uc.pt.

  11. Patch planting of hard spin-glass problems: Getting ready for the next generation of optimization approaches

    NASA Astrophysics Data System (ADS)

    Wang, Wenlong; Mandrà, Salvatore; Katzgraber, Helmut

    We propose a patch planting heuristic that allows us to create arbitrarily-large Ising spin-glass instances on any topology and with any type of disorder, and where the exact ground-state energy of the problem is known by construction. By breaking up the problem into patches that can be treated either with exact or heuristic solvers, we can reconstruct the optimum of the original, considerably larger, problem. The scaling of the computational complexity of these instances with various patch numbers and sizes is investigated and compared with random instances using population annealing Monte Carlo and quantum annealing on the D-Wave 2X quantum annealer. The method can be useful for benchmarking of novel computing technologies and algorithms. NSF-DMR-1208046 and the Office of the Director of National Intelligence (ODNI), Intelligence Advanced Research Projects Activity (IARPA), via MIT Lincoln Laboratory Air Force Contract No. FA8721-05-C-0002.

  12. The computational structural mechanics testbed architecture. Volume 5: The Input-Output Manager DMGASP

    NASA Technical Reports Server (NTRS)

    Felippa, Carlos A.

    1989-01-01

    This is the fifth of a set of five volumes which describe the software architecture for the Computational Structural Mechanics Testbed. Derived from NICE, an integrated software system developed at Lockheed Palo Alto Research Laboratory, the architecture is composed of the command language (CLAMP), the command language interpreter (CLIP), and the data manager (GAL). Volumes 1, 2, and 3 (NASA CR's 178384, 178385, and 178386, respectively) describe CLAMP and CLIP and the CLIP-processor interface. Volumes 4 and 5 (NASA CR's 178387 and 178388, respectively) describe GAL and its low-level I/O. CLAMP, an acronym for Command Language for Applied Mechanics Processors, is designed to control the flow of execution of processors written for NICE. Volume 5 describes the low-level data management component of the NICE software. It is intended only for advanced programmers involved in maintenance of the software.

  13. FOREWORD: 3rd International Workshop on New Computational Methods for Inverse Problems (NCMIP 2013)

    NASA Astrophysics Data System (ADS)

    Blanc-Féraud, Laure; Joubert, Pierre-Yves

    2013-10-01

    Conference logo This volume of Journal of Physics: Conference Series is dedicated to the scientific contributions presented during the 3rd International Workshop on New Computational Methods for Inverse Problems, NCMIP 2013 (http://www.farman.ens-cachan.fr/NCMIP_2013.html). This workshop took place at Ecole Normale Supérieure de Cachan, in Cachan, France, on 22 May 2013, at the initiative of Institut Farman. The prior editions of NCMIP also took place in Cachan, France, firstly within the scope of the ValueTools Conference, in May 2011 (http://www.ncmip.org/2011/), and secondly at the initiative of Institut Farman, in May 2012 (http://www.farman.ens-cachan.fr/NCMIP_2012.html). The NCMIP Workshop focused on recent advances in the resolution of inverse problems. Indeed inverse problems appear in numerous scientific areas such as geophysics, biological and medical imaging, material and structure characterization, electrical, mechanical and civil engineering, and finances. The resolution of inverse problems consists of estimating the parameters of the observed system or structure from data collected by an instrumental sensing or imaging device. Its success firstly requires the collection of relevant observation data. It also requires accurate models describing the physical interactions between the instrumental device and the observed system, as well as the intrinsic properties of the solution itself. Finally, it requires the design of robust, accurate and efficient inversion algorithms. Advanced sensor arrays and imaging devices provide high rate and high volume data; in this context, the efficient resolution of the inverse problem requires the joint development of new models and inversion methods, taking computational and implementation aspects into account. During this one-day workshop, researchers had the opportunity to bring to light and share new techniques and results in the field of inverse problems. The topics of the workshop were: algorithms and computational aspects of inversion, Bayesian estimation, kernel methods, learning methods, convex optimization, free discontinuity problems, metamodels, proper orthogonal decomposition, reduced models for the inversion, non-linear inverse scattering, image reconstruction and restoration, and applications (bio-medical imaging, non-destructive evaluation...). NCMIP 2013 was a one-day workshop held in May 2013 which attracted around 60 attendees. Each of the submitted papers has been reviewed by three reviewers. Among the accepted papers, there are seven oral presentations, five posters and one invited poster (On a deconvolution challenge presented by C Vonesch from EPFL, Switzerland). In addition, three international speakers were invited to present a longer talk. The workshop was supported by Institut Farman (ENS Cachan, CNRS) and endorsed by the following French research networks (GDR ISIS, GDR Ondes, GDR MOA, GDR MSPC). The program committee acknowledges the following research laboratories CMLA, LMT, LSV, LURPA, SATIE. Laure Blanc-Féraud and Pierre-Yves Joubert Workshop co-chair Laure Blanc-Féraud, I3S laboratory and INRIA Nice Sophia-Antipolis, France Pierre-Yves Joubert, IEF, Paris-Sud University, CNRS, France Technical program committee Gilles Aubert, J-A Dieudonné Laboratory, CNRS and University of Nice-Sophia Antipolis, France Nabil Anwer, LURPA, ENS Cachan, France Alexandre Baussard, ENSTA Bretagne, Lab-STICC, France Marc Bonnet, ENSTA, ParisTech, France Antonin Chambolle, CMAP, Ecole Polytechnique, CNRS, France Oliver Dorn, School of Mathematics, University of Manchester, UK Cécile Durieu, SATIE, ENS Cachan, CNRS, France Gérard Favier, I3S Laboratory, University of Nice Sophia-Antipolis, France Mário Figueiredo, Instituto Superior Técnico, Lisbon, Portugal Laurent Fribourg, LSV, ENS Cachan, CNRS, France Marc Lambert, L2S Laboratory, CNRS, SupElec, Paris-Sud University, France Dominique Lesselier, L2S Laboratory, CNRS, SupElec, Paris-Sud University, France Matteo Pastorino, DIBE, University of Genoa, Italy Christian Rey, LMT, ENS Cachan, CNRS, France Simon Setzer, Saarland University, Germany Cedric Vonesch, EPFL, Switzerland Local chair Sophie Abriet, SATIE Laboratory, ENS Cachan, France Béatrice Bacquet, SATIE Laboratory, ENS Cachan, France Lydia Matijevic, LMT Laboratory, ENS Cachan France Invited speakers Jérôme Idier, IRCCyN (UMR CNRS 6597), Ecole Centrale de Nantes, France Massimo Fornasier, Faculty of Mathematics, Technical University of Munich, Germany Matthias Fink, Institut Langevin, ESPCI, Université Paris Diderot, France

  14. Operator Station Design System - A computer aided design approach to work station layout

    NASA Technical Reports Server (NTRS)

    Lewis, J. L.

    1979-01-01

    The Operator Station Design System is resident in NASA's Johnson Space Center Spacecraft Design Division Performance Laboratory. It includes stand-alone minicomputer hardware and Panel Layout Automated Interactive Design and Crew Station Assessment of Reach software. The data base consists of the Shuttle Transportation System Orbiter Crew Compartment (in part), the Orbiter payload bay and remote manipulator (in part), and various anthropometric populations. The system is utilized to provide panel layouts, assess reach and vision, determine interference and fit problems early in the design phase, study design applications as a function of anthropometric and mission requirements, and to accomplish conceptual design to support advanced study efforts.

  15. Engineering studies of vectorcardiographs in blood pressure measuring systems, appendix 1

    NASA Technical Reports Server (NTRS)

    Mark, R. G.

    1975-01-01

    A small, portable, relatively inexpensive computer system was developed for on-line use in clinical or laboratory situations. The system features an integrated hardware-software package that permits use of all peripherals, such as analog-to-digital converter, oscilloscope, plotter, digital bus, with an interpreter constructed around the BASIC programming language. The system is conceptually similar to the LINC system developed in 1962, but is more compact and powerful due to intervening advances in integrated circuit technology. A description of the hardware of the system was given. A reference manual, user manual, and programming guides were also presented. Finally, a stereo display system for vectorcardiograms was described.

  16. Microgravity

    NASA Image and Video Library

    1999-11-10

    Space Vacuum Epitaxy Center works with industry and government laboratories to develop advanced thin film materials and devices by utilizing the most abundant free resource in orbit: the vacuum of space. SVEC, along with its affiliates, is developing semiconductor mid-IR lasers for environmental sensing and defense applications, high efficiency solar cells for space satellite applications, oxide thin films for computer memory applications, and ultra-hard thin film coatings for wear resistance in micro devices. Performance of these vacuum deposited thin film materials and devices can be enhanced by using the ultra-vacuum of space for which SVEC has developed the Wake Shield Facility---a free flying research platform dedicated to thin film materials development in space.

  17. Microgravity

    NASA Image and Video Library

    2000-11-10

    Space Vacuum Epitaxy Center works with industry and government laboratories to develop advanced thin film materials and devices by utilizing the most abundant free resource in orbit: the vacuum of space. SVEC, along with its affiliates, is developing semiconductor mid-IR lasers for environmental sensing and defense applications, high efficiency solar cells for space satellite applications, oxide thin films for computer memory applications, and ultra-hard thin film coatings for wear resistance in micro devices. Performance of these vacuum deposited thin film materials and devices can be enhanced by using the ultra-vacuum of space for which SVEC has developed the Wake Shield Facility---a free flying research platform dedicated to thin film materials development in space.

  18. Assessment of the MHD capability in the ATHENA code using data from the ALEX facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roth, P.A.

    1989-03-01

    The ATHENA (Advanced Thermal Hydraulic Energy Network Analyzer) code is a system transient analysis code with multi-loop, multi-fluid capabilities, which is available to the fusion community at the National Magnetic Fusion Energy Computing Center (NMFECC). The work reported here assesses the ATHENA magnetohydrodynamic (MHD) pressure drop model for liquid metals flowing through a strong magnetic field. An ATHENA model was developed for two simple geometry, adiabatic test sections used in the Argonne Liquid Metal Experiment (ALEX) at Argonne National Laboratory (ANL). The pressure drops calculated by ATHENA agreed well with the experimental results from the ALEX facility.

  19. Automated documentation generator for advanced protein crystal growth

    NASA Technical Reports Server (NTRS)

    Maddux, Gary A.; Provancha, Anna; Chattam, David; Ford, Ronald

    1993-01-01

    The System Management and Production Laboratory at the Research Institute, the University of Alabama in Huntsville (UAH), was tasked by the Microgravity Experiment Projects (MEP) Office of the Payload Projects Office (PPO) at Marshall Space Flight Center (MSFC) to conduct research in the current methods of written documentation control and retrieval. The goals of this research were to determine the logical interrelationships within selected NASA documentation, and to expand on a previously developed prototype system to deliver a distributable, electronic knowledge-based system. This computer application would then be used to provide a paperless interface between the appropriate parties for the required NASA document.

  20. Computer Security Awareness Guide for Department of Energy Laboratories, Government Agencies, and others for use with Lawrence Livermore National Laboratory`s (LLNL): Computer security short subjects videos

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    Lonnie Moore, the Computer Security Manager, CSSM/CPPM at Lawrence Livermore National Laboratory (LLNL) and Gale Warshawsky, the Coordinator for Computer Security Education & Awareness at LLNL, wanted to share topics such as computer ethics, software piracy, privacy issues, and protecting information in a format that would capture and hold an audience`s attention. Four Computer Security Short Subject videos were produced which ranged from 1-3 minutes each. These videos are very effective education and awareness tools that can be used to generate discussions about computer security concerns and good computing practices. Leaders may incorporate the Short Subjects into presentations. After talkingmore » about a subject area, one of the Short Subjects may be shown to highlight that subject matter. Another method for sharing them could be to show a Short Subject first and then lead a discussion about its topic. The cast of characters and a bit of information about their personalities in the LLNL Computer Security Short Subjects is included in this report.« less

  1. Reproducibility of risk figures in 2nd-trimester maternal serum screening for down syndrome: comparison of 2 laboratories.

    PubMed

    Benn, Peter A; Makowski, Gregory S; Egan, James F X; Wright, Dave

    2006-11-01

    Analytical error affects 2nd-trimester maternal serum screening for Down syndrome risk estimation. We analyzed the between-laboratory reproducibility of risk estimates from 2 laboratories. Laboratory 1 used Bayer ACS180 immunoassays for alpha-fetoprotein (AFP) and human chorionic gonadotropin (hCG), Diagnostic Systems Laboratories (DSL) RIA for unconjugated estriol (uE3), and DSL enzyme immunoassay for inhibin-A (INH-A). Laboratory 2 used Beckman immunoassays for AFP, hCG, and uE3, and DSL enzyme immunoassay for INH-A. Analyte medians were separately established for each laboratory. We used the same computational algorithm for all risk calculations, and we used Monte Carlo methods for computer modeling. For 462 samples tested, risk figures from the 2 laboratories differed >2-fold for 44.7%, >5-fold for 7.1%, and >10-fold for 1.7%. Between-laboratory differences in analytes were greatest for uE3 and INH-A. The screen-positive rates were 9.3% for laboratory 1 and 11.5% for laboratory 2, with a significant difference in the patients identified as screen-positive vs screen-negative (McNemar test, P<0.001). Computer modeling confirmed the large between-laboratory risk differences. Differences in performance of assays and laboratory procedures can have a large effect on patient-specific risks. Screening laboratories should minimize test imprecision and ensure that each assay performs in a manner similar to that assumed in the risk computational algorithm.

  2. The Workstation Approach to Laboratory Computing

    PubMed Central

    Crosby, P.A.; Malachowski, G.C.; Hall, B.R.; Stevens, V.; Gunn, B.J.; Hudson, S.; Schlosser, D.

    1985-01-01

    There is a need for a Laboratory Workstation which specifically addresses the problems associated with computing in the scientific laboratory. A workstation based on the IBM PC architecture and including a front end data acquisition system which communicates with a host computer via a high speed communications link; a new graphics display controller with hardware window management and window scrolling; and an integrated software package is described.

  3. A computational model of the human hand 93-ERI-053

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hollerbach, K.; Axelrod, T.

    1996-03-01

    The objectives of the Computational Hand Modeling project were to prove the feasibility of the Laboratory`s NIKE3D finite element code to orthopaedic problems. Because of the great complexity of anatomical structures and the nonlinearity of their behavior, we have focused on a subset of joints of the hand and lower extremity and have developed algorithms to model their behavior. The algorithms developed here solve fundamental problems in computational biomechanics and can be expanded to describe any other joints of the human body. This kind of computational modeling has never successfully been attempted before, due in part to a lack ofmore » biomaterials data and a lack of computational resources. With the computational resources available at the National Laboratories and the collaborative relationships we have established with experimental and other modeling laboratories, we have been in a position to pursue our innovative approach to biomechanical and orthopedic modeling.« less

  4. Earth System Grid II (ESG): Turning Climate Model Datasets Into Community Resources

    NASA Astrophysics Data System (ADS)

    Williams, D.; Middleton, D.; Foster, I.; Nevedova, V.; Kesselman, C.; Chervenak, A.; Bharathi, S.; Drach, B.; Cinquni, L.; Brown, D.; Strand, G.; Fox, P.; Garcia, J.; Bernholdte, D.; Chanchio, K.; Pouchard, L.; Chen, M.; Shoshani, A.; Sim, A.

    2003-12-01

    High-resolution, long-duration simulations performed with advanced DOE SciDAC/NCAR climate models will produce tens of petabytes of output. To be useful, this output must be made available to global change impacts researchers nationwide, both at national laboratories and at universities, other research laboratories, and other institutions. To this end, we propose to create a new Earth System Grid, ESG-II - a virtual collaborative environment that links distributed centers, users, models, and data. ESG-II will provide scientists with virtual proximity to the distributed data and resources that they require to perform their research. The creation of this environment will significantly increase the scientific productivity of U.S. climate researchers by turning climate datasets into community resources. In creating ESG-II, we will integrate and extend a range of Grid and collaboratory technologies, including the DODS remote access protocols for environmental data, Globus Toolkit technologies for authentication, resource discovery, and resource access, and Data Grid technologies developed in other projects. We will develop new technologies for (1) creating and operating "filtering servers" capable of performing sophisticated analyses, and (2) delivering results to users. In so doing, we will simultaneously contribute to climate science and advance the state of the art in collaboratory technology. We expect our results to be useful to numerous other DOE projects. The three-year R&D program will be undertaken by a talented and experienced team of computer scientists at five laboratories (ANL, LBNL, LLNL, NCAR, ORNL) and one university (ISI), working in close collaboration with climate scientists at several sites.

  5. Strengthening laboratory systems in resource-limited settings.

    PubMed

    Olmsted, Stuart S; Moore, Melinda; Meili, Robin C; Duber, Herbert C; Wasserman, Jeffrey; Sama, Preethi; Mundell, Ben; Hilborne, Lee H

    2010-09-01

    Considerable resources have been invested in recent years to improve laboratory systems in resource-limited settings. We reviewed published reports, interviewed major donor organizations, and conducted case studies of laboratory systems in 3 countries to assess how countries and donors have worked together to improve laboratory services. While infrastructure and the provision of services have seen improvement, important opportunities remain for further advancement. Implementation of national laboratory plans is inconsistent, human resources are limited, and quality laboratory services rarely extend to lower tier laboratories (eg, health clinics, district hospitals). Coordination within, between, and among governments and donor organizations is also frequently problematic. Laboratory standardization and quality control are improving but remain challenging, making accreditation a difficult goal. Host country governments and their external funding partners should coordinate their efforts effectively around a host country's own national laboratory plan to advance sustainable capacity development throughout a country's laboratory system.

  6. Custom electronic subsystems for the laboratory telerobotic manipulator

    NASA Technical Reports Server (NTRS)

    Glassell, R. L.; Butler, P. L.; Rowe, J. C.; Zimmermann, S. D.

    1990-01-01

    The National Aeronautics and Space Administration (NASA) Space Station Program presents new opportunities for the application of telerobotic and robotic systems. The Laboratory Telerobotic Manipulator (LTM) is a highly advanced 7 degrees-of-freedom (DOF) telerobotic/robotic manipulator. It was developed and built for the Automation Technology Branch at NASA's Langley Research Center (LaRC) for work in research and to demonstrate ground-based telerobotic manipulator system hardware and software systems for future NASA applications in the hazardous environment of space. The LTM manipulator uses an embedded wiring design with all electronics, motor power, and control and communication cables passing through the pitch-yaw differential joints. This design requires the number of cables passing through the pitch/yaw joint to be kept to a minimum. To eliminate the cables needed to carry each pitch-yaw joint's sensor data to the VME control computers, a custom-embedded electronics package for each manipulator joint was developed. The electronics package collects and sends the joint's sensor data to the VME control computers over a fiber optic cable. The electronics package consist of five individual subsystems: the VME Link Processor, the Joint Processor and the Joint Processor power supply in the joint module, the fiber optics communications system, and the electronics and motor power cabling.

  7. DOE planning workshop advanced biomedical technology initiative

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1994-06-01

    The Department of Energy has mad major contributions in the biomedical sciences with programs in medical applications and instrumentation development, molecular biology, human genome, and computational sciences. In an effort to help determine DOE`s role in applying these capabilities to the nation`s health care needs, a planning workshop was held on January 11--12, 1994. The workshop was co-sponsored by the Department`s Office of Energy Research and Defense Programs organizations. Participants represented industry, medical research institutions, national laboratories, and several government agencies. They attempted to define the needs of the health care industry. identify DOE laboratory capabilities that address these needs,more » and determine how DOE, in cooperation with other team members, could begin an initiative with the goals of reducing health care costs while improving the quality of health care delivery through the proper application of technology and computational systems. This document is a report of that workshop. Seven major technology development thrust areas were considered. Each involves development of various aspects of imaging, optical, sensor and data processing and storage technologies. The thrust areas as prioritized for DOE are: (1) Minimally Invasive Procedures; (2) Technologies for Individual Self Care; (3) Outcomes Research; (4) Telemedicine; (5) Decision Support Systems; (6) Assistive Technology; (7) Prevention and Education.« less

  8. 2016 Final Reports from the Los Alamos National Laboratory Computational Physics Student Summer Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Runnels, Scott Robert; Bachrach, Harrison Ian; Carlson, Nils

    The two primary purposes of LANL’s Computational Physics Student Summer Workshop are (1) To educate graduate and exceptional undergraduate students in the challenges and applications of computational physics of interest to LANL, and (2) Entice their interest toward those challenges. Computational physics is emerging as a discipline in its own right, combining expertise in mathematics, physics, and computer science. The mathematical aspects focus on numerical methods for solving equations on the computer as well as developing test problems with analytical solutions. The physics aspects are very broad, ranging from low-temperature material modeling to extremely high temperature plasma physics, radiation transportmore » and neutron transport. The computer science issues are concerned with matching numerical algorithms to emerging architectures and maintaining the quality of extremely large codes built to perform multi-physics calculations. Although graduate programs associated with computational physics are emerging, it is apparent that the pool of U.S. citizens in this multi-disciplinary field is relatively small and is typically not focused on the aspects that are of primary interest to LANL. Furthermore, more structured foundations for LANL interaction with universities in computational physics is needed; historically interactions rely heavily on individuals’ personalities and personal contacts. Thus a tertiary purpose of the Summer Workshop is to build an educational network of LANL researchers, university professors, and emerging students to advance the field and LANL’s involvement in it.« less

  9. The Effect of Birthrate Granularity on the Release- to- Birth Ratio for the AGR-1 In-core Experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dawn Scates; John Walter

    The AGR-1 Advanced Gas Reactor (AGR) tristructural-isotropic-particle fuel experiment underwent 13 irradiation intervals from December 2006 until November 2009 within the Idaho National Laboratory Advanced Test Reactor in support of the Next Generation Nuclear Power Plant program. During this multi-year experiment, release-to-birth rate ratios were computed at the end of each operating interval to provide information about fuel performance. Fission products released during irradiation were tracked daily by the Fission Product Monitoring System using 8-hour measurements. Birth rates calculated by MCNP with ORIGEN for as-run conditions were computed at the end of each irradiation interval. Each time step in MCNPmore » provided neutron flux, reaction rates and AGR-1 compact composition, which were used to determine birth rates using ORIGEN. The initial birth-rate data, consisting of four values for each irradiation interval at the beginning, end, and two intermediate times, were interpolated to obtain values for each 8-hour activity. The problem with this method is that any daily changes in heat rates or perturbations, such as shim control movement or core/lobe power fluctuations, would not be reflected in the interpolated data and a true picture of the system would not be presented. At the conclusion of the AGR-1 experiment, great efforts were put forth to compute daily birthrates, which were reprocessed with the 8-hour release activity. The results of this study are presented in this paper.« less

  10. The effect of birthrate granularity on the release-to-birth ratio for the AGR-1 in-core experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D. M. Scates; J. B. Walter; J. T. Maki

    The AGR-1 Advanced Gas Reactor (AGR) tristructural-isotropic-particle fuel experiment underwent 13 irradiation intervals from December 2006 until November 2009 within the Idaho National Laboratory Advanced Test Reactor in support of the Next Generation Nuclear Power Plant program. During this multi-year experiment, release-to-birth rate ratios were computed at the end of each operating interval to provide information about fuel performance. Fission products released during irradiation were tracked daily by the Fission Product Monitoring System using 8-h measurements. Birth rate calculated by MCNP with ORIGEN for as-run conditions were computed at the end of each irradiation interval. Each time step in MCNPmore » provided neutron flux, reaction rates and AGR-1 compact composition, which were used to determine birth rate using ORIGEN. The initial birth-rate data, consisting of four values for each irradiation interval at the beginning, end, and two intermediate times, were interpolated to obtain values for each 8-h activity. The problem with this method is that any daily changes in heat rates or perturbations, such as shim control movement or core/lobe power fluctuations, would not be reflected in the interpolated data and a true picture of the system would not be presented. At the conclusion of the AGR-1 experiment, great efforts were put forth to compute daily birthrates, which were reprocessed with the 8-h release activity. The results of this study are presented in this paper.« less

  11. The Petascale Data Storage Institute

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gibson, Garth; Long, Darrell; Honeyman, Peter

    2013-07-01

    Petascale computing infrastructures for scientific discovery make petascale demands on information storage capacity, performance, concurrency, reliability, availability, and manageability.The Petascale Data Storage Institute focuses on the data storage problems found in petascale scientific computing environments, with special attention to community issues such as interoperability, community buy-in, and shared tools.The Petascale Data Storage Institute is a collaboration between researchers at Carnegie Mellon University, National Energy Research Scientific Computing Center, Pacific Northwest National Laboratory, Oak Ridge National Laboratory, Sandia National Laboratory, Los Alamos National Laboratory, University of Michigan, and the University of California at Santa Cruz.

  12. Student perceptions and learning outcomes of computer-assisted versus traditional instruction in physiology.

    PubMed

    Richardson, D

    1997-12-01

    This study compared student perceptions and learning outcomes of computer-assisted instruction against those of traditional didactic lectures. Components of Quantitative Circulatory Physiology (Biological Simulators) and Mechanical Properties of Active Muscle (Trinity Software) were used to teach regulation of tissue blood flow and muscle mechanics, respectively, in the course Medical Physiology. These topics were each taught, in part, by 1) standard didactic lectures, 2) computer-assisted lectures, and 3) computer laboratory assignment. Subjective evaluation was derived from a questionnaire assessing student opinions of the effectiveness of each method. Objective evaluation consisted of comparing scores on examination questions generated from each method. On a 1-10 scale, effectiveness ratings were higher (P < 0.0001) for the didactic lectures (7.7) compared with either computer-assisted lecture (3.8) or computer laboratory (4.2) methods. A follow-up discussion with representatives from the class indicated that students did not perceive computer instruction as being time effective. However, examination scores from computer laboratory questions (94.3%) were significantly higher compared with ones from either computer-assisted (89.9%; P < 0.025) or didactic (86.6%; P < 0.001) lectures. Thus computer laboratory instruction enhanced learning outcomes in medical physiology despite student perceptions to the contrary.

  13. 78 FR 41046 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-09

    ... Services Administration, notice is hereby given that the Advanced Scientific Computing Advisory Committee will be renewed for a two-year period beginning on July 1, 2013. The Committee will provide advice to the Director, Office of Science (DOE), on the Advanced Scientific Computing Research Program managed...

  14. SERA -- An advanced treatment planning system for neutron therapy and BNCT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nigg, D.W.; Wemple, C.A.; Wessol, D.E.

    1999-09-01

    Detailed treatment planning calculations on a patient-specific basis are required for boron neutron capture therapy (BNCT). Two integrated treatment planning systems developed specifically for BNCT have been in clinical use in the United States over the past few years. The MacNCTPLAN BNCT treatment planning system is used in the clinical BNCT trials that are underway at the Massachusetts Institute of Technology. A second system, BNCT{_}rtpe (BNCT radiation therapy planning environment), developed independently by the Idaho national Engineering and Environmental Laboratory (INEEL) in collaboration with Montana State University (MSU), is used for treatment planning in the current series of BNCT clinicalmore » trials for glioblastoma at Brookhaven National Laboratory (BNL). This latter system is also licensed for use at several other BNCT research facilities worldwide. Although the currently available BNCT planning systems have served their purpose well, they suffer from somewhat long computation times (2 to 3 CPU-hours or more per field) relative to standard photon therapy planning software. This is largely due to the need for explicit three-dimensional solutions to the relevant transport equations. The simplifying approximations that work well for photon transport computations are not generally applicable to neutron transport computations. Greater computational speeds for BNCT treatment planning must therefore generally be achieved through the application of improved numerical techniques rather than by simplification of the governing equations. Recent efforts at INEEL and MSU have been directed toward this goal. This has resulted in a new paradigm for this type of calculation and the subsequent creation of the new simulation environment for radiotherapy applications (SERA) treatment planning system for BNCT. SERA is currently in initial clinical testing in connection with the trials at BNL, and it is expected to replace the present BNCT{_}rtpe system upon general release during 1999.« less

  15. Practical experience with graphical user interfaces and object-oriented design in the clinical laboratory.

    PubMed

    Wells, I G; Cartwright, R Y; Farnan, L P

    1993-12-15

    The computing strategy in our laboratories evolved from research in Artificial Intelligence, and is based on powerful software tools running on high performance desktop computers with a graphical user interface. This allows most tasks to be regarded as design problems rather than implementation projects, and both rapid prototyping and an object-oriented approach to be employed during the in-house development and enhancement of the laboratory information systems. The practical application of this strategy is discussed, with particular reference to the system designer, the laboratory user and the laboratory customer. Routine operation covers five departments, and the systems are stable, flexible and well accepted by the users. Client-server computing, currently undergoing final trials, is seen as the key to further development, and this approach to Pathology computing has considerable potential for the future.

  16. ASCR Cybersecurity for Scientific Computing Integrity - Research Pathways and Ideas Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peisert, Sean; Potok, Thomas E.; Jones, Todd

    At the request of the U.S. Department of Energy's (DOE) Office of Science (SC) Advanced Scientific Computing Research (ASCR) program office, a workshop was held June 2-3, 2015, in Gaithersburg, MD, to identify potential long term (10 to +20 year) cybersecurity fundamental basic research and development challenges, strategies and roadmap facing future high performance computing (HPC), networks, data centers, and extreme-scale scientific user facilities. This workshop was a follow-on to the workshop held January 7-9, 2015, in Rockville, MD, that examined higher level ideas about scientific computing integrity specific to the mission of the DOE Office of Science. Issues includedmore » research computation and simulation that takes place on ASCR computing facilities and networks, as well as network-connected scientific instruments, such as those run by various DOE Office of Science programs. Workshop participants included researchers and operational staff from DOE national laboratories, as well as academic researchers and industry experts. Participants were selected based on the submission of abstracts relating to the topics discussed in the previous workshop report [1] and also from other ASCR reports, including "Abstract Machine Models and Proxy Architectures for Exascale Computing" [27], the DOE "Preliminary Conceptual Design for an Exascale Computing Initiative" [28], and the January 2015 machine learning workshop [29]. The workshop was also attended by several observers from DOE and other government agencies. The workshop was divided into three topic areas: (1) Trustworthy Supercomputing, (2) Extreme-Scale Data, Knowledge, and Analytics for Understanding and Improving Cybersecurity, and (3) Trust within High-end Networking and Data Centers. Participants were divided into three corresponding teams based on the category of their abstracts. The workshop began with a series of talks from the program manager and workshop chair, followed by the leaders for each of the three topics and a representative of each of the four major DOE Office of Science Advanced Scientific Computing Research Facilities: the Argonne Leadership Computing Facility (ALCF), the Energy Sciences Network (ESnet), the National Energy Research Scientific Computing Center (NERSC), and the Oak Ridge Leadership Computing Facility (OLCF). The rest of the workshop consisted of topical breakout discussions and focused writing periods that produced much of this report.« less

  17. Harnessing the power of emerging petascale platforms

    NASA Astrophysics Data System (ADS)

    Mellor-Crummey, John

    2007-07-01

    As part of the US Department of Energy's Scientific Discovery through Advanced Computing (SciDAC-2) program, science teams are tackling problems that require computational simulation and modeling at the petascale. A grand challenge for computer science is to develop software technology that makes it easier to harness the power of these systems to aid scientific discovery. As part of its activities, the SciDAC-2 Center for Scalable Application Development Software (CScADS) is building open source software tools to support efficient scientific computing on the emerging leadership-class platforms. In this paper, we describe two tools for performance analysis and tuning that are being developed as part of CScADS: a tool for analyzing scalability and performance, and a tool for optimizing loop nests for better node performance. We motivate these tools by showing how they apply to S3D, a turbulent combustion code under development at Sandia National Laboratory. For S3D, our node performance analysis tool helped uncover several performance bottlenecks. Using our loop nest optimization tool, we transformed S3D's most costly loop nest to reduce execution time by a factor of 2.94 for a processor working on a 503 domain.

  18. Binary Black Holes, Gravitational Waves, and Numerical Relativity

    NASA Technical Reports Server (NTRS)

    Centrella, Joan

    2008-01-01

    The final merger of two black holes releases a tremendous amount of energy and is one of the brightest sources in the gravitational wave sky. Observing these sources with gravitational wave detectors requires that we know the radiation waveforms they emit. Since these mergers take place in regions of very strong gravitational fields. We need to solve Einstein's equations of general relativity on a computer in order to calculate these waveforms. For more than 30 years, scientists have tried to compute these waveforms using the methods of numerical relativity. The resulting computer codes have been plagued by instabilities, causing them to crash well before the black holes in the binary could complete even a single orbit. Recently this situation has changed dramatically, with a series of amazing breakthroughs. This talk will take you on this quest for the holy grail of numerical relativity, showing how a spacetime is constructed on a computer to build a simulation laboratory for binary black hole mergers. We will focus on the recent advances that are revealing these waveforms, and the dramatic new potential for discoveries that arises when these sources will be observed by LIGO and LISA.

  19. Binary Black Holes, Gravitational Waves, and Numerical Relativity

    NASA Technical Reports Server (NTRS)

    Centrella, Joan

    2008-01-01

    The final merger of two black holes releases a tremendous amount of energy and is one of the brightest sources in the gravitational wave sky. Observing these sources with gravitational wave detectors requires that we know the radiation waveforms they emit. Since these mergers take place in regions of very strong gravitational fields, we need to solve Einstein's equations of general relativity on a computer in order to calculate these waveforms. For more than 30 years, scientists have tried to compute these waveforms using the methods of numerical relativity. The resulting computer codes have been plagued by instabilities. causing them to crash well before the black hole:, in the binary could complete even a single orbit. Recently this situation has changed dramatically, with a series of amazing breakthroughs. This talk will take you on this quest for the holy grail of numerical relativity, showing how a spacetime is constructed on a computer to build a simulation laboratory for binary black hole mergers. We will focus on the recent advances that are revealing these waveforms, and the dramatic new potential for discoveries that arises when these sources will be observed by LIGO and LISA.

  20. Binary Black Holes, Gravitational Waves, and Numerical Relativity

    NASA Technical Reports Server (NTRS)

    Centrella, Joan

    2008-01-01

    The final merger of two black holes releases a tremendous amount of energy and is one of the brightest sources in the gravitational wave sky. Observing these sources with gravitational wave detectors requires that we know the radiation waveforms they emit. Since these mergers take place in regions of very strong gravitational fields, we need to solve Einstein's equations of general relativity on a computer in order to calculate these waveforms. For more than 30 years, scientists have tried to compute these waveforms using the methods of numerical relativity. The resulting computer codes have been plagued by instabilities, causing them to crash well before the black holes in the binary could complete even a single orbit. Recently this situation has changed dramatically, with a series of amazing breakthroughs. This talk will take you on this quest for the holy grail of numerical relativity, showing how a spacetime is constructed on a computer to build a simulation laboratory for binary black hole mergers. We will focus on the recent advances that are revealing these waveforms, and the dramatic new potential for discoveries that arises when these sources will be observed by LIGO and LISA.

  1. Binary Black Holes, Gravitational Waves, and Numerical Relativity

    NASA Technical Reports Server (NTRS)

    Centrella, Joan

    2009-01-01

    The final merger of two black holes releases a tremendous amount of energy and is one of the brightest sources in the gravitational wave sky. Observing these sources with gravitational wave detectors requires that we know the radiation waveforms they emit. Since these mergers take place in regions of very strong gravitational fields, we need to solve Einstein's equations of general relativity on a computer in order to calculate these waveforms. For more than 30 years, scientists have tried to compute these waveforms using the methods of numerical relativity. The resulting computer codes have been plagued by instabilities, causing them to crash well before the black holes in the binary could complete even a single orbit. Recently this situation has changed dramatically, with a series of amazing breakthroughs. This talk will take you on this quest for the holy grail of numerical relativity, showing how a spacetime is constructed on a computer to build a simulation laboratory for binary black hole mergers. We will focus on the recent advances that are revealing these waveforms, and the dramatic new potential for discoveries that arises when these sources will be observed by LIGO and LISA.

  2. Binary Black Holes, Gravitational Waves, and Numerical Relativity

    NASA Technical Reports Server (NTRS)

    Centrella, Joan

    2007-01-01

    The final merger of two black holes releases a tremendous amount of energy and is one of the brightest sources in the gravitational wave sky. Observing these sources with gravitational wave detectors requires that we know the radiation waveforms they emit. Since these mergers take place in regions of very strong gravitational fields, we need to solve Einstein's equations of general relativity on a computer in order to calculate these waveforms. For more than 30 years, scientists have tried to compute these waveforms using the methods of numerical relativity. The resulting computer codes have been plagued by instabilities, causing them to crash well before the black holes in the binary could complete even a single orbit. Recently this situation has changed dramatically, with a series of amazing breakthroughs. This talk will take you on this quest for the holy grail of numerical relativity, showing how a spacetime is constructed on a computer to build a simulation laboratory for binary black hole mergers. We will focus on the recent advances that are revealing these waveforms, and the dramatic new potential for discoveries that arises when these sources will be observed by LIGO and LISA

  3. Binary Black Holes, Gravitational Waves, and Numerical Relativity

    NASA Technical Reports Server (NTRS)

    Centrella, Joan

    2007-01-01

    The final merger of two black holes releases a tremendous amount of energy and is one of the brightest sources in the gravitational wave sky. Observing these sources with gravitational wave detectors requires that we know the radiation waveforms they emit. Since these mergers take place in regions of very strong gravitational fields, we need to solve Einstein's equations of general relativity on a computer in order to calculate these waveforms. For more than 30 years, scientists have tried to compute these waveforms using the methods of numerical relativity. The resulting computer codes have been plagued by instabilities, causing them to crash well before the black holes in the binary could complete even a single orbit. Recently this situation has changed dramatically, with a series of amazing breakthroughs. This talk will take you on this quest for the holy grail of numerical relativity, showing how a spacetime is constructed on a computer to build a simutation laboratory for binary black hole mergers. We will focus on the recent advances that are revealing these waveforms, and the dramatic new potential for discoveries that arises when these sources will be observed by LIGO and LISA.

  4. Binary Black Holes, Gravitational Waves, and Numerical Relativity

    NASA Technical Reports Server (NTRS)

    Centrella, Joan

    2006-01-01

    The final merger of two black holes releases a tremendous amount of energy and is one of the brightest sources in the gravitational wave sky. Observing these sources with gravitational wave detectors requires that we know the radiation waveforms they emit. Since these mergers take place in regions of extreme gravity, we need to solve Einstein's equations of general relativity on a computer in order to calculate these waveforms. For more than 30 years, scientists have tried to compute these waveforms using the methods of numerical relativity. The resulting computer codes have been plagued by instabilities, causing them to crash well before the black holes in the binary could complete even a single orbit. This situation has changed dramatically in the past year, with a series of amazing breakthroughs. This talk will take you on this quest for the holy grail of numerical relativity, showing how a spacetime is constructed on a computer to build a simulation laboratory for binary black hole mergers. We will focus on the recent advances that are revealing these waveforms, and the dramatic new potential for discoveries that arises when these sources will be observed by LISA and LIGO.

  5. Binary Black Holes and Gravitational Waves

    NASA Technical Reports Server (NTRS)

    Centrella, Joan

    2007-01-01

    The final merger of two black holes releases a tremendous amount of energy, more than the combined light from all the stars in the visible universe. This energy is emitted in the form of gravitational waves, and observing these sources with gravitational wave detectors such as LIGO and LISA requires that we know the pattern or fingerprint of the radiation emitted. Since black hole mergers take place in regions of extreme gravitational fields, we need to solve Einstein's equations of general relativity on a computer in order to calculate these wave patterns. For more than 30 years, scientists have tried to compute these wave patterns. However, their computer codes have been plagued by problems that caused them to crash. This situation has changed dramatically in the past 2 years, with a series of amazing breakthroughs. This discussion examines these gravitational patterns, showing how a spacetime is constructed on a computer to build a simulation laboratory for binary black hole mergers. The focus is on recent advances that are revealing these waveforms, and the dramatic new potential for discoveries that arises when these sources will be observed by the space-based gravitational wave detector LISA.

  6. Binary Black Holes, Numerical Relativity, and Gravitational Waves

    NASA Technical Reports Server (NTRS)

    Centrella, Joan

    2007-01-01

    The final merger of two black holes releases a tremendous amount of energy, more than the combined light from all the stars in the visible universe. This energy is emitted in the form of gravitational waves, and observing these sources with gravitational wave detectors such as LISA requires that we know the pattern or fingerprint of the radiation emitted. Since black hole mergers take place in regions of extreme gravitational fields, we need to solve Einstein's equations of general relativity on a computer in order to calculate these wave patterns. For more than 30 years, scientists have tried to compute these wave patterns. However, their computer codes have been plagued by problems that caused them to crash. This situation has changed dramatically in the past 2 years, with a series of amazing breakthroughs. This talk will take you on this quest for these gravitational wave patterns, showing how a spacetime is constructed on a computer to build a simulation laboratory for binary black hole mergers. We will focus on the recent advances that are revealing these waveforms, and the dramatic new potential for discoveries that arises when these sources will be observed by LISA

  7. Cosmic Messengers: Binary Black Holes and Gravitational Waves

    NASA Technical Reports Server (NTRS)

    Centrella, Joan

    2007-01-01

    The final merger of two black holes releases a tremendous amount of energy, more than the combined light from all the stars in the visible universe. This energy is emitted in the form of gravitational waves, and observing these sources with gravitational wave detectors such as LISA requires that we know the pattern or fingerprint of the radiation emitted. Since black hole mergers take place in regions of extreme gravitational fields, we need to solve Einstein s equations of general relativity on a computer in order to calculate these wave patterns. For more than 30 years, scientists have tried to compute these wave patterns. However, their computer codes have been plagued by problems that caused them to crash. . This situation has changed dramatically in the past 2 years, with a series of amazing breakthroughs. This talk will take you on this quest for these gravitational wave patterns, showing how a spacetime is constructed on a computer to build a simulation laboratory for binary black hole mergers. We will focus on the recent advances that are revealing these waveforms, and the dramatic new potential for discoveries that arises when these sources will. be observed by LISA.

  8. 75 FR 15675 - Professional Research Experience Program in Chemical Science and Technology Laboratory...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-30

    ... in physics, chemistry, mathematics, computer science, or engineering. Institutions should have a 4..., mathematics, computer science, or engineering with work experiences in laboratories or other settings...-0141-01] Professional Research Experience Program in Chemical Science and Technology Laboratory...

  9. Polybrominated Diphenyl Ethers in Dryer Lint: An Advanced Analysis Laboratory

    ERIC Educational Resources Information Center

    Thompson, Robert Q.

    2008-01-01

    An advanced analytical chemistry laboratory experiment is described that involves environmental analysis and gas chromatography-mass spectrometry. Students analyze lint from clothes dryers for traces of flame retardant chemicals, polybrominated diphenylethers (PBDEs), compounds receiving much attention recently. In a typical experiment, ng/g…

  10. An Advanced Undergraduate Chemistry Laboratory Experiment Exploring NIR Spectroscopy and Chemometrics

    ERIC Educational Resources Information Center

    Wanke, Randall; Stauffer, Jennifer

    2007-01-01

    An advanced undergraduate chemistry laboratory experiment to study the advantages and hazards of the coupling of NIR spectroscopy and chemometrics is described. The combination is commonly used for analysis and process control of various ingredients used in agriculture, petroleum and food products.

  11. Computer Exercises in Systems and Fields Experiments

    ERIC Educational Resources Information Center

    Bacon, C. M.; McDougal, J. R.

    1971-01-01

    Laboratory activities give students an opportunity to interact with computers in modes ranging from remote terminal use in laboratory experimentation to the direct hands-on use of a small digital computer with disk memory and on-line plotter, and finally to the use of a large computer under closed-shop operation. (Author/TS)

  12. Civil propulsion technology for the next twenty-five years

    NASA Technical Reports Server (NTRS)

    Rosen, Robert; Facey, John R.

    1987-01-01

    The next twenty-five years will see major advances in civil propulsion technology that will result in completely new aircraft systems for domestic, international, commuter and high-speed transports. These aircraft will include advanced aerodynamic, structural, and avionic technologies resulting in major new system capabilities and economic improvements. Propulsion technologies will include high-speed turboprops in the near term, very high bypass ratio turbofans, high efficiency small engines and advanced cycles utilizing high temperature materials for high-speed propulsion. Key fundamental enabling technologies include increased temperature capability and advanced design methods. Increased temperature capability will be based on improved composite materials such as metal matrix, intermetallics, ceramics, and carbon/carbon as well as advanced heat transfer techniques. Advanced design methods will make use of advances in internal computational fluid mechanics, reacting flow computation, computational structural mechanics and computational chemistry. The combination of advanced enabling technologies, new propulsion concepts and advanced control approaches will provide major improvements in civil aircraft.

  13. Computer Forensics Education - the Open Source Approach

    NASA Astrophysics Data System (ADS)

    Huebner, Ewa; Bem, Derek; Cheung, Hon

    In this chapter we discuss the application of the open source software tools in computer forensics education at tertiary level. We argue that open source tools are more suitable than commercial tools, as they provide the opportunity for students to gain in-depth understanding and appreciation of the computer forensic process as opposed to familiarity with one software product, however complex and multi-functional. With the access to all source programs the students become more than just the consumers of the tools as future forensic investigators. They can also examine the code, understand the relationship between the binary images and relevant data structures, and in the process gain necessary background to become the future creators of new and improved forensic software tools. As a case study we present an advanced subject, Computer Forensics Workshop, which we designed for the Bachelor's degree in computer science at the University of Western Sydney. We based all laboratory work and the main take-home project in this subject on open source software tools. We found that without exception more than one suitable tool can be found to cover each topic in the curriculum adequately. We argue that this approach prepares students better for forensic field work, as they gain confidence to use a variety of tools, not just a single product they are familiar with.

  14. An autonomous rendezvous and docking system using cruise missile technologies

    NASA Technical Reports Server (NTRS)

    Jones, Ruel Edwin

    1991-01-01

    In November 1990 the Autonomous Rendezvous & Docking (AR&D) system was first demonstrated for members of NASA's Strategic Avionics Technology Working Group. This simulation utilized prototype hardware from the Cruise Missile and Advanced Centaur Avionics systems. The object was to show that all the accuracy, reliability and operational requirements established for a space craft to dock with Space Station Freedom could be met by the proposed system. The rapid prototyping capabilities of the Advanced Avionics Systems Development Laboratory were used to evaluate the proposed system in a real time, hardware in the loop simulation of the rendezvous and docking reference mission. The simulation permits manual, supervised automatic and fully autonomous operations to be evaluated. It is also being upgraded to be able to test an Autonomous Approach and Landing (AA&L) system. The AA&L and AR&D systems are very similar. Both use inertial guidance and control systems supplemented by GPS. Both use an Image Processing System (IPS), for target recognition and tracking. The IPS includes a general purpose multiprocessor computer and a selected suite of sensors that will provide the required relative position and orientation data. Graphic displays can also be generated by the computer, providing the astronaut / operator with real-time guidance and navigation data with enhanced video or sensor imagery.

  15. University of Washington/ Northwest National Marine Renewable Energy Center Tidal Current Technology Test Protocol, Instrumentation, Design Code, and Oceanographic Modeling Collaboration: Cooperative Research and Development Final Report, CRADA Number CRD-11-452

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Driscoll, Frederick R.

    The University of Washington (UW) - Northwest National Marine Renewable Energy Center (UW-NNMREC) and the National Renewable Energy Laboratory (NREL) will collaborate to advance research and development (R&D) of Marine Hydrokinetic (MHK) renewable energy technology, specifically renewable energy captured from ocean tidal currents. UW-NNMREC is endeavoring to establish infrastructure, capabilities and tools to support in-water testing of marine energy technology. NREL is leveraging its experience and capabilities in field testing of wind systems to develop protocols and instrumentation to advance field testing of MHK systems. Under this work, UW-NNMREC and NREL will work together to develop a common instrumentation systemmore » and testing methodologies, standards and protocols. UW-NNMREC is also establishing simulation capabilities for MHK turbine and turbine arrays. NREL has extensive experience in wind turbine array modeling and is developing several computer based numerical simulation capabilities for MHK systems. Under this CRADA, UW-NNMREC and NREL will work together to augment single device and array modeling codes. As part of this effort UW NNMREC will also work with NREL to run simulations on NREL's high performance computer system.« less

  16. A Reverse Osmosis System for an Advanced Separation Process Laboratory.

    ERIC Educational Resources Information Center

    Slater, C. S.; Paccione, J. D.

    1987-01-01

    Focuses on the development of a pilot unit for use in an advanced separations process laboratory in an effort to develop experiments on such processes as reverse osmosis, ultrafiltration, adsorption, and chromatography. Discusses reverse osmosis principles, the experimental system design, and some experimental studies. (TW)

  17. Implementing an Affordable High-Performance Computing for Teaching-Oriented Computer Science Curriculum

    ERIC Educational Resources Information Center

    Abuzaghleh, Omar; Goldschmidt, Kathleen; Elleithy, Yasser; Lee, Jeongkyu

    2013-01-01

    With the advances in computing power, high-performance computing (HPC) platforms have had an impact on not only scientific research in advanced organizations but also computer science curriculum in the educational community. For example, multicore programming and parallel systems are highly desired courses in the computer science major. However,…

  18. Image analysis in cytology: DNA-histogramming versus cervical smear prescreening.

    PubMed

    Bengtsson, E W; Nordin, B

    1993-01-01

    The visual inspection of cellular specimens and histological sections through a light microscope plays an important role in clinical medicine and biomedical research. The human visual system is very good at the recognition of various patterns but less efficient at quantitative assessment of these patterns. Some samples are prepared in great numbers, most notably the screening for cervical cancer, the so-called PAP-smears, which results in hundreds of millions of samples each year, creating a tedious mass inspection task. Numerous attempts have been made over the last 40 years to create systems that solve these two tasks, the quantitative supplement to the human visual system and the automation of mass screening. The most difficult task, the total automation, has received the greatest attention with many large scale projects over the decades. In spite of all these efforts, still no generally accepted automated prescreening device exists on the market. The main reason for this failure is the great pattern recognition capabilities needed to distinguish between cancer cells and all other kinds of objects found in the specimens: cellular clusters, debris, degenerate cells, etc. Improved algorithms, the ever-increasing processing power of computers and progress in biochemical specimen preparation techniques make it likely that eventually useful automated prescreening systems will become available. Meanwhile, much less effort has been put into the development of interactive cell image analysis systems. Still, some such systems have been developed and put into use at thousands of laboratories worldwide. In these the human pattern recognition capability is used to select the fields and objects that are to be analysed while the computational power of the computer is used for the quantitative analysis of cellular DNA content or other relevant markers. Numerous studies have shown that the quantitative information about the distribution of cellular DNA content is of prognostic significance in many types of cancer. Several laboratories are therefore putting these techniques into routine clinical use. The more advanced systems can also study many other markers and cellular features, some known to be of clinical interest, others useful in research. The advances in computer technology are making these systems more generally available through decreasing cost, increasing computational power and improved user interfaces. We have been involved in research and development of both automated and interactive cell analysis systems during the last 20 years. Here some experiences and conclusions from this work will be presented as well as some predictions about what can be expected in the near future.

  19. High Precision Prediction of Functional Sites in Protein Structures

    PubMed Central

    Buturovic, Ljubomir; Wong, Mike; Tang, Grace W.; Altman, Russ B.; Petkovic, Dragutin

    2014-01-01

    We address the problem of assigning biological function to solved protein structures. Computational tools play a critical role in identifying potential active sites and informing screening decisions for further lab analysis. A critical parameter in the practical application of computational methods is the precision, or positive predictive value. Precision measures the level of confidence the user should have in a particular computed functional assignment. Low precision annotations lead to futile laboratory investigations and waste scarce research resources. In this paper we describe an advanced version of the protein function annotation system FEATURE, which achieved 99% precision and average recall of 95% across 20 representative functional sites. The system uses a Support Vector Machine classifier operating on the microenvironment of physicochemical features around an amino acid. We also compared performance of our method with state-of-the-art sequence-level annotator Pfam in terms of precision, recall and localization. To our knowledge, no other functional site annotator has been rigorously evaluated against these key criteria. The software and predictive models are incorporated into the WebFEATURE service at http://feature.stanford.edu/wf4.0-beta. PMID:24632601

  20. Plain abdominal radiography in acute abdominal pain; past, present, and future

    PubMed Central

    Gans, Sarah L; Stoker, Jaap; Boermeester, Marja A

    2012-01-01

    Several studies have demonstrated that a diagnosis based solely on a patient’s medical history, physical examination, and laboratory tests is not reliable enough, despite the fact that these aspects are essential parts of the workup of a patient presenting with acute abdominal pain. Traditionally, imaging workup starts with abdominal radiography. However, numerous studies have demonstrated low sensitivity and accuracy for plain abdominal radiography in the evaluation of acute abdominal pain as well as various specific diseases such as perforated viscus, bowel obstruction, ingested foreign body, and ureteral stones. Computed tomography, and in particular computed tomography after negative ultrasonography, provides a better workup than plain abdominal radiography alone. The benefits of computed tomography lie in decision-making for management, planning of a surgical strategy, and possibly even avoidance of negative laparotomies. Based on abundant available evidence, major advances in diagnostic imaging, and changes in the management of certain diseases, we can conclude that there is no place for plain abdominal radiography in the workup of adult patients with acute abdominal pain presenting in the emergency department in current practice. PMID:22807640

Top