Science.gov

Sample records for advanced computing laboratory

  1. Using an Advanced Computational Laboratory Experiment to Extend and Deepen Physical Chemistry Students' Understanding of Atomic Structure

    ERIC Educational Resources Information Center

    Hoffman, Gary G.

    2015-01-01

    A computational laboratory experiment is described, which involves the advanced study of an atomic system. The students use concepts and techniques typically covered in a physical chemistry course but extend those concepts and techniques to more complex situations. The students get a chance to explore the study of atomic states and perform…

  2. Elementary and Advanced Computer Projects for the Physics Classroom and Laboratory

    DTIC Science & Technology

    1992-12-01

    development initiated by Dr. Pat Bronson ) begins by spelling out the words Monte Carlo on the screen using the BASIC random number generator. After a title...development initiated by Chris 11 Brueningsen) allows students the opportunity to be introduced to and/or review scientific notation, basic algebra ...laboratory experiments9 include: vector and sumzy for a force table experiment; kinem for a linear air track experiment; plotit, linzer, quazer, linlst

  3. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan : ASC software quality engineering practices Version 3.0.

    SciTech Connect

    Turgeon, Jennifer L.; Minana, Molly A.; Hackney, Patricia; Pilch, Martin M.

    2009-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in the US Department of Energy/National Nuclear Security Agency (DOE/NNSA) Quality Criteria, Revision 10 (QC-1) as 'conformance to customer requirements and expectations'. This quality plan defines the SNL ASC Program software quality engineering (SQE) practices and provides a mapping of these practices to the SNL Corporate Process Requirement (CPR) 001.3.6; 'Corporate Software Engineering Excellence'. This plan also identifies ASC management's and the software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals. This SNL ASC Software Quality Plan establishes the signatories commitments to improving software products by applying cost-effective SQE practices. This plan enumerates the SQE practices that comprise the development of SNL ASC's software products and explains the project teams opportunities for tailoring and implementing the practices.

  4. An Advanced Chemistry Laboratory Program.

    ERIC Educational Resources Information Center

    Wise, John H.

    The Advanced Chemistry Laboratory Program is a project designed to devise experiments to coordinate the use of instruments in the laboratory programs of physical chemistry, instrumental analysis, and inorganic chemistry at the advanced undergraduate level. It is intended that such experiments would incorporate an introduction to the instrument…

  5. Advanced Computer Typography.

    DTIC Science & Technology

    1981-12-01

    ADVANCED COMPUTER TYPOGRAPHY .(U) DEC 81 A V HERSHEY UNCLASSIFIED NPS012-81-005 M MEEEIEEEII IIUJIL15I.4 MICROCQP RE SO.JjI ON ft R NPS012-81-005...NAVAL POSTGRADUATE SCHOOL 0Monterey, California DTIC SELECTEWA APR 5 1982 B ADVANCED COMPUTER TYPOGRAPHY by A. V. HERSHEY December 1981 OApproved for...Subtitle) S. TYPE Or REPORT & PERIOD COVERED Final ADVANCED COMPUTER TYPOGRAPHY Dec 1979 - Dec 1981 S. PERFORMING ORG. REPORT NUMBER 7. AUTHOR(s) S CONTRACT

  6. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan part 2 mappings for the ASC software quality engineering practices, version 2.0.

    SciTech Connect

    Heaphy, Robert; Sturtevant, Judith E.; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Minana, Molly A.; Hackney, Patricia; Forsythe, Christi A.; Schofield, Joseph Richard, Jr.; Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR001.3.2 and CPR001.3.6 and to a Department of Energy document, ''ASCI Software Quality Engineering: Goals, Principles, and Guidelines''. This document also identifies ASC management and software project teams' responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  7. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1: ASC software quality engineering practices, Version 2.0.

    SciTech Connect

    Sturtevant, Judith E.; Heaphy, Robert; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Minana, Molly A.; Hackney, Patricia; Forsythe, Christi A.; Schofield, Joseph Richard, Jr.; Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR 1.3.2 and 1.3.6 and to a Department of Energy document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines. This document also identifies ASC management and software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  8. Managing a Computer Teaching Laboratory.

    ERIC Educational Resources Information Center

    Macey, Susan M.

    1998-01-01

    Examines issues concerning the initial setup and the everyday operational problems of managing a computer teaching laboratory. Addresses such issues as setting policies on laboratory access, dealing with a high student-per-machine ratio, provisions for maintenance, obtaining hardware and software upgrades, staffing, data security, and networking…

  9. Advanced computations in plasma physics

    NASA Astrophysics Data System (ADS)

    Tang, W. M.

    2002-05-01

    Scientific simulation in tandem with theory and experiment is an essential tool for understanding complex plasma behavior. In this paper we review recent progress and future directions for advanced simulations in magnetically confined plasmas with illustrative examples chosen from magnetic confinement research areas such as microturbulence, magnetohydrodynamics, magnetic reconnection, and others. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales together with access to powerful new computational resources. In particular, the fusion energy science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPP's to produce three-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of turbulence self-regulation by zonal flows. It should be emphasized that these calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to

  10. Advanced Computation in Plasma Physics

    NASA Astrophysics Data System (ADS)

    Tang, William

    2001-10-01

    Scientific simulation in tandem with theory and experiment is an essential tool for understanding complex plasma behavior. This talk will review recent progress and future directions for advanced simulations in magnetically-confined plasmas with illustrative examples chosen from areas such as microturbulence, magnetohydrodynamics, magnetic reconnection, and others. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales together with access to powerful new computational resources. In particular, the fusion energy science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop MPP's to produce 3-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of turbulence self-regulation by zonal flows. It should be emphasized that these calculations, which typically utilized billions of particles for tens of thousands time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to stimulate improved cross-cutting collaborations with other fields and also to help attract

  11. Computer-Assisted Laboratory Stations.

    ERIC Educational Resources Information Center

    Snyder, William J., Hanyak, Michael E.

    1985-01-01

    Describes the advantages and features of computer-assisted laboratory stations for use in a chemical engineering program. Also describes a typical experiment at such a station: determining the response times of a solid state humidity sensor at various humidity conditions and developing an empirical model for the sensor. (JN)

  12. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1 : ASC software quality engineering practices version 1.0.

    SciTech Connect

    Minana, Molly A.; Sturtevant, Judith E.; Heaphy, Robert; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Forsythe, Christi A.; Schofield, Joseph Richard, Jr.; Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2005-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in DOE/AL Quality Criteria (QC-1) as conformance to customer requirements and expectations. This quality plan defines the ASC program software quality practices and provides mappings of these practices to the SNL Corporate Process Requirements (CPR 1.3.2 and CPR 1.3.6) and the Department of Energy (DOE) document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines (GP&G). This quality plan identifies ASC management and software project teams' responsibilities for cost-effective software engineering quality practices. The SNL ASC Software Quality Plan establishes the signatories commitment to improving software products by applying cost-effective software engineering quality practices. This document explains the project teams opportunities for tailoring and implementing the practices; enumerates the practices that compose the development of SNL ASC's software products; and includes a sample assessment checklist that was developed based upon the practices in this document.

  13. The Computation Directorate at Lawrence Livermore National Laboratory

    SciTech Connect

    Cook, L

    2006-09-07

    The Computation Directorate at Lawrence Livermore National Laboratory has four major areas of work: (1) Programmatic Support -- Programs are areas which receive funding to develop solutions to problems or advance basic science in their areas (Stockpile Stewardship, Homeland Security, the Human Genome project). Computer scientists are 'matrixed' to these programs to provide computer science support. (2) Livermore Computer Center (LCC) -- Development, support and advanced planning for the large, massively parallel computers, networks and storage facilities used throughout the laboratory. (3) Research -- Computer scientists research advanced solutions for programmatic work and for external contracts and research new HPC hardware solutions. (4) Infrastructure -- Support for thousands of desktop computers and numerous LANs, labwide unclassified networks, computer security, computer-use policy.

  14. Computer laboratory in medical education for medical students.

    PubMed

    Hercigonja-Szekeres, Mira; Marinović, Darko; Kern, Josipa

    2009-01-01

    Five generations of second year students at the Zagreb University School of Medicine were interviewed through an anonymous questionnaire on their use of personal computers, Internet, computer laboratories and computer-assisted education in general. Results show an advance in students' usage of information and communication technology during the period from 1998/99 to 2002/03. However, their positive opinion about computer laboratory depends on installed capacities: the better the computer laboratory technology, the better the students' acceptance and use of it.

  15. Advances in Computational Astrophysics

    SciTech Connect

    Calder, Alan C.; Kouzes, Richard T.

    2009-03-01

    I was invited to be the guest editor for a special issue of Computing in Science and Engineering along with a colleague from Stony Brook. This is the guest editors' introduction to a special issue of Computing in Science and Engineering. Alan and I have written this introduction and have been the editors for the 4 papers to be published in this special edition.

  16. Three-dimensional registration of synchrotron radiation-based micro-computed tomography images with advanced laboratory micro-computed tomography data from murine kidney casts

    NASA Astrophysics Data System (ADS)

    Thalmann, Peter; Hieber, Simone E.; Schulz, Georg; Deyhle, Hans; Khimchenko, Anna; Kurtcuoglu, Vartan; Olgac, Ufuk; Marmaras, Anastasios; Kuo, Willy; Meyer, Eric P.; Beckmann, Felix; Herzen, Julia; Ehrbar, Stefanie; Müller, Bert

    2014-09-01

    Malfunction of oxygen regulation in kidney and liver may lead to the pathogenesis of chronic diseases. The underlying mechanisms are poorly understood. In kidney, it is hypothesized that renal gas shunting from arteries to veins eliminates excess oxygen. Such shunting is highly dependent on the structure of the renal vascular network. The vascular tree has so far not been quantified under maintenance of its connectivity as three-dimensional imaging of the vessel tree down to the smallest capillaries, which in mouse model are smaller than 5 μm in diameter, is a challenging task. An established protocol uses corrosion casts and applies synchrotron radiation-based micro-computed tomography (SRμCT), which provides the desired spatial resolution with the necessary contrast. However, SRμCT is expensive and beamtime access is limited. We show here that measurements with a phoenix nanotomrm (General Electric, Wunstorf, Germany) can provide comparable results to those obtained with SRμCT, except for regions with small vessel structures, where the signal-to-noise level was significantly reduced. For this purpose the nanotom®m measurement was compared with its corresponding measurement acquired at the beamline P05 at PETRA III at DESY, Hamburg, Germany.

  17. Using the Computer as a Laboratory Instrument.

    ERIC Educational Resources Information Center

    Collings, Peter J.; Greenslade, Thomas B., Jr.

    1989-01-01

    Reports experiences during a two-year period in introducing the computer to the laboratory and students to the computer as a laboratory instrument. Describes a working philosophy, data acquisition system, and experiments. Summarizes the laboratory procedures of nine experiments, covering mechanics, heat, electromagnetism, and optics. (YP)

  18. Simulation methods for advanced scientific computing

    SciTech Connect

    Booth, T.E.; Carlson, J.A.; Forster, R.A.

    1998-11-01

    This is the final report of a three-year, Laboratory Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The objective of the project was to create effective new algorithms for solving N-body problems by computer simulation. The authors concentrated on developing advanced classical and quantum Monte Carlo techniques. For simulations of phase transitions in classical systems, they produced a framework generalizing the famous Swendsen-Wang cluster algorithms for Ising and Potts models. For spin-glass-like problems, they demonstrated the effectiveness of an extension of the multicanonical method for the two-dimensional, random bond Ising model. For quantum mechanical systems, they generated a new method to compute the ground-state energy of systems of interacting electrons. They also improved methods to compute excited states when the diffusion quantum Monte Carlo method is used and to compute longer time dynamics when the stationary phase quantum Monte Carlo method is used.

  19. Redirecting Under-Utilised Computer Laboratories into Cluster Computing Facilities

    ERIC Educational Resources Information Center

    Atkinson, John S.; Spenneman, Dirk H. R.; Cornforth, David

    2005-01-01

    Purpose: To provide administrators at an Australian university with data on the feasibility of redirecting under-utilised computer laboratories facilities into a distributed high performance computing facility. Design/methodology/approach: The individual log-in records for each computer located in the computer laboratories at the university were…

  20. Advances and challenges in computational plasma science

    NASA Astrophysics Data System (ADS)

    Tang, W. M.

    2005-02-01

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behaviour. Recent advances in simulations of magnetically confined plasmas are reviewed in this paper, with illustrative examples, chosen from associated research areas such as microturbulence, magnetohydrodynamics and other topics. Progress has been stimulated, in particular, by the exponential growth of computer speed along with significant improvements in computer technology. The advances in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics have produced increasingly good agreement between experimental observations and computational modelling. This was enabled by two key factors: (a) innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales and (b) access to powerful new computational resources. Excellent progress has been made in developing codes for which computer run-time and problem-size scale well with the number of processors on massively parallel processors (MPPs). Examples include the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPPs to produce three-dimensional, general geometry, nonlinear particle simulations that have accelerated advances in understanding the nature of turbulence self-regulation by zonal flows. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In looking towards the future, the current results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. This

  1. Argonne's Laboratory computing center - 2007 annual report.

    SciTech Connect

    Bair, R.; Pieper, G. W.

    2008-05-28

    Argonne National Laboratory founded the Laboratory Computing Resource Center (LCRC) in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. In September 2002 the LCRC deployed a 350-node computing cluster from Linux NetworX to address Laboratory needs for mid-range supercomputing. This cluster, named 'Jazz', achieved over a teraflop of computing power (1012 floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the 50 fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2007, there were over 60 active projects representing a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to foster growth in the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure providers to offer more scientific data management capabilities, expanding Argonne staff use of national computing facilities, and improving the scientific reach and

  2. Software Security in the University Computer Laboratories.

    ERIC Educational Resources Information Center

    Kung, Mable T.

    1989-01-01

    Discussion of software security in university computer laboratories focuses on the causes of computer viruses. Possible ways to detect an infected disk are described; strategies for professors, students, and computer personnel to eradicate the spread of a computer virus are proposed; and two resources for further information are given. (LRW)

  3. Focusing on the laboratory, not the computer.

    PubMed

    Floering, D A; Seaman, G J

    1983-05-01

    Much time and effort is spent comparing laboratory computer systems; the process is not unlike that of shopping for an automated chemistry analyzer. But the authors believe too little effort is expended in examining and understanding the laboratory functions that are to be computerized and, as a consequence, computer purchases fail to reach expectations. Here, they emphasize the importance of evaluating laboratory functions, particularly the area of communication and record keeping.

  4. The Advanced Manufacturing Laboratory at RPI.

    ERIC Educational Resources Information Center

    Desrochers, A.; DeRusso, P. M.

    1984-01-01

    An Advanced Manufacturing Laboratory (AML) has been established at Rensselaer Polytechnic Institute (RPI). AML courses, course objectives, instructional strategies, student experiences in design and manufacturing, and AML equipment are discussed. Overall recommendations based on student and instructor experiences are also presented. (JN)

  5. Advanced Laboratory NMR Spectrometer with Applications.

    ERIC Educational Resources Information Center

    Biscegli, Clovis; And Others

    1982-01-01

    A description is given of an inexpensive nuclear magnetic resonance (NMR) spectrometer suitable for use in advanced laboratory courses. Applications to the nondestructive analysis of the oil content in corn seeds and in monitoring the crystallization of polymers are presented. (SK)

  6. Center for Advanced Computational Technology

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    2000-01-01

    The Center for Advanced Computational Technology (ACT) was established to serve as a focal point for diverse research activities pertaining to application of advanced computational technology to future aerospace systems. These activities include the use of numerical simulations, artificial intelligence methods, multimedia and synthetic environments, and computational intelligence, in the modeling, analysis, sensitivity studies, optimization, design and operation of future aerospace systems. The Center is located at NASA Langley and is an integral part of the School of Engineering and Applied Science of the University of Virginia. The Center has four specific objectives: 1) conduct innovative research on applications of advanced computational technology to aerospace systems; 2) act as pathfinder by demonstrating to the research community what can be done (high-potential, high-risk research); 3) help in identifying future directions of research in support of the aeronautical and space missions of the twenty-first century; and 4) help in the rapid transfer of research results to industry and in broadening awareness among researchers and engineers of the state-of-the-art in applications of advanced computational technology to the analysis, design prototyping and operations of aerospace and other high-performance engineering systems. In addition to research, Center activities include helping in the planning and coordination of the activities of a multi-center team of NASA and JPL researchers who are developing an intelligent synthesis environment for future aerospace systems; organizing workshops and national symposia; as well as writing state-of-the-art monographs and NASA special publications on timely topics.

  7. Advanced flight computer. Special study

    NASA Technical Reports Server (NTRS)

    Coo, Dennis

    1995-01-01

    This report documents a special study to define a 32-bit radiation hardened, SEU tolerant flight computer architecture, and to investigate current or near-term technologies and development efforts that contribute to the Advanced Flight Computer (AFC) design and development. An AFC processing node architecture is defined. Each node may consist of a multi-chip processor as needed. The modular, building block approach uses VLSI technology and packaging methods that demonstrate a feasible AFC module in 1998 that meets that AFC goals. The defined architecture and approach demonstrate a clear low-risk, low-cost path to the 1998 production goal, with intermediate prototypes in 1996.

  8. Development of Research Projects in Advanced Laboratory

    NASA Astrophysics Data System (ADS)

    Yu, Ping; Guha, Suchi

    2008-04-01

    Advanced laboratory serves as a bridge spanning primary physics laboratory and scientific research or industrial activities for undergraduate students. Students not only study modern physics experiments and techniques but also acquire the knowledge of advanced instrumentation. It is of interest to encourage students using the knowledge into research projects at a later stage of the course. We have designed several scientific projects for advanced laboratory to promote student's abilities of independent research. Students work as a team to select the project and search literatures, to perform experiments, and to give presentations. During the research project, instructor only provides necessary equipment for the project without any pre-knowledge of results, giving students a real flavor of scientific research. Our initial attempt has shown some interesting results. We found that students showed a very strong motivation in these projects, and student performances exceeded our expectation. Almost all the students in our first batch of the course have now joined graduate school in Physics and Materials Science. In the future we will also arrange graduate students working with undergraduate students to build a collaborative environment. In addition, a more comprehensive method will be used to evaluate student achievements.

  9. The Laboratory for Information and Computer Science.

    ERIC Educational Resources Information Center

    Jensen, Alton P.; Slamecka, Vladimir

    This document briefly explains the relationship between the School of Information Science and the Laboratory for Information and Computer Science at the Georgia Institute of Technology. The explicit purposes of the information science laboratory are spelled out as well as the specific objectives for the 1969/70, 1970/71, and 1971/72 school years.…

  10. Teaching Cardiovascular Integrations with Computer Laboratories.

    ERIC Educational Resources Information Center

    Peterson, Nils S.; Campbell, Kenneth B.

    1985-01-01

    Describes a computer-based instructional unit in cardiovascular physiology. The program (which employs simulated laboratory experimental techniques with a problem-solving format is designed to supplement an animal laboratory and to offer students an integrative approach to physiology through use of microcomputers. Also presents an overview of the…

  11. Pulmonary Testing Laboratory Computer Application

    PubMed Central

    Johnson, Martin E.

    1980-01-01

    An interactive computer application reporting patient pulmonary function data has been developed by Washington, D.C. VA Medical Center staff. A permanent on-line data base of patient demographics, lung capacity, flows, diffusion, arterial blood gases and physician interpretation is maintained by a minicomputer at the hospital. A user oriented application program resulted from development in concert with the clinical users. Rapid program development resulted from employing a newly developed time saving technique that has found wide application at other VA Medical Centers. Careful attention to user interaction has resulted in an application program requiring little training and which has been satisfactorily used by a number of clinicians.

  12. Los Alamos National Laboratory computer benchmarking 1982

    SciTech Connect

    Martin, J.L.

    1983-06-01

    Evaluating the performance of computing machinery is a continual effort of the Computer Research and Applications Group of the Los Alamos National Laboratory. This report summarizes the results of the group's benchmarking activities performed between October 1981 and September 1982, presenting compilation and execution times as well as megaflop rates for a set of benchmark codes. Tests were performed on the following computers: Cray Research, Inc. (CRI) Cray-1S; Control Data Corporation (CDC) 7600, 6600, Cyber 73, Cyber 825, Cyber 835, Cyber 855, and Cyber 205; Digital Equipment Corporation (DEC) VAX 11/780 and VAX 11/782; and Apollo Computer, Inc., Apollo.

  13. Computers in the General Physics Laboratory.

    ERIC Educational Resources Information Center

    Preston, Daryl W.; Good, R. H.

    1996-01-01

    Provides ideas and outcomes for nine computer laboratory experiments using a commercial eight-bit analog to digital (ADC) interface. Experiments cover statistics; rotation; harmonic motion; voltage, current, and resistance; ADC conversions; temperature measurement; single slit diffraction; and radioactive decay. Includes necessary schematics. (MVL)

  14. Computer Based Simulation of Laboratory Experiments.

    ERIC Educational Resources Information Center

    Edward, Norrie S.

    1997-01-01

    Examines computer based simulations of practical laboratory experiments in engineering. Discusses the aims and achievements of lab work (cognitive, process, psychomotor, and affective); types of simulations (model building and behavioral); and the strengths and weaknesses of simulations. Describes the development of a centrifugal pump simulation,…

  15. JPL Robotics Laboratory computer vision software library

    NASA Technical Reports Server (NTRS)

    Cunningham, R.

    1984-01-01

    The past ten years of research on computer vision have matured into a powerful real time system comprised of standardized commercial hardware, computers, and pipeline processing laboratory prototypes, supported by anextensive set of image processing algorithms. The software system was constructed to be transportable via the choice of a popular high level language (PASCAL) and a widely used computer (VAX-11/750), it comprises a whole realm of low level and high level processing software that has proven to be versatile for applications ranging from factory automation to space satellite tracking and grappling.

  16. Advanced Materials Laboratory User Test Planning Guide

    NASA Technical Reports Server (NTRS)

    Orndoff, Evelyne

    2012-01-01

    Test process, milestones and inputs are unknowns to first-time users of the Advanced Materials Laboratory. The User Test Planning Guide aids in establishing expectations for both NASA and non-NASA facility customers. The potential audience for this guide includes both internal and commercial spaceflight hardware/software developers. It is intended to assist their test engineering personnel in test planning and execution. Material covered includes a roadmap of the test process, roles and responsibilities of facility and user, major milestones, facility capabilities, and inputs required by the facility. Samples of deliverables, test article interfaces, and inputs necessary to define test scope, cost, and schedule are included as an appendix to the guide.

  17. Computer technology forecasting at the National Laboratories

    SciTech Connect

    Peskin, A M

    1980-01-01

    The DOE Office of ADP Management organized a group of scientists and computer professionals, mostly from their own national laboratories, to prepare an annually updated technology forecast to accompany the Department's five-year ADP Plan. The activities of the task force were originally reported in an informal presentation made at the ACM Conference in 1978. This presentation represents an update of that report. It also deals with the process of applying the results obtained at a particular computing center, Brookhaven National Laboratory. Computer technology forecasting is a difficult and hazardous endeavor, but it can reap considerable advantage. The forecast performed on an industry-wide basis can be applied to the particular needs of a given installation, and thus give installation managers considerable guidance in planning. A beneficial side effect of this process is that it forces installation managers, who might otherwise tend to preoccupy themselves with immediate problems, to focus on longer term goals and means to their ends. (RWR)

  18. Leadership Computing at Oak Ridge National Laboratory

    SciTech Connect

    Kuehn, Jeffery A; Studham, Scott; White III, James B; Fahey, Mark R; Carter, Steven M; Nichols, Jeffrey A

    2005-05-01

    Oak Ridge National Laboratory is running the world's largest Cray X1, the world's largest unclassified Cray XT3, and a Cray XD1. In this report we provide an overview of the applications requiring leadership computing and the performance characteristics of the various platforms at ORNL. We then discuss ways in which we are working with Cray to establish a roadmap that will provide 100's of teraflops of sustained performance while integrating a balance of vector and scalar processors.

  19. Eagleworks Laboratories: Advanced Propulsion Physics Research

    NASA Technical Reports Server (NTRS)

    White, Harold; March, Paul; Williams, Nehemiah; ONeill, William

    2011-01-01

    NASA/JSC is implementing an advanced propulsion physics laboratory, informally known as "Eagleworks", to pursue propulsion technologies necessary to enable human exploration of the solar system over the next 50 years, and enabling interstellar spaceflight by the end of the century. This work directly supports the "Breakthrough Propulsion" objectives detailed in the NASA OCT TA02 In-space Propulsion Roadmap, and aligns with the #10 Top Technical Challenge identified in the report. Since the work being pursued by this laboratory is applied scientific research in the areas of the quantum vacuum, gravitation, nature of space-time, and other fundamental physical phenomenon, high fidelity testing facilities are needed. The lab will first implement a low-thrust torsion pendulum (<1 uN), and commission the facility with an existing Quantum Vacuum Plasma Thruster. To date, the QVPT line of research has produced data suggesting very high specific impulse coupled with high specific force. If the physics and engineering models can be explored and understood in the lab to allow scaling to power levels pertinent for human spaceflight, 400kW SEP human missions to Mars may become a possibility, and at power levels of 2MW, 1-year transit to Neptune may also be possible. Additionally, the lab is implementing a warp field interferometer that will be able to measure spacetime disturbances down to 150nm. Recent work published by White [1] [2] [3] suggests that it may be possible to engineer spacetime creating conditions similar to what drives the expansion of the cosmos. Although the expected magnitude of the effect would be tiny, it may be a "Chicago pile" moment for this area of physics.

  20. Sandia National Laboratories Advanced Simulation and Computing (ASC) : appraisal method for the implementation of the ASC software quality engineering practices: Version 1.0.

    SciTech Connect

    Turgeon, Jennifer; Minana, Molly A.

    2008-02-01

    This document provides a guide to the process of conducting software appraisals under the Sandia National Laboratories (SNL) ASC Program. The goal of this document is to describe a common methodology for planning, conducting, and reporting results of software appraisals thereby enabling: development of an objective baseline on implementation of the software quality engineering (SQE) practices identified in the ASC Software Quality Plan across the ASC Program; feedback from project teams on SQE opportunities for improvement; identification of strengths and opportunities for improvement for individual project teams; guidance to the ASC Program on the focus of future SQE activities Document contents include process descriptions, templates to promote consistent conduct of appraisals, and an explanation of the relationship of this procedure to the SNL ASC software program.

  1. Computer system for a hospital microbiology laboratory.

    PubMed

    Delorme, J; Cournoyer, G

    1980-07-01

    An online computer system has been developed for a university hospital laboratory in microbiology that processes more than 125,000 specimens yearly. The system performs activities such as the printing of reports, fiscal and administrative tasks, quality control of data and technics, epidemiologic assistance, germ identification, and teaching and research in the different subspecialties of microbiology. Features of interest are smooth sequential transmission of clinical microbiologic test results from the laboratory to medical records, instantaneous display of all results for as long as 16 months, and updating of patient status, room number, and attending physician before the printing of reports. All data stored in the computer-file can be retrieved by any data item or combination of such. The reports are normally produced in the laboratory area by a teleprinter or by batch at night in case of mechanical failure of the terminal. If the system breaks down, the manually completed request forms can be sent to medical records. Programs were written in COBOL and ASSEMBLY languages.

  2. Computational geomechanics & applications at Sandia National Laboratories.

    SciTech Connect

    Arguello, Jose Guadalupe, Jr.

    2010-04-01

    Sandia National Laboratories (SNL) is a multi-program national laboratory in the business of national security, whose primary mission is nuclear weapons (NW). It is a prime contractor to the USDOE, operating under the NNSA and is one of the three NW national laboratories. It has a long history of involvement in the area of geomechanics, starting with the some of the earliest weapons tests at Nevada. Projects in which geomechanics support (in general) and computational geomechanics support (in particular) are at the forefront at Sandia, range from those associated with civilian programs to those in the defense programs. SNL has had significant involvement and participation in the Waste Isolation Pilot Plant (low-level defense nuclear waste), the Yucca Mountain Project (formerly proposed for commercial spent fuel and high-level nuclear waste), and the Strategic Petroleum Reserve (the nation's emergency petroleum store). In addition, numerous industrial partners seek-out our computational/geomechanics expertise, and there are efforts in compressed air and natural gas storage, as well as in CO{sub 2} Sequestration. Likewise, there have also been collaborative past efforts in the areas of compactable reservoir response, the response of salt structures associated with reservoirs, and basin modeling for the Oil & Gas industry. There are also efforts on the defense front, ranging from assessment of vulnerability of infrastructure to defeat of hardened targets, which require an understanding and application of computational geomechanics. Several examples from some of these areas will be described and discussed to give the audience a flavor of the type of work currently being performed at Sandia in the general area of geomechanics.

  3. [Advanced data analysis and visualization for clinical laboratory].

    PubMed

    Inada, Masanori; Yoneyama, Akiko

    2011-01-01

    This paper describes visualization techniques that help identify hidden structures in clinical laboratory data. The visualization of data is helpful for a rapid and better understanding of the characteristics of data sets. Various charts help the user identify trends in data. Scatter plots help prevent misinterpretations due to invalid data by identifying outliers. The representation of experimental data in figures is always useful for communicating results to others. Currently, flexible methods such as smoothing methods and latent structure analysis are available owing to the presence of advanced hardware and software. Principle component analysis, which is a well-known technique used to reduce multidimensional data sets, can be carried out on a personal computer. These methods could lead to advanced visualization with regard to exploratory data analysis. In this paper, we present 3 examples in order to introduce advanced data analysis. In the first example, a smoothing spline was fitted to a time-series from the control chart which is not in a state of statistical control. The trend line was clearly extracted from the daily measurements of the control samples. In the second example, principal component analysis was used to identify a new diagnostic indicator for Graves' disease. The multi-dimensional data obtained from patients were reduced to lower dimensions, and the principle components thus obtained summarized the variation in the data set. In the final example, a latent structure analysis for a Gaussian mixture model was used to draw complex density functions suitable for actual laboratory data. As a result, 5 clusters were extracted. The mixed density function of these clusters represented the data distribution graphically. The methods used in the above examples make the creation of complicated models for clinical laboratories more simple and flexible.

  4. National Laboratory for Advanced Scientific Visualization at UNAM - Mexico

    NASA Astrophysics Data System (ADS)

    Manea, Marina; Constantin Manea, Vlad; Varela, Alfredo

    2016-04-01

    In 2015, the National Autonomous University of Mexico (UNAM) joined the family of Universities and Research Centers where advanced visualization and computing plays a key role to promote and advance missions in research, education, community outreach, as well as business-oriented consulting. This initiative provides access to a great variety of advanced hardware and software resources and offers a range of consulting services that spans a variety of areas related to scientific visualization, among which are: neuroanatomy, embryonic development, genome related studies, geosciences, geography, physics and mathematics related disciplines. The National Laboratory for Advanced Scientific Visualization delivers services through three main infrastructure environments: the 3D fully immersive display system Cave, the high resolution parallel visualization system Powerwall, the high resolution spherical displays Earth Simulator. The entire visualization infrastructure is interconnected to a high-performance-computing-cluster (HPCC) called ADA in honor to Ada Lovelace, considered to be the first computer programmer. The Cave is an extra large 3.6m wide room with projected images on the front, left and right, as well as floor walls. Specialized crystal eyes LCD-shutter glasses provide a strong stereo depth perception, and a variety of tracking devices allow software to track the position of a user's hand, head and wand. The Powerwall is designed to bring large amounts of complex data together through parallel computing for team interaction and collaboration. This system is composed by 24 (6x4) high-resolution ultra-thin (2 mm) bezel monitors connected to a high-performance GPU cluster. The Earth Simulator is a large (60") high-resolution spherical display used for global-scale data visualization like geophysical, meteorological, climate and ecology data. The HPCC-ADA, is a 1000+ computing core system, which offers parallel computing resources to applications that requires

  5. Aerodynamic Analyses Requiring Advanced Computers, part 2

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Papers given at the conference present the results of theoretical research on aerodynamic flow problems requiring the use of advanced computers. Topics discussed include two-dimensional configurations, three-dimensional configurations, transonic aircraft, and the space shuttle.

  6. Aerodynamic Analyses Requiring Advanced Computers, Part 1

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Papers are presented which deal with results of theoretical research on aerodynamic flow problems requiring the use of advanced computers. Topics discussed include: viscous flows, boundary layer equations, turbulence modeling and Navier-Stokes equations, and internal flows.

  7. Bringing Advanced Computational Techniques to Energy Research

    SciTech Connect

    Mitchell, Julie C

    2012-11-17

    Please find attached our final technical report for the BACTER Institute award. BACTER was created as a graduate and postdoctoral training program for the advancement of computational biology applied to questions of relevance to bioenergy research.

  8. Quantum chromodynamics with advanced computing

    SciTech Connect

    Kronfeld, Andreas S.; /Fermilab

    2008-07-01

    We survey results in lattice quantum chromodynamics from groups in the USQCD Collaboration. The main focus is on physics, but many aspects of the discussion are aimed at an audience of computational physicists.

  9. The Particle Beam Optics Interactive Computer Laboratory

    SciTech Connect

    Gillespie, G.H.; Hill, B.W.; Brown, N.A.; Babcock, R.C.; Martono, H.; Carey, D.C. |

    1997-02-01

    The Particle Beam Optics Interactive Computer Laboratory (PBO Lab) is an educational software concept to aid students and professionals in learning about charged particle beams and particle beam optical systems. The PBO Lab is being developed as a cross-platform application and includes four key elements. The first is a graphic user interface shell that provides for a highly interactive learning session. The second is a knowledge database containing information on electric and magnetic optics transport elements. The knowledge database provides interactive tutorials on the fundamental physics of charged particle optics and on the technology used in particle optics hardware. The third element is a graphical construction kit that provides tools for students to interactively and visually construct optical beamlines. The final element is a set of charged particle optics computational engines that compute trajectories, transport beam envelopes, fit parameters to optical constraints and carry out similar calculations for the student designed beamlines. The primary computational engine is provided by the third-order TRANSPORT code. Augmenting TRANSPORT is the multiple ray tracing program TURTLE and a first-order matrix program that includes a space charge model and support for calculating single particle trajectories in the presence of the beam space charge. This paper describes progress on the development of the PBO Lab. {copyright} {ital 1997 American Institute of Physics.}

  10. The Particle Beam Optics Interactive Computer Laboratory

    SciTech Connect

    Gillespie, George H.; Hill, Barrey W.; Brown, Nathan A.; Babcock, R. Chris; Martono, Hendy; Carey, David C.

    1997-02-01

    The Particle Beam Optics Interactive Computer Laboratory (PBO Lab) is an educational software concept to aid students and professionals in learning about charged particle beams and particle beam optical systems. The PBO Lab is being developed as a cross-platform application and includes four key elements. The first is a graphic user interface shell that provides for a highly interactive learning session. The second is a knowledge database containing information on electric and magnetic optics transport elements. The knowledge database provides interactive tutorials on the fundamental physics of charged particle optics and on the technology used in particle optics hardware. The third element is a graphical construction kit that provides tools for students to interactively and visually construct optical beamlines. The final element is a set of charged particle optics computational engines that compute trajectories, transport beam envelopes, fit parameters to optical constraints and carry out similar calculations for the student designed beamlines. The primary computational engine is provided by the third-order TRANSPORT code. Augmenting TRANSPORT is the multiple ray tracing program TURTLE and a first-order matrix program that includes a space charge model and support for calculating single particle trajectories in the presence of the beam space charge. This paper describes progress on the development of the PBO Lab.

  11. Applied human factors research at the NASA Johnson Space Center Human-Computer Interaction Laboratory

    NASA Technical Reports Server (NTRS)

    Rudisill, Marianne; Mckay, Timothy D.

    1990-01-01

    The applied human factors research program performed at the NASA Johnson Space Center's Human-Computer Interaction Laboratory is discussed. Research is conducted to advance knowledge in human interaction with computer systems during space crew tasks. In addition, the Laboratory is directly involved in the specification of the human-computer interface (HCI) for space systems in development (e.g., Space Station Freedom) and is providing guidelines and support for HCI design to current and future space missions.

  12. Advanced Biomedical Computing Center (ABCC) | DSITP

    Cancer.gov

    The Advanced Biomedical Computing Center (ABCC), located in Frederick Maryland (MD), provides HPC resources for both NIH/NCI intramural scientists and the extramural biomedical research community. Its mission is to provide HPC support, to provide collaborative research, and to conduct in-house research in various areas of computational biology and biomedical research.

  13. Advanced laptop and small personal computer technology

    NASA Technical Reports Server (NTRS)

    Johnson, Roger L.

    1991-01-01

    Advanced laptop and small personal computer technology is presented in the form of the viewgraphs. The following areas of hand carried computers and mobile workstation technology are covered: background, applications, high end products, technology trends, requirements for the Control Center application, and recommendations for the future.

  14. Advancing crime scene computer forensics techniques

    NASA Astrophysics Data System (ADS)

    Hosmer, Chet; Feldman, John; Giordano, Joe

    1999-02-01

    Computers and network technology have become inexpensive and powerful tools that can be applied to a wide range of criminal activity. Computers have changed the world's view of evidence because computers are used more and more as tools in committing `traditional crimes' such as embezzlements, thefts, extortion and murder. This paper will focus on reviewing the current state-of-the-art of the data recovery and evidence construction tools used in both the field and laboratory for prosection purposes.

  15. Opportunities in computational mechanics: Advances in parallel computing

    SciTech Connect

    Lesar, R.A.

    1999-02-01

    In this paper, the authors will discuss recent advances in computing power and the prospects for using these new capabilities for studying plasticity and failure. They will first review the new capabilities made available with parallel computing. They will discuss how these machines perform and how well their architecture might work on materials issues. Finally, they will give some estimates on the size of problems possible using these computers.

  16. Role of HPC in Advancing Computational Aeroelasticity

    NASA Technical Reports Server (NTRS)

    Guruswamy, Guru P.

    2004-01-01

    On behalf of the High Performance Computing and Modernization Program (HPCMP) and NASA Advanced Supercomputing Division (NAS) a study is conducted to assess the role of supercomputers on computational aeroelasticity of aerospace vehicles. The study is mostly based on the responses to a web based questionnaire that was designed to capture the nuances of high performance computational aeroelasticity, particularly on parallel computers. A procedure is presented to assign a fidelity-complexity index to each application. Case studies based on major applications using HPCMP resources are presented.

  17. Advances and trends in computational structural mechanics

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Atluri, Satya N.

    1987-01-01

    The development status and applicational range of techniques in computational structural mechanics (CSM) are evaluated with a view to advances in computational models for material behavior, discrete-element technology, quality assessment, the control of numerical simulations of structural response, hybrid analysis techniques, techniques for large-scale optimization, and the impact of new computing systems on CSM. Primary pacers of CSM development encompass prediction and analysis of novel materials for structural components, computational strategies for large-scale structural calculations, and the assessment of response prediction reliability together with its adaptive improvement.

  18. Results of Laboratory Testing of Advanced Power Strips: Preprint

    SciTech Connect

    Earle, L.; Sparn, B.

    2012-08-01

    This paper describes the results of a laboratory investigation to evaluate the technical performance of advanced power strip (APS) devices when subjected to a range of home entertainment center and home office usage scenarios.

  19. OPENING REMARKS: Scientific Discovery through Advanced Computing

    NASA Astrophysics Data System (ADS)

    Strayer, Michael

    2006-01-01

    as the national and regional electricity grid, carbon sequestration, virtual engineering, and the nuclear fuel cycle. The successes of the first five years of SciDAC have demonstrated the power of using advanced computing to enable scientific discovery. One measure of this success could be found in the President’s State of the Union address in which President Bush identified ‘supercomputing’ as a major focus area of the American Competitiveness Initiative. Funds were provided in the FY 2007 President’s Budget request to increase the size of the NERSC-5 procurement to between 100-150 teraflops, to upgrade the LCF Cray XT3 at Oak Ridge to 250 teraflops and acquire a 100 teraflop IBM BlueGene/P to establish the Leadership computing facility at Argonne. We believe that we are on a path to establish a petascale computing resource for open science by 2009. We must develop software tools, packages, and libraries as well as the scientific application software that will scale to hundreds of thousands of processors. Computer scientists from universities and the DOE’s national laboratories will be asked to collaborate on the development of the critical system software components such as compilers, light-weight operating systems and file systems. Standing up these large machines will not be business as usual for ASCR. We intend to develop a series of interconnected projects that identify cost, schedule, risks, and scope for the upgrades at the LCF at Oak Ridge, the establishment of the LCF at Argonne, and the development of the software to support these high-end computers. The critical first step in defining the scope of the project is to identify a set of early application codes for each leadership class computing facility. These codes will have access to the resources during the commissioning phase of the facility projects and will be part of the acceptance tests for the machines. Applications will be selected, in part, by breakthrough science, scalability, and

  20. Advanced computational tools for 3-D seismic analysis

    SciTech Connect

    Barhen, J.; Glover, C.W.; Protopopescu, V.A.

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  1. TOPICAL REVIEW: Advances and challenges in computational plasma science

    NASA Astrophysics Data System (ADS)

    Tang, W. M.; Chan, V. S.

    2005-02-01

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behaviour. Recent advances in simulations of magnetically confined plasmas are reviewed in this paper, with illustrative examples, chosen from associated research areas such as microturbulence, magnetohydrodynamics and other topics. Progress has been stimulated, in particular, by the exponential growth of computer speed along with significant improvements in computer technology. The advances in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics have produced increasingly good agreement between experimental observations and computational modelling. This was enabled by two key factors: (a) innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales and (b) access to powerful new computational resources. Excellent progress has been made in developing codes for which computer run-time and problem-size scale well with the number of processors on massively parallel processors (MPPs). Examples include the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPPs to produce three-dimensional, general geometry, nonlinear particle simulations that have accelerated advances in understanding the nature of turbulence self-regulation by zonal flows. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In looking towards the future, the current results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. This

  2. A Multistep Synthesis for an Advanced Undergraduate Organic Chemistry Laboratory

    ERIC Educational Resources Information Center

    Chang Ji; Peters, Dennis G.

    2006-01-01

    Multistep syntheses are often important components of the undergraduate organic laboratory experience and a three-step synthesis of 5-(2-sulfhydrylethyl) salicylaldehyde was described. The experiment is useful as a special project for an advanced undergraduate organic chemistry laboratory course and offers opportunities for students to master a…

  3. Advances and Challenges in Computational Plasma Science

    SciTech Connect

    W.M. Tang; V.S. Chan

    2005-01-03

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behavior. Recent advances in simulations of magnetically-confined plasmas are reviewed in this paper with illustrative examples chosen from associated research areas such as microturbulence, magnetohydrodynamics, and other topics. Progress has been stimulated in particular by the exponential growth of computer speed along with significant improvements in computer technology.

  4. A Virtual Laboratory on Natural Computing: A Learning Experiment

    ERIC Educational Resources Information Center

    de Castro, Leandro Nunes; Muñoz, Yupanqui Julho; de Freitas, Leandro Rubim; El-Hani, Charbel Niño

    2008-01-01

    Natural computing is a terminology used to describe computational algorithms developed by taking inspiration from information processing mechanisms in nature, methods to synthesize natural phenomena in computers, and novel computational approaches based on natural materials. The virtual laboratory on natural computing (LVCoN) is a Web environment…

  5. An advanced laboratory course that emphasizes communication

    NASA Astrophysics Data System (ADS)

    Rieger, Georg

    2012-10-01

    I will introduce a fourth-year laboratory course that has a strong focus on communication skills. The course is meant to give students a preview of how experimental physics is performed in an academic or industrial research lab. The design is such that the course approximates the experience of a graduate student in a research group, which I regard as an ideal learning environment. I will contrast this with the learning experience in a typical first- or second-year lab. Results from a small survey are also presented.

  6. Advanced Computing Architectures for Cognitive Processing

    DTIC Science & Technology

    2009-07-01

    AND IS APPROVED FOR PUBLICATION IN ACCORDANCE WITH ASSIGNED DISTRIBUTION STATEMENT. FOR THE DIRECTOR: / s ... s / LOK YAN EDWARD J. JONES, Deputy Chief Work Unit Manager Advanced Computing Division...ELEMENT NUMBER 62702F 6. AUTHOR( S ) Gregory D. Peterson 5d. PROJECT NUMBER 459T 5e. TASK NUMBER AC 5f. WORK UNIT NUMBER CP 7. PERFORMING

  7. Predictive Dynamic Security Assessment through Advanced Computing

    SciTech Connect

    Huang, Zhenyu; Diao, Ruisheng; Jin, Shuangshuang; Chen, Yousu

    2014-11-30

    Abstract— Traditional dynamic security assessment is limited by several factors and thus falls short in providing real-time information to be predictive for power system operation. These factors include the steady-state assumption of current operating points, static transfer limits, and low computational speed. This addresses these factors and frames predictive dynamic security assessment. The primary objective of predictive dynamic security assessment is to enhance the functionality and computational process of dynamic security assessment through the use of high-speed phasor measurements and the application of advanced computing technologies for faster-than-real-time simulation. This paper presents algorithms, computing platforms, and simulation frameworks that constitute the predictive dynamic security assessment capability. Examples of phasor application and fast computation for dynamic security assessment are included to demonstrate the feasibility and speed enhancement for real-time applications.

  8. Advances in laboratory geophysics using bender elements

    NASA Astrophysics Data System (ADS)

    Rio, Joao Filipe Meneses Espinheira

    Bender element transducers are used to determine the small-strain shear stiffness, Go, of soil, by determining the velocity of propagation of mechanical waves through tested samples. They are normally used in the laboratory, on their own or incorpo rated in geotechnical equipment such as triaxial cells or oedometers. Different excitation signals and interpretation methods are presently applied, each producing different results. The initial assumptions of unbounded wave propa gation, generally used in bender element testing and inherited from seismic cross-hole testing, are quite crude and do not account for specific boundary conditions, which might explain the lack of reliability of the results. The main objective of this study is to establish the influence of the sample and transducer geometry in the behaviour of a typical bender element test system. Laboratory and numerical tests, supported by a theoretical analytical study, are conducted and the results presented in order to achieve this goal. An independent monitoring of the dynamic behaviour of the bender elements and samples is also carried out. Using a laser velocimeter, capable of recording the motion of the subjects without interference, their dynamic responses can be obtained and their mechanical properties verified. A parametric study dealing with sample geometry is presented, where 24 samples with different geometries are tested. Synthetic rubber is used as a substitute for soft clay, due to the great number of samples involved and the necessity of guarantee the constancy of their properties. The numerical analysis makes use of three-dimensional finite difference models with different geometries. A regressive analysis is possible since the elastic properties of the system are pre-determined and used to evaluate the results. A numerical analysis also has the benefit of providing the response not only at a single receiving point but at any node in the model.

  9. Argonne's Laboratory computing resource center : 2006 annual report.

    SciTech Connect

    Bair, R. B.; Kaushik, D. K.; Riley, K. R.; Valdes, J. V.; Drugan, C. D.; Pieper, G. P.

    2007-05-31

    Argonne National Laboratory founded the Laboratory Computing Resource Center (LCRC) in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. In September 2002 the LCRC deployed a 350-node computing cluster from Linux NetworX to address Laboratory needs for mid-range supercomputing. This cluster, named 'Jazz', achieved over a teraflop of computing power (10{sup 12} floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the 50 fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2006, there were 76 active projects on Jazz involving over 380 scientists and engineers. These projects represent a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to foster growth in the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure providers to offer more scientific data management capabilities, expanding Argonne staff use of national

  10. Advances in laboratory diagnosis of hereditary spherocytosis.

    PubMed

    Farias, Mariela Granero

    2016-11-12

    Among the red cell membrane disorders, hereditary spherocytosis (HS) is one of the most common causes of inherited hemolytic anemia. HS results from the deficiency or dysfunction of red blood cell membrane proteins, such as α spectrin, β spectrin, ankyrin, anion channel protein (Band-3 protein), protein 4.1 and protein 4.2. Conventionally, HS diagnosis is established through a series of tests, which include spherocytes identification in peripheral smear, reticulocyte count, osmotic fragility, etc. Currently, different hematological analyzers provide erythrocyte indicators that estimate the presence of spherocytes and correlate that with HS, which can be useful for disease screening. The most traditional method is the osmotic fragility (OF) test, which is labor-intensive and time-consuming to perform and presents low sensitivity and specificity values. Thus, new methods have been developed for HS diagnosis, such as flow cytometry. Current guidelines recommend the use of flow cytometry as a screening test for HS diagnosis using the eosin-5'-maleimide (EMA) binding test. Thus, HS diagnosis is the result of a collaboration between clinicians and laboratories, who should take into account the family history and the exclusion of other causes of secondary spherocytosis.

  11. Argonne Laboratory Computing Resource Center - FY2004 Report.

    SciTech Connect

    Bair, R.

    2005-04-14

    In the spring of 2002, Argonne National Laboratory founded the Laboratory Computing Resource Center, and in April 2003 LCRC began full operations with Argonne's first teraflops computing cluster. The LCRC's driving mission is to enable and promote computational science and engineering across the Laboratory, primarily by operating computing facilities and supporting application use and development. This report describes the scientific activities, computing facilities, and usage in the first eighteen months of LCRC operation. In this short time LCRC has had broad impact on programs across the Laboratory. The LCRC computing facility, Jazz, is available to the entire Laboratory community. In addition, the LCRC staff provides training in high-performance computing and guidance on application usage, code porting, and algorithm development. All Argonne personnel and collaborators are encouraged to take advantage of this computing resource and to provide input into the vision and plans for computing and computational analysis at Argonne. Steering for LCRC comes from the Computational Science Advisory Committee, composed of computing experts from many Laboratory divisions. The CSAC Allocations Committee makes decisions on individual project allocations for Jazz.

  12. Technological advances in the hemostasis laboratory.

    PubMed

    Lippi, Giuseppe; Plebani, Mario; Favaloro, Emmanuel J

    2014-03-01

    Automation is conventionally defined as the use of machines, control systems, and information technologies to optimize productivity. Although automation is now commonplace in several areas of diagnostic testing, especially in clinical chemistry and immunochemistry, the concept of extending this process to hemostasis testing has only recently been advanced. The leading drawbacks are still represented by the almost unique biological matrix because citrated plasma can only be used for clotting assays and few other notable exceptions, and by the highly specific pretreatment of samples, which is particularly distinct to other test systems. Despite these important limitations, a certain degree of automation is also now embracing hemostasis testing. The more relevant developments include the growing integration of routine hemostasis analyzers with track line systems and workcells, the development of specific instrumentation tools to enhance reliability of testing (i.e., signal detection with different technologies to increase test panels, plasma indices for preanalytical check of interfering substances, failure patterns sensors for identifying insufficient volume, clots or bubbles, cap-piercing for enhancing operator safety, automatic reflex testing, automatic redilution of samples, and laser barcode readers), preanalytical features (e.g., positive identification, automatic systems for tube(s) labeling, transillumination devices), and postphlebotomy tools (pneumatic tube systems for reducing turnaround time, sample transport boxes for ensuring stability of specimens, monitoring systems for identifying unsuitable conditions of transport). Regardless of these important innovations, coagulation/hemostasis testing still requires specific technical and clinical expertise, not only in terms of measurement procedures but also for interpreting and then appropriately utilizing the derived information. Thus, additional and special caution has to be used when designing projects of

  13. [Recent advances in immunologic laboratory testing for rheumatic diseases].

    PubMed

    Akahoshi, Tohru

    2004-09-01

    Immunologic laboratory tests serve critical roles in the care of patients with various rheumatic diseases. These tests can provide relevant information of rheumatic diseases based on their immunopathophysiological condition. Although immunologic laboratory tests are quite useful for the determination of diagnosis and the estimation of disease activity, organ involvement and prognosis, they are frequently misused and resulted in an inappropriate diagnosis and treatment. Appropriate use of immunologic laboratory tests and accurate clinical interpretation of the test results can prevent false diagnosis and unnecessary treatment. In order to improve clinical care of patients with rheumatic diseases, clinicians caring patients with rheumatic disease should recognize meanings, characteristics and limitations of each result of immunologic laboratory testing. This article reviewed recent advances in immunologic laboratory testing such as antinuclear antibody, anti-DNA antibody and anti-neutrophil cytoplasmic antibody, and introduced guidelines for these testing. These guidelines based on evidences of EBM may contribute to the appropriate use of immunologic laboratory tests for rheumatic diseases.

  14. Advances in Computational Capabilities for Hypersonic Flows

    NASA Technical Reports Server (NTRS)

    Kumar, Ajay; Gnoffo, Peter A.; Moss, James N.; Drummond, J. Philip

    1997-01-01

    The paper reviews the growth and advances in computational capabilities for hypersonic applications over the period from the mid-1980's to the present day. The current status of the code development issues such as surface and field grid generation, algorithms, physical and chemical modeling, and validation is provided. A brief description of some of the major codes being used at NASA Langley Research Center for hypersonic continuum and rarefied flows is provided, along with their capabilities and deficiencies. A number of application examples are presented, and future areas of research to enhance accuracy, reliability, efficiency, and robustness of computational codes are discussed.

  15. The Advanced Controls Program at Oak Ridge National Laboratory

    SciTech Connect

    Knee, H.E.; White, J.D.

    1990-01-01

    The Oak Ridge National Laboratory (ORNL), under sponsorship of the US Department of Energy (DOE), is conducting research that will lead to advanced, automated control of new liquid-metal-reactor (LMR) nuclear power plants. Although this program of research (entitled the Advanced Controls Program'') is focused on LMR technology, it will be capable of providing control design, test, and qualification capability for other advanced reactor designs (e.g., the advanced light water reactor (ALWR) and high temperature gas-cooled reactor (HTGR) designs), while also benefiting existing nuclear plants. The Program will also have applicability to complex, non-nuclear process control environments (e.g., petrochemical, aerospace, etc.). The Advanced Controls Program will support capabilities throughout the entire plant design life cycle, i.e., from the initial interactive first-principle dynamic model development for the process, systems, components, and instruments through advanced control room qualification. The current program involves five principal areas of research activities: (1) demonstrations of advanced control system designs, (2) development of an advanced controls design environment, (3) development of advanced control strategies, (4) research and development (R D) in human-system integration for advanced control system designs, and (5) testing and validation of advanced control system designs. Discussion of the research in these five areas forms the basis of this paper. Also included is a description of the research directions of the program. 8 refs.

  16. Determination of Absolute Zero Using a Computer-Based Laboratory

    ERIC Educational Resources Information Center

    Amrani, D.

    2007-01-01

    We present a simple computer-based laboratory experiment for evaluating absolute zero in degrees Celsius, which can be performed in college and undergraduate physical sciences laboratory courses. With a computer, absolute zero apparatus can help demonstrators or students to observe the relationship between temperature and pressure and use…

  17. A Software Laboratory Environment for Computer-Based Problem Solving.

    ERIC Educational Resources Information Center

    Kurtz, Barry L.; O'Neal, Micheal B.

    This paper describes a National Science Foundation-sponsored project at Louisiana Technological University to develop computer-based laboratories for "hands-on" introductions to major topics of computer science. The underlying strategy is to develop structured laboratory environments that present abstract concepts through the use of…

  18. Mentoring for retention and advancement in the multigenerational clinical laboratory.

    PubMed

    Laudicina, R J

    2001-01-01

    Retention of recent graduates and other laboratory practitioners in the workplace will play a key role in addressing current and projected shortages of clinical laboratory scientists (CLS) and technicians (CLT). In addition, with overrepresentation of the aging Baby Boomer generation in laboratory supervisory and management positions, it is crucial not only to retain younger practitioners, but to prepare them for assuming these important functions in the future. Mentoring, a practice commonly employed in other professions, is widely considered to be useful in employee retention and career advancement. Mentoring has probably been used in the clinical laboratory profession, but has not been well documented. In the clinical laboratory environment, potential mentors are in the Veteran and Baby Boomer generations, and new practitioners who could benefit from mentoring are in Generation X. Generational differences among these groups may present challenges to the use of mentoring. This article will attempt to provide a better understanding of generational differences and show how mentoring can be applied in the setting of the clinical laboratory in order to increase retention and promote career advancement of younger practitioners. A panel of five laboratory managers provided examples of mentoring strategies. Definitions, benefits, and examples of mentoring are addressed in the accompanying article, "Passing the Torch: Mentoring the Next Generation of Laboratory Professionals".

  19. Advanced Propulsion Concepts at the Jet Propulsion Laboratory

    NASA Technical Reports Server (NTRS)

    Brophy, J. R.

    1997-01-01

    Current interest in advanced propulsion within NASA and research activities in advanced propulsion concepts at the Jet Propulsion Laboratory are reviewed. The concepts, which include high power plasma thrusters such as lithuim-fueled Lorentz-Force-Accelerators, MEMS-scale propulsion systems, in-situ propellant utilization techniques, fusion propulsion systems and methods of using antimatter, offer the potential for either significantly enhancing space transportation capability as compared with that of traditional chemical propulsion, or enabling ambitious new missions.

  20. Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2000-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center. It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. Ames has been designated NASA's Center of Excellence in Information Technology. In this capacity, Ames is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA Ames and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth; (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking. Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a

  1. Airborne Advanced Reconfigurable Computer System (ARCS)

    NASA Technical Reports Server (NTRS)

    Bjurman, B. E.; Jenkins, G. M.; Masreliez, C. J.; Mcclellan, K. L.; Templeman, J. E.

    1976-01-01

    A digital computer subsystem fault-tolerant concept was defined, and the potential benefits and costs of such a subsystem were assessed when used as the central element of a new transport's flight control system. The derived advanced reconfigurable computer system (ARCS) is a triple-redundant computer subsystem that automatically reconfigures, under multiple fault conditions, from triplex to duplex to simplex operation, with redundancy recovery if the fault condition is transient. The study included criteria development covering factors at the aircraft's operation level that would influence the design of a fault-tolerant system for commercial airline use. A new reliability analysis tool was developed for evaluating redundant, fault-tolerant system availability and survivability; and a stringent digital system software design methodology was used to achieve design/implementation visibility.

  2. Advances in Measurement Technology at NIST's Physical Measurement Laboratory

    NASA Astrophysics Data System (ADS)

    Dehmer, Joseph

    2014-03-01

    The NIST mission is to promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology. The Physical Measurement Laboratory (PML) has responsibility for maintaining national standards for two dozen physical quantities needed for international trade; and, importantly, it carries out advanced research at the frontiers of measurement science to enable extending innovation into new realms and new markets. This talk will highlight advances being made across several sectors of technology; and it will describe how PML interacts with its many collaborators and clients in industry, government, and academe.

  3. A Hierarchical Computer Network: An Alternative Approach to Clinical Laboratory Computerization in a Large Hospital

    PubMed Central

    Miller, Robert E.; Steinbach, Glen L.; Dayhoff, Ruth E.

    1980-01-01

    Computerized data handling is recognized as an essential aspect of the modern clinical laboratory in medium and large sized hospitals. However, most currently installed proprietary or “turnkey” systems are often hardware/software-constrained and based on outmoded design concepts which seriously limit the use of the laboratory computer system as an effective patient care, research, and management tool. These short-comings are particularly serious in the large university teaching hospital. Recent improvements in the price/performance ratio for computer hardware, the availability of specialized high-level applications-oriented languages, and advances in data communications have permitted development of powerful computer networks which are economically feasible in the large hospital setting. An operational three-tiered hierarchical network for clinical laboratory data processing is described. The integration of the clinical laboratory data processing function into overall institutional information processing, details of the computer system configuration, and the benefits realized are discussed.

  4. Exploring Electronics Laboratory Experiments Using Computer Software

    ERIC Educational Resources Information Center

    Gandole, Yogendra Babarao

    2011-01-01

    The roles of teachers and students are changing, and there are undoubtedly ways of learning not yet discovered. However, the computer and software technology may provide a significant role to identify the problems, to present solutions and life-long learning. It is clear that the computer based educational technology has reached the point where…

  5. Polybrominated Diphenyl Ethers in Dryer Lint: An Advanced Analysis Laboratory

    ERIC Educational Resources Information Center

    Thompson, Robert Q.

    2008-01-01

    An advanced analytical chemistry laboratory experiment is described that involves environmental analysis and gas chromatography-mass spectrometry. Students analyze lint from clothes dryers for traces of flame retardant chemicals, polybrominated diphenylethers (PBDEs), compounds receiving much attention recently. In a typical experiment, ng/g…

  6. Results of Laboratory Testing of Advanced Power Strips

    SciTech Connect

    Earle, L.; Sparn, B.

    2012-08-01

    Presented at the ACEEE Summer Study on Energy Efficiency in Buildings on August 12-17, 2012, this presentation reports on laboratory tests of 20 currently available advanced power strip products, which reduce wasteful electricity use of miscellaneous electric loads in buildings.

  7. A Simultaneous Analysis Problem for Advanced General Chemistry Laboratories.

    ERIC Educational Resources Information Center

    Leary, J. J.; Gallaher, T. N.

    1983-01-01

    Oxidation of magnesium metal in air has been used as an introductory experiment for determining the formula of a compound. The experiment described employs essentially the same laboratory procedure but is significantly more advanced in terms of information sought. Procedures and sample calculations/results are provided. (JN)

  8. A Reverse Osmosis System for an Advanced Separation Process Laboratory.

    ERIC Educational Resources Information Center

    Slater, C. S.; Paccione, J. D.

    1987-01-01

    Focuses on the development of a pilot unit for use in an advanced separations process laboratory in an effort to develop experiments on such processes as reverse osmosis, ultrafiltration, adsorption, and chromatography. Discusses reverse osmosis principles, the experimental system design, and some experimental studies. (TW)

  9. Argonne's Laboratory Computing Resource Center : 2005 annual report.

    SciTech Connect

    Bair, R. B.; Coghlan, S. C; Kaushik, D. K.; Riley, K. R.; Valdes, J. V.; Pieper, G. P.

    2007-06-30

    Argonne National Laboratory founded the Laboratory Computing Resource Center in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. The first goal of the LCRC was to deploy a mid-range supercomputing facility to support the unmet computational needs of the Laboratory. To this end, in September 2002, the Laboratory purchased a 350-node computing cluster from Linux NetworX. This cluster, named 'Jazz', achieved over a teraflop of computing power (10{sup 12} floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the fifty fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2005, there were 62 active projects on Jazz involving over 320 scientists and engineers. These projects represent a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to improve the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure providers to develop

  10. Analog Computer Laboratory with Biological Examples.

    ERIC Educational Resources Information Center

    Strebel, Donald E.

    1979-01-01

    The use of biological examples in teaching applications of the analog computer is discussed and several examples from mathematical ecology, enzyme kinetics, and tracer dynamics are described. (Author/GA)

  11. Controlling Laboratory Processes From A Personal Computer

    NASA Technical Reports Server (NTRS)

    Will, H.; Mackin, M. A.

    1991-01-01

    Computer program provides natural-language process control from IBM PC or compatible computer. Sets up process-control system that either runs without operator or run by workers who have limited programming skills. Includes three smaller programs. Two of them, written in FORTRAN 77, record data and control research processes. Third program, written in Pascal, generates FORTRAN subroutines used by other two programs to identify user commands with device-driving routines written by user. Also includes set of input data allowing user to define user commands to be executed by computer. Requires personal computer operating under MS-DOS with suitable hardware interfaces to all controlled devices. Also requires FORTRAN 77 compiler and device drivers written by user.

  12. 75 FR 64720 - DOE/Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-20

    .../Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing... Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U. S. Department...

  13. 75 FR 9887 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-04

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing... Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U.S. Department...

  14. 78 FR 6087 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-29

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing..., Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U. S. Department of...

  15. 75 FR 43518 - Advanced Scientific Computing Advisory Committee; Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-26

    ... Advanced Scientific Computing Advisory Committee; Meeting AGENCY: Office of Science, DOE. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing Advisory..., Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U. S. Department of...

  16. 76 FR 41234 - Advanced Scientific Computing Advisory Committee Charter Renewal

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-13

    ... Advanced Scientific Computing Advisory Committee Charter Renewal AGENCY: Department of Energy, Office of... Administration, notice is hereby given that the Advanced Scientific Computing Advisory Committee will be renewed... concerning the Advanced Scientific Computing program in response only to charges from the Director of...

  17. 76 FR 9765 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-22

    ... Advanced Scientific Computing Advisory Committee AGENCY: Office of Science, Department of Energy. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing..., Office of Advanced Scientific Computing Research, SC-21/Germantown Building, U.S. Department of...

  18. 77 FR 45345 - DOE/Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-31

    .../Advanced Scientific Computing Advisory Committee AGENCY: Office of Science, Department of Energy. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing... Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U.S. Department...

  19. 78 FR 41046 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-09

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION... hereby given that the Advanced Scientific Computing Advisory Committee will be renewed for a two-year... (DOE), on the Advanced Scientific Computing Research Program managed by the Office of...

  20. Computational Design of Advanced Nuclear Fuels

    SciTech Connect

    Savrasov, Sergey; Kotliar, Gabriel; Haule, Kristjan

    2014-06-03

    The objective of the project was to develop a method for theoretical understanding of nuclear fuel materials whose physical and thermophysical properties can be predicted from first principles using a novel dynamical mean field method for electronic structure calculations. We concentrated our study on uranium, plutonium, their oxides, nitrides, carbides, as well as some rare earth materials whose 4f eletrons provide a simplified framework for understanding complex behavior of the f electrons. We addressed the issues connected to the electronic structure, lattice instabilities, phonon and magnon dynamics as well as thermal conductivity. This allowed us to evaluate characteristics of advanced nuclear fuel systems using computer based simulations and avoid costly experiments.

  1. ATCA for Machines-- Advanced Telecommunications Computing Architecture

    SciTech Connect

    Larsen, R.S.; /SLAC

    2008-04-22

    The Advanced Telecommunications Computing Architecture is a new industry open standard for electronics instrument modules and shelves being evaluated for the International Linear Collider (ILC). It is the first industrial standard designed for High Availability (HA). ILC availability simulations have shown clearly that the capabilities of ATCA are needed in order to achieve acceptable integrated luminosity. The ATCA architecture looks attractive for beam instruments and detector applications as well. This paper provides an overview of ongoing R&D including application of HA principles to power electronics systems.

  2. National facility for advanced computational science: A sustainable path to scientific discovery

    SciTech Connect

    Simon, Horst; Kramer, William; Saphir, William; Shalf, John; Bailey, David; Oliker, Leonid; Banda, Michael; McCurdy, C. William; Hules, John; Canning, Andrew; Day, Marc; Colella, Philip; Serafini, David; Wehner, Michael; Nugent, Peter

    2004-04-02

    Lawrence Berkeley National Laboratory (Berkeley Lab) proposes to create a National Facility for Advanced Computational Science (NFACS) and to establish a new partnership between the American computer industry and a national consortium of laboratories, universities, and computing facilities. NFACS will provide leadership-class scientific computing capability to scientists and engineers nationwide, independent of their institutional affiliation or source of funding. This partnership will bring into existence a new class of computational capability in the United States that is optimal for science and will create a sustainable path towards petaflops performance.

  3. The Modern Mini-Computer in Laboratory Automation

    ERIC Educational Resources Information Center

    Castellan, N. John, Jr.

    1975-01-01

    A report of the growth and present status of the mini-computer based, time sharing laboratory at the University of Indiana, which describes the system hardware, software, and applications in psychological experimentation. (EH)

  4. High resolution computed tomography of advanced composite and ceramic materials

    NASA Technical Reports Server (NTRS)

    Yancey, R. N.; Klima, S. J.

    1991-01-01

    Advanced composite and ceramic materials are being developed for use in many new defense and commercial applications. In order to achieve the desired mechanical properties of these materials, the structural elements must be carefully analyzed and engineered. A study was conducted to evaluate the use of high resolution computed tomography (CT) as a macrostructural analysis tool for advanced composite and ceramic materials. Several samples were scanned using a laboratory high resolution CT scanner. Samples were also destructively analyzed at the locations of the scans and the nondestructive and destructive results were compared. The study provides useful information outlining the strengths and limitations of this technique and the prospects for further research in this area.

  5. Argonne's Laboratory Computing Resource Center 2009 annual report.

    SciTech Connect

    Bair, R. B.

    2011-05-13

    Now in its seventh year of operation, the Laboratory Computing Resource Center (LCRC) continues to be an integral component of science and engineering research at Argonne, supporting a diverse portfolio of projects for the U.S. Department of Energy and other sponsors. The LCRC's ongoing mission is to enable and promote computational science and engineering across the Laboratory, primarily by operating computing facilities and supporting high-performance computing application use and development. This report describes scientific activities carried out with LCRC resources in 2009 and the broad impact on programs across the Laboratory. The LCRC computing facility, Jazz, is available to the entire Laboratory community. In addition, the LCRC staff provides training in high-performance computing and guidance on application usage, code porting, and algorithm development. All Argonne personnel and collaborators are encouraged to take advantage of this computing resource and to provide input into the vision and plans for computing and computational analysis at Argonne. The LCRC Allocations Committee makes decisions on individual project allocations for Jazz. Committee members are appointed by the Associate Laboratory Directors and span a range of computational disciplines. The 350-node LCRC cluster, Jazz, began production service in April 2003 and has been a research work horse ever since. Hosting a wealth of software tools and applications and achieving high availability year after year, researchers can count on Jazz to achieve project milestones and enable breakthroughs. Over the years, many projects have achieved results that would have been unobtainable without such a computing resource. In fiscal year 2009, there were 49 active projects representing a wide cross-section of Laboratory research and almost all research divisions.

  6. Computational Electromagnetics (CEM) Laboratory: Simulation Planning Guide

    NASA Technical Reports Server (NTRS)

    Khayat, Michael A.

    2011-01-01

    The simulation process, milestones and inputs are unknowns to first-time users of the CEM Laboratory. The Simulation Planning Guide aids in establishing expectations for both NASA and non-NASA facility customers. The potential audience for this guide includes both internal and commercial spaceflight hardware/software developers. It is intended to assist their engineering personnel in simulation planning and execution. Material covered includes a roadmap of the simulation process, roles and responsibilities of facility and user, major milestones, facility capabilities, and inputs required by the facility. Samples of deliverables, facility interfaces, and inputs necessary to define scope, cost, and schedule are included as an appendix to the guide.

  7. Advanced Computational Thermal Studies and their Assessment for Supercritical-Pressure Reactors (SCRs)

    SciTech Connect

    D. M. McEligot; J. Y. Yoo; J. S. Lee; S. T. Ro; E. Lurien; S. O. Park; R. H. Pletcher; B. L. Smith; P. Vukoslavcevic; J. M. Wallace

    2009-04-01

    The goal of this laboratory / university collaboration of coupled computational and experimental studies is the improvement of predictive methods for supercritical-pressure reactors. The general objective is to develop supporting knowledge needed of advanced computational techniques for the technology development of the concepts and their safety systems.

  8. Teaching Advanced Concepts in Computer Networks: VNUML-UM Virtualization Tool

    ERIC Educational Resources Information Center

    Ruiz-Martinez, A.; Pereniguez-Garcia, F.; Marin-Lopez, R.; Ruiz-Martinez, P. M.; Skarmeta-Gomez, A. F.

    2013-01-01

    In the teaching of computer networks the main problem that arises is the high price and limited number of network devices the students can work with in the laboratories. Nowadays, with virtualization we can overcome this limitation. In this paper, we present a methodology that allows students to learn advanced computer network concepts through…

  9. A Choice of Terminals: Spatial Patterning in Computer Laboratories

    ERIC Educational Resources Information Center

    Spennemann, Dirk; Cornforth, David; Atkinson, John

    2007-01-01

    Purpose: This paper seeks to examine the spatial patterns of student use of machines in each laboratory to whether there are underlying commonalities. Design/methodology/approach: The research was carried out by assessing the user behaviour in 16 computer laboratories at a regional university in Australia. Findings: The study found that computers…

  10. A Low Cost Microcomputer Laboratory for Investigating Computer Architecture.

    ERIC Educational Resources Information Center

    Mitchell, Eugene E., Ed.

    1980-01-01

    Described is a microcomputer laboratory at the United States Military Academy at West Point, New York, which provides easy access to non-volatile memory and a single input/output file system for 16 microcomputer laboratory positions. A microcomputer network that has a centralized data base is implemented using the concepts of computer network…

  11. Laboratory Demonstrations for PDE and Metals Combustion at NASA MSFC's Advanced Propulsion Laboratory

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Report provides status reporting on activities under order no. H-30549 for the period December 1 through December 31, 1999. Details the activities of the contract in the coordination of planned conduct of experiments at the MSFC Advanced Propulsion Laboratory in pulse detonation MHD power production and metals combustion.

  12. Advanced CNC and CAM Series. Educational Resources for the Machine Tool Industry. Course Syllabi, Instructor's Handbook [and] Student Laboratory Manual.

    ERIC Educational Resources Information Center

    Texas State Technical Coll. System, Waco.

    This package consists of course syllabi, an instructor's handbook, and student laboratory manual for a 1-year vocational training program to prepare students for entry-level positions as advanced computer numerical control (CNC) and computer-assisted manufacturing (CAM) technicians.. The program was developed through a modification of the DACUM…

  13. Performance evaluation of the Oak Ridge National Laboratory's advanced servomanipulator

    SciTech Connect

    Draper, J.V.; Schrock, S.L.; Handel, S.J.

    1988-01-01

    The Consolidated Fuel Reprocessing Program (CFRP) at the Oak Ridge National Laboratory (ORNL) is developing technology for future nuclear fuel reprocessing facilities. This responsibility includes developing advanced telerobotic systems for repair and maintenance of such facilities. In response to a requirement for a highly reliable, remotely maintainable manipulator system, CFRP designed and built the advanced servomanipulator (ASM). This paper reports results of a recent comparison of ASM's performance to that of another highly dexterous manipulator, the Sargeant Industries/Central Research Laboratory's (CRL's) model M-2. Operators using ASM were able to complete tasks in about the same amount of time required to complete tasks with the CRL M-2. 13 refs., 5 figs., 2 tabs.

  14. The advanced computational testing and simulation toolkit (ACTS)

    SciTech Connect

    Drummond, L.A.; Marques, O.

    2002-05-21

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  15. Managing Laboratory Data Using Cloud Computing as an Organizational Tool

    ERIC Educational Resources Information Center

    Bennett, Jacqueline; Pence, Harry E.

    2011-01-01

    One of the most significant difficulties encountered when directing undergraduate research and developing new laboratory experiments is how to efficiently manage the data generated by a number of students. Cloud computing, where both software and computer files reside online, offers a solution to this data-management problem and allows researchers…

  16. Computers in a Teaching Laboratory: Just Another Piece of Apparatus.

    ERIC Educational Resources Information Center

    Harrison, David; Pitre, John M.

    1988-01-01

    Describes computer use in the undergraduate physics laboratory at the University of Toronto. Topics discussed include user interfaces; local area networking; data analysis and acquisition; other computer applications, including a programmable calculator and word processing; and an example of an experiment involving gravity. (LRW)

  17. The Computer Connection: Four Approaches to Microcomputer Laboratory Interfacing.

    ERIC Educational Resources Information Center

    Graef, Jean L.

    1983-01-01

    Four ways in which microcomputers can be turned into laboratory instruments are discussed. These include adding an analog/digital (A/D) converter on a printed circuit board, adding an external A/D converter using the computer's serial port, attaching transducers to the game paddle ports, or connecting an instrument to the computer. (JN)

  18. Advanced Manufacturing Processes Laboratory Building 878 hazards assessment document

    SciTech Connect

    Wood, C.; Thornton, W.; Swihart, A.; Gilman, T.

    1994-07-01

    The introduction of the hazards assessment process is to document the impact of the release of hazards at the Advanced Manufacturing Processes Laboratory (AMPL) that are significant enough to warrant consideration in Sandia National Laboratories` operational emergency management program. This hazards assessment is prepared in accordance with the Department of Energy Order 5500.3A requirement that facility-specific hazards assessments be prepared, maintained, and used for emergency planning purposes. This hazards assessment provides an analysis of the potential airborne release of chemicals associated with the operations and processes at the AMPL. This research and development laboratory develops advanced manufacturing technologies, practices, and unique equipment and provides the fabrication of prototype hardware to meet the needs of Sandia National Laboratories, Albuquerque, New Mexico (SNL/NM). The focus of the hazards assessment is the airborne release of materials because this requires the most rapid, coordinated emergency response on the part of the AMPL, SNL/NM, collocated facilities, and surrounding jurisdiction to protect workers, the public, and the environment.

  19. Interfacing laboratory instruments to multiuser, virtual memory computers

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R.; Roth, Don J.; Stang, David B.

    1990-01-01

    Incentives, problems and solutions associated with interfacing laboratory equipment with multiuser, virtual memory computers are presented. The major difficulty concerns how to utilize these computers effectively in a medium sized research group. This entails optimization of hardware interconnections and software to facilitate multiple instrument control, data acquisition and processing. The architecture of the system that was devised, and associated programming and subroutines are described. An example program involving computer controlled hardware for ultrasonic scan imaging is provided to illustrate the operational features.

  20. Interfacing laboratory instruments to multiuser, virtual memory computers

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R.; Stang, David B.; Roth, Don J.

    1989-01-01

    Incentives, problems and solutions associated with interfacing laboratory equipment with multiuser, virtual memory computers are presented. The major difficulty concerns how to utilize these computers effectively in a medium sized research group. This entails optimization of hardware interconnections and software to facilitate multiple instrument control, data acquisition and processing. The architecture of the system that was devised, and associated programming and subroutines are described. An example program involving computer controlled hardware for ultrasonic scan imaging is provided to illustrate the operational features.

  1. Defense Science Board Report on Advanced Computing

    DTIC Science & Technology

    2009-03-01

    complex computational  issues are  pursued , and that several vendors remain at  the  leading edge of  supercomputing  capability  in  the U.S.  In... pursuing   the  ASC  program  to  help  assure  that  HPC  advances  are  available  to  the  broad  national  security  community. As  in  the past, many...apply HPC  to  technical  problems  related  to  weapons  physics,  but  that  are  entirely  unclassified.  Examples include explosive  astrophysical

  2. Advanced high-performance computer system architectures

    NASA Astrophysics Data System (ADS)

    Vinogradov, V. I.

    2007-02-01

    Convergence of computer systems and communication technologies are moving to switched high-performance modular system architectures on the basis of high-speed switched interconnections. Multi-core processors become more perspective way to high-performance system, and traditional parallel bus system architectures (VME/VXI, cPCI/PXI) are moving to new higher speed serial switched interconnections. Fundamentals in system architecture development are compact modular component strategy, low-power processor, new serial high-speed interface chips on the board, and high-speed switched fabric for SAN architectures. Overview of advanced modular concepts and new international standards for development high-performance embedded and compact modular systems for real-time applications are described.

  3. Making Advanced Computer Science Topics More Accessible through Interactive Technologies

    ERIC Educational Resources Information Center

    Shao, Kun; Maher, Peter

    2012-01-01

    Purpose: Teaching advanced technical concepts in a computer science program to students of different technical backgrounds presents many challenges. The purpose of this paper is to present a detailed experimental pedagogy in teaching advanced computer science topics, such as computer networking, telecommunications and data structures using…

  4. 75 FR 57742 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-22

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION... Scientific Computing Advisory Committee (ASCAC). Federal Advisory Committee Act (Pub. L. 92-463, 86 Stat. 770...: Melea Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building;...

  5. 76 FR 45786 - Advanced Scientific Computing Advisory Committee; Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-01

    ... Advanced Scientific Computing Advisory Committee; Meeting AGENCY: Office of Science, Department of Energy... Computing Advisory Committee (ASCAC). Federal Advisory Committee Act (Pub. L. 92-463, 86 Stat. 770) requires... INFORMATION CONTACT: Melea Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown...

  6. Use of computers in quality assurance of laboratory testing.

    PubMed

    Tan, I K; Jacob, E; Lim, S H

    1990-09-01

    Implementation of comprehensive internal quality control programmes and participation in external quality assessment schemes to monitor analytical performance of laboratory tests have been widely accepted as an essential and integral part of good laboratory practice. As these programmes involve a great deal of repetitive statistical calculations and graphic presentation of data on quality control materials, many laboratories and practically all organisers of inter-laboratory quality assessment schemes increasingly rely on computers to handle the burdensome processing of data and to provide timely feedback in a manner that is easily understood and readily interpreted by analytical staff. However, in spite of the best effort to ensure reliable analytical performance, spurious and misleading results can still occur as a result of non-analytical errors which are not readily detected by methods designed to monitor the quality of analytical process. The use of sophisticated computer system has enabled our laboratory to check for the existence of some of these errors. This paper describes the application of computers in a variety of internal and external quality assessment programmes and demonstrates the usefulness of retrieving patients' cumulative test results and at the same time performing delta or percentage difference checks on such data in the detection of non-analytical errors and unexpected variations in results. The role of the computer in minimising transcription errors, reducing turn-around time of testing and reporting, as well as improving the quality of laboratory reports is also mentioned.

  7. Two Crystallographic Laboratory and Computational Exercises for Undergraduates.

    ERIC Educational Resources Information Center

    Lessinger, Leslie

    1988-01-01

    Describes two introductory exercises designed to teach the fundamental ideas and methods of crystallography, and to convey some important features of inorganic and organic crystal structures to students in an advanced laboratory course. Exercises include "The Crystal Structure of NiO" and "The Crystal Structure of Beta-Fumaric Acid." (CW)

  8. Application of advanced electronics to a future spacecraft computer design

    NASA Technical Reports Server (NTRS)

    Carney, P. C.

    1980-01-01

    Advancements in hardware and software technology are summarized with specific emphasis on spacecraft computer capabilities. Available state of the art technology is reviewed and candidate architectures are defined.

  9. Renewable Energy Laboratory Development for Biofuels Advanced Combustion Studies

    SciTech Connect

    Soloiu, Valentin A.

    2012-03-31

    The research advanced fundamental science and applied engineering for increasing the efficiency of internal combustion engines and meeting emissions regulations with biofuels. The project developed a laboratory with new experiments and allowed investigation of new fuels and their combustion and emissions. This project supports a sustainable domestic biofuels and automotive industry creating economic opportunities across the nation, reducing the dependence on foreign oil, and enhancing U.S. energy security. The one year period of research developed fundamental knowledge and applied technology in advanced combustion, emissions and biofuels formulation to increase vehicle's efficiency. Biofuels combustion was investigated in a Compression Ignition Direct Injection (DI) to develop idling strategies with biofuels and an Indirect Diesel Injection (IDI) intended for auxiliary power unit.

  10. ALPhA: The Advanced Laboratory Physics Association

    NASA Astrophysics Data System (ADS)

    Black, Eric; McCann, Lowell; Reichert, Jonathan; Spalding, Gabe; Essick, John; van Baak, David; Wonnell, Steve

    2011-03-01

    The Advanced Laboratory Physics Association (ALPhA) is a group of people with a shared interest in teaching physics labs at the advanced undergraduate or graduate level. ALPhA works closely with the American Physical Society (APS), the Optical Society of America (OSA), and the American Association of Physics Teachers (AAPT) to develop new methods for teaching modern experimental physics. In the summer of 2010 we initiated the ALPhA Immersion Program, a three-day short course where instructors visit a lab, do one or more of the local experiments (home-built or commercial) with the local instructor, and learn the experiments well enough to incorporate them into their own programs. These immersions were very well received, with attendees filling up all available slots. In this talk I will describe ALPhA and the Immersions Program and solicit input from the broader community.

  11. Description of a digital computer simulation of an Annular Momentum Control Device (AMCD) laboratory test model

    NASA Technical Reports Server (NTRS)

    Woolley, C. T.; Groom, N. J.

    1981-01-01

    A description of a digital computer simulation of an Annular Momentum Control Device (AMCD) laboratory model is presented. The AMCD is a momentum exchange device which is under development as an advanced control effector for spacecraft attitude control systems. The digital computer simulation of this device incorporates the following models: six degree of freedom rigid body dynamics; rim warp; controller dynamics; nonlinear distributed element axial bearings; as well as power driver and power supply current limits. An annotated FORTRAN IV source code listing of the computer program is included.

  12. 77 FR 12823 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-02

    ... final report, Advanced Networking update Status from Computer Science COV Early Career technical talks Summary of Applied Math and Computer Science Workshops ASCR's new SBIR awards Data-intensive...

  13. Developing an Advanced Environment for Collaborative Computing

    NASA Technical Reports Server (NTRS)

    Becerra-Fernandez, Irma; Stewart, Helen; DelAlto, Martha; DelAlto, Martha; Knight, Chris

    1999-01-01

    Knowledge management in general tries to organize and make available important know-how, whenever and where ever is needed. Today, organizations rely on decision-makers to produce "mission critical" decisions that am based on inputs from multiple domains. The ideal decision-maker has a profound understanding of specific domains that influence the decision-making process coupled with the experience that allows them to act quickly and decisively on the information. In addition, learning companies benefit by not repeating costly mistakes, and by reducing time-to-market in Research & Development projects. Group-decision making tools can help companies make better decisions by capturing the knowledge from groups of experts. Furthermore, companies that capture their customers preferences can improve their customer service, which translates to larger profits. Therefore collaborative computing provides a common communication space, improves sharing of knowledge, provides a mechanism for real-time feedback on the tasks being performed, helps to optimize processes, and results in a centralized knowledge warehouse. This paper presents the research directions. of a project which seeks to augment an advanced collaborative web-based environment called Postdoc, with workflow capabilities. Postdoc is a "government-off-the-shelf" document management software developed at NASA-Ames Research Center (ARC).

  14. Computational and design methods for advanced imaging

    NASA Astrophysics Data System (ADS)

    Birch, Gabriel C.

    This dissertation merges the optical design and computational aspects of imaging systems to create novel devices that solve engineering problems in optical science and attempts to expand the solution space available to the optical designer. This dissertation is divided into two parts: the first discusses a new active illumination depth sensing modality, while the second part discusses a passive illumination system called plenoptic, or lightfield, imaging. The new depth sensing modality introduced in part one is called depth through controlled aberration. This technique illuminates a target with a known, aberrated projected pattern and takes an image using a traditional, unmodified imaging system. Knowing how the added aberration in the projected pattern changes as a function of depth, we are able to quantitatively determine depth of a series of points from the camera. A major advantage this method permits is the ability for illumination and imaging axes to be coincident. Plenoptic cameras capture both spatial and angular data simultaneously. This dissertation present a new set of parameters that permit the design and comparison of plenoptic devices outside the traditionally published plenoptic 1.0 and plenoptic 2.0 configurations. Additionally, a series of engineering advancements are presented, including full system raytraces of raw plenoptic images, Zernike compression techniques of raw image files, and non-uniform lenslet arrays to compensate for plenoptic system aberrations. Finally, a new snapshot imaging spectrometer is proposed based off the plenoptic configuration.

  15. Conceptual change in an organic chemistry laboratory: A comparison of computer simulations and traditional laboratory experiments

    NASA Astrophysics Data System (ADS)

    Gaddis, Barbara A.

    2001-12-01

    This quasi-experimental research study examined the effect of computer simulations and hands-on laboratory experiments in enhancing conceptual understanding and alleviating misconceptions of organic chemistry reaction mechanisms. Subjects were sixty-nine sophomore-level organic chemistry students enrolled in four laboratory sections. Laboratory sections were stratified across instructor and randomly assigned to serve as a control or treatment laboratory. Students in the control group performed all hands-on experiments. Students in the treatment group performed hands-on experiments for the first and last part of the semester but performed computer simulations for a five-week period in the middle of the semester. Prior to treatment, groups were equivalent with respect to academic orientation, motivation, formal reasoning ability, and spatial visualization ability. Fifteen common misconceptions held by beginning organic chemistry students were identified from the Covalent Bonding and Structures Test. At the end of the semester, thirteen of these misconceptions persisted. Molecular geometry was the only category of misconceptions that significantly improved as a result of computer simulations, F(1,58) = 6.309, p = .015. No significant differential change was observed in misconceptions about bond polarity, molecular polarity, intermolecular forces, lattice structures, or the octet rule. Computer simulations were found to result in significantly greater conceptual understanding of organic chemistry reactions on two of the experiments, Stereochemistry, F(1,55) = 6.174, p = .016, and Nucleophilic Substitution, F(1,57) = 6.093, p = .017. The other three experiments, Infrared Spectroscopy, Elimination, and Oxymercuration, did not show a significant differential effect between types of laboratory experiences. No significant differences were observed on long-term retention of concepts. Overall conclusions from the study are that neither computer simulations nor hands

  16. Computer protection plan for the Superconducing Super Collider Laboratory

    SciTech Connect

    Hunter, S.

    1992-04-15

    The purpose of this document is to describe the current unclassified computer security program practices, Policies and procedures for the Superconducting Super Collider Laboratory (SSCL). This document includes or references all related policies and procedures currently implemented throughout the SSCL. The document includes security practices which are planned when the facility is fully operational.

  17. Computer Simulation and Laboratory Work in the Teaching of Mechanics.

    ERIC Educational Resources Information Center

    Borghi, L.; And Others

    1987-01-01

    Describes a teaching strategy designed to help high school students learn mechanics by involving them in simple experimental work, observing didactic films, running computer simulations, and executing more complex laboratory experiments. Provides an example of the strategy as it is applied to the topic of projectile motion. (TW)

  18. Computer Simulation of Laboratory Experiments: An Unrealized Potential.

    ERIC Educational Resources Information Center

    Magin, D. J.; Reizes, J. A.

    1990-01-01

    Discussion of the use of computer simulation for laboratory experiments in undergraduate engineering education focuses on work at the University of New South Wales in the instructional design and software development of a package simulating a heat exchange device. The importance of integrating theory, design, and experimentation is also discussed.…

  19. An Easily Assembled Laboratory Exercise in Computed Tomography

    ERIC Educational Resources Information Center

    Mylott, Elliot; Klepetka, Ryan; Dunlap, Justin C.; Widenhorn, Ralf

    2011-01-01

    In this paper, we present a laboratory activity in computed tomography (CT) primarily composed of a photogate and a rotary motion sensor that can be assembled quickly and partially automates data collection and analysis. We use an enclosure made with a light filter that is largely opaque in the visible spectrum but mostly transparent to the near…

  20. Advanced flight computers for planetary exploration

    NASA Technical Reports Server (NTRS)

    Stephenson, R. Rhoads

    1988-01-01

    Research concerning flight computers for use on interplanetary probes is reviewed. The history of these computers from the Viking mission to the present is outlined. The differences between ground commercial computers and computers for planetary exploration are listed. The development of a computer for the Mariner Mark II comet rendezvous asteroid flyby mission is described. Various aspects of recently developed computer systems are examined, including the Max real time, embedded computer, a hypercube distributed supercomputer, a SAR data processor, a processor for the High Resolution IR Imaging Spectrometer, and a robotic vision multiresolution pyramid machine for processsing images obtained by a Mars Rover.

  1. Lab4CE: A Remote Laboratory for Computer Education

    ERIC Educational Resources Information Center

    Broisin, Julien; Venant, Rémi; Vidal, Philippe

    2017-01-01

    Remote practical activities have been demonstrated to be efficient when learners come to acquire inquiry skills. In computer science education, virtualization technologies are gaining popularity as this technological advance enables instructors to implement realistic practical learning activities, and learners to engage in authentic and…

  2. Brookhaven National Laboratory's capabilities for advanced analyses of cyber threats

    SciTech Connect

    DePhillips, M. P.

    2014-01-01

    BNL has several ongoing, mature, and successful programs and areas of core scientific expertise that readily could be modified to address problems facing national security and efforts by the IC related to securing our nation’s computer networks. In supporting these programs, BNL houses an expansive, scalable infrastructure built exclusively for transporting, storing, and analyzing large disparate data-sets. Our ongoing research projects on various infrastructural issues in computer science undoubtedly would be relevant to national security. Furthermore, BNL frequently partners with researchers in academia and industry worldwide to foster unique and innovative ideas for expanding research opportunities and extending our insights. Because the basic science conducted at BNL is unique, such projects have led to advanced techniques, unlike any others, to support our mission of discovery. Many of them are modular techniques, thus making them ideal for abstraction and retrofitting to other uses including those facing national security, specifically the safety of the nation’s cyber space.

  3. An easily assembled laboratory exercise in computed tomography

    NASA Astrophysics Data System (ADS)

    Mylott, Elliot; Klepetka, Ryan; Dunlap, Justin C.; Widenhorn, Ralf

    2011-09-01

    In this paper, we present a laboratory activity in computed tomography (CT) primarily composed of a photogate and a rotary motion sensor that can be assembled quickly and partially automates data collection and analysis. We use an enclosure made with a light filter that is largely opaque in the visible spectrum but mostly transparent to the near IR light of the photogate (880 nm) to scan objects hidden from the human eye. This experiment effectively conveys how an image is formed during a CT scan and highlights the important physical and imaging concepts behind CT such as electromagnetic radiation, the interaction of light and matter, artefacts and windowing. Like our setup, previous undergraduate level laboratory activities which teach the basics of CT have also utilized light sources rather than x-rays; however, they required a more extensive setup and used devices not always easily found in undergraduate laboratories. Our setup is easily implemented with equipment found in many teaching laboratories.

  4. COMPUTATIONAL TOXICOLOGY ADVANCES: EMERGING CAPABILITIES FOR DATA EXPLORATION AND SAR MODEL DEVELOPMENT

    EPA Science Inventory

    Computational Toxicology Advances: Emerging capabilities for data exploration and SAR model development
    Ann M. Richard and ClarLynda R. Williams, National Health & Environmental Effects Research Laboratory, US EPA, Research Triangle Park, NC, USA; email: richard.ann@epa.gov

  5. Some Recent Advances in Computer Graphics.

    ERIC Educational Resources Information Center

    Whitted, Turner

    1982-01-01

    General principles of computer graphics are reviewed, including discussions of display hardware, geometric modeling, algorithms, and applications in science, computer-aided design, flight training, communications, business, art, and entertainment. (JN)

  6. Computing Advances in the Teaching of Chemistry.

    ERIC Educational Resources Information Center

    Baskett, W. P.; Matthews, G. P.

    1984-01-01

    Discusses three trends in computer-oriented chemistry instruction: (1) availability of interfaces to integrate computers with experiments; (2) impact of the development of higher resolution graphics and greater memory capacity; and (3) role of videodisc technology on computer assisted instruction. Includes program listings for auto-titration and…

  7. The advanced light source at the Lawrence Berkeley laboratory

    NASA Astrophysics Data System (ADS)

    Jackson, Alan

    1991-05-01

    The Advanced Light Source (ALS), a national facility currently under construction at the Lawrence Berkeley Laboratory (LBL), is a third-generation synchrotron light source designed to produce extremely bright beams of synchrotron radiation, in the energy range from a few eV to 10 keV. The design is based on a 1-1.9 GeV electron storage ring (optimized at 1.5 GeV), and utilizes special magnets, known as undulators and wigglers (collectively referred to as insertion devices), to generate the radiation. In this paper we describe the main accelerator components of the ALS, the variety of insertion devices, the radiation spectra expected from these devices, and the complement of experiments that have been approved for initial operation, starting in April 1993.

  8. The Advanced Light Source at Lawrence Berkeley Laboratory

    NASA Astrophysics Data System (ADS)

    Robinson, A. L.; Perera, R. C. C.; Schlachter, A. S.

    1992-01-01

    The Advanced Light Source (ALS) at the Lawrence Berkeley Laboratory (LBL), scheduled to be operational in the spring of 1993 as a U.S. Department of Energy national user facility, will be a next-generation source of soft x-ray and ultraviolet (XUV) synchrotron radiation. Undulators will provide the world's brightest synchrotron radiation at photon energies from below 10 eV to above 2 keV; wiggler and bend-magnet radiation will extend the spectral coverage with high fluxes above 10 keV. These capabilities will support an extensive research program in a broad spectrum of scientific and technological areas in which XUV radiation is used to study and manipulate matter in all its varied gaseous, liquid, and solid forms. The ALS will also serve those interested in developing the fabrication technology for microstructures and nanostructures, as well as for characterizing them.

  9. ADVANCED COMPUTATIONAL METHODS IN DOSE MODELING: APPLICATION OF COMPUTATIONAL BIOPHYSICAL TRANSPORT, COMPUTATIONAL CHEMISTRY, AND COMPUTATIONAL BIOLOGY

    EPA Science Inventory

    Computational toxicology (CompTox) leverages the significant gains in computing power and computational techniques (e.g., numerical approaches, structure-activity relationships, bioinformatics) realized over the last few years, thereby reducing costs and increasing efficiency i...

  10. Virtual Computing Laboratories: A Case Study with Comparisons to Physical Computing Laboratories

    ERIC Educational Resources Information Center

    Burd, Stephen D.; Seazzu, Alessandro F.; Conway, Christopher

    2009-01-01

    Current technology enables schools to provide remote or virtual computing labs that can be implemented in multiple ways ranging from remote access to banks of dedicated workstations to sophisticated access to large-scale servers hosting virtualized workstations. This paper reports on the implementation of a specific lab using remote access to…

  11. Sandia Laboratories hybrid computer and motion simulator facilities

    SciTech Connect

    Curry, W. H.; French, R. E.

    1980-05-01

    Hybrid computer and motion simulator facilities at Sandia National Laboratories include an AD/FIVE-AD10-PDP11/60, an AD/FIVE-PDP11/45, an EAI7800-EAI640, an EAI580/TR48-Nova 800, and two Carco S-45OR-3/R-493A three-axis motion simulators. An EAI680 is used in the analog mode only. This report describes the current equipment.

  12. Application of advanced computational technology to propulsion CFD

    NASA Astrophysics Data System (ADS)

    Szuch, John R.

    The Internal Fluid Mechanics Division of the NASA Lewis Research Center is combining the key elements of computational fluid dynamics, aerothermodynamic experiments, and advanced computational technology to bring internal computational fluid dynamics (ICFM) to a state of practical application for aerospace propulsion system design. This paper presents an overview of efforts underway at NASA Lewis to advance and apply computational technology to ICFM. These efforts include the use of modern, software engineering principles for code development, the development of an AI-based user-interface for large codes, the establishment of a high-performance, data communications network to link ICFM researchers and facilities, and the application of parallel processing to speed up computationally intensive and/or time-critical ICFM problems. A multistage compressor flow physics program is cited as an example of efforts to use advanced computational technology to enhance a current NASA Lewis ICFM research program.

  13. Computation of Viscous Flow about Advanced Projectiles.

    DTIC Science & Technology

    1983-09-09

    Domain". Journal of Comp. Physics, Vol. 8, 1971, pp. 392-408. 10. Thompson , J . F ., Thames, F. C., and Mastin, C. M., "Automatic Numerical Generation of...computations, USSR Comput. Math. Math. Phys., 12, 2 (1972), 182-195. I~~ll A - - 18. Thompson , J . F ., F. C. Thames, and C. M. Mastin, Automatic

  14. Final Report - Advanced Ion Trap Mass Spectrometry Program - Oak Ridge National Laboratory - Sandia National Laboratory

    SciTech Connect

    Whitten, W.B.

    2002-12-18

    This report covers the three main projects that collectively comprised the Advanced Ion Trap Mass Spectrometry Program. Chapter 1 describes the direct interrogation of individual particles by laser desorption within the ion trap mass spectrometer analyzer. The goals were (1) to develop an ''intelligent trigger'' capable of distinguishing particles of biological origin from those of nonbiological origin in the background and interferent particles and (2) to explore the capability for individual particle identification. Direct interrogation of particles by laser ablation and ion trap mass spectrometry was shown to have good promise for discriminating between particles of biological origin and those of nonbiological origin, although detailed protocols and operating conditions were not worked out. A library of more than 20,000 spectra of various types of biological particles has been assembled. Methods based on multivariate analysis and on neural networks were used to discriminate between particles of biological origin and those of nonbiological origin. It was possible to discriminate between at least some species of bacteria if mass spectra of several hundred similar particles were obtained. Chapter 2 addresses the development of a new ion trap mass analyzer geometry that offers the potential for a significant increase in ion storage capacity for a given set of analyzer operating conditions. This geometry may lead to the development of smaller, lower-power field-portable ion trap mass spectrometers while retaining laboratory-scale analytical performance. A novel ion trap mass spectrometer based on toroidal ion storage geometry has been developed. The analyzer geometry is based on the edge rotation of a quadrupolar ion trap cross section into the shape of a torus. Initial performance of this device was poor, however, due to the significant contribution of nonlinear fields introduced by the rotation of the symmetric ion-trapping geometry. These nonlinear resonances

  15. Jonathan F. Reichert and Barbara Wolff-Reichert Award for Excellence in Advanced Laboratory Instruction: Advanced Instructional Labs: Why Bother?

    NASA Astrophysics Data System (ADS)

    Bistrow, Van

    What aren't we teaching about physics in the traditional lecture course? Plenty! By offering the Advanced Laboratory Course, we hope to shed light on the following questions: How do we develop a systematic process of doing experiments? How do we record procedures and results? How should we interpret theoretical concepts in the real world? What experimental and computational techniques are available for producing and analyzing data? With what degree of confidence can we trust our measurements and interpretations? How well does a theory represent physical reality? How do we collaborate with experimental partners? How do we best communicate our findings to others?These questions are of fundamental importance to experimental physics, yet are not generally addressed by reading textbooks, attending lectures or doing homework problems. Thus, to provide a more complete understanding of physics, we offer laboratory exercises as a supplement to the other modes of learning. The speaker will describe some examples of experiments, and outline the history, structure and student impressions of the Advanced Lab course at the University of Chicago Department of Physics.

  16. Advanced robotic technologies for transfer at Sandia National Laboratories

    SciTech Connect

    Bennett, P.C.

    1994-10-01

    Hazardous operations which have in the past been completed by technicians are under increased scrutiny due to high costs and low productivity associated with providing protective clothing and environments. As a result, remote systems are needed to accomplish many hazardous materials handling tasks such as the clean-up of waste sites in which the exposure of personnel to radiation, chemical, explosive and other hazardous constituents is unacceptable. Computer models augmented by sensing, and structured, modular computing environments are proving effective in automating many unstructured hazardous tasks. Work at Sandia National Laboratories (SNL) has focused on applying flexible automation (robotics) to meet the needs of the U.S. Department of Energy (USDOE). Dismantling facilities, environmental remediation, and materials handling in changing, hazardous environments lead to many technical challenges. Computer planning, monitoring and operator assistance shorten training cycles, reduce errors, and speed execution of operations. Robotic systems that re-use well-understood generic technologies can be much better characterized than robotic systems developed for a particular application, leading to a more reliable and safer systems. Further safety in robotic operations results from use of environmental sensors and knowledge of the task and environment. Collision detection and avoidance is achieved from such sensor integration and model-based control. This paper discusses selected technologies developed at SNL for use within the USDOE complex that have been or are ready for transfer to government and industrial suppliers. These technologies include sensors, sub-systems, and the design philosophy applied to quickly integrate them into a working robotic system. This paper represents the work of many people at the Intelligent Systems and Robotics Center at SNL, to whom the credit belongs.

  17. Computing Algorithms for Nuffield Advanced Physics.

    ERIC Educational Resources Information Center

    Summers, M. K.

    1978-01-01

    Defines all recurrence relations used in the Nuffield course, to solve first- and second-order differential equations, and describes a typical algorithm for computer generation of solutions. (Author/GA)

  18. Aerodynamic optimization studies on advanced architecture computers

    NASA Technical Reports Server (NTRS)

    Chawla, Kalpana

    1995-01-01

    The approach to carrying out multi-discipline aerospace design studies in the future, especially in massively parallel computing environments, comprises of choosing (1) suitable solvers to compute solutions to equations characterizing a discipline, and (2) efficient optimization methods. In addition, for aerodynamic optimization problems, (3) smart methodologies must be selected to modify the surface shape. In this research effort, a 'direct' optimization method is implemented on the Cray C-90 to improve aerodynamic design. It is coupled with an existing implicit Navier-Stokes solver, OVERFLOW, to compute flow solutions. The optimization method is chosen such that it can accomodate multi-discipline optimization in future computations. In the work , however, only single discipline aerodynamic optimization will be included.

  19. Advanced Computational Techniques for Power Tube Design.

    DTIC Science & Technology

    1986-07-01

    fixturing applications, in addition to the existing computer-aided engineering capabilities. o Helix TWT Manufacturing has Implemented a tooling and fixturing...illustrates the ajor features of this computer network. ) The backbone of our system is a Sytek Broadband Network (LAN) which Interconnects terminals and...automatic network analyzer (FANA) which electrically characterizes the slow-wave helices of traveling-wave tubes ( TWTs ) -- both for engineering design

  20. Advanced Crew Personal Support Computer (CPSC) task

    NASA Technical Reports Server (NTRS)

    Muratore, Debra

    1991-01-01

    The topics are presented in view graph form and include: background; objectives of task; benefits to the Space Station Freedom (SSF) Program; technical approach; baseline integration; and growth and evolution options. The objective is to: (1) introduce new computer technology into the SSF Program; (2) augment core computer capabilities to meet additional mission requirements; (3) minimize risk in upgrading technology; and (4) provide a low cost way to enhance crew and ground operations support.

  1. Frontiers of research in advanced computations

    SciTech Connect

    1996-07-01

    The principal mission of the Institute for Scientific Computing Research is to foster interactions among LLNL researchers, universities, and industry on selected topics in scientific computing. In the area of computational physics, the Institute has developed a new algorithm, GaPH, to help scientists understand the chemistry of turbulent and driven plasmas or gases at far less cost than other methods. New low-frequency electromagnetic models better describe the plasma etching and deposition characteristics of a computer chip in the making. A new method for modeling realistic curved boundaries within an orthogonal mesh is resulting in a better understanding of the physics associated with such boundaries and much quicker solutions. All these capabilities are being developed for massively parallel implementation, which is an ongoing focus of Institute researchers. Other groups within the Institute are developing novel computational methods to address a range of other problems. Examples include feature detection and motion recognition by computer, improved monitoring of blood oxygen levels, and entirely new models of human joint mechanics and prosthetic devices.

  2. Use of the Berkeley Physics Laboratory to Teach an Advanced Physics Course

    ERIC Educational Resources Information Center

    Logan, James David

    1973-01-01

    Discusses a course, centered around 32 experiments taught for advanced students, designed to develop a laboratory strongly suggestive of contemporary research using relatively sophisticated apparatus. Its unique advantage lies in enriching advanced physics curriculum. (DF)

  3. Report on Computing and Networking in the Space Science Laboratory by the SSL Computer Committee

    NASA Technical Reports Server (NTRS)

    Gallagher, D. L. (Editor)

    1993-01-01

    The Space Science Laboratory (SSL) at Marshall Space Flight Center is a multiprogram facility. Scientific research is conducted in four discipline areas: earth science and applications, solar-terrestrial physics, astrophysics, and microgravity science and applications. Representatives from each of these discipline areas participate in a Laboratory computer requirements committee, which developed this document. The purpose is to establish and discuss Laboratory objectives for computing and networking in support of science. The purpose is also to lay the foundation for a collective, multiprogram approach to providing these services. Special recognition is given to the importance of the national and international efforts of our research communities toward the development of interoperable, network-based computer applications.

  4. Advanced Computational Techniques in Regional Wave Studies

    DTIC Science & Technology

    1990-01-03

    the new GERESS data. The dissertation work emphasized the development and use of advanced computa- tional techniques for studying regional seismic...hand, the possibility of new data sources at regional distances permits using previously ignored signals. Unfortunately, these regional signals will...the Green’s function around this new reference point is containing the propagation effects, and V is the source Gnk(x,t;r,t) - (2) volume where fJk

  5. Advances in Computer-Supported Learning

    ERIC Educational Resources Information Center

    Neto, Francisco; Brasileiro, Francisco

    2007-01-01

    The Internet and growth of computer networks have eliminated geographic barriers, creating an environment where education can be brought to a student no matter where that student may be. The success of distance learning programs and the availability of many Web-supported applications and multimedia resources have increased the effectiveness of…

  6. GeoBrain Computational Cyber-laboratory for Earth Science Studies

    NASA Astrophysics Data System (ADS)

    Deng, M.; di, L.

    2009-12-01

    Computational approaches (e.g., computer-based data visualization, analysis and modeling) are critical for conducting increasingly data-intensive Earth science (ES) studies to understand functions and changes of the Earth system. However, currently Earth scientists, educators, and students have met two major barriers that prevent them from being effectively using computational approaches in their learning, research and application activities. The two barriers are: 1) difficulties in finding, obtaining, and using multi-source ES data; and 2) lack of analytic functions and computing resources (e.g., analysis software, computing models, and high performance computing systems) to analyze the data. Taking advantages of recent advances in cyberinfrastructure, Web service, and geospatial interoperability technologies, GeoBrain, a project funded by NASA, has developed a prototype computational cyber-laboratory to effectively remove the two barriers. The cyber-laboratory makes ES data and computational resources at large organizations in distributed locations available to and easily usable by the Earth science community through 1) enabling seamless discovery, access and retrieval of distributed data, 2) federating and enhancing data discovery with a catalogue federation service and a semantically-augmented catalogue service, 3) customizing data access and retrieval at user request with interoperable, personalized, and on-demand data access and services, 4) automating or semi-automating multi-source geospatial data integration, 5) developing a large number of analytic functions as value-added, interoperable, and dynamically chainable geospatial Web services and deploying them in high-performance computing facilities, 6) enabling the online geospatial process modeling and execution, and 7) building a user-friendly extensible web portal for users to access the cyber-laboratory resources. Users can interactively discover the needed data and perform on-demand data analysis and

  7. Advanced Computing Tools and Models for Accelerator Physics

    SciTech Connect

    Ryne, Robert; Ryne, Robert D.

    2008-06-11

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.

  8. Evaluation of Advanced Computing Techniques and Technologies: Reconfigurable Computing

    NASA Technical Reports Server (NTRS)

    Wells, B. Earl

    2003-01-01

    The focus of this project was to survey the technology of reconfigurable computing determine its level of maturity and suitability for NASA applications. To better understand and assess the effectiveness of the reconfigurable design paradigm that is utilized within the HAL-15 reconfigurable computer system. This system was made available to NASA MSFC for this purpose, from Star Bridge Systems, Inc. To implement on at least one application that would benefit from the performance levels that are possible with reconfigurable hardware. It was originally proposed that experiments in fault tolerance and dynamically reconfigurability would be perform but time constraints mandated that these be pursued as future research.

  9. Advancements toward matter-antimatter pair plasmas in the laboratory

    NASA Astrophysics Data System (ADS)

    Stenson, E. V.; Hergenhahn, U.; Niemann, H.; Paschkowski, N.; Sunn Pedersen, T.; Saitoh, H.; Stanja, J.; Stoneking, M. R.; Hugenschmidt, C.; Piochacz, C.; Vohburger, S.; Schweikhard, L.; Danielson, J. R.; Surko, C. M.

    2015-11-01

    APEX/PAX (A Positron Electron Experiment/Positron Accumulation Experiment) has as its overarching goal the creation and magnetic confinement of a laboratory electron-positron pair plasma, thereby enabling experimental investigations of a topic that has already been the subject of hundreds of analytical and computational studies. This goal involves several interdependent challenges: design and construction of a suitable magnetic confinement device, access to a sufficient number of sufficiently cool positrons, and refinement of methods for the transfer of the positrons (and an equal number of electrons) into the device. The latest results of the subprojects addressing these challenges will be summarized here. Highlights include efficient (40 percent) injection of the NEPOMUC (Neutron-Inducted Positron Source Munich) positron beam into the confinement region of a dipole magnetic field, characterization of the beam at energies from 5 eV to 1 keV, and hour-long electron plasma confinement in a high-field (2.3 Telsa) Penning-Malmberg trap. on behalf of the APEX/PAX team and collaborators.

  10. ASDA - Advanced Suit Design Analyzer computer program

    NASA Technical Reports Server (NTRS)

    Bue, Grant C.; Conger, Bruce C.; Iovine, John V.; Chang, Chi-Min

    1992-01-01

    An ASDA model developed to evaluate the heat and mass transfer characteristics of advanced pressurized suit design concepts for low pressure or vacuum planetary applications is presented. The model is based on a generalized 3-layer suit that uses the Systems Integrated Numerical Differencing Analyzer '85 in conjunction with a 41-node FORTRAN routine. The latter simulates the transient heat transfer and respiratory processes of a human body in a suited environment. The user options for the suit encompass a liquid cooled garment, a removable jacket, a CO2/H2O permeable layer, and a phase change layer.

  11. The Nuclear Energy Advanced Modeling and Simulation Enabling Computational Technologies FY09 Report

    SciTech Connect

    Diachin, L F; Garaizar, F X; Henson, V E; Pope, G

    2009-10-12

    In this document we report on the status of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Enabling Computational Technologies (ECT) effort. In particular, we provide the context for ECT In the broader NEAMS program and describe the three pillars of the ECT effort, namely, (1) tools and libraries, (2) software quality assurance, and (3) computational facility (computers, storage, etc) needs. We report on our FY09 deliverables to determine the needs of the integrated performance and safety codes (IPSCs) in these three areas and lay out the general plan for software quality assurance to meet the requirements of DOE and the DOE Advanced Fuel Cycle Initiative (AFCI). We conclude with a brief description of our interactions with the Idaho National Laboratory computer center to determine what is needed to expand their role as a NEAMS user facility.

  12. Activities and operations of the Advanced Computing Research Facility, July-October 1986

    SciTech Connect

    Pieper, G.W.

    1986-01-01

    Research activities and operations of the Advanced Computing Research Facility (ACRF) at Argonne National Laboratory are discussed for the period from July 1986 through October 1986. The facility is currently supported by the Department of Energy, and is operated by the Mathematics and Computer Science Division at Argonne. Over the past four-month period, a new commercial multiprocessor, the Intel iPSC-VX/d4 hypercube was installed. In addition, four other commercial multiprocessors continue to be available for research - an Encore Multimax, a Sequent Balance 21000, an Alliant FX/8, and an Intel iPSC/d5 - as well as a locally designed multiprocessor, the Lemur. These machines are being actively used by scientists at Argonne and throughout the nation in a wide variety of projects concerning computer systems with parallel and vector architectures. A variety of classes, workshops, and seminars have been sponsored to train researchers on computing techniques for the advanced computer systems at the Advanced Computing Research Facility. For example, courses were offered on writing programs for parallel computer systems and hosted the first annual Alliant users group meeting. A Sequent users group meeting and a two-day workshop on performance evaluation of parallel computers and programs are being organized.

  13. Intelligent Software Tools for Advanced Computing

    SciTech Connect

    Baumgart, C.W.

    2001-04-03

    Feature extraction and evaluation are two procedures common to the development of any pattern recognition application. These features are the primary pieces of information which are used to train the pattern recognition tool, whether that tool is a neural network, a fuzzy logic rulebase, or a genetic algorithm. Careful selection of the features to be used by the pattern recognition tool can significantly streamline the overall development and training of the solution for the pattern recognition application. This report summarizes the development of an integrated, computer-based software package called the Feature Extraction Toolbox (FET), which can be used for the development and deployment of solutions to generic pattern recognition problems. This toolbox integrates a number of software techniques for signal processing, feature extraction and evaluation, and pattern recognition, all under a single, user-friendly development environment. The toolbox has been developed to run on a laptop computer, so that it may be taken to a site and used to develop pattern recognition applications in the field. A prototype version of this toolbox has been completed and is currently being used for applications development on several projects in support of the Department of Energy.

  14. Advances in computer imaging/applications in facial plastic surgery.

    PubMed

    Papel, I D; Jiannetto, D F

    1999-01-01

    Rapidly progressing computer technology, ever-increasing expectations of patients, and a confusing medicolegal environment requires a clarification of the role of computer imaging/applications. Advances in computer technology and its applications are reviewed. A brief historical discussion is included for perspective. Improvements in both hardware and software with the advent of digital imaging have allowed great increases in speed and accuracy in patient imaging. This facilitates doctor-patient communication and possibly realistic patient expectations. Patients seeking cosmetic surgery now often expect preoperative imaging. Although society in general has become more litigious, a literature search up to 1998 reveals no lawsuits directly involving computer imaging. It appears that conservative utilization of computer imaging by the facial plastic surgeon may actually reduce liability and promote communication. Recent advances have significantly enhanced the value of computer imaging in the practice of facial plastic surgery. These technological advances in computer imaging appear to contribute a useful technique for the practice of facial plastic surgery. Inclusion of computer imaging should be given serious consideration as an adjunct to clinical practice.

  15. STRUCTURED LEARNING AND TRAINING ENVIRONMENTS--A PREPARATION LABORATORY FOR ADVANCED MAMMALIAN PHYSIOLOGY.

    ERIC Educational Resources Information Center

    FIEL, NICHOLAS J.; JOHNSTON, RAYMOND F.

    A PREPARATION LABORATORY WAS DESIGNED TO FAMILIARIZE STUDENTS IN ADVANCED MAMMALIAN PHYSIOLOGY WITH LABORATORY SKILLS AND TECHNIQUES AND THUS SHORTEN THE TIME THEY SPEND IN SETTING UP ACTUAL EXPERIMENTS. THE LABORATORY LASTS 30 MINUTES, IS FLEXIBLE AND SIMPLE OF OPERATION, AND DOES NOT REQUIRE A PROFESSOR'S PRESENCE. THE BASIC TRAINING UNIT IS THE…

  16. Computational tools for the evaluation of laboratory-engineered biocatalysts

    PubMed Central

    Romero-Rivera, Adrian; Garcia-Borràs, Marc

    2017-01-01

    Biocatalysis is based on the application of natural catalysts for new purposes, for which enzymes were not designed. Although the first examples of biocatalysis were reported more than a century ago, biocatalysis was revolutionized after the discovery of an in vitro version of Darwinian evolution called Directed Evolution (DE). Despite the recent advances in the field, major challenges remain to be addressed. Currently, the best experimental approach consists of creating multiple mutations simultaneously while limiting the choices using statistical methods. Still, tens of thousands of variants need to be tested experimentally, and little information is available on how these mutations lead to enhanced enzyme proficiency. This review aims to provide a brief description of the available computational techniques to unveil the molecular basis of improved catalysis achieved by DE. An overview of the strengths and weaknesses of current computational strategies is explored with some recent representative examples. The understanding of how this powerful technique is able to obtain highly active variants is important for the future development of more robust computational methods to predict amino-acid changes needed for activity. PMID:27812570

  17. Computer Security Awareness Guide for Department of Energy Laboratories, Government Agencies, and others for use with Lawrence Livermore National Laboratory`s (LLNL): Computer security short subjects videos

    SciTech Connect

    Not Available

    1993-12-31

    Lonnie Moore, the Computer Security Manager, CSSM/CPPM at Lawrence Livermore National Laboratory (LLNL) and Gale Warshawsky, the Coordinator for Computer Security Education & Awareness at LLNL, wanted to share topics such as computer ethics, software piracy, privacy issues, and protecting information in a format that would capture and hold an audience`s attention. Four Computer Security Short Subject videos were produced which ranged from 1-3 minutes each. These videos are very effective education and awareness tools that can be used to generate discussions about computer security concerns and good computing practices. Leaders may incorporate the Short Subjects into presentations. After talking about a subject area, one of the Short Subjects may be shown to highlight that subject matter. Another method for sharing them could be to show a Short Subject first and then lead a discussion about its topic. The cast of characters and a bit of information about their personalities in the LLNL Computer Security Short Subjects is included in this report.

  18. The Role of the State Health Laboratories in Advancing Health Equity.

    PubMed

    King, Ewa; Vanner, Cynthia; Leibovitz, Henry; Smith, Robin

    2016-11-01

    While laboratories play an important and recognized role in many public health programs that require surveillance of disease spread or monitoring of environmental conditions, the role of public laboratories in assessing and advancing health equity is not well understood. Yet, public laboratories collect, provide or generate much of the data used to determine health equity status and monitor heath equity trends in multiple settings and disciplines. RI State Health Laboratories, a division of the RI Department of Health, operates programs that help measure and address health disparities. Health equity themes are present in laboratory programs that measure environmental determinants of health and assure equal access to laboratory screening and diagnostic services. This article will review the role of laboratory programs in advancing health equity in the state. Specific examples of laboratory contributions to health equity programs will be provided and examined. Future trends and unmet needs will also be discussed. [Full article available at http://rimed.org/rimedicaljournal-2016-11.asp].

  19. Industry and Academic Consortium for Computer Based Subsurface Geology Laboratory

    NASA Astrophysics Data System (ADS)

    Brown, A. L.; Nunn, J. A.; Sears, S. O.

    2008-12-01

    Twenty two licenses for Petrel Software acquired through a grant from Schlumberger are being used to redesign the laboratory portion of Subsurface Geology at Louisiana State University. The course redesign is a cooperative effort between LSU's Geology and Geophysics and Petroleum Engineering Departments and Schlumberger's Technical Training Division. In spring 2008, two laboratory sections were taught with 22 students in each section. The class contained geology majors, petroleum engineering majors, and geology graduate students. Limited enrollments and 3 hour labs make it possible to incorporate hands-on visualization, animation, manipulation of data and images, and access to geological data available online. 24/7 access to the laboratory and step by step instructions for Petrel exercises strongly promoted peer instruction and individual learning. Goals of the course redesign include: enhancing visualization of earth materials; strengthening student's ability to acquire, manage, and interpret multifaceted geological information; fostering critical thinking, the scientific method; improving student communication skills; providing cross training between geologists and engineers and increasing the quantity, quality, and diversity of students pursuing Earth Science and Petroleum Engineering careers. IT resources available in the laboratory provide students with sophisticated visualization tools, allowing them to switch between 2-D and 3-D reconstructions more seamlessly, and enabling them to manipulate larger integrated data-sets, thus permitting more time for critical thinking and hypothesis testing. IT resources also enable faculty and students to simultaneously work with the software to visually interrogate a 3D data set and immediately test hypothesis formulated in class. Preliminary evaluation of class results indicate that students found MS-Windows based Petrel easy to learn. By the end of the semester, students were able to not only map horizons and faults

  20. Advanced Combustion and Fuels; NREL (National Renewable Energy Laboratory)

    SciTech Connect

    Zigler, Brad

    2015-06-08

    Presented at the U.S. Department of Energy Vehicle Technologies Office 2015 Annual Merit Review and Peer Evaluation Meeting, held June 8-12, 2015, in Arlington, Virginia. It addresses technical barriers of inadequate data and predictive tools for fuel and lubricant effects on advanced combustion engines, with the strategy being through collaboration, develop techniques, tools, and data to quantify critical fuel physico-chemical effects to enable development of advanced combustion engines that use alternative fuels.

  1. Computer-Assisted Foreign Language Teaching and Learning: Technological Advances

    ERIC Educational Resources Information Center

    Zou, Bin; Xing, Minjie; Wang, Yuping; Sun, Mingyu; Xiang, Catherine H.

    2013-01-01

    Computer-Assisted Foreign Language Teaching and Learning: Technological Advances highlights new research and an original framework that brings together foreign language teaching, experiments and testing practices that utilize the most recent and widely used e-learning resources. This comprehensive collection of research will offer linguistic…

  2. 76 FR 64330 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-18

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION... Reliability, Diffusion on Complex Networks, and Reversible Software Execution Systems Report from Applied Math... at: (301) 903-7486 or by email at: Melea.Baker@science.doe.gov . You must make your request for...

  3. 78 FR 56871 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-16

    ... Advanced Scientific Computing Advisory Committee AGENCY: Office of Science, Department of Energy. ACTION... Exascale technical approaches subcommittee Facilities update Report from Applied Math Committee of Visitors...: ( Melea.Baker@science.doe.gov ). You must make your request for an oral statement at least five...

  4. The Federal Government's Role in Advancing Computer Technology

    ERIC Educational Resources Information Center

    Information Hotline, 1978

    1978-01-01

    As part of the Federal Data Processing Reorganization Study submitted by the Science and Technology Team, the Federal Government's role in advancing and diffusing computer technology is discussed. Findings and conclusions assess the state-of-the-art in government and in industry, and five recommendations provide directions for government policy…

  5. Advanced computational research in materials processing for design and manufacturing

    SciTech Connect

    Zacharia, T.

    1994-12-31

    The computational requirements for design and manufacture of automotive components have seen dramatic increases for producing automobiles with three times the mileage. Automotive component design systems are becoming increasingly reliant on structural analysis requiring both overall larger analysis and more complex analyses, more three-dimensional analyses, larger model sizes, and routine consideration of transient and non-linear effects. Such analyses must be performed rapidly to minimize delays in the design and development process, which drives the need for parallel computing. This paper briefly describes advanced computational research in superplastic forming and automotive crash worthiness.

  6. [Activities of Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2001-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administrations missions. RIACS is located at the NASA Ames Research Center, Moffett Field, California. RIACS research focuses on the three cornerstones of IT research necessary to meet the future challenges of NASA missions: 1. Automated Reasoning for Autonomous Systems Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. 2. Human-Centered Computing Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities. 3. High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to analysis of large scientific datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply IT research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, visiting scientist programs and student summer programs, designed to encourage and facilitate collaboration between the university and NASA IT research communities.

  7. Advances in Numerical Boundary Conditions for Computational Aeroacoustics

    NASA Technical Reports Server (NTRS)

    Tam, Christopher K. W.

    1997-01-01

    Advances in Computational Aeroacoustics (CAA) depend critically on the availability of accurate, nondispersive, least dissipative computation algorithm as well as high quality numerical boundary treatments. This paper focuses on the recent developments of numerical boundary conditions. In a typical CAA problem, one often encounters two types of boundaries. Because a finite computation domain is used, there are external boundaries. On the external boundaries, boundary conditions simulating the solution outside the computation domain are to be imposed. Inside the computation domain, there may be internal boundaries. On these internal boundaries, boundary conditions simulating the presence of an object or surface with specific acoustic characteristics are to be applied. Numerical boundary conditions, both external or internal, developed for simple model problems are reviewed and examined. Numerical boundary conditions for real aeroacoustic problems are also discussed through specific examples. The paper concludes with a description of some much needed research in numerical boundary conditions for CAA.

  8. Conformational Analysis in an Advanced Integrated Laboratory Course

    ERIC Educational Resources Information Center

    Ball, David B.; Miller, Randy M.

    2004-01-01

    A series of sophisticated, combined laboratory experiments are developed involving the use of various spectroscopic and other techniques in the conformational analysis of cyclohexane mechanisms. The multi-system approach enables the students to transcend the one-dimensional procedure, and develops their synthetic and diagnostic skills.

  9. A Simple Photochemical Experiment for the Advanced Laboratory.

    ERIC Educational Resources Information Center

    Rosenfeld, Stuart M.

    1986-01-01

    Describes an experiment to provide students with: (1) an introduction to photochemical techniques and theory; (2) an experience with semimicro techniques; (3) an application of carbon-14 nuclear magnetic resonance; and (4) a laboratory with some qualities of a genuine experiment. These criteria are met in the photooxidation of 9,…

  10. In Situ Techniques for Monitoring Electrochromism: An Advanced Laboratory Experiment

    ERIC Educational Resources Information Center

    Saricayir, Hakan; Uce, Musa; Koca, Atif

    2010-01-01

    This experiment employs current technology to enhance and extend existing lab content. The basic principles of spectroscopic and electroanalytical techniques and their use in determining material properties are covered in some detail in many undergraduate chemistry programs. However, there are limited examples of laboratory experiments with in…

  11. Advanced sensor-computer technology for urban runoff monitoring

    NASA Astrophysics Data System (ADS)

    Yu, Byunggu; Behera, Pradeep K.; Ramirez Rochac, Juan F.

    2011-04-01

    The paper presents the project team's advanced sensor-computer sphere technology for real-time and continuous monitoring of wastewater runoff at the sewer discharge outfalls along the receiving water. This research significantly enhances and extends the previously proposed novel sensor-computer technology. This advanced technology offers new computation models for an innovative use of the sensor-computer sphere comprising accelerometer, programmable in-situ computer, solar power, and wireless communication for real-time and online monitoring of runoff quantity. This innovation can enable more effective planning and decision-making in civil infrastructure, natural environment protection, and water pollution related emergencies. The paper presents the following: (i) the sensor-computer sphere technology; (ii) a significant enhancement to the previously proposed discrete runoff quantity model of this technology; (iii) a new continuous runoff quantity model. Our comparative study on the two distinct models is presented. Based on this study, the paper further investigates the following: (1) energy-, memory-, and communication-efficient use of the technology for runoff monitoring; (2) possible sensor extensions for runoff quality monitoring.

  12. Lawrence Livermore National Laboratory`s Computer Security Short Subjects Videos: Hidden Password, The Incident, Dangerous Games and The Mess; Computer Security Awareness Guide

    SciTech Connect

    1993-12-31

    A video on computer security is described. Lonnie Moore, the Computer Security Manager, CSSM/CPPM at Lawrence Livermore National Laboratory (LLNL) and Gale Warshawsky, the Coordinator for Computer Security Education and Awareness at LLNL, wanted to share topics such as computer ethics, software piracy, privacy issues, and protecting information in a format that would capture and hold an audience`s attention. Four Computer Security Short Subject videos were produced which ranged from 1--3 minutes each. These videos are very effective education and awareness tools that can be used to generate discussions about computer security concerns and good computing practices.

  13. Cavity Ring down Spectroscopy Experiment for an Advanced Undergraduate Laboratory

    ERIC Educational Resources Information Center

    Stacewicz, T.; Wasylczyk, P.; Kowalczyk, P.; Semczuk, M.

    2007-01-01

    A simple experiment is described that permits advanced undergraduates to learn the principles and applications of the cavity ring down spectroscopy technique. The apparatus is used for measurements of low concentrations of NO[subscript 2] produced in air by an electric discharge. We present the setup, experimental procedure, data analysis and some…

  14. Acoustic resonance spectroscopy for the advanced undergraduate laboratory

    NASA Astrophysics Data System (ADS)

    Franco-Villafañe, J. A.; Flores-Olmedo, E.; Báez, G.; Gandarilla-Carrillo, O.; Méndez-Sánchez, R. A.

    2012-11-01

    We present a simple experiment that allows advanced undergraduates to learn the principles and applications of spectroscopy. The technique, known as acoustic resonance spectroscopy, is applied to study a vibrating rod. The setup includes electromagnetic-acoustic transducers, an audio amplifier and a vector network analyzer. Typical results of compressional, torsional and bending waves are analyzed and compared with analytical results.

  15. A Comprehensive Microfluidics Device Construction and Characterization Module for the Advanced Undergraduate Analytical Chemistry Laboratory

    ERIC Educational Resources Information Center

    Piunno, Paul A. E.; Zetina, Adrian; Chu, Norman; Tavares, Anthony J.; Noor, M. Omair; Petryayeva, Eleonora; Uddayasankar, Uvaraj; Veglio, Andrew

    2014-01-01

    An advanced analytical chemistry undergraduate laboratory module on microfluidics that spans 4 weeks (4 h per week) is presented. The laboratory module focuses on comprehensive experiential learning of microfluidic device fabrication and the core characteristics of microfluidic devices as they pertain to fluid flow and the manipulation of samples.…

  16. Integration of a Communicating Science Module into an Advanced Chemistry Laboratory Course

    ERIC Educational Resources Information Center

    Renaud, Jessica; Squier, Christopher; Larsen, Sarah C.

    2006-01-01

    A communicating science module was introduced into an advanced undergraduate physical chemistry laboratory course. The module was integrated into the course such that students received formal instruction in communicating science interwoven with the chemistry laboratory curriculum. The content of the communicating science module included three…

  17. Computer-Based Laboratory For Engine-System Monitors

    NASA Technical Reports Server (NTRS)

    Aguilar, Robert B.; Garcia, Raul C.

    1992-01-01

    Laboratory evaluates artificially intelligent engine-system monitors without potentially hazardous measurements on actual engines. Monitor enhances engine controller by detecting undesirable trends and counteracting them. Once proved in laboratory, monitor will then be tried on real engine.

  18. Advanced Benchmarking for Complex Building Types: Laboratories as an Exemplar

    SciTech Connect

    Mathew, Paul A.; Clear, Robert; Kircher, Kevin; Webster, Tom; Lee, Kwang Ho; Hoyt, Tyler

    2010-08-01

    Complex buildings such as laboratories, data centers and cleanrooms present particular challenges for energy benchmarking because it is difficult to normalize special requirements such as health and safety in laboratories and reliability (i.e., system redundancy to maintain uptime) in data centers which significantly impact energy use. For example, air change requirements vary widely based on the type of work being performed in each laboratory space. We present methods and tools for energy benchmarking in laboratories, as an exemplar of a complex building type. First, we address whole building energy metrics and normalization parameters. We present empirical methods based on simple data filtering as well as multivariate regression analysis on the Labs21 database. The regression analysis showed lab type, lab-area ratio and occupancy hours to be significant variables. Yet the dataset did not allow analysis of factors such as plug loads and air change rates, both of which are critical to lab energy use. The simulation-based method uses an EnergyPlus model to generate a benchmark energy intensity normalized for a wider range of parameters. We suggest that both these methods have complementary strengths and limitations. Second, we present"action-oriented" benchmarking, which extends whole-building benchmarking by utilizing system-level features and metrics such as airflow W/cfm to quickly identify a list of potential efficiency actions which can then be used as the basis for a more detailed audit. While action-oriented benchmarking is not an"audit in a box" and is not intended to provide the same degree of accuracy afforded by an energy audit, we demonstrate how it can be used to focus and prioritize audit activity and track performance at the system level. We conclude with key principles that are more broadly applicable to other complex building types.

  19. The Design and Implementation of NASA's Advanced Flight Computing Module

    NASA Technical Reports Server (NTRS)

    Alkakaj, Leon; Straedy, Richard; Jarvis, Bruce

    1995-01-01

    This paper describes a working flight computer Multichip Module developed jointly by JPL and TRW under their respective research programs in a collaborative fashion. The MCM is fabricated by nCHIP and is packaged within a 2 by 4 inch Al package from Coors. This flight computer module is one of three modules under development by NASA's Advanced Flight Computer (AFC) program. Further development of the Mass Memory and the programmable I/O MCM modules will follow. The three building block modules will then be stacked into a 3D MCM configuration. The mass and volume of the flight computer MCM achieved at 89 grams and 1.5 cubic inches respectively, represent a major enabling technology for future deep space as well as commercial remote sensing applications.

  20. Air Force Weapons Laboratory Computational Requirements for 1976 Through 1980

    DTIC Science & Technology

    1976-01-01

    Air Force Weapons Laboratory , Attn: DYS, Kirtland AFB, NM 87117...final report was prepared by the Air Force Weapons Laboratory , Kirtland Air Force Base, New Mexico under Job Order 06CB. Dr. Clifford E. Rhoades, Jr... Force Base, New Mexico 87117 62601F, 06CB II. CONTROLLING OFFICE NAME AND ADDRESS Ai"- Force Weapons Laboratory / Jan 1076 Kirtland Air Force Base,

  1. Advanced Simulation and Computing FY17 Implementation Plan, Version 0

    SciTech Connect

    McCoy, Michel; Archer, Bill; Hendrickson, Bruce; Wade, Doug; Hoang, Thuc

    2016-08-29

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), and quantifying critical margins and uncertainties. Resolving each issue requires increasingly difficult analyses because the aging process has progressively moved the stockpile further away from the original test base. Where possible, the program also enables the use of high performance computing (HPC) and simulation tools to address broader national security needs, such as foreign nuclear weapon assessments and counter nuclear terrorism.

  2. Activities of the Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Oliger, Joseph

    1994-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. Research at RIACS is currently being done in the following areas: (1) parallel computing; (2) advanced methods for scientific computing; (3) high performance networks; and (4) learning systems. RIACS technical reports are usually preprints of manuscripts that have been submitted to research journals or conference proceedings. A list of these reports for the period January 1, 1994 through December 31, 1994 is in the Reports and Abstracts section of this report.

  3. Soft computing in design and manufacturing of advanced materials

    NASA Technical Reports Server (NTRS)

    Cios, Krzysztof J.; Baaklini, George Y; Vary, Alex

    1993-01-01

    The potential of fuzzy sets and neural networks, often referred to as soft computing, for aiding in all aspects of manufacturing of advanced materials like ceramics is addressed. In design and manufacturing of advanced materials, it is desirable to find which of the many processing variables contribute most to the desired properties of the material. There is also interest in real time quality control of parameters that govern material properties during processing stages. The concepts of fuzzy sets and neural networks are briefly introduced and it is shown how they can be used in the design and manufacturing processes. These two computational methods are alternatives to other methods such as the Taguchi method. The two methods are demonstrated by using data collected at NASA Lewis Research Center. Future research directions are also discussed.

  4. Advanced computer modeling techniques expand belt conveyor technology

    SciTech Connect

    Alspaugh, M.

    1998-07-01

    Increased mining production is continuing to challenge engineers and manufacturers to keep up. The pressure to produce larger and more versatile equipment is increasing. This paper will show some recent major projects in the belt conveyor industry that have pushed the limits of design and engineering technology. Also, it will discuss the systems engineering discipline and advanced computer modeling tools that have helped make these achievements possible. Several examples of technologically advanced designs will be reviewed. However, new technology can sometimes produce increased problems with equipment availability and reliability if not carefully developed. Computer modeling techniques that help one design larger equipment can also compound operational headaches if engineering processes and algorithms are not carefully analyzed every step of the way.

  5. Advances in Cross-Cutting Ideas for Computational Climate Science

    SciTech Connect

    Ng, Esmond; Evans, Katherine J.; Caldwell, Peter; Hoffman, Forrest M.; Jackson, Charles; Kerstin, Van Dam; Leung, Ruby; Martin, Daniel F.; Ostrouchov, George; Tuminaro, Raymond; Ullrich, Paul; Wild, S.; Williams, Samuel

    2017-01-01

    This report presents results from the DOE-sponsored workshop titled, ``Advancing X-Cutting Ideas for Computational Climate Science Workshop,'' known as AXICCS, held on September 12--13, 2016 in Rockville, MD. The workshop brought together experts in climate science, computational climate science, computer science, and mathematics to discuss interesting but unsolved science questions regarding climate modeling and simulation, promoted collaboration among the diverse scientists in attendance, and brainstormed about possible tools and capabilities that could be developed to help address them. Emerged from discussions at the workshop were several research opportunities that the group felt could advance climate science significantly. These include (1) process-resolving models to provide insight into important processes and features of interest and inform the development of advanced physical parameterizations, (2) a community effort to develop and provide integrated model credibility, (3) including, organizing, and managing increasingly connected model components that increase model fidelity yet complexity, and (4) treating Earth system models as one interconnected organism without numerical or data based boundaries that limit interactions. The group also identified several cross-cutting advances in mathematics, computer science, and computational science that would be needed to enable one or more of these big ideas. It is critical to address the need for organized, verified, and optimized software, which enables the models to grow and continue to provide solutions in which the community can have confidence. Effectively utilizing the newest computer hardware enables simulation efficiency and the ability to handle output from increasingly complex and detailed models. This will be accomplished through hierarchical multiscale algorithms in tandem with new strategies for data handling, analysis, and storage. These big ideas and cross-cutting technologies for enabling

  6. Advances in Laboratory Methods for Detection and Typing of Norovirus

    PubMed Central

    2014-01-01

    Human noroviruses are the leading cause of epidemic and sporadic gastroenteritis across all age groups. Although the disease is usually self-limiting, in the United States norovirus gastroenteritis causes an estimated 56,000 to 71,000 hospitalizations and 570 to 800 deaths each year. This minireview describes the latest data on laboratory methods (molecular, immunological) for norovirus detection, including real-time reverse transcription-quantitative PCR (RT-qPCR) and commercially available immunological assays as well as the latest FDA-cleared multi-gastrointestinal-pathogen platforms. In addition, an overview is provided on the latest nomenclature and molecular epidemiology of human noroviruses. PMID:24989606

  7. Advancing Materials Science using Neutrons at Oak Ridge National Laboratory

    ScienceCinema

    Carpenter, John

    2016-07-12

    Jack Carpenter, pioneer of accelerator-based pulsed spallation neutron sources, talks about neutron science at Oak Ridge National Laboratory (ORNL) and a need for a second target station at the Spallation Neutron Source (SNS). ORNL is the Department of Energy's largest multiprogram science and energy laboratory, and is home to two scientific user facilities serving the neutron science research community: the High Flux Isotope Reactor (HFIR) and SNS. HFIR and SNS provide researchers with unmatched capabilities for understanding the structure and properties of materials, macromolecular and biological systems, and the fundamental physics of the neutron. Neutrons provide a window through which to view materials at a microscopic level that allow researchers to develop better materials and better products. Neutrons enable us to understand materials we use in everyday life. Carpenter explains the need for another station to produce long wavelength neutrons, or cold neutrons, to answer questions that are addressed only with cold neutrons. The second target station is optimized for that purpose. Modern technology depends more and more upon intimate atomic knowledge of materials, and neutrons are an ideal probe.

  8. A Timesharing Computer Program for a General Chemistry Laboratory

    ERIC Educational Resources Information Center

    Cutler, Gary L.; Drum, Donald A.

    1975-01-01

    Describes an experiment in which general and physical chemistry students can determine the heat of vaporization of a volatile substance from experimental laboratory data using timesharing techniques. (MLH)

  9. High-Performance Computing for Advanced Smart Grid Applications

    SciTech Connect

    Huang, Zhenyu; Chen, Yousu

    2012-07-06

    The power grid is becoming far more complex as a result of the grid evolution meeting an information revolution. Due to the penetration of smart grid technologies, the grid is evolving as an unprecedented speed and the information infrastructure is fundamentally improved with a large number of smart meters and sensors that produce several orders of magnitude larger amounts of data. How to pull data in, perform analysis, and put information out in a real-time manner is a fundamental challenge in smart grid operation and planning. The future power grid requires high performance computing to be one of the foundational technologies in developing the algorithms and tools for the significantly increased complexity. New techniques and computational capabilities are required to meet the demands for higher reliability and better asset utilization, including advanced algorithms and computing hardware for large-scale modeling, simulation, and analysis. This chapter summarizes the computational challenges in smart grid and the need for high performance computing, and present examples of how high performance computing might be used for future smart grid operation and planning.

  10. Physical and Chemical Properties of the Copper-Alanine System: An Advanced Laboratory Project

    ERIC Educational Resources Information Center

    Farrell, John J.

    1977-01-01

    An integrated physical-analytical-inorganic chemistry laboratory procedure for use with undergraduate biology majors is described. The procedure requires five to six laboratory periods and includes acid-base standardizations, potentiometric determinations, computer usage, spectrophotometric determinations of crystal-field splitting…

  11. Voting with Their Seats: Computer Laboratory Design and the Casual User

    ERIC Educational Resources Information Center

    Spennemann, Dirk H. R.; Atkinson, John; Cornforth, David

    2007-01-01

    Student computer laboratories are provided by most teaching institutions around the world; however, what is the most effective layout for such facilities? The log-in data files from computer laboratories at a regional university in Australia were analysed to determine whether there was a pattern in student seating. In particular, it was…

  12. SEED: A Suite of Instructional Laboratories for Computer Security Education

    ERIC Educational Resources Information Center

    Du, Wenliang; Wang, Ronghua

    2008-01-01

    The security and assurance of our computing infrastructure has become a national priority. To address this priority, higher education has gradually incorporated the principles of computer and information security into the mainstream undergraduate and graduate computer science curricula. To achieve effective education, learning security principles…

  13. Computation of the tip vortex flowfield for advanced aircraft propellers

    NASA Technical Reports Server (NTRS)

    Tsai, Tommy M.; Dejong, Frederick J.; Levy, Ralph

    1988-01-01

    The tip vortex flowfield plays a significant role in the performance of advanced aircraft propellers. The flowfield in the tip region is complex, three-dimensional and viscous with large secondary velocities. An analysis is presented using an approximate set of equations which contains the physics required by the tip vortex flowfield, but which does not require the resources of the full Navier-Stokes equations. A computer code was developed to predict the tip vortex flowfield of advanced aircraft propellers. A grid generation package was developed to allow specification of a variety of advanced aircraft propeller shapes. Calculations of the tip vortex generation on an SR3 type blade at high Reynolds numbers were made using this code and a parametric study was performed to show the effect of tip thickness on tip vortex intensity. In addition, calculations of the tip vortex generation on a NACA 0012 type blade were made, including the flowfield downstream of the blade trailing edge. Comparison of flowfield calculations with experimental data from an F4 blade was made. A user's manual was also prepared for the computer code (NASA CR-182178).

  14. XII Advanced Computing and Analysis Techniques in Physics Research

    NASA Astrophysics Data System (ADS)

    Speer, Thomas; Carminati, Federico; Werlen, Monique

    November 2008 will be a few months after the official start of LHC when the highest quantum energy ever produced by mankind will be observed by the most complex piece of scientific equipment ever built. LHC will open a new era in physics research and push further the frontier of Knowledge This achievement has been made possible by new technological developments in many fields, but computing is certainly the technology that has made possible this whole enterprise. Accelerator and detector design, construction management, data acquisition, detectors monitoring, data analysis, event simulation and theoretical interpretation are all computing based HEP activities but also occurring many other research fields. Computing is everywhere and forms the common link between all involved scientists and engineers. The ACAT workshop series, created back in 1990 as AIHENP (Artificial Intelligence in High Energy and Nuclear Research) has been covering the tremendous evolution of computing in its most advanced topics, trying to setup bridges between computer science, experimental and theoretical physics. Conference web-site: http://acat2008.cern.ch/ Programme and presentations: http://indico.cern.ch/conferenceDisplay.py?confId=34666

  15. 'Dry Laboratories' in Science Education; Computer-Based Practical Work.

    ERIC Educational Resources Information Center

    Kirschner, Paul; Huisman, Willibrord

    1998-01-01

    Identifies the problems associated with the use of dry laboratories in science education, presents design considerations for the use of such practicals in science education, and presents examples of innovative nontraditional practicals. Contains 23 references. (DDR)

  16. Condition monitoring through advanced sensor and computational technology : final report (January 2002 to May 2005).

    SciTech Connect

    Kim, Jung-Taek; Luk, Vincent K.

    2005-05-01

    The overall goal of this joint research project was to develop and demonstrate advanced sensors and computational technology for continuous monitoring of the condition of components, structures, and systems in advanced and next-generation nuclear power plants (NPPs). This project included investigating and adapting several advanced sensor technologies from Korean and US national laboratory research communities, some of which were developed and applied in non-nuclear industries. The project team investigated and developed sophisticated signal processing, noise reduction, and pattern recognition techniques and algorithms. The researchers installed sensors and conducted condition monitoring tests on two test loops, a check valve (an active component) and a piping elbow (a passive component), to demonstrate the feasibility of using advanced sensors and computational technology to achieve the project goal. Acoustic emission (AE) devices, optical fiber sensors, accelerometers, and ultrasonic transducers (UTs) were used to detect mechanical vibratory response of check valve and piping elbow in normal and degraded configurations. Chemical sensors were also installed to monitor the water chemistry in the piping elbow test loop. Analysis results of processed sensor data indicate that it is feasible to differentiate between the normal and degraded (with selected degradation mechanisms) configurations of these two components from the acquired sensor signals, but it is questionable that these methods can reliably identify the level and type of degradation. Additional research and development efforts are needed to refine the differentiation techniques and to reduce the level of uncertainties.

  17. The use of computers in a materials science laboratory

    NASA Technical Reports Server (NTRS)

    Neville, J. P.

    1990-01-01

    The objective is to make available a method of easily recording the microstructure of a sample by means of a computer. The method requires a minimum investment and little or no instruction on the operation of a computer. An outline of the setup involving a black and white TV camera, a digitizer control box, a metallurgical microscope and a computer screen, printer, and keyboard is shown.

  18. PREFACE: 16th International workshop on Advanced Computing and Analysis Techniques in physics research (ACAT2014)

    NASA Astrophysics Data System (ADS)

    Fiala, L.; Lokajicek, M.; Tumova, N.

    2015-05-01

    This volume of the IOP Conference Series is dedicated to scientific contributions presented at the 16th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2014), this year the motto was ''bridging disciplines''. The conference took place on September 1-5, 2014, at the Faculty of Civil Engineering, Czech Technical University in Prague, Czech Republic. The 16th edition of ACAT explored the boundaries of computing system architectures, data analysis algorithmics, automatic calculations, and theoretical calculation technologies. It provided a forum for confronting and exchanging ideas among these fields, where new approaches in computing technologies for scientific research were explored and promoted. This year's edition of the workshop brought together over 140 participants from all over the world. The workshop's 16 invited speakers presented key topics on advanced computing and analysis techniques in physics. During the workshop, 60 talks and 40 posters were presented in three tracks: Computing Technology for Physics Research, Data Analysis - Algorithms and Tools, and Computations in Theoretical Physics: Techniques and Methods. The round table enabled discussions on expanding software, knowledge sharing and scientific collaboration in the respective areas. ACAT 2014 was generously sponsored by Western Digital, Brookhaven National Laboratory, Hewlett Packard, DataDirect Networks, M Computers, Bright Computing, Huawei and PDV-Systemhaus. Special appreciations go to the track liaisons Lorenzo Moneta, Axel Naumann and Grigory Rubtsov for their work on the scientific program and the publication preparation. ACAT's IACC would also like to express its gratitude to all referees for their work on making sure the contributions are published in the proceedings. Our thanks extend to the conference liaisons Andrei Kataev and Jerome Lauret who worked with the local contacts and made this conference possible as well as to the program

  19. Advances in Computational Stability Analysis of Composite Aerospace Structures

    SciTech Connect

    Degenhardt, R.; Araujo, F. C. de

    2010-09-30

    European aircraft industry demands for reduced development and operating costs. Structural weight reduction by exploitation of structural reserves in composite aerospace structures contributes to this aim, however, it requires accurate and experimentally validated stability analysis of real structures under realistic loading conditions. This paper presents different advances from the area of computational stability analysis of composite aerospace structures which contribute to that field. For stringer stiffened panels main results of the finished EU project COCOMAT are given. It investigated the exploitation of reserves in primary fibre composite fuselage structures through an accurate and reliable simulation of postbuckling and collapse. For unstiffened cylindrical composite shells a proposal for a new design method is presented.

  20. Recent Advances in Computed Tomographic Technology: Cardiopulmonary Imaging Applications.

    PubMed

    Tabari, Azadeh; Lo Gullo, Roberto; Murugan, Venkatesh; Otrakji, Alexi; Digumarthy, Subba; Kalra, Mannudeep

    2017-03-01

    Cardiothoracic diseases result in substantial morbidity and mortality. Chest computed tomography (CT) has been an imaging modality of choice for assessing a host of chest diseases, and technologic advances have enabled the emergence of coronary CT angiography as a robust noninvasive test for cardiac imaging. Technologic developments in CT have also enabled the application of dual-energy CT scanning for assessing pulmonary vascular and neoplastic processes. Concerns over increasing radiation dose from CT scanning are being addressed with introduction of more dose-efficient wide-area detector arrays and iterative reconstruction techniques. This review article discusses the technologic innovations in CT and their effect on cardiothoracic applications.

  1. Advanced Computer Science on Internal Ballistics of Solid Rocket Motors

    NASA Astrophysics Data System (ADS)

    Shimada, Toru; Kato, Kazushige; Sekino, Nobuhiro; Tsuboi, Nobuyuki; Seike, Yoshio; Fukunaga, Mihoko; Daimon, Yu; Hasegawa, Hiroshi; Asakawa, Hiroya

    In this paper, described is the development of a numerical simulation system, what we call “Advanced Computer Science on SRM Internal Ballistics (ACSSIB)”, for the purpose of improvement of performance and reliability of solid rocket motors (SRM). The ACSSIB system is consisting of a casting simulation code of solid propellant slurry, correlation database of local burning-rate of cured propellant in terms of local slurry flow characteristics, and a numerical code for the internal ballistics of SRM, as well as relevant hardware. This paper describes mainly the objectives, the contents of this R&D, and the output of the fiscal year of 2008.

  2. Advances in Electromagnetic Modelling through High Performance Computing

    SciTech Connect

    Ko, K.; Folwell, N.; Ge, L.; Guetz, A.; Lee, L.; Li, Z.; Ng, C.; Prudencio, E.; Schussman, G.; Uplenchwar, R.; Xiao, L.; /SLAC

    2006-03-29

    Under the DOE SciDAC project on Accelerator Science and Technology, a suite of electromagnetic codes has been under development at SLAC that are based on unstructured grids for higher accuracy, and use parallel processing to enable large-scale simulation. The new modeling capability is supported by SciDAC collaborations on meshing, solvers, refinement, optimization and visualization. These advances in computational science are described and the application of the parallel eigensolver Omega3P to the cavity design for the International Linear Collider is discussed.

  3. The Scanning Electron Microscope As An Accelerator For The Undergraduate Advanced Physics Laboratory

    SciTech Connect

    Peterson, Randolph S.; Berggren, Karl K.; Mondol, Mark

    2011-06-01

    Few universities or colleges have an accelerator for use with advanced physics laboratories, but many of these institutions have a scanning electron microscope (SEM) on site, often in the biology department. As an accelerator for the undergraduate, advanced physics laboratory, the SEM is an excellent substitute for an ion accelerator. Although there are no nuclear physics experiments that can be performed with a typical 30 kV SEM, there is an opportunity for experimental work on accelerator physics, atomic physics, electron-solid interactions, and the basics of modern e-beam lithography.

  4. Computational methods of the Advanced Fluid Dynamics Model

    SciTech Connect

    Bohl, W.R.; Wilhelm, D.; Parker, F.R.; Berthier, J.; Maudlin, P.J.; Schmuck, P.; Goutagny, L.; Ichikawa, S.; Ninokata, H.; Luck, L.B.

    1987-01-01

    To more accurately treat severe accidents in fast reactors, a program has been set up to investigate new computational models and approaches. The product of this effort is a computer code, the Advanced Fluid Dynamics Model (AFDM). This paper describes some of the basic features of the numerical algorithm used in AFDM. Aspects receiving particular emphasis are the fractional-step method of time integration, the semi-implicit pressure iteration, the virtual mass inertial terms, the use of three velocity fields, higher order differencing, convection of interfacial area with source and sink terms, multicomponent diffusion processes in heat and mass transfer, the SESAME equation of state, and vectorized programming. A calculated comparison with an isothermal tetralin/ammonia experiment is performed. We conclude that significant improvements are possible in reliably calculating the progression of severe accidents with further development.

  5. Accomplishment Summary 1968-1969. Biological Computer Laboratory.

    ERIC Educational Resources Information Center

    Von Foerster, Heinz; And Others

    This report summarizes theoretical, applied, and experimental studies in the areas of computational principles in complex intelligent systems, cybernetics, multivalued logic, and the mechanization of cognitive processes. This work is summarized under the following topic headings: properties of complex dynamic systems; computers and the language…

  6. Computational ocean acoustics: Advances in 3D ocean acoustic modeling

    NASA Astrophysics Data System (ADS)

    Schmidt, Henrik; Jensen, Finn B.

    2012-11-01

    The numerical model of ocean acoustic propagation developed in the 1980's are still in widespread use today, and the field of computational ocean acoustics is often considered a mature field. However, the explosive increase in computational power available to the community has created opportunities for modeling phenomena that earlier were beyond reach. Most notably, three-dimensional propagation and scattering problems have been prohibitive computationally, but are now addressed routinely using brute force numerical approaches such as the Finite Element Method, in particular for target scattering problems, where they are being combined with the traditional wave theory propagation models in hybrid modeling frameworks. Also, recent years has seen the development of hybrid approaches coupling oceanographic circulation models with acoustic propagation models, enabling the forecasting of sonar performance uncertainty in dynamic ocean environments. These and other advances made over the last couple of decades support the notion that the field of computational ocean acoustics is far from being mature. [Work supported by the Office of Naval Research, Code 321OA].

  7. Brain-computer interaction research at the Computer Vision and Multimedia Laboratory, University of Geneva.

    PubMed

    Pun, Thierry; Alecu, Teodor Iulian; Chanel, Guillaume; Kronegg, Julien; Voloshynovskiy, Sviatoslav

    2006-06-01

    This paper describes the work being conducted in the domain of brain-computer interaction (BCI) at the Multimodal Interaction Group, Computer Vision and Multimedia Laboratory, University of Geneva, Geneva, Switzerland. The application focus of this work is on multimodal interaction rather than on rehabilitation, that is how to augment classical interaction by means of physiological measurements. Three main research topics are addressed. The first one concerns the more general problem of brain source activity recognition from EEGs. In contrast with classical deterministic approaches, we studied iterative robust stochastic based reconstruction procedures modeling source and noise statistics, to overcome known limitations of current techniques. We also developed procedures for optimal electroencephalogram (EEG) sensor system design in terms of placement and number of electrodes. The second topic is the study of BCI protocols and performance from an information-theoretic point of view. Various information rate measurements have been compared for assessing BCI abilities. The third research topic concerns the use of EEG and other physiological signals for assessing a user's emotional status.

  8. Development, Evaluation and Use of a Student Experience Survey in Undergraduate Science Laboratories: The Advancing Science by Enhancing Learning in the Laboratory Student Laboratory Learning Experience Survey

    NASA Astrophysics Data System (ADS)

    Barrie, Simon C.; Bucat, Robert B.; Buntine, Mark A.; Burke da Silva, Karen; Crisp, Geoffrey T.; George, Adrian V.; Jamie, Ian M.; Kable, Scott H.; Lim, Kieran F.; Pyke, Simon M.; Read, Justin R.; Sharma, Manjula D.; Yeung, Alexandra

    2015-07-01

    Student experience surveys have become increasingly popular to probe various aspects of processes and outcomes in higher education, such as measuring student perceptions of the learning environment and identifying aspects that could be improved. This paper reports on a particular survey for evaluating individual experiments that has been developed over some 15 years as part of a large national Australian study pertaining to the area of undergraduate laboratories-Advancing Science by Enhancing Learning in the Laboratory. This paper reports on the development of the survey instrument and the evaluation of the survey using student responses to experiments from different institutions in Australia, New Zealand and the USA. A total of 3153 student responses have been analysed using factor analysis. Three factors, motivation, assessment and resources, have been identified as contributing to improved student attitudes to laboratory activities. A central focus of the survey is to provide feedback to practitioners to iteratively improve experiments. Implications for practitioners and researchers are also discussed.

  9. Computational cardiology: how computer simulations could be used to develop new therapies and advance existing ones

    PubMed Central

    Trayanova, Natalia A.; O'Hara, Thomas; Bayer, Jason D.; Boyle, Patrick M.; McDowell, Kathleen S.; Constantino, Jason; Arevalo, Hermenegild J.; Hu, Yuxuan; Vadakkumpadan, Fijoy

    2012-01-01

    This article reviews the latest developments in computational cardiology. It focuses on the contribution of cardiac modelling to the development of new therapies as well as the advancement of existing ones for cardiac arrhythmias and pump dysfunction. Reviewed are cardiac modelling efforts aimed at advancing and optimizing existent therapies for cardiac disease (defibrillation, ablation of ventricular tachycardia, and cardiac resynchronization therapy) and at suggesting novel treatments, including novel molecular targets, as well as efforts to use cardiac models in stratification of patients likely to benefit from a given therapy, and the use of models in diagnostic procedures. PMID:23104919

  10. Advanced Simulation and Computing: A Summary Report to the Director's Review

    SciTech Connect

    McCoy, M G; Peck, T

    2003-06-01

    It has now been three years since the Advanced Simulation and Computing Program (ASCI), as managed by Defense and Nuclear Technologies (DNT) Directorate, has been reviewed by this Director's Review Committee (DRC). Since that time, there has been considerable progress for all components of the ASCI Program, and these developments will be highlighted in this document and in the presentations planned for June 9 and 10, 2003. There have also been some name changes. Today, the Program is called ''Advanced Simulation and Computing,'' Although it retains the familiar acronym ASCI, the initiative nature of the effort has given way to sustained services as an integral part of the Stockpile Stewardship Program (SSP). All computing efforts at LLNL and the other two Defense Program (DP) laboratories are funded and managed under ASCI. This includes the so-called legacy codes, which remain essential tools in stockpile stewardship. The contract between the Department of Energy (DOE) and the University of California (UC) specifies an independent appraisal of Directorate technical work and programmatic management. Such represents the work of this DNT Review Committee. Beginning this year, the Laboratory is implementing a new review system. This process was negotiated between UC, the National Nuclear Security Administration (NNSA), and the Laboratory Directors. Central to this approach are eight performance objectives that focus on key programmatic and administrative goals. Associated with each of these objectives are a number of performance measures to more clearly characterize the attainment of the objectives. Each performance measure has a lead directorate and one or more contributing directorates. Each measure has an evaluation plan and has identified expected documentation to be included in the ''Assessment File''.

  11. Computer Aided Design: Instructional Manual. The North Dakota High Technology Mobile Laboratory Project.

    ERIC Educational Resources Information Center

    Cheng, Wan-Lee

    This instructional manual contains 12 learning activity packets for use in a workshop in computer-aided design and drafting (CADD). The lessons cover the following topics: introduction to computer graphics and computer-aided design/drafting; coordinate systems; advance space graphics hardware configuration and basic features of the IBM PC…

  12. A Computer-Assisted Laboratory Sequence for Petroleum Geology.

    ERIC Educational Resources Information Center

    Lumsden, David N.

    1979-01-01

    Describes a competitive oil-play game for petroleum geology students. It is accompanied by a computer program written in interactive Fortran. The program, however, is not essential, but useful for adding more interest. (SA)

  13. Computer aids for integrated circuit design at Sandia National Laboratories

    NASA Astrophysics Data System (ADS)

    Brown, G. W.

    A general framework for a hierarchical computer-aided design (CAD) system for VLSI design is described. The system supports both functional and physical design in the area of initial design specification, system synthesis, simulation, mask layout, verification, and documentation. The system is being implemented in phases within a user environment on a DECsystem 20-VAX 11/780 computer network. It supports evolutionary changes as new technologies, design strategies, and application programs are developed.

  14. DOE Advanced Scientific Computing Advisory Committee (ASCAC) Report: Exascale Computing Initiative Review

    SciTech Connect

    Reed, Daniel; Berzins, Martin; Pennington, Robert; Sarkar, Vivek; Taylor, Valerie

    2015-08-01

    On November 19, 2014, the Advanced Scientific Computing Advisory Committee (ASCAC) was charged with reviewing the Department of Energy’s conceptual design for the Exascale Computing Initiative (ECI). In particular, this included assessing whether there are significant gaps in the ECI plan or areas that need to be given priority or extra management attention. Given the breadth and depth of previous reviews of the technical challenges inherent in exascale system design and deployment, the subcommittee focused its assessment on organizational and management issues, considering technical issues only as they informed organizational or management priorities and structures. This report presents the observations and recommendations of the subcommittee.

  15. Recent advances in computational mechanics of the human knee joint.

    PubMed

    Kazemi, M; Dabiri, Y; Li, L P

    2013-01-01

    Computational mechanics has been advanced in every area of orthopedic biomechanics. The objective of this paper is to provide a general review of the computational models used in the analysis of the mechanical function of the knee joint in different loading and pathological conditions. Major review articles published in related areas are summarized first. The constitutive models for soft tissues of the knee are briefly discussed to facilitate understanding the joint modeling. A detailed review of the tibiofemoral joint models is presented thereafter. The geometry reconstruction procedures as well as some critical issues in finite element modeling are also discussed. Computational modeling can be a reliable and effective method for the study of mechanical behavior of the knee joint, if the model is constructed correctly. Single-phase material models have been used to predict the instantaneous load response for the healthy knees and repaired joints, such as total and partial meniscectomies, ACL and PCL reconstructions, and joint replacements. Recently, poromechanical models accounting for fluid pressurization in soft tissues have been proposed to study the viscoelastic response of the healthy and impaired knee joints. While the constitutive modeling has been considerably advanced at the tissue level, many challenges still exist in applying a good material model to three-dimensional joint simulations. A complete model validation at the joint level seems impossible presently, because only simple data can be obtained experimentally. Therefore, model validation may be concentrated on the constitutive laws using multiple mechanical tests of the tissues. Extensive model verifications at the joint level are still crucial for the accuracy of the modeling.

  16. Recent Advances in Computational Mechanics of the Human Knee Joint

    PubMed Central

    Kazemi, M.; Dabiri, Y.; Li, L. P.

    2013-01-01

    Computational mechanics has been advanced in every area of orthopedic biomechanics. The objective of this paper is to provide a general review of the computational models used in the analysis of the mechanical function of the knee joint in different loading and pathological conditions. Major review articles published in related areas are summarized first. The constitutive models for soft tissues of the knee are briefly discussed to facilitate understanding the joint modeling. A detailed review of the tibiofemoral joint models is presented thereafter. The geometry reconstruction procedures as well as some critical issues in finite element modeling are also discussed. Computational modeling can be a reliable and effective method for the study of mechanical behavior of the knee joint, if the model is constructed correctly. Single-phase material models have been used to predict the instantaneous load response for the healthy knees and repaired joints, such as total and partial meniscectomies, ACL and PCL reconstructions, and joint replacements. Recently, poromechanical models accounting for fluid pressurization in soft tissues have been proposed to study the viscoelastic response of the healthy and impaired knee joints. While the constitutive modeling has been considerably advanced at the tissue level, many challenges still exist in applying a good material model to three-dimensional joint simulations. A complete model validation at the joint level seems impossible presently, because only simple data can be obtained experimentally. Therefore, model validation may be concentrated on the constitutive laws using multiple mechanical tests of the tissues. Extensive model verifications at the joint level are still crucial for the accuracy of the modeling. PMID:23509602

  17. Ring-Closing Metathesis: An Advanced Guided-Inquiry Experiment for the Organic Laboratory

    ERIC Educational Resources Information Center

    Schepmann, Hala G.; Mynderse, Michelle

    2010-01-01

    The design and implementation of an advanced guided-inquiry experiment for the organic laboratory is described. Grubbs's second-generation catalyst is used to effect the ring-closing metathesis of diethyl diallylmalonate. The reaction is carried out under an inert atmosphere at room temperature and monitored by argentic TLC. The crude reaction is…

  18. Understanding Fluorescence Measurements through a Guided-Inquiry and Discovery Experiment in Advanced Analytical Laboratory

    ERIC Educational Resources Information Center

    Wilczek-Vera, Grazyna; Salin, Eric Dunbar

    2011-01-01

    An experiment on fluorescence spectroscopy suitable for an advanced analytical laboratory is presented. Its conceptual development used a combination of the expository and discovery styles. The "learn-as-you-go" and direct "hands-on" methodology applied ensures an active role for a student in the process of visualization and discovery of concepts.…

  19. Advanced Undergraduate-Laboratory Experiment on Electron Spin Resonance in Single-Crystal Ruby

    ERIC Educational Resources Information Center

    Collins, Lee A.; And Others

    1974-01-01

    An electron-spin-resonance experiment which has been successfully performed in an advanced undergraduate physics laboratory is described. A discussion of that part of the theory of magnetic resonance necessary for the understanding of the experiment is also provided in this article. (DT)

  20. An Advanced Undergraduate Chemistry Laboratory Experiment Exploring NIR Spectroscopy and Chemometrics

    ERIC Educational Resources Information Center

    Wanke, Randall; Stauffer, Jennifer

    2007-01-01

    An advanced undergraduate chemistry laboratory experiment to study the advantages and hazards of the coupling of NIR spectroscopy and chemometrics is described. The combination is commonly used for analysis and process control of various ingredients used in agriculture, petroleum and food products.

  1. Adapting Advanced Inorganic Chemistry Lecture and Laboratory Instruction for a Legally Blind Student

    ERIC Educational Resources Information Center

    Miecznikowski, John R.; Guberman-Pfeffer, Matthew J.; Butrick, Elizabeth E.; Colangelo, Julie A.; Donaruma, Cristine E.

    2015-01-01

    In this article, the strategies and techniques used to successfully teach advanced inorganic chemistry, in the lecture and laboratory, to a legally blind student are described. At Fairfield University, these separate courses, which have a physical chemistry corequisite or a prerequisite, are taught for junior and senior chemistry and biochemistry…

  2. NASA Trapezoidal Wing Computations Including Transition and Advanced Turbulence Modeling

    NASA Technical Reports Server (NTRS)

    Rumsey, C. L.; Lee-Rausch, E. M.

    2012-01-01

    Flow about the NASA Trapezoidal Wing is computed with several turbulence models by using grids from the first High Lift Prediction Workshop in an effort to advance understanding of computational fluid dynamics modeling for this type of flowfield. Transition is accounted for in many of the computations. In particular, a recently-developed 4-equation transition model is utilized and works well overall. Accounting for transition tends to increase lift and decrease moment, which improves the agreement with experiment. Upper surface flap separation is reduced, and agreement with experimental surface pressures and velocity profiles is improved. The predicted shape of wakes from upstream elements is strongly influenced by grid resolution in regions above the main and flap elements. Turbulence model enhancements to account for rotation and curvature have the general effect of increasing lift and improving the resolution of the wing tip vortex as it convects downstream. However, none of the models improve the prediction of surface pressures near the wing tip, where more grid resolution is needed.

  3. PREFACE: 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT2013)

    NASA Astrophysics Data System (ADS)

    Wang, Jianxiong

    2014-06-01

    This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2013) which took place on 16-21 May 2013 at the Institute of High Energy Physics, Chinese Academy of Sciences, Beijing, China. The workshop series brings together computer science researchers and practitioners, and researchers from particle physics and related fields to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. This year's edition of the workshop brought together over 120 participants from all over the world. 18 invited speakers presented key topics on the universe in computer, Computing in Earth Sciences, multivariate data analysis, automated computation in Quantum Field Theory as well as computing and data analysis challenges in many fields. Over 70 other talks and posters presented state-of-the-art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. The round table discussions on open-source, knowledge sharing and scientific collaboration stimulate us to think over the issue in the respective areas. ACAT 2013 was generously sponsored by the Chinese Academy of Sciences (CAS), National Natural Science Foundation of China (NFSC), Brookhaven National Laboratory in the USA (BNL), Peking University (PKU), Theoretical Physics Cernter for Science facilities of CAS (TPCSF-CAS) and Sugon. We would like to thank all the participants for their scientific contributions and for the en- thusiastic participation in all its activities of the workshop. Further information on ACAT 2013 can be found at http://acat2013.ihep.ac.cn. Professor Jianxiong Wang Institute of High Energy Physics Chinese Academy of Science Details of committees and sponsors are available in the PDF

  4. PREFACE: 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011)

    NASA Astrophysics Data System (ADS)

    Teodorescu, Liliana; Britton, David; Glover, Nigel; Heinrich, Gudrun; Lauret, Jérôme; Naumann, Axel; Speer, Thomas; Teixeira-Dias, Pedro

    2012-06-01

    ACAT2011 This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011) which took place on 5-7 September 2011 at Brunel University, UK. The workshop series, which began in 1990 in Lyon, France, brings together computer science researchers and practitioners, and researchers from particle physics and related fields in order to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. It is a forum for the exchange of ideas among the fields, exploring and promoting cutting-edge computing, data analysis and theoretical calculation techniques in fundamental physics research. This year's edition of the workshop brought together over 100 participants from all over the world. 14 invited speakers presented key topics on computing ecosystems, cloud computing, multivariate data analysis, symbolic and automatic theoretical calculations as well as computing and data analysis challenges in astrophysics, bioinformatics and musicology. Over 80 other talks and posters presented state-of-the art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. Panel and round table discussions on data management and multivariate data analysis uncovered new ideas and collaboration opportunities in the respective areas. This edition of ACAT was generously sponsored by the Science and Technology Facility Council (STFC), the Institute for Particle Physics Phenomenology (IPPP) at Durham University, Brookhaven National Laboratory in the USA and Dell. We would like to thank all the participants of the workshop for the high level of their scientific contributions and for the enthusiastic participation in all its activities which were, ultimately, the key factors in the

  5. History of computer-assisted data processing in the medical laboratory.

    PubMed

    Porth, A J; Lübke, B

    1996-03-01

    Computer-assisted processing of medical laboratory data started in the sixties. The earliest systems, which arose in English- and German-speaking laboratories, pointed the way for the development of laboratory data processing. The significance and evolution of the fundamental components of a laboratory information system, such as the placing of the request to the laboratory, identification of patients and samples, recording of data, quality control, plausibility control and results, are presented. The subject is given a wider perspective by the inclusion of a comprehensive (chronological) literature index.

  6. Laboratory for Computer Science Progress Report 18, July 1980-June 1981,

    DTIC Science & Technology

    1983-04-01

    R127 586 LABORATORY FOR COMPUTER SCIENCE’PROGRESS REPORT ig JULY 1/3 1986-JUNE 1981(U) MASSACHUSETTS INST OF TECH CAMBRIDGE LAB FOR COMPUTER SCIENCE . M...MASSACHUSETTSLABORATORY FOR INSTITUTE OF COMPUTER SCIENCE TECHNOLOGY PROGRESS REPORT 18 July 1980- June 1981 1i MAY 2 1.83 CL- Prepared for the Defense...TYPE OF REPORT & PERIOD COVERED Laboratory for Computer Science DARPA/DOD, Progress Progress Report 18 Report 7/80 - 6/81 . July 1980 - June 1981 6

  7. Evolution of a Computer-Based Testing Laboratory

    ERIC Educational Resources Information Center

    Moskal, Patrick; Caldwell, Richard; Ellis, Taylor

    2009-01-01

    In 2003, faced with increasing growth in technology-based and large-enrollment courses, the College of Business Administration at the University of Central Florida opened a computer-based testing lab to facilitate administration of course examinations. Patrick Moskal, Richard Caldwell, and Taylor Ellis describe the development and evolution of the…

  8. Computer Assisted Laboratory Problems for Teaching Business and Economic Statistics.

    ERIC Educational Resources Information Center

    Moore, Charles N.

    A computer-based Statistical Program to Assist in Teaching Statistics (SPATS) has been successfully developed to aid the teaching of statistics to undergraduates with business and economics majors. SPATS simplifies the problem of experimentally creating and analyzing a variety of populations and of selecting and analyzing different kinds of random…

  9. Three Computer Programs for Use in Introductory Level Physics Laboratories.

    ERIC Educational Resources Information Center

    Kagan, David T.

    1984-01-01

    Describes three computer programs which operate on Apple II+ microcomputers: (1) a menu-driven graph drawing program; (2) a simulation of the Millikan oil drop experiment; and (3) a program used to study the half-life of silver. (Instructions for obtaining the programs from the author are included.) (JN)

  10. An expanded framework for the advanced computational testing and simulation toolkit

    SciTech Connect

    Marques, Osni A.; Drummond, Leroy A.

    2003-11-09

    The Advanced Computational Testing and Simulation (ACTS) Toolkit is a set of computational tools developed primarily at DOE laboratories and is aimed at simplifying the solution of common and important computational problems. The use of the tools reduces the development time for new codes and the tools provide functionality that might not otherwise be available. This document outlines an agenda for expanding the scope of the ACTS Project based on lessons learned from current activities. Highlights of this agenda include peer-reviewed certification of new tools; finding tools to solve problems that are not currently addressed by the Toolkit; working in collaboration with other software initiatives and DOE computer facilities; expanding outreach efforts; promoting interoperability, further development of the tools; and improving functionality of the ACTS Information Center, among other tasks. The ultimate goal is to make the ACTS tools more widely used and more effective in solving DOE's and the nation's scientific problems through the creation of a reliable software infrastructure for scientific computing.

  11. High performance parallel computers for science: New developments at the Fermilab advanced computer program

    SciTech Connect

    Nash, T.; Areti, H.; Atac, R.; Biel, J.; Cook, A.; Deppe, J.; Edel, M.; Fischler, M.; Gaines, I.; Hance, R.

    1988-08-01

    Fermilab's Advanced Computer Program (ACP) has been developing highly cost effective, yet practical, parallel computers for high energy physics since 1984. The ACP's latest developments are proceeding in two directions. A Second Generation ACP Multiprocessor System for experiments will include $3500 RISC processors each with performance over 15 VAX MIPS. To support such high performance, the new system allows parallel I/O, parallel interprocess communication, and parallel host processes. The ACP Multi-Array Processor, has been developed for theoretical physics. Each $4000 node is a FORTRAN or C programmable pipelined 20 MFlops (peak), 10 MByte single board computer. These are plugged into a 16 port crossbar switch crate which handles both inter and intra crate communication. The crates are connected in a hypercube. Site oriented applications like lattice gauge theory are supported by system software called CANOPY, which makes the hardware virtually transparent to users. A 256 node, 5 GFlop, system is under construction. 10 refs., 7 figs.

  12. Advanced information processing system: Inter-computer communication services

    NASA Technical Reports Server (NTRS)

    Burkhardt, Laura; Masotto, Tom; Sims, J. Terry; Whittredge, Roy; Alger, Linda S.

    1991-01-01

    The purpose is to document the functional requirements and detailed specifications for the Inter-Computer Communications Services (ICCS) of the Advanced Information Processing System (AIPS). An introductory section is provided to outline the overall architecture and functional requirements of the AIPS and to present an overview of the ICCS. An overview of the AIPS architecture as well as a brief description of the AIPS software is given. The guarantees of the ICCS are provided, and the ICCS is described as a seven-layered International Standards Organization (ISO) Model. The ICCS functional requirements, functional design, and detailed specifications as well as each layer of the ICCS are also described. A summary of results and suggestions for future work are presented.

  13. Computed laser backscattering from turbid liquids - Comparison with laboratory results

    NASA Technical Reports Server (NTRS)

    Poole, L. R.

    1982-01-01

    A recently developed semianalytic Monte Carlo radiative transfer model applicable to oceanographic lidar systems (SALMON) has been used to simulate a series of laboratory experiments studying the backscatter of laser light from dispersions of Teflon spheres. Results obtained with SALMON, using best estimates of pertinent optical parameters, compared quite well with experimental data in both a qualitative and quantitative sense, with the largest relative discrepancies being approximately 30%. The results firmly establish the validity of SALMON in studying the performance of real oceanographic lidar systems.

  14. Computed laser backscattering from turbid liquids - Comparison with laboratory results

    NASA Astrophysics Data System (ADS)

    Poole, L. R.

    1982-06-01

    A recently developed semianalytic Monte Carlo radiative transfer model applicable to oceanographic lidar systems (SALMON) has been used to simulate a series of laboratory experiments studying the backscatter of laser light from dispersions of Teflon spheres. Results obtained with SALMON, using best estimates of pertinent optical parameters, compared quite well with experimental data in both a qualitative and quantitative sense, with the largest relative discrepancies being approximately 30%. The results firmly establish the validity of SALMON in studying the performance of real oceanographic lidar systems.

  15. Students' Cognitive Focus during a Chemistry Laboratory Exercise: Effects of a Computer-Simulated Prelab

    ERIC Educational Resources Information Center

    Winberg, T. Mikael; Berg, C. Anders R.

    2007-01-01

    To enhance the learning outcomes achieved by students, learners undertook a computer-simulated activity based on an acid-base titration prior to a university-level chemistry laboratory activity. Students were categorized with respect to their attitudes toward learning. During the laboratory exercise, questions that students asked their assistant…

  16. Recent advances in computational structural reliability analysis methods

    NASA Technical Reports Server (NTRS)

    Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.

    1993-01-01

    The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.

  17. Reliability of an interactive computer program for advance care planning.

    PubMed

    Schubart, Jane R; Levi, Benjamin H; Camacho, Fabian; Whitehead, Megan; Farace, Elana; Green, Michael J

    2012-06-01

    Despite widespread efforts to promote advance directives (ADs), completion rates remain low. Making Your Wishes Known: Planning Your Medical Future (MYWK) is an interactive computer program that guides individuals through the process of advance care planning, explaining health conditions and interventions that commonly involve life or death decisions, helps them articulate their values/goals, and translates users' preferences into a detailed AD document. The purpose of this study was to demonstrate that (in the absence of major life changes) the AD generated by MYWK reliably reflects an individual's values/preferences. English speakers ≥30 years old completed MYWK twice, 4 to 6 weeks apart. Reliability indices were assessed for three AD components: General Wishes; Specific Wishes for treatment; and Quality-of-Life values (QoL). Twenty-four participants completed the study. Both the Specific Wishes and QoL scales had high internal consistency in both time periods (Knuder Richardson formula 20 [KR-20]=0.83-0.95, and 0.86-0.89). Test-retest reliability was perfect for General Wishes (κ=1), high for QoL (Pearson's correlation coefficient=0.83), but lower for Specific Wishes (Pearson's correlation coefficient=0.57). MYWK generates an AD where General Wishes and QoL (but not Specific Wishes) statements remain consistent over time.

  18. Methods in Computational Neuroscience: Marine Biology Laboratory Student Projects

    DTIC Science & Technology

    1988-11-01

    models of hippocampal subsystems, a model of single CA3 and CAt pyramidal neurons was constructed with GENESIS. Biophysical and electrophysiological data...intrinsic to the hippocampus. The circuit included a CAl pyramidal cell and two inhibitory interneurons , one feed-forward, one feedback. Using known... electrophysiological results obtained from detailed experiments for this visual system, the neural network simulator GENESIS has been used during the Computational

  19. Application of system simulation for engineering the technical computing environment of the Lawrence Livermore National Laboratorie

    SciTech Connect

    Boyd, V; Edmunds, T; Minuzzo, K; Powell, E; Roche, L

    1998-09-15

    This report summarizes an investigation performed by Lawrence Livermore National Laboratory's (LLNL) Scientific Computing & Communications Department (SCCD) and the Garland Location of Raytheon Systems Company (RSC) from April through August.1998. The study assessed the applicability and benefits of utilizing System Simulation in architecting and deploying technical computing assets at LLNL, particularly in support of the ASCI program and associated scientific computing needs. The recommendations and other reported findings reflect the consensus of the investigation team. The investigation showed that there are potential benefits to performing component level simulation within SCCD in support of the ASCI program. To illustrate this, a modeling exercise was conducted by the study team that generated results consistent with measured operational performance. This activity demonstrated that a relatively modest effort could improve the toolset for making architectural trades and improving levels of understanding for managing operational practices. This capability to evaluate architectural trades was demonstrated by evaluating some of the productivity impacts of changing one of the design parameters of an existing file transfer system. The use of system simulation should be tailored to the local context of resource requirements/limitations, technology plans/processes/issues, design and deployment schedule, and organizational factors. In taking these matters into account, we recommend that simulation modeling be employed within SCCD on a limited basis for targeted engineering studies, and that an overall performance engineering program be established to better equip the Systems Engineering organization to direct future architectural decisions and operational practices. The development of an end-to-end modeling capability and enterprise-level modeling system within SCCD is not warranted in view of the associated development requirements and difficulty in determining firm

  20. Using advanced computer vision algorithms on small mobile robots

    NASA Astrophysics Data System (ADS)

    Kogut, G.; Birchmore, F.; Biagtan Pacis, E.; Everett, H. R.

    2006-05-01

    The Technology Transfer project employs a spiral development process to enhance the functionality and autonomy of mobile robot systems in the Joint Robotics Program (JRP) Robotic Systems Pool by converging existing component technologies onto a transition platform for optimization. An example of this approach is the implementation of advanced computer vision algorithms on small mobile robots. We demonstrate the implementation and testing of the following two algorithms useful on mobile robots: 1) object classification using a boosted Cascade of classifiers trained with the Adaboost training algorithm, and 2) human presence detection from a moving platform. Object classification is performed with an Adaboost training system developed at the University of California, San Diego (UCSD) Computer Vision Lab. This classification algorithm has been used to successfully detect the license plates of automobiles in motion in real-time. While working towards a solution to increase the robustness of this system to perform generic object recognition, this paper demonstrates an extension to this application by detecting soda cans in a cluttered indoor environment. The human presence detection from a moving platform system uses a data fusion algorithm which combines results from a scanning laser and a thermal imager. The system is able to detect the presence of humans while both the humans and the robot are moving simultaneously. In both systems, the two aforementioned algorithms were implemented on embedded hardware and optimized for use in real-time. Test results are shown for a variety of environments.

  1. Advances in the computational study of language acquisition.

    PubMed

    Brent, M R

    1996-01-01

    This paper provides a tutorial introduction to computational studies of how children learn their native languages. Its aim is to make recent advances accessible to the broader research community, and to place them in the context of current theoretical issues. The first section locates computational studies and behavioral studies within a common theoretical framework. The next two sections review two papers that appear in this volume: one on learning the meanings of words and one or learning the sounds of words. The following section highlights an idea which emerges independently in these two papers and which I have dubbed autonomous bootstrapping. Classical bootstrapping hypotheses propose that children begin to get a toc-hold in a particular linguistic domain, such as syntax, by exploiting information from another domain, such as semantics. Autonomous bootstrapping complements the cross-domain acquisition strategies of classical bootstrapping with strategies that apply within a single domain. Autonomous bootstrapping strategies work by representing partial and/or uncertain linguistic knowledge and using it to analyze the input. The next two sections review two more more contributions to this special issue: one on learning word meanings via selectional preferences and one on algorithms for setting grammatical parameters. The final section suggests directions for future research.

  2. Knowledge Retention for Computer Simulations: A study comparing virtual and hands-on laboratories

    NASA Astrophysics Data System (ADS)

    Croom, John R., III

    The use of virtual laboratories has the potential to change physics education. These low-cost, interactive computer activities interest students, allow for easy setup, and give educators a way to teach laboratory based online classes. This study investigated whether virtual laboratories could replace traditional hands-on laboratories and whether students could retain the same long-term knowledge in virtual laboratories as compared to hands-on laboratories. This study is a quantitative quasi-experiment that used a multiple posttest design to determine if students using virtual laboratories would retain the same knowledge as students who performed hands-on laboratories after 9 weeks. The study was composed of 336 students from 14 school districts. Students had their performances on the laboratories and their retention of the laboratories compared to a series of factors that might have affected their retention using a pretest and two posttests, which were compared using a t test. The results showed no significant difference in short-term learning between the hands-on laboratory groups and virtual laboratory groups. There was, however, a significant difference (p = .005) between the groups in long-term retention; students in the hands-on laboratory groups retained more information than those in the virtual laboratory groups. These results suggest that long-term learning is enhanced when a laboratory contains a hands-on component. Finally, the results showed that both groups of students felt their particular laboratory style was superior to the alternative method. The findings of this study can be used to improve the integration of virtual laboratories into science curriculum.

  3. Integration of Computer Technology Into an Introductory-Level Neuroscience Laboratory

    ERIC Educational Resources Information Center

    Evert, Denise L.; Goodwin, Gregory; Stavnezer, Amy Jo

    2005-01-01

    We describe 3 computer-based neuroscience laboratories. In the first 2 labs, we used commercially available interactive software to enhance the study of functional and comparative neuroanatomy and neurophysiology. In the remaining lab, we used customized software and hardware in 2 psychophysiological experiments. With the use of the computer-based…

  4. A User Assessment of Workspaces in Selected Music Education Computer Laboratories.

    ERIC Educational Resources Information Center

    Badolato, Michael Jeremy

    A study of 120 students selected from the user populations of four music education computer laboratories was conducted to determine the applicability of current ergonomic and environmental design guidelines in satisfying the needs of users of educational computing workspaces. Eleven categories of workspace factors were organized into a…

  5. Individualizing Instruction in Large Undergraduate Biology Laboratories. II. Computers and Investigation

    ERIC Educational Resources Information Center

    Norberg, Ann Marie

    1975-01-01

    Describes the following uses of computers in college biology laboratories: (1) to organize and analyze research data and (2) to simulate biological systems. Also being developed are computer simulations to systematically prepare students for independent investigations. (See also SE 515 092.) (LS)

  6. Computation of Chemical Shifts for Paramagnetic Molecules: A Laboratory Experiment for the Undergraduate Curriculum

    ERIC Educational Resources Information Center

    Pritchard, Benjamin P.; Simpson, Scott; Zurek, Eva; Autschbach, Jochen

    2014-01-01

    A computational experiment investigating the [superscript 1]H and [superscript 13]C nuclear magnetic resonance (NMR) chemical shifts of molecules with unpaired electrons has been developed and implemented. This experiment is appropriate for an upper-level undergraduate laboratory course in computational, physical, or inorganic chemistry. The…

  7. Experimental and computing strategies in advanced material characterization problems

    SciTech Connect

    Bolzon, G.

    2015-10-28

    The mechanical characterization of materials relies more and more often on sophisticated experimental methods that permit to acquire a large amount of data and, contemporarily, to reduce the invasiveness of the tests. This evolution accompanies the growing demand of non-destructive diagnostic tools that assess the safety level of components in use in structures and infrastructures, for instance in the strategic energy sector. Advanced material systems and properties that are not amenable to traditional techniques, for instance thin layered structures and their adhesion on the relevant substrates, can be also characterized by means of combined experimental-numerical tools elaborating data acquired by full-field measurement techniques. In this context, parameter identification procedures involve the repeated simulation of the laboratory or in situ tests by sophisticated and usually expensive non-linear analyses while, in some situation, reliable and accurate results would be required in real time. The effectiveness and the filtering capabilities of reduced models based on decomposition and interpolation techniques can be profitably used to meet these conflicting requirements. This communication intends to summarize some results recently achieved in this field by the author and her co-workers. The aim is to foster further interaction between engineering and mathematical communities.

  8. Process for selecting NEAMS applications for access to Idaho National Laboratory high performance computing resources

    SciTech Connect

    Michael Pernice

    2010-09-01

    INL has agreed to provide participants in the Nuclear Energy Advanced Mod- eling and Simulation (NEAMS) program with access to its high performance computing (HPC) resources under sponsorship of the Enabling Computational Technologies (ECT) program element. This report documents the process used to select applications and the software stack in place at INL.

  9. A fission matrix based validation protocol for computed power distributions in the advanced test reactor

    SciTech Connect

    Nielsen, J. W.; Nigg, D. W.; LaPorta, A. W.

    2013-07-01

    The Idaho National Laboratory (INL) has been engaged in a significant multi year effort to modernize the computational reactor physics tools and validation procedures used to support operations of the Advanced Test Reactor (ATR) and its companion critical facility (ATRC). Several new protocols for validation of computed neutron flux distributions and spectra as well as for validation of computed fission power distributions, based on new experiments and well-recognized least-squares statistical analysis techniques, have been under development. In the case of power distributions, estimates of the a priori ATR-specific fuel element-to-element fission power correlation and covariance matrices are required for validation analysis. A practical method for generating these matrices using the element-to-element fission matrix is presented, along with a high-order scheme for estimating the underlying fission matrix itself. The proposed methodology is illustrated using the MCNP5 neutron transport code for the required neutronics calculations. The general approach is readily adaptable for implementation using any multidimensional stochastic or deterministic transport code that offers the required level of spatial, angular, and energy resolution in the computed solution for the neutron flux and fission source. (authors)

  10. Reference site selection report for the advanced liquid metal reactor at the Idaho National Engineering Laboratory

    SciTech Connect

    Sivill, R.L.

    1990-03-01

    This Reference Site Selection Report was prepared by EG G, Idaho Inc., for General Electric (GE) to provide information for use by the Department of Energy (DOE) in selecting a Safety Test Site for an Advanced Liquid Metal Reactor. Similar Evaluation studies are planned to be conducted at other potential DOE sites. The Power Reactor Innovative Small Module (PRISM) Concept was developed for ALMR by GE. A ALMR Safety Test is planned to be performed on a DOE site to demonstrate features and meet Nuclear Regulatory Commission Requirements. This study considered possible locations at the Idaho National Engineering Laboratory that met the ALMR Prototype Site Selection Methodology and Criteria. Four sites were identified, after further evaluation one site was eliminated. Each of the remaining three sites satisfied the criteria and was graded. The results were relatively close. Thus concluding that the Idaho National Engineering Laboratory is a suitable location for an Advanced Liquid Metal Reactor Safety Test. 23 refs., 13 figs., 9 tabs.

  11. Integrated safeguards testing laboratories in support of the advanced fuel cycle initiative

    SciTech Connect

    Santi, Peter A; Demuth, Scott F; Klasky, Kristen L; Lee, Haeok; Miller, Michael C; Sprinkle, James K; Tobin, Stephen J; Williams, Bradley

    2009-01-01

    A key enabler for advanced fuel cycle safeguards research and technology development for programs such as the Advanced Fuel Cycle Initiative (AFCI) is access to facilities and nuclear materials. This access is necessary in many cases in order to ensure that advanced safeguards techniques and technologies meet the measurement needs for which they were designed. One such crucial facility is a hot cell based laboratory which would allow developers from universities, national laboratories, and commercial companies to perform iterative research and development of advanced safeguards instrumentation under realistic operating conditions but not be subject to production schedule limitations. The need for such a facility arises from the requirement to accurately measure minor actinide and/or fission product bearing nuclear materials that cannot be adequately shielded in glove boxes. With the contraction of the DOE nuclear complex following the end of the cold war, many suitable facilities at DOE sites are increasingly costly to operate and are being evaluated for closure. A hot cell based laboratory that allowed developers to install and remove instrumentation from the hot cell would allow for both risk mitigation and performance optimization of the instrumentation prior to fielding equipment in facilities where maintenance and repair of the instrumentation is difficult or impossible. These benefits are accomplished by providing developers the opportunity to iterate between testing the performance of the instrumentation by measuring realistic types and amounts of nuclear material, and adjusting and refining the instrumentation based on the results of these measurements. In this paper, we review the requirements for such a facility using the Wing 9 hot cells in the Los Alamos National Laboratory's Chemistry and Metallurgy Research facility as a model for such a facility and describe recent use of these hot cells in support of AFCI.

  12. Precision laser range finder system design for Advanced Technology Laboratory applications

    NASA Technical Reports Server (NTRS)

    Golden, K. E.; Kohn, R. L.; Seib, D. H.

    1974-01-01

    Preliminary system design of a pulsed precision ruby laser rangefinder system is presented which has a potential range resolution of 0.4 cm when atmospheric effects are negligible. The system being proposed for flight testing on the advanced technology laboratory (ATL) consists of a modelocked ruby laser transmitter, course and vernier rangefinder receivers, optical beacon retroreflector tracking system, and a network of ATL tracking retroreflectors. Performance calculations indicate that spacecraft to ground ranging accuracies of 1 to 2 cm are possible.

  13. Recovery Act: Advanced Direct Methanol Fuel Cell for Mobile Computing

    SciTech Connect

    Fletcher, James H.; Cox, Philip; Harrington, William J; Campbell, Joseph L

    2013-09-03

    ABSTRACT Project Title: Recovery Act: Advanced Direct Methanol Fuel Cell for Mobile Computing PROJECT OBJECTIVE The objective of the project was to advance portable fuel cell system technology towards the commercial targets of power density, energy density and lifetime. These targets were laid out in the DOE’s R&D roadmap to develop an advanced direct methanol fuel cell power supply that meets commercial entry requirements. Such a power supply will enable mobile computers to operate non-stop, unplugged from the wall power outlet, by using the high energy density of methanol fuel contained in a replaceable fuel cartridge. Specifically this project focused on balance-of-plant component integration and miniaturization, as well as extensive component, subassembly and integrated system durability and validation testing. This design has resulted in a pre-production power supply design and a prototype that meet the rigorous demands of consumer electronic applications. PROJECT TASKS The proposed work plan was designed to meet the project objectives, which corresponded directly with the objectives outlined in the Funding Opportunity Announcement: To engineer the fuel cell balance-of-plant and packaging to meet the needs of consumer electronic systems, specifically at power levels required for mobile computing. UNF used existing balance-of-plant component technologies developed under its current US Army CERDEC project, as well as a previous DOE project completed by PolyFuel, to further refine them to both miniaturize and integrate their functionality to increase the system power density and energy density. Benefits of UNF’s novel passive water recycling MEA (membrane electrode assembly) and the simplified system architecture it enabled formed the foundation of the design approach. The package design was hardened to address orientation independence, shock, vibration, and environmental requirements. Fuel cartridge and fuel subsystems were improved to ensure effective fuel

  14. Advanced computational sensors technology: testing and evaluation in visible, SWIR, and LWIR imaging

    NASA Astrophysics Data System (ADS)

    Rizk, Charbel G.; Wilson, John P.; Pouliquen, Philippe

    2015-05-01

    The Advanced Computational Sensors Team at the Johns Hopkins University Applied Physics Laboratory and the Johns Hopkins University Department of Electrical and Computer Engineering has been developing advanced readout integrated circuit (ROIC) technology for more than 10 years with a particular focus on the key challenges of dynamic range, sampling rate, system interface and bandwidth, and detector materials or band dependencies. Because the pixel array offers parallel sampling by default, the team successfully demonstrated that adding smarts in the pixel and the chip can increase performance significantly. Each pixel becomes a smart sensor and can operate independently in collecting, processing, and sharing data. In addition, building on the digital circuit revolution, the effective well size can be increased by orders of magnitude within the same pixel pitch over analog designs. This research has yielded an innovative class of a system-on-chip concept: the Flexible Readout and Integration Sensor (FRIS) architecture. All key parameters are programmable and/or can be adjusted dynamically, and this architecture can potentially be sensor and application agnostic. This paper reports on the testing and evaluation of one prototype that can support either detector polarity and includes sample results with visible, short-wavelength infrared (SWIR), and long-wavelength infrared (LWIR) imaging.

  15. Advanced computational tools for optimization and uncertainty quantification of carbon capture processes

    SciTech Connect

    Miller, David C.; Ng, Brenda; Eslick, John

    2014-01-01

    Advanced multi-scale modeling and simulation has the potential to dramatically reduce development time, resulting in considerable cost savings. The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and universities that is developing, demonstrating, and deploying a suite of multi-scale modeling and simulation tools. One significant computational tool is FOQUS, a Framework for Optimization and Quantification of Uncertainty and Sensitivity, which enables basic data submodels, including thermodynamics and kinetics, to be used within detailed process models to rapidly synthesize and optimize a process and determine the level of uncertainty associated with the resulting process. The overall approach of CCSI is described with a more detailed discussion of FOQUS and its application to carbon capture systems.

  16. Quantitative Computed Tomography and Image Analysis for Advanced Muscle Assessment

    PubMed Central

    Edmunds, Kyle Joseph; Gíslason, Magnus K.; Arnadottir, Iris D.; Marcante, Andrea; Piccione, Francesco; Gargiulo, Paolo

    2016-01-01

    Medical imaging is of particular interest in the field of translational myology, as extant literature describes the utilization of a wide variety of techniques to non-invasively recapitulate and quantity various internal and external tissue morphologies. In the clinical context, medical imaging remains a vital tool for diagnostics and investigative assessment. This review outlines the results from several investigations on the use of computed tomography (CT) and image analysis techniques to assess muscle conditions and degenerative process due to aging or pathological conditions. Herein, we detail the acquisition of spiral CT images and the use of advanced image analysis tools to characterize muscles in 2D and 3D. Results from these studies recapitulate changes in tissue composition within muscles, as visualized by the association of tissue types to specified Hounsfield Unit (HU) values for fat, loose connective tissue or atrophic muscle, and normal muscle, including fascia and tendon. We show how results from these analyses can be presented as both average HU values and compositions with respect to total muscle volumes, demonstrating the reliability of these tools to monitor, assess and characterize muscle degeneration. PMID:27478562

  17. Computer validation in toxicology: historical review for FDA and EPA good laboratory practice.

    PubMed

    Brodish, D L

    1998-01-01

    The application of computer validation principles to Good Laboratory Practice is a fairly recent phenomenon. As automated data collection systems have become more common in toxicology facilities, the U.S. Food and Drug Administration and the U.S. Environmental Protection Agency have begun to focus inspections in this area. This historical review documents the development of regulatory guidance on computer validation in toxicology over the past several decades. An overview of the components of a computer life cycle is presented, including the development of systems descriptions, validation plans, validation testing, system maintenance, SOPs, change control, security considerations, and system retirement. Examples are provided for implementation of computer validation principles on laboratory computer systems in a toxicology facility.

  18. The Astromaterials X-Ray Computed Tomography Laboratory at Johnson Space Center

    NASA Technical Reports Server (NTRS)

    Zeigler, R. A.; Coleff, D. M.; McCubbin, F. M.

    2017-01-01

    The Astromaterials Acquisition and Curation Office at NASA's Johnson Space Center (hereafter JSC curation) is the past, present, and future home of all of NASA's astromaterials sample collections. JSC curation currently houses all or part of nine different sample collections: (1) Apollo samples (1969), (2) Lunar samples (1972), (3) Antarctic meteorites (1976), (4) Cosmic Dust particles (1981), (5) Microparticle Impact Collection (1985), (6) Genesis solar wind atoms (2004); (7) Stardust comet Wild-2 particles (2006), (8) Stardust interstellar particles (2006), and (9) Hayabusa asteroid Itokawa particles (2010). Each sample collection is housed in a dedicated clean room, or suite of clean rooms, that is tailored to the requirements of that sample collection. Our primary goals are to maintain the long-term integrity of the samples and ensure that the samples are distributed for scientific study in a fair, timely, and responsible manner, thus maximizing the return on each sample. Part of the curation process is planning for the future, and we also perform fundamental research in advanced curation initiatives. Advanced Curation is tasked with developing procedures, technology, and data sets necessary for curating new types of sample collections, or getting new results from existing sample collections [2]. We are (and have been) planning for future curation, including cold curation, extended curation of ices and volatiles, curation of samples with special chemical considerations such as perchlorate-rich samples, and curation of organically- and biologically-sensitive samples. As part of these advanced curation efforts we are augmenting our analytical facilities as well. A micro X-Ray computed tomography (micro-XCT) laboratory dedicated to the study of astromaterials will be coming online this spring within the JSC Curation office, and we plan to add additional facilities that will enable nondestructive (or minimally-destructive) analyses of astromaterials in the near

  19. Recent advances in data assimilation in computational geodynamic models

    NASA Astrophysics Data System (ADS)

    Ismail-Zadeh, Alik

    2010-05-01

    . The QRV method was most recently introduced in geodynamic modelling (Ismail-Zadeh et al., 2007, 2008; Tantsyrev, 2008; Glisovic et al., 2009). The advances in computational geodynamics and in data assimilation attract an interest of the community dealing with lithosphere, mantle and core dynamics.

  20. Grand Challenges of Advanced Computing for Energy Innovation Report from the Workshop Held July 31-August 2, 2012

    SciTech Connect

    Larzelere, Alex R.; Ashby, Steven F.; Christensen, Dana C.; Crawford, Dona L.; Khaleel, Mohammad A.; John, Grosh; Stults, B. Ray; Lee, Steven L.; Hammond, Steven W.; Grover, Benjamin T.; Neely, Rob; Dudney, Lee Ann; Goldstein, Noah C.; Wells, Jack; Peltz, Jim

    2013-03-06

    On July 31-August 2 of 2012, the U.S. Department of Energy (DOE) held a workshop entitled Grand Challenges of Advanced Computing for Energy Innovation. This workshop built on three earlier workshops that clearly identified the potential for the Department and its national laboratories to enable energy innovation. The specific goal of the workshop was to identify the key challenges that the nation must overcome to apply the full benefit of taxpayer-funded advanced computing technologies to U.S. energy innovation in the ways that the country produces, moves, stores, and uses energy. Perhaps more importantly, the workshop also developed a set of recommendations to help the Department overcome those challenges. These recommendations provide an action plan for what the Department can do in the coming years to improve the nation’s energy future.

  1. High Energy Laboratory Astrophysics Experiments using electron beam ion traps and advanced light sources

    NASA Astrophysics Data System (ADS)

    Brown, Gregory V.; Beiersdorfer, Peter; Bernitt, Sven; Eberle, Sita; Hell, Natalie; Kilbourne, Caroline; Kelley, Rich; Leutenegger, Maurice; Porter, F. Scott; Rudolph, Jan; Steinbrugge, Rene; Traebert, Elmar; Crespo-Lopez-Urritia, Jose R.

    2015-08-01

    We have used the Lawrence Livermore National Laboratory's EBIT-I electron beam ion trap coupled with a NASA/GSFC microcalorimeter spectrometer instrument to systematically address problems found in the analysis of high resolution X-ray spectra from celestial sources, and to benchmark atomic physics codes employed by high resolution spectral modeling packages. Our results include laboratory measurements of transition energies, absolute and relative electron impact excitation cross sections, charge exchange cross sections, and dielectronic recombination resonance strengths. More recently, we have coupled to the Max-Plank Institute for Nuclear Physics-Heidelberg's FLASH-EBIT electron beam ion trap to third and fourth generation advanced light sources to measure photoexcitation and photoionization cross sections, as well as, natural line widths of X-ray transitions in highly charged iron ions. Selected results will be presented.

  2. The impact of recent advances in laboratory astrophysics on our understanding of the cosmos.

    PubMed

    Savin, D W; Brickhouse, N S; Cowan, J J; Drake, R P; Federman, S R; Ferland, G J; Frank, A; Gudipati, M S; Haxton, W C; Herbst, E; Profumo, S; Salama, F; Ziurys, L M; Zweibel, E G

    2012-03-01

    An emerging theme in modern astrophysics is the connection between astronomical observations and the underlying physical phenomena that drive our cosmos. Both the mechanisms responsible for the observed astrophysical phenomena and the tools used to probe such phenomena-the radiation and particle spectra we observe-have their roots in atomic, molecular, condensed matter, plasma, nuclear and particle physics. Chemistry is implicitly included in both molecular and condensed matter physics. This connection is the theme of the present report, which provides a broad, though non-exhaustive, overview of progress in our understanding of the cosmos resulting from recent theoretical and experimental advances in what is commonly called laboratory astrophysics. This work, carried out by a diverse community of laboratory astrophysicists, is increasingly important as astrophysics transitions into an era of precise measurement and high fidelity modeling.

  3. Request for Information from entities interested in commercializing Laboratory-developed advanced in vitro assessment technology

    SciTech Connect

    Intrator, Miranda Huang

    2016-03-30

    Los Alamos National Security, LLC (LANS) is the manager and operator of Los Alamos National Laboratory (Los Alamos) for the U.S. Department of Energy National Nuclear Security Administration under contract DE-AC52- 06NA25396. Los Alamos is a mission-centric Federally Funded Research and Development Center focused on solving critical national security challenges through science and engineering for both government and private customers. LANS is opening this formal Request for Information (RFI) to gauge interest in engaging as an industry partner to LANS for collaboration in advancing the bio-assessment platform described below. Please see last section for details on submitting a Letter of Interest.

  4. A landmark recognition and tracking experiment for flight on the Shuttle/Advanced Technology Laboratory (ATL)

    NASA Technical Reports Server (NTRS)

    Welch, J. D.

    1975-01-01

    The preliminary design of an experiment for landmark recognition and tracking from the Shuttle/Advanced Technology Laboratory is described. It makes use of parallel coherent optical processing to perform correlation tests between landmarks observed passively with a telescope and previously made holographic matched filters. The experimental equipment including the optics, the low power laser, the random access file of matched filters and the electro-optical readout device are described. A real time optically excited liquid crystal device is recommended for performing the input non-coherent optical to coherent optical interface function. A development program leading to a flight experiment in 1981 is outlined.

  5. The Design and Transfer of Advanced Command and Control (C2) Computer-Based Systems

    DTIC Science & Technology

    1980-03-31

    TECHNICAL REPORT 80-02 QUARTERLY TECHNICAL REPORT: THE DESIGN AND TRANSFER OF ADVANCED COMMAND AND CONTROL (C 2 ) COMPUTER-BASED SYSTEMS ARPA...The Tasks/Objectives and/or Purposes of the overall project are connected with the design , development, demonstration and transfer of advanced...command and control (C2 ) computer-based systems; this report covers work in the computer-based design and transfer areas only. The Technical Problems thus

  6. First 3 years of operation of RIACS (Research Institute for Advanced Computer Science) (1983-1985)

    NASA Technical Reports Server (NTRS)

    Denning, P. J.

    1986-01-01

    The focus of the Research Institute for Advanced Computer Science (RIACS) is to explore matches between advanced computing architectures and the processes of scientific research. An architecture evaluation of the MIT static dataflow machine, specification of a graphical language for expressing distributed computations, and specification of an expert system for aiding in grid generation for two-dimensional flow problems was initiated. Research projects for 1984 and 1985 are summarized.

  7. Student teaching and research laboratory focusing on brain-computer interface paradigms--A creative environment for computer science students.

    PubMed

    Rutkowski, Tomasz M

    2015-08-01

    This paper presents an applied concept of a brain-computer interface (BCI) student research laboratory (BCI-LAB) at the Life Science Center of TARA, University of Tsukuba, Japan. Several successful case studies of the student projects are reviewed together with the BCI Research Award 2014 winner case. The BCI-LAB design and project-based teaching philosophy is also explained. Future teaching and research directions summarize the review.

  8. Usnic Acid and the Intramolecular Hydrogen Bond: A Computational Experiment for the Organic Laboratory

    ERIC Educational Resources Information Center

    Green, Thomas K.; Lane, Charles A.

    2006-01-01

    A computational experiment is described for the organic chemistry laboratory that allows students to estimate the relative strengths of the intramolecular hydrogen bonds of usnic and isousnic acids, two related lichen secondary metabolites. Students first extract and purify usnic acid from common lichens and obtain [superscript 1]H NMR and IR…

  9. Motion of Electrons in Electric and Magnetic Fields: Introductory Laboratory and Computer Studies.

    ERIC Educational Resources Information Center

    Huggins, Elisha R.; Lelek, Jeffrey J.

    1979-01-01

    Describes a series of laboratory experiments and computer simulations of the motion of electrons in electric and magnetic fields. These experiments, which involve an inexpensive student-built electron gun, study the electron mean free path, magnetic focusing, and other aspects. (Author/HM)

  10. Annotated List of Chemistry Laboratory Experiments with Computer Access. Final Report.

    ERIC Educational Resources Information Center

    Bunce, S. C.; And Others

    Project Chemlab was designed to prepare an "Annotated List of Laboratory Experiments in Chemistry from the Journal of Chemical Education (1957-1979)" and to develop a computer file and program to search for specific types of experiments. Provided in this document are listings (photoreduced copies of printouts) of over 1500 entries…

  11. Journal of Chemical Education: Software: Abstract of "The Computer-Based Laboratory."

    ERIC Educational Resources Information Center

    Krause, Daniel C.; Divis, Lynne M., Ed.

    1988-01-01

    Describes a chemistry laboratory software package for interfacing the Apple IIe for high school and introductory college courses. Topics include: thermistor calibration, phase change, heat of reaction, freezing point depression, Beer's law, and color decay in crystal violet. Explains the computer interface and the tools needed. (MVL)

  12. BASIC and the Density of Glass. A First-Year Laboratory/Computer Experiment.

    ERIC Educational Resources Information Center

    Harris, Arlo D.

    1986-01-01

    Describes a first-year chemistry laboratory experiment which uses a simple computer program written in BASIC, to analyze data collected by students about the density of a set of marbles. A listing of the program is provided, along with a sample printout of the experiment's results. (TW)

  13. Computational studies of horizontal axis wind turbines in high wind speed condition using advanced turbulence models

    NASA Astrophysics Data System (ADS)

    Benjanirat, Sarun

    Next generation horizontal-axis wind turbines (HAWTs) will operate at very high wind speeds. Existing engineering approaches for modeling the flow phenomena are based on blade element theory, and cannot adequately account for 3-D separated, unsteady flow effects. Therefore, researchers around the world are beginning to model these flows using first principles-based computational fluid dynamics (CFD) approaches. In this study, an existing first principles-based Navier-Stokes approach is being enhanced to model HAWTs at high wind speeds. The enhancements include improved grid topology, implicit time-marching algorithms, and advanced turbulence models. The advanced turbulence models include the Spalart-Allmaras one-equation model, k-epsilon, k-o and Shear Stress Transport (k-o-SST) models. These models are also integrated with detached eddy simulation (DES) models. Results are presented for a range of wind speeds, for a configuration termed National Renewable Energy Laboratory Phase VI rotor, tested at NASA Ames Research Center. Grid sensitivity studies are also presented. Additionally, effects of existing transition models on the predictions are assessed. Data presented include power/torque production, radial distribution of normal and tangential pressure forces, root bending moments, and surface pressure fields. Good agreement was obtained between the predictions and experiments for most of the conditions, particularly with the Spalart-Allmaras-DES model.

  14. Current research activities at the NASA-sponsored Illinois Computing Laboratory of Aerospace Systems and Software

    NASA Technical Reports Server (NTRS)

    Smith, Kathryn A.

    1994-01-01

    The Illinois Computing Laboratory of Aerospace Systems and Software (ICLASS) was established to: (1) pursue research in the areas of aerospace computing systems, software and applications of critical importance to NASA, and (2) to develop and maintain close contacts between researchers at ICLASS and at various NASA centers to stimulate interaction and cooperation, and facilitate technology transfer. Current ICLASS activities are in the areas of parallel architectures and algorithms, reliable and fault tolerant computing, real time systems, distributed systems, software engineering and artificial intelligence.

  15. A Model Computing Laboratory for University Schools of Nursing: The Michigan Experience

    PubMed Central

    Schultz, Samuel

    1982-01-01

    This paper presents an historical view of a prototype four-phase developmental system for university level instruction in computing and data analysis for nursing curricula. Specific hardware, instrumented classrooms and computing laboratory designs are discussed as they relate to typical program growth. Although the historical view presents a system which spans more than 13 years of growth, the paper concludes with a presentation of the current state-of-the-art microcomputer system and an archetypical four-phase system model potentially useful for other health science computing curricula.

  16. Generalized Advanced Propeller Analysis System (GAPAS). Volume 2: Computer program user manual

    NASA Technical Reports Server (NTRS)

    Glatt, L.; Crawford, D. R.; Kosmatka, J. B.; Swigart, R. J.; Wong, E. W.

    1986-01-01

    The Generalized Advanced Propeller Analysis System (GAPAS) computer code is described. GAPAS was developed to analyze advanced technology multi-bladed propellers which operate on aircraft with speeds up to Mach 0.8 and altitudes up to 40,000 feet. GAPAS includes technology for analyzing aerodynamic, structural, and acoustic performance of propellers. The computer code was developed for the CDC 7600 computer and is currently available for industrial use on the NASA Langley computer. A description of all the analytical models incorporated in GAPAS is included. Sample calculations are also described as well as users requirements for modifying the analysis system. Computer system core requirements and running times are also discussed.

  17. Recent advances in computational methods for nuclear magnetic resonance data processing.

    PubMed

    Gao, Xin

    2013-02-01

    Although three-dimensional protein structure determination using nuclear magnetic resonance (NMR) spectroscopy is a computationally costly and tedious process that would benefit from advanced computational techniques, it has not garnered much research attention from specialists in bioinformatics and computational biology. In this paper, we review recent advances in computational methods for NMR protein structure determination. We summarize the advantages of and bottlenecks in the existing methods and outline some open problems in the field. We also discuss current trends in NMR technology development and suggest directions for research on future computational methods for NMR.

  18. Integration of Computational Chemistry into the Undergraduate Organic Chemistry Laboratory Curriculum

    ERIC Educational Resources Information Center

    Esselman, Brian J.; Hill, Nicholas J.

    2016-01-01

    Advances in software and hardware have promoted the use of computational chemistry in all branches of chemical research to probe important chemical concepts and to support experimentation. Consequently, it has become imperative that students in the modern undergraduate curriculum become adept at performing simple calculations using computational…

  19. Audit of desktop computer acquisitions at the Idaho National Engineering and Environmental Laboratory

    SciTech Connect

    1997-08-25

    Federal and Department of Energy (Department) acquisition regulations, policies and procedures, as well as the terms of the current contract between the Idaho Operations Office (Idaho) and Lockheed Martin Idaho Technologies Company (Lockheed) require them to pay the lowest possible prices for desktop computers needed to support the overall mission at the Idaho National Engineering and Environmental Laboratory (Laboratory). The purpose of this audit was to determine Idaho`s and Lockheed`s success in achieving this price goal. Idaho and Lockheed have implemented numerous efficiency standards that are expected to reduce computer service and maintenance costs as well as increase employee productivity by approximately $3.6 million per year. However, the audit showed that Lockheed did not always pay the lowest possible prices for desktop computers because its standard desktop computer configuration was excessive. Additionally, some desktop computers that Lockheed acquired exceeded its established standard and were not fully justified in accordance with established policies and procedures. Further, Lockheed purchased desktop computers from a local vendor rather than a less costly alternative source and did not pursue the possibly more economical option of leasing computers.

  20. The effect of introducing computers into an introductory physics problem-solving laboratory

    NASA Astrophysics Data System (ADS)

    McCullough, Laura Ellen

    2000-10-01

    Computers are appearing in every type of classroom across the country. Yet they often appear without benefit of studying their effects. The research that is available on computer use in classrooms has found mixed results, and often ignores the theoretical and instructional contexts of the computer in the classroom. The University of Minnesota's physics department employs a cooperative-group problem solving pedagogy, based on a cognitive apprenticeship instructional model, in its calculus-based introductory physics course. This study was designed to determine possible negative effects of introducing a computerized data-acquisition and analysis tool into this pedagogy as a problem-solving tool for students to use in laboratory. To determine the effects of the computer tool, two quasi-experimental treatment groups were selected. The computer-tool group (N = 170) used a tool, designed for this study (VideoTool), to collect and analyze motion data in the laboratory. The control group (N = 170) used traditional non-computer equipment (spark tapes and Polaroid(TM) film). The curriculum was kept as similar as possible for the two groups. During the ten week academic quarter, groups were examined for effects on performance on conceptual tests and grades, attitudes towards the laboratory and the laboratory tools, and behaviors within cooperative groups. Possible interactions with gender were also examined. Few differences were found between the control and computer-tool groups. The control group received slightly higher scores on one conceptual test, but this difference was not educationally significant. The computer-tool group had slightly more positive attitudes towards using the computer tool than their counterparts had towards the traditional tools. The computer-tool group also perceived that they spoke more frequently about physics misunderstandings, while the control group felt that they discussed equipment difficulties more often. This perceptual difference interacted

  1. An ultrafast optics undergraduate advanced laboratory with a mode-locked fiber laser

    NASA Astrophysics Data System (ADS)

    Schaffer, Andrew; Fredrick, Connor; Hoyt, Chad; Jones, Jason

    2015-05-01

    We describe an ultrafast optics undergraduate advanced laboratory comprising a mode-locked erbium fiber laser, auto-correlation measurements, and an external, free-space parallel grating dispersion compensation apparatus. The simple design of the stretched pulse laser uses nonlinear polarization rotation mode-locking to produce pulses at a repetition rate of 55 MHz and average power of 5.5 mW. Interferometric and intensity auto-correlation measurements are made using a Michelson interferometer that takes advantage of the two-photon nonlinear response of a common silicon photodiode for the second order correlation between 1550 nm laser pulses. After a pre-amplifier and compression, pulse widths as narrow as 108 fs are measured at 17 mW average power. A detailed parts list includes previously owned and common components used by the telecommunications industry, which may decrease the cost of the lab to within reach of many undergraduate and graduate departments. We also describe progress toward a relatively low-cost optical frequency comb advanced laboratory. NSF EIR #1208930.

  2. A driver linac for the Advanced Exotic Beam Laboratory : physics design and beam dynamics simulations.

    SciTech Connect

    Ostroumov, P. N.; Mustapha, B.; Nolen, J.; Physics

    2007-01-01

    The Advanced Exotic Beam Laboratory (AEBL) being developed at ANL consists of an 833 MV heavy-ion driver linac capable of producing uranium ions up to 200 MeV/u and protons to 580 MeV with 400 kW beam power. We have designed all accelerator components including a two charge state LEBT, an RFQ, a MEBT, a superconducting linac, a stripper station and chicane. We present the results of an optimized linac design and end-to-end simulations including machine errors and detailed beam loss analysis. The Advanced Exotic Beam Laboratory (AEBL) has been proposed at ANL as a reduced scale of the original Rare Isotope Accelerator (RIA) project with about half the cost but the same beam power. AEBL will address 90% or more of RIA physics but with reduced multi-users capabilities. The focus of this paper is the physics design and beam dynamics simulations of the AEBL driver linac. The reported results are for a multiple charge state U{sup 238} beam.

  3. Computer architectures for computational physics work done by Computational Research and Technology Branch and Advanced Computational Concepts Group

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Slides are reproduced that describe the importance of having high performance number crunching and graphics capability. They also indicate the types of research and development underway at Ames Research Center to ensure that, in the near term, Ames is a smart buyer and user, and in the long-term that Ames knows the best possible solutions for number crunching and graphics needs. The drivers for this research are real computational physics applications of interest to Ames and NASA. They are concerned with how to map the applications, and how to maximize the physics learned from the results of the calculations. The computer graphics activities are aimed at getting maximum information from the three-dimensional calculations by using the real time manipulation of three-dimensional data on the Silicon Graphics workstation. Work is underway on new algorithms that will permit the display of experimental results that are sparse and random, the same way that the dense and regular computed results are displayed.

  4. Configuration and Management of a Cluster Computing Facility in Undergraduate Student Computer Laboratories

    ERIC Educational Resources Information Center

    Cornforth, David; Atkinson, John; Spennemann, Dirk H. R.

    2006-01-01

    Purpose: Many researchers require access to computer facilities beyond those offered by desktop workstations. Traditionally, these are offered either through partnerships, to share the cost of supercomputing facilities, or through purpose-built cluster facilities. However, funds are not always available to satisfy either of these options, and…

  5. Cane Toad or Computer Mouse? Real and Computer-Simulated Laboratory Exercises in Physiology Classes

    ERIC Educational Resources Information Center

    West, Jan; Veenstra, Anneke

    2012-01-01

    Traditional practical classes in many countries are being rationalised to reduce costs. The challenge for university educators is to provide students with the opportunity to reinforce theoretical concepts by running something other than a traditional practical program. One alternative is to replace wet labs with comparable computer simulations.…

  6. A Computational Methodology for Simulating Thermal Loss Testing of the Advanced Stirling Convertor

    NASA Technical Reports Server (NTRS)

    Reid, Terry V.; Wilson, Scott D.; Schifer, Nicholas A.; Briggs, Maxwell H.

    2012-01-01

    The U.S. Department of Energy (DOE) and Lockheed Martin Space Systems Company (LMSSC) have been developing the Advanced Stirling Radioisotope Generator (ASRG) for use as a power system for space science missions. This generator would use two highefficiency Advanced Stirling Convertors (ASCs), developed by Sunpower Inc. and NASA Glenn Research Center (GRC). The ASCs convert thermal energy from a radioisotope heat source into electricity. As part of ground testing of these ASCs, different operating conditions are used to simulate expected mission conditions. These conditions require achieving a particular operating frequency, hot end and cold end temperatures, and specified electrical power output for a given net heat input. In an effort to improve net heat input predictions, numerous tasks have been performed which provided a more accurate value for net heat input into the ASCs, including the use of multidimensional numerical models. Validation test hardware has also been used to provide a direct comparison of numerical results and validate the multi-dimensional numerical models used to predict convertor net heat input and efficiency. These validation tests were designed to simulate the temperature profile of an operating Stirling convertor and resulted in a measured net heat input of 244.4 W. The methodology was applied to the multi-dimensional numerical model which resulted in a net heat input of 240.3 W. The computational methodology resulted in a value of net heat input that was 1.7 percent less than that measured during laboratory testing. The resulting computational methodology and results are discussed.

  7. Design and Laboratory Evaluation of Future Elongation and Diameter Measurements at the Advanced Test Reactor

    SciTech Connect

    K. L. Davis; D. L. Knudson; J. L. Rempe; J. C. Crepeau; S. Solstad

    2015-07-01

    New materials are being considered for fuel, cladding, and structures in next generation and existing nuclear reactors. Such materials can undergo significant dimensional and physical changes during high temperature irradiations. In order to accurately predict these changes, real-time data must be obtained under prototypic irradiation conditions for model development and validation. To provide such data, researchers at the Idaho National Laboratory (INL) High Temperature Test Laboratory (HTTL) are developing several instrumented test rigs to obtain data real-time from specimens irradiated in well-controlled pressurized water reactor (PWR) coolant conditions in the Advanced Test Reactor (ATR). This paper reports the status of INL efforts to develop and evaluate prototype test rigs that rely on Linear Variable Differential Transformers (LVDTs) in laboratory settings. Although similar LVDT-based test rigs have been deployed in lower flux Materials Testing Reactors (MTRs), this effort is unique because it relies on robust LVDTs that can withstand higher temperatures and higher fluxes than often found in other MTR irradiations. Specifically, the test rigs are designed for detecting changes in length and diameter of specimens irradiated in ATR PWR loops. Once implemented, these test rigs will provide ATR users with unique capabilities that are sorely needed to obtain measurements such as elongation caused by thermal expansion and/or creep loading and diameter changes associated with fuel and cladding swelling, pellet-clad interaction, and crud buildup.

  8. Apoptosis: A Four-Week Laboratory Investigation for Advanced Molecular and Cellular Biology Students

    PubMed Central

    DiBartolomeis, Susan M.; Moné, James P.

    2003-01-01

    Over the past decade, apoptosis has emerged as an important field of study central to ongoing research in many diverse fields, from developmental biology to cancer research. Apoptosis proceeds by a highly coordinated series of events that includes enzyme activation, DNA fragmentation, and alterations in plasma membrane permeability. The detection of each of these phenotypic changes is accessible to advanced undergraduate cell and molecular biology students. We describe a 4-week laboratory sequence that integrates cell culture, fluorescence microscopy, DNA isolation and analysis, and western blotting (immunoblotting) to follow apoptosis in cultured human cells. Students working in teams chemically induce apoptosis, and harvest, process, and analyze cells, using their data to determine the order of events during apoptosis. We, as instructors, expose the students to an environment closely simulating what they would encounter in an active cell or molecular biology research laboratory by having students coordinate and perform multiple tasks simultaneously and by having them experience experimental design using current literature, data interpretation, and analysis to answer a single question. Students are assessed by examination of laboratory notebooks for completeness of experimental protocols and analysis of results and for completion of an assignment that includes questions pertaining to data interpretation and apoptosis. PMID:14673493

  9. Computer Assisted Fluid Power Instruction: A Comparison of Hands-On and Computer-Simulated Laboratory Experiences for Post-Secondary Students

    ERIC Educational Resources Information Center

    Wilson, Scott B.

    2005-01-01

    The primary purpose of this study was to examine the effectiveness of utilizing a combination of lecture and computer resources to train personnel to assume roles as hydraulic system technicians and specialists in the fluid power industry. This study compared computer simulated laboratory instruction to traditional hands-on laboratory instruction,…

  10. The Advance of Computing from the Ground to the Cloud

    ERIC Educational Resources Information Center

    Breeding, Marshall

    2009-01-01

    A trend toward the abstraction of computing platforms that has been developing in the broader IT arena over the last few years is just beginning to make inroads into the library technology scene. Cloud computing offers for libraries many interesting possibilities that may help reduce technology costs and increase capacity, reliability, and…

  11. Computational Enzyme Design: Advances, hurdles and possible ways forward

    PubMed Central

    Linder, Mats

    2012-01-01

    This mini review addresses recent developments in computational enzyme design. Successful protocols as well as known issues and limitations are discussed from an energetic perspective. It will be argued that improved results can be obtained by including a dynamic treatment in the design protocol. Finally, a molecular dynamics-based approach for evaluating and refining computational designs is presented. PMID:24688650

  12. An Instructional and Research Laboratory for Syndetic Analog-Digital Computation in Science and Engineering Education. Final Report.

    ERIC Educational Resources Information Center

    Jensen, A. P.; And Others

    Reports the results of a feasibility study concerned with the development of a computing laboratory which unified the concepts of analog and digital computing in information and computer science. Contents of the report include a project overview and definition of syndetic computation, the state of the art, some typical student projects, an…

  13. Computer-Assisted Photo Interpretation Research At United States Army Engineer Topographic Laboratories (USAETL)

    NASA Astrophysics Data System (ADS)

    Lukes, George E.

    1981-11-01

    A program in computer-assisted photo interpretation research (CAPIR) has been initiated at the U.S. Army Engineer Topographic Laboratories. In a new laboratory, a photo interpreter (PI) analyzing high-resolution, aerial photography interfaces directly to a digital computer and geographic information system (GIS). A modified analytical plotter enables the PI to transmit encoded three-dimensional spatial data from the stereomodel to the computer. Computer-generated graphics are displayed in the stereomodel for direct feedback of digital spatial data to the PI. Initial CAPIR capabilities include point positioning, mensuration, stereoscopic area search, GIS creation and playback, and elevation data extraction. New capabilities under development include stereo graphic superposition, a digital image workstation, and integration of panoramic Optical Bar Camera photography as a primary GIS data source. This project has been conceived as an evolutionary approach to the digital cartographic feature extraction problem. As a working feature extraction system, the CAPIR laboratory can serve as a testbed for new concepts emerging from image understanding and knowledge-based systems research.

  14. Comparison of nonmesonic hypernuclear decay rates computed in laboratory and center-of-mass coordinates

    SciTech Connect

    De Conti, C.; Barbero, C.; Galeão, A. P.; Krmpotić, F.

    2014-11-11

    In this work we compute the one-nucleon-induced nonmesonic hypernuclear decay rates of {sub Λ}{sup 5}He, {sub Λ}{sup 12}C and {sub Λ}{sup 13}C using a formalism based on the independent particle shell model in terms of laboratory coordinates. To ascertain the correctness and precision of the method, these results are compared with those obtained using a formalism in terms of center-of-mass coordinates, which has been previously reported in the literature. The formalism in terms of laboratory coordinates will be useful in the shell-model approach to two-nucleon-induced transitions.

  15. Virtual earthquake engineering laboratory with physics-based degrading materials on parallel computers

    NASA Astrophysics Data System (ADS)

    Cho, In Ho

    For the last few decades, we have obtained tremendous insight into underlying microscopic mechanisms of degrading quasi-brittle materials from persistent and near-saintly efforts in laboratories, and at the same time we have seen unprecedented evolution in computational technology such as massively parallel computers. Thus, time is ripe to embark on a novel approach to settle unanswered questions, especially for the earthquake engineering community, by harmoniously combining the microphysics mechanisms with advanced parallel computing technology. To begin with, it should be stressed that we placed a great deal of emphasis on preserving clear meaning and physical counterparts of all the microscopic material models proposed herein, since it is directly tied to the belief that by doing so, the more physical mechanisms we incorporate, the better prediction we can obtain. We departed from reviewing representative microscopic analysis methodologies, selecting out "fixed-type" multidirectional smeared crack model as the base framework for nonlinear quasi-brittle materials, since it is widely believed to best retain the physical nature of actual cracks. Microscopic stress functions are proposed by integrating well-received existing models to update normal stresses on the crack surfaces (three orthogonal surfaces are allowed to initiate herein) under cyclic loading. Unlike the normal stress update, special attention had to be paid to the shear stress update on the crack surfaces, due primarily to the well-known pathological nature of the fixed-type smeared crack model---spurious large stress transfer over the open crack under nonproportional loading. In hopes of exploiting physical mechanism to resolve this deleterious nature of the fixed crack model, a tribology-inspired three-dimensional (3d) interlocking mechanism has been proposed. Following the main trend of tribology (i.e., the science and engineering of interacting surfaces), we introduced the base fabric of solid

  16. Lawrence Livermore National Laboratories Perspective on Code Development and High Performance Computing Resources in Support of the National HED/ICF Effort

    SciTech Connect

    Clouse, C. J.; Edwards, M. J.; McCoy, M. G.; Marinak, M. M.; Verdon, C. P.

    2015-07-07

    Through its Advanced Scientific Computing (ASC) and Inertial Confinement Fusion (ICF) code development efforts, Lawrence Livermore National Laboratory (LLNL) provides a world leading numerical simulation capability for the National HED/ICF program in support of the Stockpile Stewardship Program (SSP). In addition the ASC effort provides high performance computing platform capabilities upon which these codes are run. LLNL remains committed to, and will work with, the national HED/ICF program community to help insure numerical simulation needs are met and to make those capabilities available, consistent with programmatic priorities and available resources.

  17. Computing environment for the ASSIST data warehouse at Lawrence Livermore National Laboratory

    SciTech Connect

    Shuk, K.

    1995-11-01

    The current computing environment for the ASSIST data warehouse at Lawrence Livermore National Laboratory is that of a central server that is accessed by a terminal or terminal emulator. The initiative to move to a client/server environment is strong, backed by desktop machines becoming more and more powerful. The desktop machines can now take on parts of tasks once run entirely on the central server, making the whole environment computationally more efficient as a result. Services are tasks that are repeated throughout the environment such that it makes sense to share them; tasks such as email, user authentication and file transfer are services. The new client/;server environment needs to determine which services must be included in the environment for basic functionality. These services then unify the computing environment, not only for the forthcoming ASSIST+, but for Administrative Information Systems as a whole, joining various server platforms with heterogeneous desktop computing platforms.

  18. A Study into Advanced Guidance Laws Using Computational Methods

    DTIC Science & Technology

    2011-12-01

    computing aerodynamic forces % and moments. Except where noted, all dimensions in % MKS system. % Inputs...9] R. L. Shaw, Fighter Combat: Tactics and Maneuvering. Annapolis, MD: Naval Institute Press, 1988. [10] U. S. Shukla and P. R. Mahapatra

  19. RECENT ADVANCES IN COMPUTATIONAL MECHANICS FOR CIVIL ENGINEERING

    NASA Astrophysics Data System (ADS)

    Applied Mechanics Committee, Computational Mechanics Subcommittee,

    In order to clarify mechanical phenomena in civil engineering, it is necessary to improve computational theory and technique in consideration of the particularity of objects to be analyzed and to update computational mechanics focusing on practical use. In addition to the analysis of infrastructure, for damage prediction of natural disasters such as earthquake, tsunami and flood, since it is essential to reflect broad ranges in space and time inherent to fields of civil engineering as well as material properties, it is important to newly develop computational method in view of the particularity of fields of civil engineering. In this context, research trend of methods of computational mechanics which is noteworthy for resolving the complex mechanics problems in civil engineering is reviewed in this paper.

  20. When Learning about the Real World is Better Done Virtually: A Study of Substituting Computer Simulations for Laboratory Equipment

    ERIC Educational Resources Information Center

    Finkelstein, N. D.; Adams, W. K.; Keller, C. J.; Kohl, P. B.; Perkins, K. K.; Podolefsky, N. S.; Reid, S.; LeMaster, R.

    2005-01-01

    This paper examines the effects of substituting a computer simulation for real laboratory equipment in the second semester of a large-scale introductory physics course. The direct current circuit laboratory was modified to compare the effects of using computer simulations with the effects of using real light bulbs, meters, and wires. Two groups of…

  1. Advances in Domain Mapping of Massively Parallel Scientific Computations

    SciTech Connect

    Leland, Robert W.; Hendrickson, Bruce A.

    2015-10-01

    One of the most important concerns in parallel computing is the proper distribution of workload across processors. For most scientific applications on massively parallel machines, the best approach to this distribution is to employ data parallelism; that is, to break the datastructures supporting a computation into pieces and then to assign those pieces to different processors. Collectively, these partitioning and assignment tasks comprise the domain mapping problem.

  2. 2015 Final Reports from the Los Alamos National Laboratory Computational Physics Student Summer Workshop

    SciTech Connect

    Runnels, Scott Robert; Caldwell, Wendy; Brown, Barton Jed; Pederson, Clark; Brown, Justin; Burrill, Daniel; Feinblum, David; Hyde, David; Levick, Nathan; Lyngaas, Isaac; Maeng, Brad; Reed, Richard LeRoy; Sarno-Smith, Lois; Shohet, Gil; Skarda, Jinhie; Stevens, Josey; Zeppetello, Lucas; Grossman-Ponemon, Benjamin; Bottini, Joseph Larkin; Loudon, Tyson Shane; VanGessel, Francis Gilbert; Nagaraj, Sriram; Price, Jacob

    2015-10-15

    The two primary purposes of LANL’s Computational Physics Student Summer Workshop are (1) To educate graduate and exceptional undergraduate students in the challenges and applications of computational physics of interest to LANL, and (2) Entice their interest toward those challenges. Computational physics is emerging as a discipline in its own right, combining expertise in mathematics, physics, and computer science. The mathematical aspects focus on numerical methods for solving equations on the computer as well as developing test problems with analytical solutions. The physics aspects are very broad, ranging from low-temperature material modeling to extremely high temperature plasma physics, radiation transport and neutron transport. The computer science issues are concerned with matching numerical algorithms to emerging architectures and maintaining the quality of extremely large codes built to perform multi-physics calculations. Although graduate programs associated with computational physics are emerging, it is apparent that the pool of U.S. citizens in this multi-disciplinary field is relatively small and is typically not focused on the aspects that are of primary interest to LANL. Furthermore, more structured foundations for LANL interaction with universities in computational physics is needed; historically interactions rely heavily on individuals’ personalities and personal contacts. Thus a tertiary purpose of the Summer Workshop is to build an educational network of LANL researchers, university professors, and emerging students to advance the field and LANL’s involvement in it. This report includes both the background for the program and the reports from the students.

  3. 2016 Final Reports from the Los Alamos National Laboratory Computational Physics Student Summer Workshop

    SciTech Connect

    Runnels, Scott Robert; Bachrach, Harrison Ian; Carlson, Nils; Collier, Angela; Dumas, William; Fankell, Douglas; Ferris, Natalie; Gonzalez, Francisco; Griffith, Alec; Guston, Brandon; Kenyon, Connor; Li, Benson; Mookerjee, Adaleena; Parkinson, Christian; Peck, Hailee; Peters, Evan; Poondla, Yasvanth; Rogers, Brandon; Shaffer, Nathaniel; Trettel, Andrew; Valaitis, Sonata Mae; Venzke, Joel Aaron; Black, Mason; Demircan, Samet; Holladay, Robert Tyler

    2016-09-22

    The two primary purposes of LANL’s Computational Physics Student Summer Workshop are (1) To educate graduate and exceptional undergraduate students in the challenges and applications of computational physics of interest to LANL, and (2) Entice their interest toward those challenges. Computational physics is emerging as a discipline in its own right, combining expertise in mathematics, physics, and computer science. The mathematical aspects focus on numerical methods for solving equations on the computer as well as developing test problems with analytical solutions. The physics aspects are very broad, ranging from low-temperature material modeling to extremely high temperature plasma physics, radiation transport and neutron transport. The computer science issues are concerned with matching numerical algorithms to emerging architectures and maintaining the quality of extremely large codes built to perform multi-physics calculations. Although graduate programs associated with computational physics are emerging, it is apparent that the pool of U.S. citizens in this multi-disciplinary field is relatively small and is typically not focused on the aspects that are of primary interest to LANL. Furthermore, more structured foundations for LANL interaction with universities in computational physics is needed; historically interactions rely heavily on individuals’ personalities and personal contacts. Thus a tertiary purpose of the Summer Workshop is to build an educational network of LANL researchers, university professors, and emerging students to advance the field and LANL’s involvement in it.

  4. Advances in computational design and analysis of airbreathing propulsion systems

    NASA Technical Reports Server (NTRS)

    Klineberg, John M.

    1989-01-01

    The development of commercial and military aircraft depends, to a large extent, on engine manufacturers being able to achieve significant increases in propulsion capability through improved component aerodynamics, materials, and structures. The recent history of propulsion has been marked by efforts to develop computational techniques that can speed up the propulsion design process and produce superior designs. The availability of powerful supercomputers, such as the NASA Numerical Aerodynamic Simulator, and the potential for even higher performance offered by parallel computer architectures, have opened the door to the use of multi-dimensional simulations to study complex physical phenomena in propulsion systems that have previously defied analysis or experimental observation. An overview of several NASA Lewis research efforts is provided that are contributing toward the long-range goal of a numerical test-cell for the integrated, multidisciplinary design, analysis, and optimization of propulsion systems. Specific examples in Internal Computational Fluid Mechanics, Computational Structural Mechanics, Computational Materials Science, and High Performance Computing are cited and described in terms of current capabilities, technical challenges, and future research directions.

  5. 3D chemical imaging in the laboratory by hyperspectral X-ray computed tomography

    PubMed Central

    Egan, C. K.; Jacques, S. D. M.; Wilson, M. D.; Veale, M. C.; Seller, P.; Beale, A. M.; Pattrick, R. A. D.; Withers, P. J.; Cernik, R. J.

    2015-01-01

    We report the development of laboratory based hyperspectral X-ray computed tomography which allows the internal elemental chemistry of an object to be reconstructed and visualised in three dimensions. The method employs a spectroscopic X-ray imaging detector with sufficient energy resolution to distinguish individual elemental absorption edges. Elemental distributions can then be made by K-edge subtraction, or alternatively by voxel-wise spectral fitting to give relative atomic concentrations. We demonstrate its application to two material systems: studying the distribution of catalyst material on porous substrates for industrial scale chemical processing; and mapping of minerals and inclusion phases inside a mineralised ore sample. The method makes use of a standard laboratory X-ray source with measurement times similar to that required for conventional computed tomography. PMID:26514938

  6. 3D chemical imaging in the laboratory by hyperspectral X-ray computed tomography

    NASA Astrophysics Data System (ADS)

    Egan, C. K.; Jacques, S. D. M.; Wilson, M. D.; Veale, M. C.; Seller, P.; Beale, A. M.; Pattrick, R. A. D.; Withers, P. J.; Cernik, R. J.

    2015-10-01

    We report the development of laboratory based hyperspectral X-ray computed tomography which allows the internal elemental chemistry of an object to be reconstructed and visualised in three dimensions. The method employs a spectroscopic X-ray imaging detector with sufficient energy resolution to distinguish individual elemental absorption edges. Elemental distributions can then be made by K-edge subtraction, or alternatively by voxel-wise spectral fitting to give relative atomic concentrations. We demonstrate its application to two material systems: studying the distribution of catalyst material on porous substrates for industrial scale chemical processing; and mapping of minerals and inclusion phases inside a mineralised ore sample. The method makes use of a standard laboratory X-ray source with measurement times similar to that required for conventional computed tomography.

  7. Strengthening LLNL Missions through Laboratory Directed Research and Development in High Performance Computing

    SciTech Connect

    Willis, D. K.

    2016-12-01

    High performance computing (HPC) has been a defining strength of Lawrence Livermore National Laboratory (LLNL) since its founding. Livermore scientists have designed and used some of the world’s most powerful computers to drive breakthroughs in nearly every mission area. Today, the Laboratory is recognized as a world leader in the application of HPC to complex science, technology, and engineering challenges. Most importantly, HPC has been integral to the National Nuclear Security Administration’s (NNSA’s) Stockpile Stewardship Program—designed to ensure the safety, security, and reliability of our nuclear deterrent without nuclear testing. A critical factor behind Lawrence Livermore’s preeminence in HPC is the ongoing investments made by the Laboratory Directed Research and Development (LDRD) Program in cutting-edge concepts to enable efficient utilization of these powerful machines. Congress established the LDRD Program in 1991 to maintain the technical vitality of the Department of Energy (DOE) national laboratories. Since then, LDRD has been, and continues to be, an essential tool for exploring anticipated needs that lie beyond the planning horizon of our programs and for attracting the next generation of talented visionaries. Through LDRD, Livermore researchers can examine future challenges, propose and explore innovative solutions, and deliver creative approaches to support our missions. The present scientific and technical strengths of the Laboratory are, in large part, a product of past LDRD investments in HPC. Here, we provide seven examples of LDRD projects from the past decade that have played a critical role in building LLNL’s HPC, computer science, mathematics, and data science research capabilities, and describe how they have impacted LLNL’s mission.

  8. Advances in Engine Test Capabilities at the NASA Glenn Research Center's Propulsion Systems Laboratory

    NASA Technical Reports Server (NTRS)

    Pachlhofer, Peter M.; Panek, Joseph W.; Dicki, Dennis J.; Piendl, Barry R.; Lizanich, Paul J.; Klann, Gary A.

    2006-01-01

    The Propulsion Systems Laboratory at the National Aeronautics and Space Administration (NASA) Glenn Research Center is one of the premier U.S. facilities for research on advanced aeropropulsion systems. The facility can simulate a wide range of altitude and Mach number conditions while supplying the aeropropulsion system with all the support services necessary to operate at those conditions. Test data are recorded on a combination of steady-state and highspeed data-acquisition systems. Recently a number of upgrades were made to the facility to meet demanding new requirements for the latest aeropropulsion concepts and to improve operational efficiency. Improvements were made to data-acquisition systems, facility and engine-control systems, test-condition simulation systems, video capture and display capabilities, and personnel training procedures. This paper discusses the facility s capabilities, recent upgrades, and planned future improvements.

  9. Temperature monitoring options available at the Idaho national laboratory advanced test reactor

    NASA Astrophysics Data System (ADS)

    Daw, J. E.; Rempe, J. L.; Knudson, D. L.; Unruh, T. C.; Chase, B. M.; Davis, K. L.; Palmer, A. J.

    2013-09-01

    As part of the Advanced Test Reactor National Scientific User Facility (ATR-NSUF) program, the Idaho National Laboratory (INL) has developed in-house capabilities to fabricate, test, and qualify new and enhanced temperature sensors for irradiation testing. Clearly, temperature sensor selection for irradiation tests will be determined based on the irradiation environment and budget. However, temperature sensors now offered by INL include a wide array of melt wires in small capsules, silicon carbide monitors, commercially available thermocouples, and specialized high temperature irradiation resistant thermocouples containing doped molybdenum and niobium alloy thermoelements. In addition, efforts have been initiated to develop and evaluate ultrasonic thermometers for irradiation testing. This array of temperature monitoring options now available to ATR and other Material and Test Reactor (MTR) users fulfills recent customer requests.

  10. Experiments in advanced control concepts for space robotics - An overview of the Stanford Aerospace Robotics Laboratory

    NASA Technical Reports Server (NTRS)

    Hollars, M. G.; Cannon, R. H., Jr.; Alexander, H. L.; Morse, D. F.

    1987-01-01

    The Stanford University Aerospace Robotics Laboratory is actively developing and experimentally testing advanced robot control strategies for space robotic applications. Early experiments focused on control of very lightweight one-link manipulators and other flexible structures. The results are being extended to position and force control of mini-manipulators attached to flexible manipulators and multilink manipulators with flexible drive trains. Experimental results show that end-point sensing and careful dynamic modeling or adaptive control are key to the success of these control strategies. Free-flying space robot simulators that operate on an air cushion table have been built to test control strategies in which the dynamics of the base of the robot and the payload are important.

  11. Hawaiian Electric Advanced Inverter Grid Support Function Laboratory Validation and Analysis

    SciTech Connect

    Nelson, Austin; Nagarajan, Adarsh; Prabakar, Kumar; Gevorgian, Vahan; Lundstrom, Blake; Nepal, Shaili; Hoke, Anderson; Asano, Marc; Ueda, Reid; Shindo, Jon; Kubojiri, Kandice; Ceria, Riley; Ifuku, Earle

    2016-12-01

    The objective for this test plan was to better understand how to utilize the performance capabilities of advanced inverter functions to allow the interconnection of distributed energy resource (DER) systems to support the new Customer Self-Supply, Customer Grid-Supply, and other future DER programs. The purpose of this project was: 1) to characterize how the tested grid supportive inverters performed the functions of interest, 2) to evaluate the grid supportive inverters in an environment that emulates the dynamics of O'ahu's electrical distribution system, and 3) to gain insight into the benefits of the grid support functions on selected O'ahu island distribution feeders. These goals were achieved through laboratory testing of photovoltaic inverters, including power hardware-in-the-loop testing.

  12. The Advanced Light Source at the Lawrence Berkeley Laboratory (ALS, LBL)

    SciTech Connect

    Jackson, A.

    1990-08-01

    The Advanced Light Source (ALS), a national facility currently under construction at the Lawrence Berkeley Laboratory (LBL), is a third-generation synchrotron light source designed to produce extremely bright beams of synchrotron radiation, in the energy range from a few eV to 10 keV. The design is based on a 1-1.9 GeV electron storage ring (optimized at 1.5 GeV), and utilizes special magnets, known as undulators and wigglers (collectively referred to as insertion devices), to generate the radiation. In this paper we describe the main accelerator components of the ALS, the variety of insertion devices, the radiation spectra expected from these devices, and the complement of experiments that have been approved for initial operation, starting in April 1993.

  13. Development of the Digital Engineering Laboratory Computer Network: Host-to-Node/Host-to-Host Protocols.

    DTIC Science & Technology

    1981-12-01

    HOST-TO-HOST PROTOCOLS THESIS AFIT,’GCS/EE/8lD-8 John W. Geist Capt USAF Approved for public release; distribution unlimited. AFIT/GCS/EE/81D-8...DEVELOPMENT OF THE DIGITAL ENGINEERING LABORATORY COMPUTER NETWORK: HOST-TO-NODE/HOST-TO-HOST PROTOCOLS THESIS Presented to the Faculty of the School of...development and operational implementation. I wish to express my appreciation to Dr. Gary B. Lamont, my thesis advisor, for his valued support and

  14. An On-Line Tutorial for the Administrative Sciences Personal Computer Laboratory.

    DTIC Science & Technology

    1987-09-01

    Sciences Personal Computer Laboratory by Karen M. Overall Lieutcnant, United States Navy B.S., Eastern New Mexico University, 1979 Submitted in partial...PC Storyboard is a software package that generates automated presentations on an IBM PC or compatible. It allows creation of screen displays of text...figures, charts, or graphics. Then lets you organize them into stories for presentation with a wide variety of special effects. PC Storyboard consists

  15. OPENING REMARKS: SciDAC: Scientific Discovery through Advanced Computing

    NASA Astrophysics Data System (ADS)

    Strayer, Michael

    2005-01-01

    Good morning. Welcome to SciDAC 2005 and San Francisco. SciDAC is all about computational science and scientific discovery. In a large sense, computational science characterizes SciDAC and its intent is change. It transforms both our approach and our understanding of science. It opens new doors and crosses traditional boundaries while seeking discovery. In terms of twentieth century methodologies, computational science may be said to be transformational. There are a number of examples to this point. First are the sciences that encompass climate modeling. The application of computational science has in essence created the field of climate modeling. This community is now international in scope and has provided precision results that are challenging our understanding of our environment. A second example is that of lattice quantum chromodynamics. Lattice QCD, while adding precision and insight to our fundamental understanding of strong interaction dynamics, has transformed our approach to particle and nuclear science. The individual investigator approach has evolved to teams of scientists from different disciplines working side-by-side towards a common goal. SciDAC is also undergoing a transformation. This meeting is a prime example. Last year it was a small programmatic meeting tracking progress in SciDAC. This year, we have a major computational science meeting with a variety of disciplines and enabling technologies represented. SciDAC 2005 should position itself as a new corner stone for Computational Science and its impact on science. As we look to the immediate future, FY2006 will bring a new cycle to SciDAC. Most of the program elements of SciDAC will be re-competed in FY2006. The re-competition will involve new instruments for computational science, new approaches for collaboration, as well as new disciplines. There will be new opportunities for virtual experiments in carbon sequestration, fusion, and nuclear power and nuclear waste, as well as collaborations

  16. The CT Scanner Facility at Stellenbosch University: An open access X-ray computed tomography laboratory

    NASA Astrophysics Data System (ADS)

    du Plessis, Anton; le Roux, Stephan Gerhard; Guelpa, Anina

    2016-10-01

    The Stellenbosch University CT Scanner Facility is an open access laboratory providing non-destructive X-ray computed tomography (CT) and a high performance image analysis services as part of the Central Analytical Facilities (CAF) of the university. Based in Stellenbosch, South Africa, this facility offers open access to the general user community, including local researchers, companies and also remote users (both local and international, via sample shipment and data transfer). The laboratory hosts two CT instruments, i.e. a micro-CT system, as well as a nano-CT system. A workstation-based Image Analysis Centre is equipped with numerous computers with data analysis software packages, which are to the disposal of the facility users, along with expert supervision, if required. All research disciplines are accommodated at the X-ray CT laboratory, provided that non-destructive analysis will be beneficial. During its first four years, the facility has accommodated more than 400 unique users (33 in 2012; 86 in 2013; 154 in 2014; 140 in 2015; 75 in first half of 2016), with diverse industrial and research applications using X-ray CT as means. This paper summarises the existence of the laboratory's first four years by way of selected examples, both from published and unpublished projects. In the process a detailed description of the capabilities and facilities available to users is presented.

  17. Computational Protein Engineering: Bridging the Gap between Rational Design and Laboratory Evolution

    PubMed Central

    Barrozo, Alexandre; Borstnar, Rok; Marloie, Gaël; Kamerlin, Shina Caroline Lynn

    2012-01-01

    Enzymes are tremendously proficient catalysts, which can be used as extracellular catalysts for a whole host of processes, from chemical synthesis to the generation of novel biofuels. For them to be more amenable to the needs of biotechnology, however, it is often necessary to be able to manipulate their physico-chemical properties in an efficient and streamlined manner, and, ideally, to be able to train them to catalyze completely new reactions. Recent years have seen an explosion of interest in different approaches to achieve this, both in the laboratory, and in silico. There remains, however, a gap between current approaches to computational enzyme design, which have primarily focused on the early stages of the design process, and laboratory evolution, which is an extremely powerful tool for enzyme redesign, but will always be limited by the vastness of sequence space combined with the low frequency for desirable mutations. This review discusses different approaches towards computational enzyme design and demonstrates how combining newly developed screening approaches that can rapidly predict potential mutation “hotspots” with approaches that can quantitatively and reliably dissect the catalytic step can bridge the gap that currently exists between computational enzyme design and laboratory evolution studies. PMID:23202907

  18. Nonlinear dynamics of high-power ultrashort laser pulses: exaflop computations on a laboratory computer station and subcycle light bullets

    NASA Astrophysics Data System (ADS)

    Voronin, A. A.; Zheltikov, A. M.

    2016-09-01

    The propagation of high-power ultrashort light pulses involves intricate nonlinear spatio-temporal dynamics where various spectral-temporal field transformation effects are strongly coupled to the beam dynamics, which, in turn, varies from the leading to the trailing edge of the pulse. Analysis of this nonlinear dynamics, accompanied by spatial instabilities, beam breakup into multiple filaments, and unique phenomena leading to the generation of extremely short optical field waveforms, is equivalent in its computational complexity to a simulation of the time evolution of a few billion-dimensional physical system. Such an analysis requires exaflops of computational operations and is usually performed on high-performance supercomputers. Here, we present methods of physical modeling and numerical analysis that allow problems of this class to be solved on a laboratory computer boosted by a cluster of graphic accelerators. Exaflop computations performed with the application of these methods reveal new unique phenomena in the spatio-temporal dynamics of high-power ultrashort laser pulses. We demonstrate that unprecedentedly short light bullets can be generated as a part of that dynamics, providing optical field localization in both space and time through a delicate balance between dispersion and nonlinearity with simultaneous suppression of diffraction-induced beam divergence due to the joint effect of Kerr and ionization nonlinearities.

  19. Recent Advances in Laboratory Infrared Spectroscopy of Polycyclic Aromatic Hydrocarbons: PAHs in the Far Infrared

    NASA Technical Reports Server (NTRS)

    Mattioda, Andrew L.; Ricca, Alessandra; Tucker, Jonathan; Boersma, Christiaan; Bauschlicher, Charles, Jr.; Allamandola, Louis J.

    2010-01-01

    Over 25 years of observations and laboratory work have shown that the mid-IR spectra of a majority of astronomical sources are dominated by emission features near 3.3, 6.2, 7.7, and 11.2 microns, which originate in free polycyclic aromatic hydrocarbon (PAH) molecules. PAHs dominate the mid-IR emission from many galactic and extragalactic objects. As such, this material tracks a wide variety of astronomical processes, making this spectrum a powerful probe of the cosmos Apart from bands in the mid-IR, PAHs have bands spanning the Far-IR (FIR) and emission from these FIR features should be present in astronomical sources showing the Mid-IR PAH bands. However, with one exception, the FIR spectral characteristics are known only for a few neutral small PAHs trapped in salt pellets or oils at room temperature, data which is not relevant to astrophysics. Furthermore, since most emitting PAHs responsible for the mid-IR astronomical features are ionized, the absence of any experimental or theoretical PAH ion FIR spectra will make it impossible to correctly interpret the FIR data from these objects. In view of the upcoming Herschel space telescope mission and SOFIA's FIR airborne instrumentation, which will pioneer the FIR region, it is now urgent to obtain PAH FIR spectra. This talk will present an overview recent advances in the laboratory spectroscopy of PAHs, Highlighting the FIR spectroscopy along with some quantum calculations.

  20. RECENT ADVANCES IN HIGH TEMPERATURE ELECTROLYSIS AT IDAHO NATIONAL LABORATORY: STACK TESTS

    SciTech Connect

    X, Zhang; J. E. O'Brien; R. C. O'Brien; J. J. Hartvigsen; G. Tao; N. Petigny

    2012-07-01

    High temperature steam electrolysis is a promising technology for efficient sustainable large-scale hydrogen production. Solid oxide electrolysis cells (SOECs) are able to utilize high temperature heat and electric power from advanced high-temperature nuclear reactors or renewable sources to generate carbon-free hydrogen at large scale. However, long term durability of SOECs needs to be improved significantly before commercialization of this technology. A degradation rate of 1%/khr or lower is proposed as a threshold value for commercialization of this technology. Solid oxide electrolysis stack tests have been conducted at Idaho National Laboratory to demonstrate recent improvements in long-term durability of SOECs. Electrolytesupported and electrode-supported SOEC stacks were provided by Ceramatec Inc., Materials and Systems Research Inc. (MSRI), and Saint Gobain Advanced Materials (St. Gobain), respectively for these tests. Long-term durability tests were generally operated for a duration of 1000 hours or more. Stack tests based on technology developed at Ceramatec and MSRI have shown significant improvement in durability in the electrolysis mode. Long-term degradation rates of 3.2%/khr and 4.6%/khr were observed for MSRI and Ceramatec stacks, respectively. One recent Ceramatec stack even showed negative degradation (performance improvement) over 1900 hours of operation. A three-cell short stack provided by St. Gobain, however, showed rapid degradation in the electrolysis mode. Improvements on electrode materials, interconnect coatings, and electrolyteelectrode interface microstructures contribute to better durability of SOEC stacks.

  1. International Society for Advancement of Cytometry (ISAC) flow cytometry shared resource laboratory (SRL) best practices.

    PubMed

    Barsky, Lora W; Black, Michele; Cochran, Matthew; Daniel, Benjamin J; Davies, Derek; DeLay, Monica; Gardner, Rui; Gregory, Michael; Kunkel, Desiree; Lannigan, Joanne; Marvin, James; Salomon, Robert; Torres, Carina; Walker, Rachael

    2016-11-01

    The purpose of this document is to define minimal standards for a flow cytometry shared resource laboratory (SRL) and provide guidance for best practices in several important areas. This effort is driven by the desire of International Society for the Advancement of Cytometry (ISAC) members in SRLs to define and maintain standards of excellence in flow cytometry, and act as a repository for key elements of this information (e.g. example SOPs/training material, etc.). These best practices are not intended to define specifically how to implement these recommendations, but rather to establish minimal goals for an SRL to address in order to achieve excellence. It is hoped that once these best practices are established and implemented they will serve as a template from which similar practices can be defined for other types of SRLs. Identification of the need for best practices first occurred through discussions at the CYTO 2013 SRL Forum, with the most important areas for which best practices should be defined identified through several surveys and SRL track workshops as part of CYTO 2014. © 2016 International Society for Advancement of Cytometry.

  2. Advanced Simulation and Computing Co-Design Strategy

    SciTech Connect

    Ang, James A.; Hoang, Thuc T.; Kelly, Suzanne M.; McPherson, Allen; Neely, Rob

    2015-11-01

    This ASC Co-design Strategy lays out the full continuum and components of the co-design process, based on what we have experienced thus far and what we wish to do more in the future to meet the program’s mission of providing high performance computing (HPC) and simulation capabilities for NNSA to carry out its stockpile stewardship responsibility.

  3. Connecting Performance Analysis and Visualization to Advance Extreme Scale Computing

    SciTech Connect

    Bremer, Peer-Timo; Mohr, Bernd; Schulz, Martin; Pasccci, Valerio; Gamblin, Todd; Brunst, Holger

    2015-07-29

    The characterization, modeling, analysis, and tuning of software performance has been a central topic in High Performance Computing (HPC) since its early beginnings. The overall goal is to make HPC software run faster on particular hardware, either through better scheduling, on-node resource utilization, or more efficient distributed communication.

  4. Advanced Computer Image Generation Techniques Exploiting Perceptual Characteristics. Final Report.

    ERIC Educational Resources Information Center

    Stenger, Anthony J.; And Others

    This study suggests and identifies computer image generation (CIG) algorithms for visual simulation that improve the training effectiveness of CIG simulators and identifies areas of basic research in visual perception that are significant for improving CIG technology. The first phase of the project entailed observing three existing CIG simulators.…

  5. MAX - An advanced parallel computer for space applications

    NASA Technical Reports Server (NTRS)

    Lewis, Blair F.; Bunker, Robert L.

    1991-01-01

    MAX is a fault-tolerant multicomputer hardware and software architecture designed to meet the needs of NASA spacecraft systems. It consists of conventional computing modules (computers) connected via a dual network topology. One network is used to transfer data among the computers and between computers and I/O devices. This network's topology is arbitrary. The second network operates as a broadcast medium for operating system synchronization messages and supports the operating system's Byzantine resilience. A fully distributed operating system supports multitasking in an asynchronous event and data driven environment. A large grain dataflow paradigm is used to coordinate the multitasking and provide easy control of concurrency. It is the basis of the system's fault tolerance and allows both static and dynamical location of tasks. Redundant execution of tasks with software voting of results may be specified for critical tasks. The dataflow paradigm also supports simplified software design, test and maintenance. A unique feature is a method for reliably patching code in an executing dataflow application.

  6. Advanced Computational Aeroacoustics Methods for Fan Noise Prediction

    NASA Technical Reports Server (NTRS)

    Envia, Edmane (Technical Monitor); Tam, Christopher

    2003-01-01

    Direct computation of fan noise is presently not possible. One of the major difficulties is the geometrical complexity of the problem. In the case of fan noise, the blade geometry is critical to the loading on the blade and hence the intensity of the radiated noise. The precise geometry must be incorporated into the computation. In computational fluid dynamics (CFD), there are two general ways to handle problems with complex geometry. One way is to use unstructured grids. The other is to use body fitted overset grids. In the overset grid method, accurate data transfer is of utmost importance. For acoustic computation, it is not clear that the currently used data transfer methods are sufficiently accurate as not to contaminate the very small amplitude acoustic disturbances. In CFD, low order schemes are, invariably, used in conjunction with unstructured grids. However, low order schemes are known to be numerically dispersive and dissipative. dissipative errors are extremely undesirable for acoustic wave problems. The objective of this project is to develop a high order unstructured grid Dispersion-Relation-Preserving (DRP) scheme. would minimize numerical dispersion and dissipation errors. contains the results of the funded portion of the project. scheme on an unstructured grid has been developed. constructed in the wave number space. The characteristics of the scheme can be improved by the inclusion of additional constraints. Stability of the scheme has been investigated. Stability can be improved by adopting the upwinding strategy.

  7. Project T.E.A.M. (Technical Education Advancement Modules). Introduction to Computers.

    ERIC Educational Resources Information Center

    Ellis, Brenda

    This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 3-hour introduction to computers. The purpose is to develop the following competencies: (1) orientation to data processing; (2) use of data entry devices; (3) use of computer menus; and (4) entry of data with accuracy and…

  8. Computer and laboratory simulation in the teaching of neonatal nursing: innovation and impact on learning 1

    PubMed Central

    Fonseca, Luciana Mara Monti; Aredes, Natália Del' Angelo; Fernandes, Ananda Maria; Batalha, Luís Manuel da Cunha; Apóstolo, Jorge Manuel Amado; Martins, José Carlos Amado; Rodrigues, Manuel Alves

    2016-01-01

    ABSTRACT Objectives: to evaluate the cognitive learning of nursing students in neonatal clinical evaluation from a blended course with the use of computer and laboratory simulation; to compare the cognitive learning of students in a control and experimental group testing the laboratory simulation; and to assess the extracurricular blended course offered on the clinical assessment of preterm infants, according to the students. Method: a quasi-experimental study with 14 Portuguese students, containing pretest, midterm test and post-test. The technologies offered in the course were serious game e-Baby, instructional software of semiology and semiotechnique, and laboratory simulation. Data collection tools developed for this study were used for the course evaluation and characterization of the students. Nonparametric statistics were used: Mann-Whitney and Wilcoxon. Results: the use of validated digital technologies and laboratory simulation demonstrated a statistically significant difference (p = 0.001) in the learning of the participants. The course was evaluated as very satisfactory for them. The laboratory simulation alone did not represent a significant difference in the learning. Conclusions: the cognitive learning of participants increased significantly. The use of technology can be partly responsible for the course success, showing it to be an important teaching tool for innovation and motivation of learning in healthcare. PMID:27737376

  9. IMPROVED COMPUTATIONAL NEUTRONICS METHODS AND VALIDATION PROTOCOLS FOR THE ADVANCED TEST REACTOR

    SciTech Connect

    David W. Nigg; Joseph W. Nielsen; Benjamin M. Chase; Ronnie K. Murray; Kevin A. Steuhm

    2012-04-01

    The Idaho National Laboratory (INL) is in the process of modernizing the various reactor physics modeling and simulation tools used to support operation and safety assurance of the Advanced Test Reactor (ATR). Key accomplishments so far have encompassed both computational as well as experimental work. A new suite of stochastic and deterministic transport theory based reactor physics codes and their supporting nuclear data libraries (HELIOS, KENO6/SCALE, NEWT/SCALE, ATTILA, and an extended implementation of MCNP5) has been installed at the INL. Corresponding models of the ATR and ATRC are now operational with all five codes, demonstrating the basic feasibility of the new code packages for their intended purpose. Of particular importance, a set of as-run core depletion HELIOS calculations for all ATR cycles since August 2009 was successfully completed during 2011. This demonstration supported a decision late in the year to proceed with the phased incorporation of the HELIOS methodology into the ATR fuel cycle management process beginning in 2012. On the experimental side of the project, new hardware was fabricated, measurement protocols were finalized, and the first four of six planned physics code validation experiments based on neutron activation spectrometry were conducted at the ATRC facility. Data analysis for the first three experiments, focused on characterization of the neutron spectrum in one of the ATR flux traps, has been completed. The six experiments will ultimately form the basis for a flexible, easily-repeatable ATR physics code validation protocol that is consistent with applicable ASTM standards.

  10. Advances on modelling of ITER scenarios: physics and computational challenges

    NASA Astrophysics Data System (ADS)

    Giruzzi, G.; Garcia, J.; Artaud, J. F.; Basiuk, V.; Decker, J.; Imbeaux, F.; Peysson, Y.; Schneider, M.

    2011-12-01

    Methods and tools for design and modelling of tokamak operation scenarios are discussed with particular application to ITER advanced scenarios. Simulations of hybrid and steady-state scenarios performed with the integrated tokamak modelling suite of codes CRONOS are presented. The advantages of a possible steady-state scenario based on cyclic operations, alternating phases of positive and negative loop voltage, with no magnetic flux consumption on average, are discussed. For regimes in which current alignment is an issue, a general method for scenario design is presented, based on the characteristics of the poloidal current density profile.

  11. Using Advanced Computer Vision Algorithms on Small Mobile Robots

    DTIC Science & Technology

    2006-04-20

    Lab. This classification algorithm has been used to successfully detect the license plates of automobiles in motion in real-time. While working...use in real-time. Test results are shown for a variety of environments. KEYWORDS: robotics, computer vision, car /license plate detection, SIFT...when detecting the make and model of automobiles , SIFT can be used to achieve very high detection rates at the expense of a hefty performance cost when

  12. Cogeneration computer model assessment: Advanced cogeneration research study

    NASA Technical Reports Server (NTRS)

    Rosenberg, L.

    1983-01-01

    Cogeneration computer simulation models to recommend the most desirable models or their components for use by the Southern California Edison Company (SCE) in evaluating potential cogeneration projects was assessed. Existing cogeneration modeling capabilities are described, preferred models are identified, and an approach to the development of a code which will best satisfy SCE requirements is recommended. Five models (CELCAP, COGEN 2, CPA, DEUS, and OASIS) are recommended for further consideration.

  13. Advanced Computational Methods for Security Constrained Financial Transmission Rights

    SciTech Connect

    Kalsi, Karanjit; Elbert, Stephen T.; Vlachopoulou, Maria; Zhou, Ning; Huang, Zhenyu

    2012-07-26

    Financial Transmission Rights (FTRs) are financial insurance tools to help power market participants reduce price risks associated with transmission congestion. FTRs are issued based on a process of solving a constrained optimization problem with the objective to maximize the FTR social welfare under power flow security constraints. Security constraints for different FTR categories (monthly, seasonal or annual) are usually coupled and the number of constraints increases exponentially with the number of categories. Commercial software for FTR calculation can only provide limited categories of FTRs due to the inherent computational challenges mentioned above. In this paper, first an innovative mathematical reformulation of the FTR problem is presented which dramatically improves the computational efficiency of optimization problem. After having re-formulated the problem, a novel non-linear dynamic system (NDS) approach is proposed to solve the optimization problem. The new formulation and performance of the NDS solver is benchmarked against widely used linear programming (LP) solvers like CPLEX™ and tested on both standard IEEE test systems and large-scale systems using data from the Western Electricity Coordinating Council (WECC). The performance of the NDS is demonstrated to be comparable and in some cases is shown to outperform the widely used CPLEX algorithms. The proposed formulation and NDS based solver is also easily parallelizable enabling further computational improvement.

  14. Parallel computing in genomic research: advances and applications

    PubMed Central

    Ocaña, Kary; de Oliveira, Daniel

    2015-01-01

    Today’s genomic experiments have to process the so-called “biological big data” that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC) environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities. PMID:26604801

  15. Parallel computing in genomic research: advances and applications.

    PubMed

    Ocaña, Kary; de Oliveira, Daniel

    2015-01-01

    Today's genomic experiments have to process the so-called "biological big data" that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC) environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities.

  16. Vision 20/20: Automation and advanced computing in clinical radiation oncology

    SciTech Connect

    Moore, Kevin L. Moiseenko, Vitali; Kagadis, George C.; McNutt, Todd R.; Mutic, Sasa

    2014-01-15

    This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authors contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy.

  17. Computational Approaches to Enhance Nanosafety and Advance Nanomedicine

    NASA Astrophysics Data System (ADS)

    Mendoza, Eduardo R.

    With the increasing use of nanoparticles in food processing, filtration/purification and consumer products, as well as the huge potential of their use in nanomedicine, a quantitative understanding of the effects of nanoparticle uptake and transport is needed. We provide examples of novel methods for modeling complex bio-nano interactions which are based on stochastic process algebras. Since model construction presumes sufficient availability of experimental data, recent developments in "nanoinformatics", an emerging discipline analogous to bioinfomatics, in building an accessible information infrastructure are subsequently discussed. Both computational areas offer opportunities for Filipinos to engage in collaborative, cutting edge research in this impactful field.

  18. First Responders Guide to Computer Forensics: Advanced Topics

    DTIC Science & Technology

    2005-09-01

    server of the sender , the mail server of the receiver, and the computer that receives the email. Assume that Alice wants to send an email to her friend...pleased to meet you MAIL FROM: alice.price@alphanet.com 250 alice.price@alphanet.com... Sender ok RCPT TO: bob.doe@betanet.com 250 bob.doe...betanet.com... Sender ok DATA 354 Please start mail input From: alice.price@alphanet.com To: bob.doe@betanet.com Subject: Lunch Bob, It was good

  19. Advanced Computational Methods for Thermal Radiative Heat Transfer

    SciTech Connect

    Tencer, John; Carlberg, Kevin Thomas; Larsen, Marvin E.; Hogan, Roy E.

    2016-10-01

    Participating media radiation (PMR) in weapon safety calculations for abnormal thermal environments are too costly to do routinely. This cost may be s ubstantially reduced by applying reduced order modeling (ROM) techniques. The application of ROM to PMR is a new and unique approach for this class of problems. This approach was investigated by the authors and shown to provide significant reductions in the computational expense associated with typical PMR simulations. Once this technology is migrated into production heat transfer analysis codes this capability will enable the routine use of PMR heat transfer in higher - fidelity simulations of weapon resp onse in fire environments.

  20. Advanced Computational Framework for Environmental Management ZEM, Version 1.x

    SciTech Connect

    Vesselinov, Velimir V.; O'Malley, Daniel; Pandey, Sachin

    2016-11-04

    Typically environmental management problems require analysis of large and complex data sets originating from concurrent data streams with different data collection frequencies and pedigree. These big data sets require on-the-fly integration into a series of models with different complexity for various types of model analyses where the data are applied as soft and hard model constraints. This is needed to provide fast iterative model analyses based on the latest available data to guide decision-making. Furthermore, the data and model are associated with uncertainties. The uncertainties are probabilistic (e.g. measurement errors) and non-probabilistic (unknowns, e.g. alternative conceptual models characterizing site conditions). To address all of these issues, we have developed an integrated framework for real-time data and model analyses for environmental decision-making called ZEM. The framework allows for seamless and on-the-fly integration of data and modeling results for robust and scientifically-defensible decision-making applying advanced decision analyses tools such as Bayesian- Information-Gap Decision Theory (BIG-DT). The framework also includes advanced methods for optimization that are capable of dealing with a large number of unknown model parameters, and surrogate (reduced order) modeling capabilities based on support vector regression techniques. The framework is coded in Julia, a state-of-the-art high-performance programing language (http://julialang.org). The ZEM framework is open-source and can be applied to any environmental management site. The framework will be open-source and released under GPL V3 license.

  1. Computational Efforts in Support of Advanced Coal Research

    SciTech Connect

    Suljo Linic

    2006-08-17

    The focus in this project was to employ first principles computational methods to study the underlying molecular elementary processes that govern hydrogen diffusion through Pd membranes as well as the elementary processes that govern the CO- and S-poisoning of these membranes. Our computational methodology integrated a multiscale hierarchical modeling approach, wherein a molecular understanding of the interactions between various species is gained from ab-initio quantum chemical Density Functional Theory (DFT) calculations, while a mesoscopic statistical mechanical model like Kinetic Monte Carlo is employed to predict the key macroscopic membrane properties such as permeability. The key developments are: (1) We have coupled systematically the ab initio calculations with Kinetic Monte Carlo (KMC) simulations to model hydrogen diffusion through the Pd based-membranes. The predicted tracer diffusivity of hydrogen atoms through the bulk of Pd lattice from KMC simulations are in excellent agreement with experiments. (2) The KMC simulations of dissociative adsorption of H{sub 2} over Pd(111) surface indicates that for thin membranes (less than 10{micro} thick), the diffusion of hydrogen from surface to the first subsurface layer is rate limiting. (3) Sulfur poisons the Pd surface by altering the electronic structure of the Pd atoms in the vicinity of the S atom. The KMC simulations indicate that increasing sulfur coverage drastically reduces the hydrogen coverage on the Pd surface and hence the driving force for diffusion through the membrane.

  2. Recent advances in direct methanol fuel cells at Los Alamos National Laboratory

    NASA Astrophysics Data System (ADS)

    Ren, Xiaoming; Zelenay, Piotr; Thomas, Sharon; Davey, John; Gottesfeld, Shimshon

    This paper describes recent advances in the science and technology of direct methanol fuel cells (DMFCs) made at Los Alamos National Laboratory (LANL). The effort on DMFCs at LANL includes work devoted to portable power applications, funded by the Defense Advanced Research Project Agency (DARPA), and work devoted to potential transport applications, funded by the US DOE. We describe recent results with a new type of DMFC stack hardware that allows to lower the pitch per cell to 2 mm while allowing low air flow and air pressure drops. Such stack technology lends itself to both portable power and potential transport applications. Power densities of 300 W/l and 1 kW/l seem achievable under conditions applicable to portable power and transport applications, respectively. DMFC power system analysis based on the performance of this stack, under conditions applying to transport applications (joint effort with U.C. Davis), has shown that, in terms of overall system efficiency and system packaging requirements, a power source for a passenger vehicle based on a DMFC could compete favorably with a hydrogen-fueled fuel cell system, as well as with fuel cell systems based on fuel processing on board. As part of more fundamental studies performed, we describe optimization of anode catalyst layers in terms of PtRu catalyst nature, loading and catalyst layer composition and structure. We specifically show that, optimized content of recast ionic conductor added to the catalyst layer is a sensitive function of the nature of the catalyst. Other elements of membrane/electrode assembly (MEA) optimization efforts are also described, highlighting our ability to resolve, to a large degree, a well-documented problem of polymer electrolyte DMFCs, namely "methanol crossover". This was achieved by appropriate cell design, enabling fuel utilization as high as 90% in highly performing DMFCs.

  3. Computer based learning in an undergraduate physics laboratory: interfacing and instrument control using Matlab

    NASA Astrophysics Data System (ADS)

    Sharp, J. S.; Glover, P. M.; Moseley, W.

    2007-05-01

    In this paper we describe the recent changes to the curriculum of the second year practical laboratory course in the School of Physics and Astronomy at the University of Nottingham. In particular, we describe how Matlab has been implemented as a teaching tool and discuss both its pedagogical advantages and disadvantages in teaching undergraduate students about computer interfacing and instrument control techniques. We also discuss the motivation for converting the interfacing language that is used in the laboratory from LabView to Matlab. We describe an example of a typical experiment the students are required to complete and we conclude by briefly assessing how the recent curriculum changes have affected both student performance and compliance.

  4. Advances in x-ray computed microtomography at the NSLS

    SciTech Connect

    Dowd, B.A.; Andrews, A.B.; Marr, R.B.; Siddons, D.P.; Jones, K.W.; Peskin, A.M.

    1998-08-01

    The X-Ray Computed Microtomography workstation at beamline X27A at the NSLS has been utilized by scientists from a broad range of disciplines from industrial materials processing to environmental science. The most recent applications are presented here as well as a description of the facility that has evolved to accommodate a wide variety of materials and sample sizes. One of the most exciting new developments reported here resulted from a pursuit of faster reconstruction techniques. A Fast Filtered Back Transform (FFBT) reconstruction program has been developed and implemented, that is based on a refinement of the gridding algorithm first developed for use with radio astronomical data. This program has reduced the reconstruction time to 8.5 sec for a 929 x 929 pixel{sup 2} slice on an R10,000 CPU, more than 8x reduction compared with the Filtered Back-Projection method.

  5. ADVANCES IN X-RAY COMPUTED MICROTOMOGRAPHY AT THE NSLS.

    SciTech Connect

    DOWD,B.A.

    1998-08-07

    The X-Ray Computed Microtomography workstation at beamline X27A at the NSLS has been utilized by scientists from a broad range of disciplines from industrial materials processing to environmental science. The most recent applications are presented here as well as a description of the facility that has evolved to accommodate a wide variety of materials and sample sizes. One of the most exciting new developments reported here resulted from a pursuit of faster reconstruction techniques. A Fast Filtered Back Transform (FFBT) reconstruction program has been developed and implemented, that is based on a refinement of the ''gridding'' algorithm first developed for use with radio astronomical data. This program has reduced the reconstruction time to 8.5 sec for a 929 x 929 pixel{sup 2} slice on an R10,000 CPU, more than 8x reduction compared with the Filtered Back-Projection method.

  6. Summary of ground water and surface water flow and contaminant transport computer codes used at the Idaho National Engineering Laboratory (INEL). [Contaminant transport computer codes

    SciTech Connect

    Bandy, P.J.; Hall, L.F.

    1993-03-01

    This report presents information on computer codes for numerical and analytical models that have been used at the Idaho National Engineering Laboratory (INEL) to model ground water and surface water flow and contaminant transport. Organizations conducting modeling at the INEL include: EG G Idaho, Inc., US Geological Survey, and Westinghouse Idaho Nuclear Company. Information concerning computer codes included in this report are: agency responsible for the modeling effort, name of the computer code, proprietor of the code (copyright holder or original author), validation and verification studies, applications of the model at INEL, the prime user of the model, computer code description, computing environment requirements, and documentation and references for the computer code.

  7. National Energy Research Scientific Computing Center (NERSC): Advancing the frontiers of computational science and technology

    SciTech Connect

    Hules, J.

    1996-11-01

    National Energy Research Scientific Computing Center (NERSC) provides researchers with high-performance computing tools to tackle science`s biggest and most challenging problems. Founded in 1974 by DOE/ER, the Controlled Thermonuclear Research Computer Center was the first unclassified supercomputer center and was the model for those that followed. Over the years the center`s name was changed to the National Magnetic Fusion Energy Computer Center and then to NERSC; it was relocated to LBNL. NERSC, one of the largest unclassified scientific computing resources in the world, is the principal provider of general-purpose computing services to DOE/ER programs: Magnetic Fusion Energy, High Energy and Nuclear Physics, Basic Energy Sciences, Health and Environmental Research, and the Office of Computational and Technology Research. NERSC users are a diverse community located throughout US and in several foreign countries. This brochure describes: the NERSC advantage, its computational resources and services, future technologies, scientific resources, and computational science of scale (interdisciplinary research over a decade or longer; examples: combustion in engines, waste management chemistry, global climate change modeling).

  8. Completion summary for borehole USGS 136 near the Advanced Test Reactor Complex, Idaho National Laboratory, Idaho

    USGS Publications Warehouse

    Twining, Brian V.; Bartholomay, Roy C.; Hodges, Mary K.V.

    2012-01-01

    In 2011, the U.S. Geological Survey, in cooperation with the U.S. Department of Energy, cored and completed borehole USGS 136 for stratigraphic framework analyses and long-term groundwater monitoring of the eastern Snake River Plain aquifer at the Idaho National Laboratory. The borehole was initially cored to a depth of 1,048 feet (ft) below land surface (BLS) to collect core, open-borehole water samples, and geophysical data. After these data were collected, borehole USGS 136 was cemented and backfilled between 560 and 1,048 ft BLS. The final construction of borehole USGS 136 required that the borehole be reamed to allow for installation of 6-inch (in.) diameter carbon-steel casing and 5-in. diameter stainless-steel screen; the screened monitoring interval was completed between 500 and 551 ft BLS. A dedicated pump and water-level access line were placed to allow for aquifer testing, for collecting periodic water samples, and for measuring water levels. Geophysical and borehole video logs were collected after coring and after the completion of the monitor well. Geophysical logs were examined in conjunction with the borehole core to describe borehole lithology and to identify primary flow paths for groundwater, which occur in intervals of fractured and vesicular basalt. A single-well aquifer test was used to define hydraulic characteristics for borehole USGS 136 in the eastern Snake River Plain aquifer. Specific-capacity, transmissivity, and hydraulic conductivity from the aquifer test were at least 975 gallons per minute per foot, 1.4 × 105 feet squared per day (ft2/d), and 254 feet per day, respectively. The amount of measureable drawdown during the aquifer test was about 0.02 ft. The transmissivity for borehole USGS 136 was in the range of values determined from previous aquifer tests conducted in other wells near the Advanced Test Reactor Complex: 9.5 × 103 to 1.9 × 105 ft2/d. Water samples were analyzed for cations, anions, metals, nutrients, total organic

  9. Cold Crucible Induction Melter Testing at The Idaho National Laboratory for the Advanced Remediation Technologies Program

    SciTech Connect

    Jay Roach; Nick Soelberg; Mike Ancho; Eric Tchemitcheff; John Richardson

    2009-03-01

    AREVA Federal Services (AFS) is performing a multi-year, multi-phase Advanced Remediation Technologies (ART) project, sponsored by the U.S. Department of Energy (DOE), to evaluate the feasibility and benefits of replacing the existing joule-heated melter (JHM) used to treat high level waste (HLW) in the Defense Waste Processing Facility (DWPF) at the Savannah River Site with a cold crucible induction melter (CCIM). The AFS ART CCIM project includes several collaborators from AREVA subsidiaries, French companies, and DOE national laboratories. The Savannah River National Laboratory and the Commissariat a l’Energie Atomique (CEA) have performed laboratory-scale studies and testing to determine a suitable, high-waste-loading glass matrix. The Idaho National Laboratory (INL) and CEA are performing CCIM demonstrations at two different pilot scales to assess CCIM design and operation for treating SRS sludge wastes that are currently being treated in the DWPF. SGN is performing engineering studies to validate the feasibility of retrofitting CCIM technology into the DWPF Melter Cell. The long-term project plan includes more lab-testing, pilot- and large-scale demonstrations, and engineering activities to be performed during subsequent project phases. This paper provides preliminary results of tests using the engineering-scale CCIM test system located at the INL. The CCIM test system was operated continuously over a time period of about 58 hours. As the DWPF simulant feed was continuously fed to the melter, the glass level gradually increased until a portion of the molten glass was drained from the melter. The glass drain was operated semi-continuously because the glass drain rate was higher than the glass feedrate. A cold cap of unmelted feed was controlled by adjusting the feedrate and melter power levels to obtain the target molten glass temperatures with varying cold cap levels. Three test conditions were performed per the test plan, during which the melter was

  10. Complementary Spectroscopic Assays for Investigating Protein-Ligand Binding Activity: A Project for the Advanced Chemistry Laboratory

    ERIC Educational Resources Information Center

    Mascotti, David P.; Waner, Mark J.

    2010-01-01

    A protein-ligand binding, guided-inquiry laboratory project with potential application across the advanced undergraduate curriculum is described. At the heart of the project are fluorescence and spectrophotometric assays utilizing biotin-4-fluorescein and streptavidin. The use of the same stock solutions for an assay that may be examined by two…

  11. The Advanced Interdisciplinary Research Laboratory: A Student Team Approach to the Fourth-Year Research Thesis Project Experience

    ERIC Educational Resources Information Center

    Piunno, Paul A. E.; Boyd, Cleo; Barzda, Virginijus; Gradinaru, Claudiu C.; Krull, Ulrich J.; Stefanovic, Sasa; Stewart, Bryan

    2014-01-01

    The advanced interdisciplinary research laboratory (AIRLab) represents a novel, effective, and motivational course designed from the interdisciplinary research interests of chemistry, physics, biology, and education development faculty members as an alternative to the independent thesis project experience. Student teams are assembled to work…

  12. Studying the Earth's Environment from Space: Computer Laboratory Exercised and Instructor Resources

    NASA Technical Reports Server (NTRS)

    Smith, Elizabeth A.; Alfultis, Michael

    1998-01-01

    Studying the Earth's Environment From Space is a two-year project to develop a suite of CD-ROMs containing Earth System Science curriculum modules for introductory undergraduate science classes. Lecture notes, slides, and computer laboratory exercises, including actual satellite data and software, are being developed in close collaboration with Carla Evans of NASA GSFC Earth Sciences Directorate Scientific and Educational Endeavors (SEE) project. Smith and Alfultis are responsible for the Oceanography and Sea Ice Processes Modules. The GSFC SEE project is responsible for Ozone and Land Vegetation Modules. This document constitutes a report on the first year of activities of Smith and Alfultis' project.

  13. Comparison of Mars Science Laboratory Reaction Control System Jet Computations With Flow Visualization and Velocimetry

    NASA Technical Reports Server (NTRS)

    Bathel, Brett F.; Danehy, Paul M.; Johansen, Craig T.; Ashcraft, Scott W.; Novak, Luke A.

    2013-01-01

    Numerical predictions of the Mars Science Laboratory reaction control system jets interacting with a Mach 10 hypersonic flow are compared to experimental nitric oxide planar laser-induced fluorescence data. The steady Reynolds Averaged Navier Stokes equations using the Baldwin-Barth one-equation turbulence model were solved using the OVERFLOW code. The experimental fluorescence data used for comparison consists of qualitative two-dimensional visualization images, qualitative reconstructed three-dimensional flow structures, and quantitative two-dimensional distributions of streamwise velocity. Through modeling of the fluorescence signal equation, computational flow images were produced and directly compared to the qualitative fluorescence data.

  14. RECENT ADVANCES IN HIGH TEMPERATURE ELECTROLYSIS AT IDAHO NATIONAL LABORATORY: SINGLE CELL TESTS

    SciTech Connect

    X. Zhang; J. E. O'Brien; R. C. O'Brien

    2012-07-01

    An experimental investigation on the performance and durability of single solid oxide electrolysis cells (SOECs) is under way at the Idaho National Laboratory. In order to understand and mitigate the degradation issues in high temperature electrolysis, single SOECs with different configurations from several manufacturers have been evaluated for initial performance and long-term durability. A new test apparatus has been developed for single cell and small stack tests from different vendors. Single cells from Ceramatec Inc. show improved durability compared to our previous stack tests. Single cells from Materials and Systems Research Inc. (MSRI) demonstrate low degradation both in fuel cell and electrolysis modes. Single cells from Saint Gobain Advanced Materials (St. Gobain) show stable performance in fuel cell mode, but rapid degradation in the electrolysis mode. Electrolyte-electrode delamination is found to have significant impact on degradation in some cases. Enhanced bonding between electrolyte and electrode and modification of the microstructure help to mitigate degradation. Polarization scans and AC impedance measurements are performed during the tests to characterize the cell performance and degradation.

  15. The advanced light source at Lawrence Berkeley laboratory: a new tool for research in atomic physics

    NASA Astrophysics Data System (ADS)

    Schlachter, Alfred S.; Robinson, Arthur L.

    1991-04-01

    The Advanced Light Source, a third-generation national synchrotron-radiation facility now under construction at the Lawrence Berkeley Laboratory, is scheduled to begin serving qualified users across a broad spectrum of research areas in the spring of 1993. Based on a low-emittance electron storage ring optimized to operate at 1.5 GeV, the ALS will have 10 long straight sections available for insertion devices (undulators and wigglers) and 24 high-quality bend-magnet ports. The short pulse width (30-50 ps) will be ideal for time-resolved measurements. Undulators will generate high-brightness partially coherent soft X-ray and ultraviolet (XUV) radiation from below 10 eV to above 2 keV; this radiation is plane polarized. Wigglers and bend magnets will extend the spectrum by generating high fluxes of X-rays to photon energies above 10 keV. The ALS will have an extensive research program in which XUV radiation is used to study matter in all its varied gaseous, liquid, and solid forms. The high brightness will open new areas of research in the materials sciences, such as spatially resolved spectroscopy (spectromicroscopy), and in biology, such as X-ray microscopy with element-specific sensitivity; the high flux will allow measurements in atomic physics and chemistry to be made with tenuous gas-phase targets. Technological applications could include lithography and nano-fabrication.

  16. Data Generation in the Discovery Sciences—Learning from the Practices in an Advanced Research Laboratory

    NASA Astrophysics Data System (ADS)

    Roth, Wolff-Michael

    2013-08-01

    General scientific literacy includes understanding the grounds on which scientific claims are based. The measurements scientists make and the data that they produce from them generally constitute these grounds. However, the nature of data generation has received relatively little attention from those interested in teaching science through inquiry. To inform curriculum designers about the process of data generation and its relation to the understanding of patterns as these may arise from graphs, this 5-year ethnographic study in one advanced research laboratory was designed to investigate how natural scientists make decisions about the inclusion/exclusion of certain measurements in/from their data sources. The study shows that scientists exclude measurements from their data sources even before attempting to mathematize and interpret the data. The excluded measurements therefore never even enter the ground from and against which the scientific phenomenon emerges and therefore remain invisible to it. I conclude by encouraging science educators to squarely address this aspect of the discovery sciences in their teaching, which has both methodological and ethical implications.

  17. The entrance system laboratory prototype for an advanced mass and ionic charge composition experiment.

    PubMed

    Allegrini, F; Desai, M I; Livi, R; Livi, S; McComas, D J; Randol, B

    2009-10-01

    Electrostatic analyzers (ESA) have been used extensively for the characterization of plasmas in a variety of space environments. They vary in shape, geometry, and size and are adapted to the specific particle population to be measured and the configuration of the spacecraft. Their main function is to select the energy per charge of the particles within a passband. An energy-per-charge range larger than that of the passband can be sampled by varying the voltage difference between the ESA electrodes. The voltage sweep takes time and reduces the duty cycle for a particular energy-per-charge passband. Our design approach for an advanced mass and ionic charge composition experiment (AMICCE) has a novel electrostatic analyzer that essentially serves as a spectrograph and selects ions simultaneously over a broad range of energy-per-charge (E/q). Only three voltage settings are required to cover the entire range from approximately 10 to 270 keV/q, thus dramatically increasing the product of the geometric factor times the duty cycle when compared with other instruments. In this paper, we describe the AMICCE concept with particular emphasis on the prototype of the entrance system (ESA and collimator), which we designed, developed, and tested. We also present comparisons of the laboratory results with electrostatic simulations.

  18. Strategies for casualty mitigation programs by using advanced tsunami computation

    NASA Astrophysics Data System (ADS)

    IMAI, K.; Imamura, F.

    2012-12-01

    1. Purpose of the study In this study, based on the scenario of great earthquakes along the Nankai trough, we aim on the estimation of the run up and high accuracy inundation process of tsunami in coastal areas including rivers. Here, using a practical method of tsunami analytical model, and taking into account characteristics of detail topography, land use and climate change in a realistic present and expected future environment, we examined the run up and tsunami inundation process. Using these results we estimated the damage due to tsunami and obtained information for the mitigation of human casualties. Considering the time series from the occurrence of the earthquake and the risk of tsunami damage, in order to mitigate casualties we provide contents of disaster risk information displayed in a tsunami hazard and risk map. 2. Creating a tsunami hazard and risk map From the analytical and practical tsunami model (a long wave approximated model) and the high resolution topography (5 m) including detailed data of shoreline, rivers, building and houses, we present a advanced analysis of tsunami inundation considering the land use. Based on the results of tsunami inundation and its analysis; it is possible to draw a tsunami hazard and risk map with information of human casualty, building damage estimation, drift of vehicles, etc. 3. Contents of disaster prevention information To improve the hazard, risk and evacuation information distribution, it is necessary to follow three steps. (1) Provide basic information such as tsunami attack info, areas and routes for evacuation and location of tsunami evacuation facilities. (2) Provide as additional information the time when inundation starts, the actual results of inundation, location of facilities with hazard materials, presence or absence of public facilities and areas underground that required evacuation. (3) Provide information to support disaster response such as infrastructure and traffic network damage prediction

  19. Computer and laboratory simulation of interactions between spacecraft surfaces and charged-particle environments

    NASA Technical Reports Server (NTRS)

    Stevens, N. J.

    1979-01-01

    Cases where the charged-particle environment acts on the spacecraft (e.g., spacecraft charging phenomena) and cases where a system on the spacecraft causes the interaction (e.g., high voltage space power systems) are considered. Both categories were studied in ground simulation facilities to understand the processes involved and to measure the pertinent parameters. Computer simulations are based on the NASA Charging Analyzer Program (NASCAP) code. Analytical models are developed in this code and verified against the experimental data. Extrapolation from the small test samples to space conditions are made with this code. Typical results from laboratory and computer simulations are presented for both types of interactions. Extrapolations from these simulations to performance in space environments are discussed.

  20. CART V: recent advancements in computer-aided camouflage assessment

    NASA Astrophysics Data System (ADS)

    Müller, Thomas; Müller, Markus

    2011-05-01

    In order to facilitate systematic, computer aided improvements of camouflage and concealment assessment methods, the software system CART (Camouflage Assessment in Real-Time) was built up for the camouflage assessment of objects in multispectral image sequences (see contributions to SPIE 2007-2010 [1], [2], [3], [4]). It comprises a semi-automatic marking of target objects (ground truth generation) including their propagation over the image sequence and the evaluation via user-defined feature extractors as well as methods to assess the object's movement conspicuity. In this fifth part in an annual series at the SPIE conference in Orlando, this paper presents the enhancements over the recent year and addresses the camouflage assessment of static and moving objects in multispectral image data that can show noise or image artefacts. The presented methods fathom the correlations between image processing and camouflage assessment. A novel algorithm is presented based on template matching to assess the structural inconspicuity of an object objectively and quantitatively. The results can easily be combined with an MTI (moving target indication) based movement conspicuity assessment function in order to explore the influence of object movement to a camouflage effect in different environments. As the results show, the presented methods contribute to a significant benefit in the field of camouflage assessment.

  1. Block sparse Cholesky algorithms on advanced uniprocessor computers

    SciTech Connect

    Ng, E.G.; Peyton, B.W.

    1991-12-01

    As with many other linear algebra algorithms, devising a portable implementation of sparse Cholesky factorization that performs well on the broad range of computer architectures currently available is a formidable challenge. Even after limiting our attention to machines with only one processor, as we have done in this report, there are still several interesting issues to consider. For dense matrices, it is well known that block factorization algorithms are the best means of achieving this goal. We take this approach for sparse factorization as well. This paper has two primary goals. First, we examine two sparse Cholesky factorization algorithms, the multifrontal method and a blocked left-looking sparse Cholesky method, in a systematic and consistent fashion, both to illustrate the strengths of the blocking techniques in general and to obtain a fair evaluation of the two approaches. Second, we assess the impact of various implementation techniques on time and storage efficiency, paying particularly close attention to the work-storage requirement of the two methods and their variants.

  2. Advances in computed radiography systems and their physical imaging characteristics.

    PubMed

    Cowen, A R; Davies, A G; Kengyelics, S M

    2007-12-01

    Radiological imaging is progressing towards an all-digital future, across the spectrum of medical imaging techniques. Computed radiography (CR) has provided a ready pathway from screen film to digital radiography and a convenient entry point to PACS. This review briefly revisits the principles of modern CR systems and their physical imaging characteristics. Wide dynamic range and digital image enhancement are well-established benefits of CR, which lend themselves to improved image presentation and reduced rates of repeat exposures. However, in its original form CR offered limited scope for reducing the radiation dose per radiographic exposure, compared with screen film. Recent innovations in CR, including the use of dual-sided image readout and channelled storage phosphor have eased these concerns. For example, introduction of these technologies has improved detective quantum efficiency (DQE) by approximately 50 and 100%, respectively, compared with standard CR. As a result CR currently affords greater scope for reducing patient dose, and provides a more substantive challenge to the new solid-state, flat-panel, digital radiography detectors.

  3. Report: EPA’s Radiation and Indoor Environments National Laboratory Should Improve Its Computer Room Security Controls

    EPA Pesticide Factsheets

    Report #12-P-0847, September 21, 2012.Our review of the security posture and in-place environmental controls of EPA’s Radiation and Indoor Environments National Laboratory computer room disclosed an array of security and environmental control deficiencies.

  4. WAATS: A computer program for Weights Analysis of Advanced Transportation Systems

    NASA Technical Reports Server (NTRS)

    Glatt, C. R.

    1974-01-01

    A historical weight estimating technique for advanced transportation systems is presented. The classical approach to weight estimation is discussed and sufficient data is presented to estimate weights for a large spectrum of flight vehicles including horizontal and vertical takeoff aircraft, boosters and reentry vehicles. A computer program, WAATS (Weights Analysis for Advanced Transportation Systems) embracing the techniques discussed has been written and user instructions are presented. The program was developed for use in the ODIN (Optimal Design Integration System) system.

  5. Unified parallel C and the computing needs of Sandia National Laboratories.

    SciTech Connect

    Brown, Jonathan Leighton; Wen, Zhaofang

    2004-09-01

    As Sandia looks toward petaflops computing and other advanced architectures, it is necessary to provide a programming environment that can exploit this additional computing power while supporting reasonable development time for applications. Thus, they evaluate the Partitioned Global Address Space (PGAS) programming model as implemented in Unified Parallel C (UPC) for its applicability. They report on their experiences in implementing sorting and minimum spanning tree algorithms on a test system, a Cray T3e, with UPC support. They describe several macros that could serve as language extensions and several building-block operations that could serve as a foundation for a PGAS programming library. They analyze the limitations of the UPC implementation available on the test system, and suggest improvements necessary before UPC can be used in a production environment.

  6. ADVANCED METHODS FOR THE COMPUTATION OF PARTICLE BEAM TRANSPORT AND THE COMPUTATION OF ELECTROMAGNETIC FIELDS AND MULTIPARTICLE PHENOMENA

    SciTech Connect

    Alex J. Dragt

    2012-08-31

    Since 1980, under the grant DEFG02-96ER40949, the Department of Energy has supported the educational and research work of the University of Maryland Dynamical Systems and Accelerator Theory (DSAT) Group. The primary focus of this educational/research group has been on the computation and analysis of charged-particle beam transport using Lie algebraic methods, and on advanced methods for the computation of electromagnetic fields and multiparticle phenomena. This Final Report summarizes the accomplishments of the DSAT Group from its inception in 1980 through its end in 2011.

  7. Computer aided fast turnaround laboratory for research in VLSI (Very Large Scale Integration)

    NASA Astrophysics Data System (ADS)

    Meindl, James D.; Shott, John

    1987-05-01

    The principal objectives of the computer aided/Automated fast turn-around laboratory (CAFTAL) for VLSI are: application of cutting edge computer science and software systems engineering to fast turn-around fabrication in order to develop more productive and flexible new approaches; fast turn-around fabrication of optimized VLSI systems achieved through synergistic integration of system research and device research in aggressive applications such as superfast computers, and investigation of physical limits on submicron VLSI in order to define and explore the most promising technologies. To make a state-of-the-art integrated circuit process more manufacturable, we must be able to understand both the numerous individual process technologies used to fabricate the complete device as well as the important device, circuit and system limitations in sufficient detail to monitor and control the overall fabrication sequence. Specifically, we must understand the sensitivity of device, circuit and system performance to each important step in the fabrication sequence. Moreover, we should be able to predict the manufacturability of an integrated circuit before we actually manufacture it. The salient objective of this program is to enable accurate simulation and control of computer-integrated manufacturing of ultra large scale integrated (ULSI) systems, including millions of submicron transistors in a single silicon chip.

  8. ADVANCED COMPUTATIONAL MODEL FOR THREE-PHASE SLURRY REACTORS

    SciTech Connect

    Goodarz Ahmadi

    2004-10-01

    In this project, an Eulerian-Lagrangian formulation for analyzing three-phase slurry flows in a bubble column was developed. The approach used an Eulerian analysis of liquid flows in the bubble column, and made use of the Lagrangian trajectory analysis for the bubbles and particle motions. The bubble-bubble and particle-particle collisions are included the model. The model predictions are compared with the experimental data and good agreement was found An experimental setup for studying two-dimensional bubble columns was developed. The multiphase flow conditions in the bubble column were measured using optical image processing and Particle Image Velocimetry techniques (PIV). A simple shear flow device for bubble motion in a constant shear flow field was also developed. The flow conditions in simple shear flow device were studied using PIV method. Concentration and velocity of particles of different sizes near a wall in a duct flow was also measured. The technique of Phase-Doppler anemometry was used in these studies. An Eulerian volume of fluid (VOF) computational model for the flow condition in the two-dimensional bubble column was also developed. The liquid and bubble motions were analyzed and the results were compared with observed flow patterns in the experimental setup. Solid-fluid mixture flows in ducts and passages at different angle of orientations were also analyzed. The model predictions were compared with the experimental data and good agreement was found. Gravity chute flows of solid-liquid mixtures were also studied. The simulation results were compared with the experimental data and discussed A thermodynamically consistent model for multiphase slurry flows with and without chemical reaction in a state of turbulent motion was developed. The balance laws were obtained and the constitutive laws established.

  9. Advances and perspectives in lung cancer imaging using multidetector row computed tomography.

    PubMed

    Coche, Emmanuel

    2012-10-01

    The introduction of multidetector row computed tomography (CT) into clinical practice has revolutionized many aspects of the clinical work-up. Lung cancer imaging has benefited from various breakthroughs in computing technology, with advances in the field of lung cancer detection, tissue characterization, lung cancer staging and response to therapy. Our paper discusses the problems of radiation, image visualization and CT examination comparison. It also reviews the most significant advances in lung cancer imaging and highlights the emerging clinical applications that use state of the art CT technology in the field of lung cancer diagnosis and follow-up.

  10. Advanced entry guidance algorithm with landing footprint computation

    NASA Astrophysics Data System (ADS)

    Leavitt, James Aaron

    -determined angle of attack profile. The method is also capable of producing orbital footprints using an automatically-generated set of angle of attack profiles of varying range, with the lowest profile designed for near-maximum range in the absence of an active heat load constraint. The accuracy of the footprint method is demonstrated by direct comparison with footprints computed independently by an optimization program.

  11. A validated methodology for determination of laboratory instrument computer interface efficacy

    NASA Astrophysics Data System (ADS)

    1984-12-01

    This report is intended to provide a methodology for determining when, and for which instruments, direct interfacing of laboratory instrument and laboratory computers is beneficial. This methodology has been developed to assist the Tri-Service Medical Information Systems Program Office in making future decisions regarding laboratory instrument interfaces. We have calculated the time savings required to reach a break-even point for a range of instrument interface prices and corresponding average annual costs. The break-even analyses used empirical data to estimate the number of data points run per day that are required to meet the break-even point. The results indicate, for example, that at a purchase price of $3,000, an instrument interface will be cost-effective if the instrument is utilized for at least 154 data points per day if operated in the continuous mode, or 216 points per day if operated in the discrete mode. Although this model can help to ensure that instrument interfaces are cost effective, additional information should be considered in making the interface decisions. A reduction in results transcription errors may be a major benefit of instrument interfacing.

  12. Computational Modeling and High Performance Computing in Advanced Materials Processing, Synthesis, and Design

    DTIC Science & Technology

    2014-12-07

    research efforts in this project focused on the synergistic coupling of: • Computational material science and mechanics of hybrid and light weight polymeric...MATERIAL SCIENCE AND MECHANICS OF HYBRID AND LIGHT WEIGHT POLYMERIC COMPOSITE STRUCTURES 11 A-l-l: Atomistic Modeling in Polymer Nanocomposite Systems...DETAILED TECHNICAL REPORT 16 A-1: COMPUTATIONAL MATERIAL SCIENCE AND MECHANICS OF HYBRID AND LIGHT WEIGHT POLYMERIC COMPOSITE STRUCTURES 16 A-l-l

  13. TEMPERATURE MONITORING OPTIONS AVAILABLE AT THE IDAHO NATIONAL LABORATORY ADVANCED TEST REACTOR

    SciTech Connect

    J.E. Daw; J.L. Rempe; D.L. Knudson; T. Unruh; B.M. Chase; K.L Davis

    2012-03-01

    As part of the Advanced Test Reactor National Scientific User Facility (ATR NSUF) program, the Idaho National Laboratory (INL) has developed in-house capabilities to fabricate, test, and qualify new and enhanced sensors for irradiation testing. To meet recent customer requests, an array of temperature monitoring options is now available to ATR users. The method selected is determined by test requirements and budget. Melt wires are the simplest and least expensive option for monitoring temperature. INL has recently verified the melting temperature of a collection of materials with melt temperatures ranging from 100 to 1000 C with a differential scanning calorimeter installed at INL’s High Temperature Test Laboratory (HTTL). INL encapsulates these melt wires in quartz or metal tubes. In the case of quartz tubes, multiple wires can be encapsulated in a single 1.6 mm diameter tube. The second option available to ATR users is a silicon carbide temperature monitor. The benefit of this option is that a single small monitor (typically 1 mm x 1 mm x 10 mm or 1 mm diameter x 10 mm length) can be used to detect peak irradiation temperatures ranging from 200 to 800 C. Equipment has been installed at INL’s HTTL to complete post-irradiation resistivity measurements on SiC monitors, a technique that has been found to yield the most accurate temperatures from these monitors. For instrumented tests, thermocouples may be used. In addition to Type-K and Type-N thermocouples, a High Temperature Irradiation Resistant ThermoCouple (HTIR-TC) was developed at the HTTL that contains commercially-available doped molybdenum paired with a niobium alloy thermoelements. Long duration high temperature tests, in furnaces and in the ATR and other MTRs, demonstrate that the HTIR-TC is accurate up to 1800 C and insensitive to thermal neutron interactions. Thus, degradation observed at temperatures above 1100 C with Type K and N thermocouples and decalibration due to transmutation with tungsten

  14. Advancing the Theory of Nuclear Reactions with Rare Isotopes. From the Laboratory to the Cosmos

    SciTech Connect

    Nunes, Filomena

    2015-06-01

    The mission of the Topical Collaboration on the Theory of Reactions for Unstable iSotopes (TORUS) was to develop new methods to advance nuclear reaction theory for unstable isotopes—particularly the (d,p) reaction in which a deuteron, composed of a proton and a neutron, transfers its neutron to an unstable nucleus. After benchmarking the state-of-the-art theories, the TORUS collaboration found that there were no exact methods to study (d,p) reactions involving heavy targets; the difficulty arising from the long-range nature of the well known, yet subtle, Coulomb force. To overcome this challenge, the TORUS collaboration developed a new theory where the complexity of treating the long-range Coulomb interaction is shifted to the calculation of so-called form-factors. An efficient implementation for the computation of these form factors was a major achievement of the TORUS collaboration. All the new machinery developed are essential ingredients to analyse (d,p) reactions involving heavy nuclei relevant for astrophysics, energy production, and stockpile stewardship.

  15. Projected role of advanced computational aerodynamic methods at the Lockheed-Georgia company

    NASA Technical Reports Server (NTRS)

    Lores, M. E.

    1978-01-01

    Experience with advanced computational methods being used at the Lockheed-Georgia Company to aid in the evaluation and design of new and modified aircraft indicates that large and specialized computers will be needed to make advanced three-dimensional viscous aerodynamic computations practical. The Numerical Aerodynamic Simulation Facility should be used to provide a tool for designing better aerospace vehicles while at the same time reducing development costs by performing computations using Navier-Stokes equations solution algorithms and permitting less sophisticated but nevertheless complex calculations to be made efficiently. Configuration definition procedures and data output formats can probably best be defined in cooperation with industry, therefore, the computer should handle many remote terminals efficiently. The capability of transferring data to and from other computers needs to be provided. Because of the significant amount of input and output associated with 3-D viscous flow calculations and because of the exceedingly fast computation speed envisioned for the computer, special attention should be paid to providing rapid, diversified, and efficient input and output.

  16. Characterization of Aerodynamic Interactions with the Mars Science Laboratory Reaction Control System Using Computation and Experiment

    NASA Technical Reports Server (NTRS)

    Schoenenberger, Mark; VanNorman, John; Rhode, Matthew; Paulson, John

    2013-01-01

    On August 5 , 2012, the Mars Science Laboratory (MSL) entry capsule successfully entered Mars' atmosphere and landed the Curiosity rover in Gale Crater. The capsule used a reaction control system (RCS) consisting of four pairs of hydrazine thrusters to fly a guided entry. The RCS provided bank control to fly along a flight path commanded by an onboard computer and also damped unwanted rates due to atmospheric disturbances and any dynamic instabilities of the capsule. A preliminary assessment of the MSL's flight data from entry showed that the capsule flew much as predicted. This paper will describe how the MSL aerodynamics team used engineering analyses, computational codes and wind tunnel testing in concert to develop the RCS system and certify it for flight. Over the course of MSL's development, the RCS configuration underwent a number of design iterations to accommodate mechanical constraints, aeroheating concerns and excessive aero/RCS interactions. A brief overview of the MSL RCS configuration design evolution is provided. Then, a brief description is presented of how the computational predictions of RCS jet interactions were validated. The primary work to certify that the RCS interactions were acceptable for flight was centered on validating computational predictions at hypersonic speeds. A comparison of computational fluid dynamics (CFD) predictions to wind tunnel force and moment data gathered in the NASA Langley 31-Inch Mach 10 Tunnel was the lynch pin to validating the CFD codes used to predict aero/RCS interactions. Using the CFD predictions and experimental data, an interaction model was developed for Monte Carlo analyses using 6-degree-of-freedom trajectory simulation. The interaction model used in the flight simulation is presented.

  17. Phase 1 environmental report for the Advanced Neutron Source at Oak Ridge National Laboratory

    SciTech Connect

    Blasing, T.J.; Brown, R.A.; Cada, G.F.; Easterly, C.; Feldman, D.L.; Hagan, C.W.; Harrington, R.M.; Johnson, R.O.; Ketelle, R.H.; Kroodsma, R.L.; McCold, L.N.; Reich, W.J.; Scofield, P.A.; Socolof, M.L.; Taleyarkhan, R.P.; Van Dyke, J.W.

    1992-02-01

    The US Department of Energy (DOE) has proposed the construction and operation of the Advanced Neutron Source (ANS), a 330-MW(f) reactor, at Oak Ridge National Laboratory (ORNL) to support neutron scattering and nuclear physics experiments. ANS would provide a steady-state source of neutrons that are thermalized to produce sources of hot, cold, and very coal neutrons. The use of these neutrons in ANS experiment facilities would be an essential component of national research efforts in basic materials science. Additionally, ANS capabilities would include production of transplutonium isotopes, irradiation of potential fusion and fission reactor materials, activation analysis, and production of medical and industrial isotopes such as {sup 252}Cf. Although ANS would not require licensing by the US Nuclear Regulatory Commission (NRC), DOE regards the design, construction, and operation of ANS as activities that would produce a licensable facility; that is, DOE is following the regulatory guidelines that NRC would apply if NRC were licensing the facility. Those guidelines include instructions for the preparation of an environmental report (ER), a compilation of available data and preliminary analyses regarding the environmental impacts of nuclear facility construction and operation. The ER, described and outlined in NRC Regulatory Guide 4.2, serves as a background document to facilitate the preparation of environmental impact statements (EISs). Using Regulatory Guide 4.2 as a model, this ANS ER provides analyses and information specific to the ANS site and area that can be adopted (and modified, if necessary) for the ANS EIS. The ER is being prepared in two phases. Phase 1 ER includes many of the data and analyses needed to prepare the EIS but does not include data or analyses of alternate sites or alternate technologies. Phase 2 ER will include the additional data and analyses stipulated by Regulatory Guide 4.2.

  18. Saturday Academay of Computing and Mathematics (SACAM) at the Oak Ridge National Laboratory

    SciTech Connect

    Clark, D.N. )

    1991-01-01

    To be part of the impending Information Age, our students and teachers must be trained in the use of computers, logic, and mathematics. The Saturday Academy of Computing and Mathematics (SACAM) represents one facet of Oak Ridge National Laboratory's (ORNL) response to meet the challenge. SACAM attempts to provide the area's best high school students with a creative program that illustrates how researchers are using computing and mathematics tools to help solve nationally recognized problems in virtually all scientific fields. Each SACAM program is designed as eight 3-hour sessions. Each session outlines a current scientific question or research area. Sessions are presented on a Saturday morning by a speaker team of two to four ORNL scientists (mentors) working in that particular field. Approximately four students and one teacher from each of ten area high schools attend the eight sessions. Session topics cover diverse problems such as climate modeling cryptography and cryptology, high-energy physics, human genome sequencing, and even the use of probability in locating people lost in a national forest. Evaluations from students, teachers, and speakers indicate that the program has been well received, and a tracking program is being undertaken to determine long-range benefits. An analysis of the program's successes and lessons learned is presented as well as resources required for the program.

  19. Climate systems modeling on massively parallel processing computers at Lawrence Livermore National Laboratory

    SciTech Connect

    Wehner, W.F.; Mirin, A.A.; Bolstad, J.H.

    1996-09-01

    A comprehensive climate system model is under development at Lawrence Livermore National Laboratory. The basis for this model is a consistent coupling of multiple complex subsystem models, each describing a major component of the Earth`s climate. Among these are general circulation models of the atmosphere and ocean, a dynamic and thermodynamic sea ice model, and models of the chemical processes occurring in the air, sea water, and near-surface land. The computational resources necessary to carry out simulations at adequate spatial resolutions for durations of climatic time scales exceed those currently available. Distributed memory massively parallel processing (MPP) computers promise to affordably scale to the computational rates required by directing large numbers of relatively inexpensive processors onto a single problem. We have developed a suite of routines designed to exploit current generation MPP architectures via domain and functional decomposition strategies. These message passing techniques have been implemented in each of the component models and in their coupling interfaces. Production runs of the atmospheric and oceanic components performed on the National Environmental Supercomputing Center (NESC) Cray T3D are described.

  20. Fossil energy: From laboratory to marketplace. Part 2, The role of advanced research

    SciTech Connect

    Not Available

    1992-03-01

    The purpose of this work is to provide a summary description of the role of advanced research in the overall Fossil Energy R&D program successes. It presents the specific Fossil Energy advanced research products that have been adopted commercially or fed into other R&D programs as part of the crosscutting enabling technology base upon which advanced systems are based.

  1. Volumes to learn: advancing therapeutics with innovative computed tomography image data analysis.

    PubMed

    Maitland, Michael L

    2010-09-15

    Semi-automated methods for calculating tumor volumes from computed tomography images are a new tool for advancing the development of cancer therapeutics. Volumetric measurements, relying on already widely available standard clinical imaging techniques, could shorten the observation intervals needed to identify cohorts of patients sensitive or resistant to treatment.

  2. Next Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and Systems Biology (Final Report)

    EPA Science Inventory

    EPA announced the release of the final report, Next Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and Systems Biology. This report describes new approaches that are faster, less resource intensive, and more robust that can help ...

  3. Response to House Joint Resolution No. 118 [To Advance Computer-Assisted Instruction].

    ERIC Educational Resources Information Center

    Virginia State General Assembly, Richmond.

    This response by the Virginia Department of Education to House Joint Resolution No. 118 of the General Assembly of Virginia, which requested the Department of Education to study initiatives to advance computer-assisted instruction, is based on input from state and national task forces and on a 1986 survey of 80 Viriginia school divisions. The…

  4. PARTNERING WITH DOE TO APPLY ADVANCED BIOLOGICAL, ENVIRONMENTAL, AND COMPUTATIONAL SCIENCE TO ENVIRONMENTAL ISSUES

    EPA Science Inventory

    On February 18, 2004, the U.S. Environmental Protection Agency and Department of Energy signed a Memorandum of Understanding to expand the research collaboration of both agencies to advance biological, environmental, and computational sciences for protecting human health and the ...

  5. Computers-for-edu: An Advanced Business Application Programming (ABAP) Teaching Case

    ERIC Educational Resources Information Center

    Boyle, Todd A.

    2007-01-01

    The "Computers-for-edu" case is designed to provide students with hands-on exposure to creating Advanced Business Application Programming (ABAP) reports and dialogue programs, as well as navigating various mySAP Enterprise Resource Planning (ERP) transactions needed by ABAP developers. The case requires students to apply a wide variety…

  6. Scientific Discovery through Advanced Computing (SciDAC-3) Partnership Project Annual Report

    SciTech Connect

    Hoffman, Forest M.; Bochev, Pavel B.; Cameron-Smith, Philip J..; Easter, Richard C; Elliott, Scott M.; Ghan, Steven J.; Liu, Xiaohong; Lowrie, Robert B.; Lucas, Donald D.; Ma, Po-lun; Sacks, William J.; Shrivastava, Manish; Singh, Balwinder; Tautges, Timothy J.; Taylor, Mark A.; Vertenstein, Mariana; Worley, Patrick H.

    2014-01-15

    The Applying Computationally Efficient Schemes for BioGeochemical Cycles ACES4BGC Project is advancing the predictive capabilities of Earth System Models (ESMs) by reducing two of the largest sources of uncertainty, aerosols and biospheric feedbacks, with a highly efficient computational approach. In particular, this project is implementing and optimizing new computationally efficient tracer advection algorithms for large numbers of tracer species; adding important biogeochemical interactions between the atmosphere, land, and ocean models; and applying uncertainty quanti cation (UQ) techniques to constrain process parameters and evaluate uncertainties in feedbacks between biogeochemical cycles and the climate system.

  7. Research in Computational Aeroscience Applications Implemented on Advanced Parallel Computing Systems

    NASA Technical Reports Server (NTRS)

    Wigton, Larry

    1996-01-01

    Improving the numerical linear algebra routines for use in new Navier-Stokes codes, specifically Tim Barth's unstructured grid code, with spin-offs to TRANAIR is reported. A fast distance calculation routine for Navier-Stokes codes using the new one-equation turbulence models is written. The primary focus of this work was devoted to improving matrix-iterative methods. New algorithms have been developed which activate the full potential of classical Cray-class computers as well as distributed-memory parallel computers.

  8. Impact of computer advances on future finite elements computations. [for aircraft and spacecraft design

    NASA Technical Reports Server (NTRS)

    Fulton, Robert E.

    1985-01-01

    Research performed over the past 10 years in engineering data base management and parallel computing is discussed, and certain opportunities for research toward the next generation of structural analysis capability are proposed. Particular attention is given to data base management associated with the IPAD project and parallel processing associated with the Finite Element Machine project, both sponsored by NASA, and a near term strategy for a distributed structural analysis capability based on relational data base management software and parallel computers for a future structural analysis system.

  9. A first attempt to bring computational biology into advanced high school biology classrooms.

    PubMed

    Gallagher, Suzanne Renick; Coon, William; Donley, Kristin; Scott, Abby; Goldberg, Debra S

    2011-10-01

    Computer science has become ubiquitous in many areas of biological research, yet most high school and even college students are unaware of this. As a result, many college biology majors graduate without adequate computational skills for contemporary fields of biology. The absence of a computational element in secondary school biology classrooms is of growing concern to the computational biology community and biology teachers who would like to acquaint their students with updated approaches in the discipline. We present a first attempt to correct this absence by introducing a computational biology element to teach genetic evolution into advanced biology classes in two local high schools. Our primary goal was to show students how computation is used in biology and why a basic understanding of computation is necessary for research in many fields of biology. This curriculum is intended to be taught by a computational biologist who has worked with a high school advanced biology teacher to adapt the unit for his/her classroom, but a motivated high school teacher comfortable with mathematics and computing may be able to teach this alone. In this paper, we present our curriculum, which takes into consideration the constraints of the required curriculum, and discuss our experiences teaching it. We describe the successes and challenges we encountered while bringing this unit to high school students, discuss how we addressed these challenges, and make suggestions for future versions of this curriculum.We believe that our curriculum can be a valuable seed for further development of computational activities aimed at high school biology students. Further, our experiences may be of value to others teaching computational biology at this level. Our curriculum can be obtained at http://ecsite.cs.colorado.edu/?page_id=149#biology or by contacting the authors.

  10. Building an advanced climate model: Program plan for the CHAMMP (Computer Hardware, Advanced Mathematics, and Model Physics) Climate Modeling Program

    SciTech Connect

    Not Available

    1990-12-01

    The issue of global warming and related climatic changes from increasing concentrations of greenhouse gases in the atmosphere has received prominent attention during the past few years. The Computer Hardware, Advanced Mathematics, and Model Physics (CHAMMP) Climate Modeling Program is designed to contribute directly to this rapid improvement. The goal of the CHAMMP Climate Modeling Program is to develop, verify, and apply a new generation of climate models within a coordinated framework that incorporates the best available scientific and numerical approaches to represent physical, biogeochemical, and ecological processes, that fully utilizes the hardware and software capabilities of new computer architectures, that probes the limits of climate predictability, and finally that can be used to address the challenging problem of understanding the greenhouse climate issue through the ability of the models to simulate time-dependent climatic changes over extended times and with regional resolution.

  11. FY05-FY06 Advanced Simulation and Computing Implementation Plan, Volume 2

    SciTech Connect

    Baron, A L

    2004-07-19

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program will require the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapon design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile life extension programs and the resolution of significant finding investigations (SFIs). This requires a balanced system of technical staff, hardware, simulation software, and computer science solutions.

  12. Advanced Optical Diagnostics for Ice Crystal Cloud Measurements in the NASA Glenn Propulsion Systems Laboratory

    NASA Technical Reports Server (NTRS)

    Bencic, Timothy J.; Fagan, Amy; Van Zante, Judith F.; Kirkegaard, Jonathan P.; Rohler, David P.; Maniyedath, Arjun; Izen, Steven H.

    2013-01-01

    A light extinction tomography technique has been developed to monitor ice water clouds upstream of a direct connected engine in the Propulsion Systems Laboratory (PSL) at NASA Glenn Research Center (GRC). The system consists of 60 laser diodes with sheet generating optics and 120 detectors mounted around a 36-inch diameter ring. The sources are pulsed sequentially while the detectors acquire line-of-sight extinction data for each laser pulse. Using computed tomography algorithms, the extinction data are analyzed to produce a plot of the relative water content in the measurement plane. To target the low-spatial-frequency nature of ice water clouds, unique tomography algorithms were developed using filtered back-projection methods and direct inversion methods that use Gaussian basis functions. With the availability of a priori knowledge of the mean droplet size and the total water content at some point in the measurement plane, the tomography system can provide near real-time in-situ quantitative full-field total water content data at a measurement plane approximately 5 feet upstream of the engine inlet. Results from ice crystal clouds in the PSL are presented. In addition to the optical tomography technique, laser sheet imaging has also been applied in the PSL to provide planar ice cloud uniformity and relative water content data during facility calibration before the tomography system was available and also as validation data for the tomography system. A comparison between the laser sheet system and light extinction tomography resulting data are also presented. Very good agreement of imaged intensity and water content is demonstrated for both techniques. Also, comparative studies between the two techniques show excellent agreement in calculation of bulk total water content averaged over the center of the pipe.

  13. The use of computer-aided learning in chemistry laboratory instruction

    NASA Astrophysics Data System (ADS)

    Allred, Brian Robert Tracy

    This research involves developing and implementing computer software for chemistry laboratory instruction. The specific goal is to design the software and investigate whether it can be used to introduce concepts and laboratory procedures without a lecture format. This would allow students to conduct an experiment even though they may not have been introduced to the chemical concept in their lecture course. This would also allow for another type of interaction for those students who respond more positively to a visual approach to instruction. The first module developed was devoted to using computer software to help introduce students to the concepts related to thin-layer chromatography and setting up and running an experiment. This was achieved through the use of digitized pictures and digitized video clips along with written information. A review quiz was used to help reinforce the learned information. The second module was devoted to the concept of the "dry lab". This module presented students with relevant information regarding the chemical concepts and then showed them the outcome of mixing solutions. By these observations, they were to determine the composition of unknown solutions based on provided descriptions and comparison with their written observations. The third piece of the software designed was a computer game. This program followed the first two modules in providing information the students were to learn. The difference here, though, was incorporating a game scenario for students to use to help reinforce the learning. Students were then assessed to see how much information they retained after playing the game. In each of the three cases, a control group exposed to the traditional lecture format was used. Their results were compared to the experimental group using the computer modules. Based upon the findings, it can be concluded that using technology to aid in the instructional process is definitely of benefit and students were more successful in

  14. A Computational Method for 3D Anisotropic Travel-time Tomography of Rocks in the Laboratory

    NASA Astrophysics Data System (ADS)

    Ghofranitabari, Mehdi; Young, R. Paul

    2013-04-01

    True triaxial loading in the laboratory applies three principal stresses on a cubic rock specimen. Elliptical anisotropy and distributed heterogeneities are introduced in the rock due to closure and opening of the pre-existing cracks and creation and growth of the new aligned cracks. The rock sample is tested in a Geophysical Imaging Cell that is armed with an Acoustic Emission monitoring system which can perform transducer to transducer velocity surveys to image velocity structure of the sample during the experiment. Ultrasonic travel-time tomography as a non-destructive method outfits a map of wave propagation velocity in the sample in order to detect the uniformly distributed or localised heterogeneities and provide the spatial variation and temporal evolution of induced damages in rocks at various stages of loading. The rock sample is partitioned into cubic grid cells as model space. Ray-based tomography method measuring body wave travel time along ray paths between pairs of emitting and receiving transducers is used to calculate isotropic ray-path segment matrix elements (Gij) which contain segment lengths of the ith ray in the jth cell in three dimensions. Synthetic P wave travel times are computed between pairs of transducers in a hypothetical isotropic heterogeneous cubic sample as data space along with an error due to precision of measurement. 3D strain of the squeezed rock and the consequent geometrical deformation is also included in computations for further accuracy. Singular Value Decomposition method is used for the inversion from data space to model space. In the next step, the anisotropic ray-path segment matrix and the corresponded data space are computed for hypothetical anisotropic heterogeneous samples based on the elliptical anisotropic model of velocity which is obtained from the real laboratory experimental data. The method is examined for several different synthetic heterogeneous models. An "Inaccuracy factor" is utilized to inquire the

  15. The digital computer as a metaphor for the perfect laboratory experiment: Loophole-free Bell experiments

    NASA Astrophysics Data System (ADS)

    De Raedt, Hans; Michielsen, Kristel; Hess, Karl

    2016-12-01

    Using Einstein-Podolsky-Rosen-Bohm experiments as an example, we demonstrate that the combination of a digital computer and algorithms, as a metaphor for a perfect laboratory experiment, provides solutions to problems of the foundations of physics. Employing discrete-event simulation, we present a counterexample to John Bell's remarkable "proof" that any theory of physics, which is both Einstein-local and "realistic" (counterfactually definite), results in a strong upper bound to the correlations that are being measured in Einstein-Podolsky-Rosen-Bohm experiments. Our counterexample, which is free of the so-called detection-, coincidence-, memory-, and contextuality loophole, violates this upper bound and fully agrees with the predictions of quantum theory for Einstein-Podolsky-Rosen-Bohm experiments.

  16. Savannah River Laboratory DOSTOMAN code: a compartmental pathways computer model of contaminant transport

    SciTech Connect

    King, C M; Wilhite, E L; Root, Jr, R W; Fauth, D J; Routt, K R; Emslie, R H; Beckmeyer, R R; Fjeld, R A; Hutto, G A; Vandeven, J A

    1985-01-01

    The Savannah River Laboratory DOSTOMAN code has been used since 1978 for environmental pathway analysis of potential migration of radionuclides and hazardous chemicals. The DOSTOMAN work is reviewed including a summary of historical use of compartmental models, the mathematical basis for the DOSTOMAN code, examples of exact analytical solutions for simple matrices, methods for numerical solution of complex matrices, and mathematical validation/calibration of the SRL code. The review includes the methodology for application to nuclear and hazardous chemical waste disposal, examples of use of the model in contaminant transport and pathway analysis, a user's guide for computer implementation, peer review of the code, and use of DOSTOMAN at other Department of Energy sites. 22 refs., 3 figs.

  17. ELAS - A geobased information system that is transferable to several computers. [Earth resources Laboratory Applications Software

    NASA Technical Reports Server (NTRS)

    Whitley, S. L.; Pearson, R. W.; Seyfarth, B. R.; Graham, M. H.

    1981-01-01

    In the early years of remote sensing, emphasis was placed on the processing and analysis of data from a single multispectral sensor, such as the Landsat Multispectral Scanner System (MSS). However, in connection with attempts to use the data for resource management, it was realized that many deficiencies existed in single data sets. A need was established to geographically reference the MSS data and to register with it data from disparate sources. Technological transfer activities have required systems concepts that can be easily transferred to computers of different types in other organizations. ELAS (Earth Resources Laboratory Applications Software), a geographically based information system, was developed to meet the considered needs. ELAS accepts data from a variety of sources. It contains programs to geographically reference the data to the Universal Transverse Mercator grid. One of the primary functions of ELAS is to produce a surface cover map.

  18. Computer Network Availability at Sandia National Laboratories, Albuquerque NM: Measurement and Perception

    SciTech Connect

    NELSON,SPENCER D.; TOLENDINO,LAWRENCE F.

    1999-11-01

    The desire to provide a measure of computer network availability at Sandia National Laboratories has existed for along time. Several attempts were made to build this measure by accurately recording network failures, identifying the type of network element involved, the root cause of the problem, and the time to repair the fault. Recognizing the limitations of available methods, it became obvious that another approach of determining network availability had to be defined. The chosen concept involved the periodic sampling of network services and applications from various network locations. A measure of ''network'' availability was then calculated based on the ratio of polling success to failure. The effort required to gather the information and produce a useful metric is not prohibitive and the information gained has verified long held feelings regarding network performance with real data.

  19. Applied x-ray computed tomography with high resolution in paleontology using laboratory and synchrotron sources

    NASA Astrophysics Data System (ADS)

    Bidola, Pidassa; Pacheco, Mirian L. A. F.; Stockmar, Marco K.; Achterhold, Klaus; Pfeiffer, Franz; Beckmann, Felix; Tafforeau, Paul; Herzen, Julia

    2014-09-01

    X-ray computed tomography (CT) has become an established technique in the biomedical imaging or materials science research. Its ability to non-destructively provide high-resolution images of samples makes it attractive for diverse fields of research especially the paleontology. Exceptionally, the Precambrian is a geological time of rocks deposition containing several fossilized early animals, which still need to be investigated in order to predict the origin and evolution of early life. Corumbella werneri is one of those fossils skeletonized in Corumbá (Brazil). Here, we present a study on selected specimens of Corumbella werneri using absorption-based contrast imaging at diverse tomographic setups. We investigated the potential of conventional laboratory-based device and synchrotron radiation sources to visualize internal structures of the fossils. The obtained results are discussed as well as the encountered limitations of those setups.

  20. Pencil-and-Paper Neural Networks: An Undergraduate Laboratory Exercise in Computational Neuroscience.

    PubMed

    Crisp, Kevin M; Sutter, Ellen N; Westerberg, Jacob A

    2015-01-01

    Although it has been more than 70 years since McCulloch and Pitts published their seminal work on artificial neural networks, such models remain primarily in the domain of computer science departments in undergraduate education. This is unfortunate, as simple network models offer undergraduate students a much-needed bridge between cellular neurobiology and processes governing thought and behavior. Here, we present a very simple laboratory exercise in which students constructed, trained and tested artificial neural networks by hand on paper. They explored a variety of concepts, including pattern recognition, pattern completion, noise elimination and stimulus ambiguity. Learning gains were evident in changes in the use of language when writing about information processing in the brain.

  1. Communication and computing technology in biocontainment laboratories using the NEIDL as a model.

    PubMed

    McCall, John; Hardcastle, Kath

    2014-07-01

    The National Emerging Infectious Diseases Laboratories (NEIDL), Boston University, is a globally unique biocontainment research facility housing biosafety level 2 (BSL-2), BSL-3, and BSL-4 laboratories. Located in the BioSquare area at the University's Medical Campus, it is part of a national network of secure facilities constructed to study infectious diseases of major public health concern. The NEIDL allows for basic, translational, and clinical phases of research to be carried out in a single facility with the overall goal of accelerating understanding, treatment, and prevention of infectious diseases. The NEIDL will also act as a center of excellence providing training and education in all aspects of biocontainment research. Within every detail of NEIDL operations is a primary emphasis on safety and security. The ultramodern NEIDL has required a new approach to communications technology solutions in order to ensure safety and security and meet the needs of investigators working in this complex building. This article discusses the implementation of secure wireless networks and private cloud computing to promote operational efficiency, biosecurity, and biosafety with additional energy-saving advantages. The utilization of a dedicated data center, virtualized servers, virtualized desktop integration, multichannel secure wireless networks, and a NEIDL-dedicated Voice over Internet Protocol (VoIP) network are all discussed.

  2. 2015 Annual Reuse Report for the Idaho National Laboratory Site’s Advanced Test Reactor Complex Cold Waste Ponds

    SciTech Connect

    Lewis, Michael George

    2016-02-01

    This report describes conditions and information, as required by the state of Idaho, Department of Environmental Quality Reuse Permit I-161-02, for the Advanced Test Reactor Complex Cold Waste Ponds located at Idaho National Laboratory from November 1, 2014–October 31, 2015. The effective date of Reuse Permit I-161-02 is November 20, 2014 with an expiration date of November 19, 2019.

  3. Naval Research Laboratory's programs in advanced indium phosphide solar cell development

    NASA Technical Reports Server (NTRS)

    Summers, Geoffrey P.

    1995-01-01

    The Naval Research Laboratory has been involved in developing InP solar cell technology since 1988. The purpose of these programs was to produce advanced cells for use in very high radiation environments, either as a result of operating satellites in the Van Allen belts or for very long duration missions in other orbits. Richard Statler was technical representative on the first program, with Spire Corporation as the contractor, which eventually produced several hundred, high efficiency 2 x 2 sq cm single crystal InP cells. The shallow homojunction technology which was developed in this program enabled cells to be made with AMO, one sun efficiencies greater than 19%. Many of these cells have been flown on space experiments, including PASP Plus, which have confirmed the high radiation resistance of InP cells. NRL has also published widely on the radiation response of these cells and also on radiation-induced defect levels detected by DLTS, especially the work of Rob Walters and Scott Messenger. In 1990 NRL began another Navy-sponsored program with Tim Coutts and Mark Wanlass at the National Renewable Energy Laboratory (NREL), to develop a one sun, two terminal space version of the InP-InGaAs tandem junction cell being investigated at NREL for terrestrial applications. These cells were grown on InP substrates. Several cells with AM0, one sun efficiencies greater than 22% were produced. Two 2 x 2 sq cm cells were incorporated on the STRV lA/B solar cell experiment. These were the only two junction, tandem cells on the STRV experiment. The high cost and relative brittleness of InP wafers meant that if InP cell technology were to become a viable space power source, the superior radiation resistance of InP would have to be combined with a cheaper and more robust substrate. The main technical challenge was to overcome the effect of the dislocations produced by the lattice mismatch at the interface of the two materials. Over the last few years, NRL and Steve Wojtczuk at

  4. Determining the hydraulic properties of saturated, low-permeability geological materials in the laboratory: Advances in theory and practice

    USGS Publications Warehouse

    Zhang, M.; Takahashi, M.; Morin, R.H.; Endo, H.; Esaki, T.; ,

    2002-01-01

    The accurate hydraulic characterization of low-permeability subsurface environments has important practical significance. In order to examine this issue from the perspective of laboratory-based approaches, we review some recent advancements in the theoretical analyses of three different laboratory techniques specifically applied to low-permeability geologic materials: constant-head, constant flow-rate and transient-pulse permeability tests. Some potential strategies for effectively decreasing the time required to confidently estimate the permeability of these materials are presented. In addition, a new and versatile laboratory system is introduced that can implement any of these three test methods while simultaneously subjecting a specimen to high confining pressures and pore pressures, thereby simulating in situ conditions at great depths. The capabilities and advantages of this innovative system are demonstrated using experimental data derived from Shirahama sandstone and Inada granite, two rock types widely encountered in Japan.

  5. Designing Communication Protocols for a Computer-Mediated Laboratory for Problem-Based Learning

    PubMed Central

    Koschmann, T.D.; Feltovich, P.J.; Myers, A.; Barrows, H.S.

    1990-01-01

    This paper describes some of the design criteria for a facility to support problem-based tutorials, known as the Computer-Mediated Tutorial Laboratory (CMTL). In the CMTL, a networked workstation will be provided for the tutor and each of the students. The tutor's workstation will be connected to a projection system, permitting the entire group to view the tutor's screen. The software used in the CMTL will have three components: a Patient Simulation Stack (PSS), the group/student Tutorial Stacks (TSs) and the network communication interface. The PSS represents a clinical problem; it is designed to realistically simulate an encounter with an actual patient. The TSs serve as a personalized record of what transpired in the tutorial session. Each student will maintain a private TS and the tutor will maintain a shared TS, viewable by all members of the group. The network communication interface will permit the participants in the tutorial to direct electronic messages to each other. The communication interface has two components: the client software available on the student's and tutor's workstation and the server software. The CMTL client software consists of two applications-one for sending messages and one for viewing the stream of incoming messages. Research is planned to investigate the effects of computer-mediation on the tutorial process.

  6. The Advancement in Using Remote Laboratories in Electrical Engineering Education: A Review

    ERIC Educational Resources Information Center

    Almarshoud, A. F.

    2011-01-01

    The rapid development in Internet technology and its big popularity has led some universities around the world to incorporate web-based learning in some of their programmes. The present paper introduces a comprehensive survey of the publications about using remote laboratories in electrical engineering education. Remote laboratories are web-based,…

  7. ENHANCING THE ATOMIC-LEVEL UNDERSTANDING OF CO2 MINERAL SEQUESTRATION MECHANISMS VIA ADVANCED COMPUTATIONAL MODELING

    SciTech Connect

    A.V.G. Chizmeshya; M.J. McKelvy; G.H. Wolf; R.W. Carpenter; D.A. Gormley; J.R. Diefenbacher; R. Marzke

    2006-03-01

    significantly improved our understanding of mineral carbonation. Group members at the Albany Research Center have recently shown that carbonation of olivine and serpentine, which naturally occurs over geological time (i.e., 100,000s of years), can be accelerated to near completion in hours. Further process refinement will require a synergetic science/engineering approach that emphasizes simultaneous investigation of both thermodynamic processes and the detailed microscopic, atomic-level mechanisms that govern carbonation kinetics. Our previously funded Phase I Innovative Concepts project demonstrated the value of advanced quantum-mechanical modeling as a complementary tool in bridging important gaps in our understanding of the atomic/molecular structure and reaction mechanisms that govern CO2 mineral sequestration reaction processes for the model Mg-rich lamellar hydroxide feedstock material Mg(OH)2. In the present simulation project, improved techniques and more efficient computational schemes have allowed us to expand and augment these capabilities and explore more complex Mg-rich, lamellar hydroxide-based feedstock materials, including the serpentine-based minerals. These feedstock materials are being actively investigated due to their wide availability, and low-cost CO2 mineral sequestration potential. Cutting-edge first principles quantum chemical, computational solid-state and materials simulation methodology studies proposed herein, have been strategically integrated with our new DOE supported (ASU-Argonne National Laboratory) project to investigate the mechanisms that govern mineral feedstock heat-treatment and aqueous/fluid-phase serpentine mineral carbonation in situ. This unified, synergetic theoretical and experimental approach has provided a deeper understanding of the key reaction mechanisms than either individual approach can alone. We used ab initio techniques to significantly advance our understanding of atomic-level processes at the solid/solution interface by

  8. ENHANCING THE ATOMIC-LEVEL UNDERSTANDING OF CO2 MINERAL SEQUESTRATION MECHANISMS VIA ADVANCED COMPUTATIONAL MODELING

    SciTech Connect

    A.V.G. Chizmeshya

    2003-12-19

    /NETL managed National Mineral Sequestration Working Group we have already significantly improved our understanding of mineral carbonation. Group members at the Albany Research Center have recently shown that carbonation of olivine and serpentine, which naturally occurs over geological time (i.e., 100,000s of years), can be accelerated to near completion in hours. Further process refinement will require a synergetic science/engineering approach that emphasizes simultaneous investigation of both thermodynamic processes and the detailed microscopic, atomic-level mechanisms that govern carbonation kinetics. Our previously funded Phase I Innovative Concepts project demonstrated the value of advanced quantum-mechanical modeling as a complementary tool in bridging important gaps in our understanding of the atomic/molecular structure and reaction mechanisms that govern CO{sub 2} mineral sequestration reaction processes for the model Mg-rich lamellar hydroxide feedstock material Mg(OH){sub 2}. In the present simulation project, improved techniques and more efficient computational schemes have allowed us to expand and augment these capabilities and explore more complex Mg-rich, lamellar hydroxide-based feedstock materials, including the serpentine-based minerals. These feedstock materials are being actively investigated due to their wide availability, and low-cost CO{sub 2} mineral sequestration potential. Cutting-edge first principles quantum chemical, computational solid-state and materials simulation methodology studies proposed herein, have been strategically integrated with our new DOE supported (ASU-Argonne National Laboratory) project to investigate the mechanisms that govern mineral feedstock heat-treatment and aqueous/fluid-phase serpentine mineral carbonation in situ. This unified, synergetic theoretical and experimental approach will provide a deeper understanding of the key reaction mechanisms than either individual approach can alone. Ab initio techniques will also

  9. ENHANCING THE ATOMIC-LEVEL UNDERSTANDING OF CO2 MINERAL SEQUESTRATION MECHANISMS VIA ADVANCED COMPUTATIONAL MODELING

    SciTech Connect

    A.V.G. Chizmeshya

    2002-12-19

    /NETL managed National Mineral Sequestration Working Group we have already significantly improved our understanding of mineral carbonation. Group members at the Albany Research Center have recently shown that carbonation of olivine and serpentine, which naturally occurs over geological time (i.e., 100,000s of years), can be accelerated to near completion in hours. Further process refinement will require a synergetic science/engineering approach that emphasizes simultaneous investigation of both thermodynamic processes and the detailed microscopic, atomic-level mechanisms that govern carbonation kinetics. Our previously funded Phase I Innovative Concepts project demonstrated the value of advanced quantum-mechanical modeling as a complementary tool in bridging important gaps in our understanding of the atomic/molecular structure and reaction mechanisms that govern CO{sub 2} mineral sequestration reaction processes for the model Mg-rich lamellar hydroxide feedstock material Mg(OH){sub 2}. In the present simulation project, improved techniques and more efficient computational schemes have allowed us to expand and augment these capabilities and explore more complex Mg-rich, lamellar hydroxide-based feedstock materials, including the serpentine-based minerals. These feedstock materials are being actively investigated due to their wide availability, and low-cost CO{sub 2} mineral sequestration potential. Cutting-edge first principles quantum chemical, computational solid-state and materials simulation methodology studies proposed herein, have been strategically integrated with our new DOE supported (ASU-Argonne National Laboratory) project to investigate the mechanisms that govern mineral feedstock heat-treatment and aqueous/fluid-phase serpentine mineral carbonation in situ. This unified, synergetic theoretical and experimental approach will provide a deeper understanding of the key reaction mechanisms than either individual approach can alone. Ab initio techniques will also

  10. Restructuring the CS 1 classroom: Examining the effect of open laboratory-based classes vs. closed laboratory-based classes on Computer Science 1 students' achievement and attitudes toward computers and computer courses

    NASA Astrophysics Data System (ADS)

    Henderson, Jean Foster

    The purpose of this study was to assess the effect of classroom restructuring involving computer laboratories on student achievement and student attitudes toward computers and computer courses. The effects of the targeted student attributes of gender, previous programming experience, math background, and learning style were also examined. The open lab-based class structure consisted of a traditional lecture class with a separate, unscheduled lab component in which lab assignments were completed outside of class; the closed lab-based class structure integrated a lab component within the lecture class so that half the class was reserved for lecture and half the class was reserved for students to complete lab assignments by working cooperatively with each other and under the supervision and guidance of the instructor. The sample consisted of 71 students enrolled in four intact classes of Computer Science I during the fall and spring semesters of the 2006--2007 school year at two southern universities: two classes were held in the fall (one at each university) and two classes were held in the spring (one at each university). A counterbalanced repeated measures design was used in which all students experienced both class structures for half of each semester. The order of control and treatment was rotated among the four classes. All students received the same amount of class and instructor time. A multivariate analysis of variance (MANOVA) via a multiple regression strategy was used to test the study's hypotheses. Although the overall MANOVA model was statistically significant, independent follow-up univariate analyses relative to each dependent measure found that the only significant research factor was math background: Students whose mathematics background was at the level of Calculus I or higher had significantly higher student achievement than students whose mathematics background was less than Calculus I. The results suggest that classroom structures that

  11. The Facilitation of Problem-Based Learning in Medical Education Through a Computer-Mediated Tutorial Laboratory

    PubMed Central

    Myers, A.; Barrows, H.S.; Koschmann, T.D.; Feltovich, P.J.

    1990-01-01

    This paper describes the means by which a computer-supported group interaction system known as the Computer-Mediated Tutorial Laboratory (CMTL) is used to support Problem-Based Learning Tutorials. The Problem-Based Learning Tutorial process has traditionally been solely a group process, sharing both the advantages and the disadvantages of any group process. This paper discusses the nature of Problem-Based Learning, the logistics of integrating computer mediation with the tutorial process and how computer mediation can be used to facilitate the eliciting and recording of individual input while enhancing the powerful effects of the group process.

  12. Computation of Loads on the McDonnell Douglas Advanced Bearingless Rotor

    NASA Technical Reports Server (NTRS)

    Nguyen, Khanh; Lauzon, Dan; Anand, Vaidyanathan

    1994-01-01

    Computed results from UMARC and DART analyses are compared with the blade bending moments and vibratory hub loads data obtained from a full-scale wind tunnel test of the McDonnell Douglas five-bladed advanced bearingless rotor. The 5 per-rev vibratory hub loads data are corrected using results from a dynamic calibration of the rotor balance. The comparison between UMARC computed blade bending moments at different flight conditions are poor to fair, while DART results are fair to good. Using the free wake module, UMARC adequately computes the 5P vibratory hub loads for this rotor, capturing both magnitude and variations with forward speed. DART employs a uniform inflow wake model and does not adequately compute the 5P vibratory hub loads for this rotor.

  13. Effects of Combined Hands-on Laboratory and Computer Modeling on Student Learning of Gas Laws: A Quasi-Experimental Study

    ERIC Educational Resources Information Center

    Liu, Xiufeng

    2006-01-01

    Based on current theories of chemistry learning, this study intends to test a hypothesis that computer modeling enhanced hands-on chemistry laboratories are more effective than hands-on laboratories or computer modeling laboratories alone in facilitating high school students' understanding of chemistry concepts. Thirty-three high school chemistry…

  14. Advanced Simulation & Computing FY15 Implementation Plan Volume 2, Rev. 0.5

    SciTech Connect

    McCoy, Michel; Archer, Bill; Matzen, M. Keith

    2014-09-16

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. As the program approaches the end of its second decade, ASC is intently focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), quantify critical margins and uncertainties, and resolve increasingly difficult analyses needed for the SSP. Where possible, the program also enables the use of high-performance simulation and computing tools to address broader national security needs, such as foreign nuclear weapon assessments and counternuclear terrorism.

  15. Advances in computer-aided design and computer-aided manufacture technology.

    PubMed

    Calamia, J R

    1994-01-01

    Although the development of computer-aided design (CAD) and computer-aided manufacture (CAM) technology and the benefits of increased productivity became obvious in the automobile and aerospace industries in the 1970s, investigations of this technology's application in the field of dentistry did not begin until the 1980s. Only now are we beginning to see the fruits of this work with the commercial availability of some systems; the potential for this technology seems boundless. This article reviews the recent literature with emphasis on the period from June 1992 to May 1993. This review should familiarize the reader with some of the latest developments in this technology, including a brief description of some systems currently available and the clinical and economical rationale for their acceptance into the dental mainstream. This article concentrates on a particular system, the Cerec (Siemens/Pelton and Crane, Charlotte, NC) system, for three reasons: first, this system has been available since 1985 and, as a result, has a track record of almost 7 years of data. Most of the data have just recently been released and consequently, much of this year's literature on CAD-CAM is monopolized by studies using this system. Second, this system was developed as a mobile, affordable, direct chairside CAD-CAM restorative method. As such, it is of special interest to the dentist who will offer this new technology directly to the patient, providing a one-visit restoration. Third, the author is currently engaged in research using this particular system and has a working knowledge of this system's capabilities.

  16. DOE Advanced Scientific Computing Advisory Subcommittee (ASCAC) Report: Top Ten Exascale Research Challenges

    SciTech Connect

    Lucas, Robert; Ang, James; Bergman, Keren; Borkar, Shekhar; Carlson, William; Carrington, Laura; Chiu, George; Colwell, Robert; Dally, William; Dongarra, Jack; Geist, Al; Haring, Rud; Hittinger, Jeffrey; Hoisie, Adolfy; Klein, Dean Micron; Kogge, Peter; Lethin, Richard; Sarkar, Vivek; Schreiber, Robert; Shalf, John; Sterling, Thomas; Stevens, Rick; Bashor, Jon; Brightwell, Ron; Coteus, Paul; Debenedictus, Erik; Hiller, Jon; Kim, K. H.; Langston, Harper; Murphy, Richard Micron; Webster, Clayton; Wild, Stefan; Grider, Gary; Ross, Rob; Leyffer, Sven; Laros III, James

    2014-02-10

    Exascale computing systems are essential for the scientific fields that will transform the 21st century global economy, including energy, biotechnology, nanotechnology, and materials science. Progress in these fields is predicated on the ability to perform advanced scientific and engineering simulations, and analyze the deluge of data. On July 29, 2013, ASCAC was charged by Patricia Dehmer, the Acting Director of the Office of Science, to assemble a subcommittee to provide advice on exascale computing. This subcommittee was directed to return a list of no more than ten technical approaches (hardware and software) that will enable the development of a system that achieves the Department's goals for exascale computing. Numerous reports over the past few years have documented the technical challenges and the non¬-viability of simply scaling existing computer designs to reach exascale. The technical challenges revolve around energy consumption, memory performance, resilience, extreme concurrency, and big data. Drawing from these reports and more recent experience, this ASCAC subcommittee has identified the top ten computing technology advancements that are critical to making a capable, economically viable, exascale system.

  17. Recovery Act: Advanced Interaction, Computation, and Visualization Tools for Sustainable Building Design

    SciTech Connect

    Greenberg, Donald P.; Hencey, Brandon M.

    2013-08-20

    Current building energy simulation technology requires excessive labor, time and expertise to create building energy models, excessive computational time for accurate simulations and difficulties with the interpretation of the results. These deficiencies can be ameliorated using modern graphical user interfaces and algorithms which take advantage of modern computer architectures and display capabilities. To prove this hypothesis, we developed an experimental test bed for building energy simulation. This novel test bed environment offers an easy-to-use interactive graphical interface, provides access to innovative simulation modules that run at accelerated computational speeds, and presents new graphics visualization methods to interpret simulation results. Our system offers the promise of dramatic ease of use in comparison with currently available building energy simulation tools. Its modular structure makes it suitable for early stage building design, as a research platform for the investigation of new simulation methods, and as a tool for teaching concepts of sustainable design. Improvements in the accuracy and execution speed of many of the simulation modules are based on the modification of advanced computer graphics rendering algorithms. Significant performance improvements are demonstrated in several computationally expensive energy simulation modules. The incorporation of these modern graphical techniques should advance the state of the art in the domain of whole building energy analysis and building performance simulation, particularly at the conceptual design stage when decisions have the greatest impact. More importantly, these better simulation tools will enable the transition from prescriptive to performative energy codes, resulting in better, more efficient designs for our future built environment.

  18. High Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

    SciTech Connect

    Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

    2014-07-28

    The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space, and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing are essential for accurately modeling them. In the past decade, the US Department of Energy's SciDAC program has produced accelerator-modeling tools that have been employed to tackle some of the most difficult accelerator science problems. The authors discuss the Synergia framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable of handling the entire spectrum of beam dynamics simulations. Our authors present Synergia's design principles and its performance on HPC platforms.

  19. High Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

    DOE PAGES

    Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

    2014-07-28

    The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space, and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing are essential for accurately modeling them. In the past decade, the US Department of Energy's SciDAC program has produced accelerator-modeling tools that have been employed to tackle some of the most difficult accelerator science problems. The authors discuss the Synergia framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable ofmore » handling the entire spectrum of beam dynamics simulations. Our authors present Synergia's design principles and its performance on HPC platforms.« less

  20. Recent Advances in Cardiac Computed Tomography: Dual Energy, Spectral and Molecular CT Imaging

    PubMed Central

    Danad, Ibrahim; Fayad, Zahi A.; Willemink, Martin J.; Min, James K.

    2015-01-01

    Computed tomography (CT) evolved into a powerful diagnostic tool and it is impossible to imagine current clinical practice without CT imaging. Due to its widespread availability, ease of clinical application, superb sensitivity for detection of CAD, and non-invasive nature, CT has become a valuable tool within the armamentarium of the cardiologist. In the last few years, numerous technological advances in CT have occurred—including dual energy CT (DECT), spectral CT and CT-based molecular imaging. By harnessing the advances in technology, cardiac CT has advanced beyond the mere evaluation of coronary stenosis to an imaging modality tool that permits accurate plaque characterization, assessment of myocardial perfusion and even probing of molecular processes that are involved in coronary atherosclerosis. Novel innovations in CT contrast agents and pre-clinical spectral CT devices have paved the way for CT-based molecular imaging. PMID:26068288

  1. Advances in Single-Photon Emission Computed Tomography Hardware and Software.

    PubMed

    Piccinelli, Marina; Garcia, Ernest V

    2016-02-01

    Nuclear imaging techniques remain today's most reliable modality for the assessment and quantification of myocardial perfusion. In recent years, the field has experienced tremendous progress both in terms of dedicated cameras for cardiac applications and software techniques for image reconstruction. The most recent advances in single-photon emission computed tomography hardware and software are reviewed, focusing on how these improvements have resulted in an even more powerful diagnostic tool with reduced injected radiation dose and acquisition time.

  2. Computer-Based Testing System. Project STEEL. A Special Project To Develop and Implement a Computer-Based Special Teacher Education and Evaluation Laboratory. Volume III. Final Report.

    ERIC Educational Resources Information Center

    Frick, Theodore W.; And Others

    The document is part of the final report on Project STEEL (Special Teacher Education and Evaluation Laboratory) intended to extend the utilization of technology in the training of preservice special education teachers. This volume focuses on the third of four project objectives, the development and implementation of a computer-based testing…

  3. Computer-assisted virtual planning and surgical template fabrication for frontoorbital advancement.

    PubMed

    Soleman, Jehuda; Thieringer, Florian; Beinemann, Joerg; Kunz, Christoph; Guzman, Raphael

    2015-05-01

    OBJECT The authors describe a novel technique using computer-assisted design (CAD) and computed-assisted manufacturing (CAM) for the fabrication of individualized 3D printed surgical templates for frontoorbital advancement surgery. METHODS Two patients underwent frontoorbital advancement surgery for unilateral coronal synostosis. Virtual surgical planning (SurgiCase-CMF, version 5.0, Materialise) was done by virtual mirroring techniques and superposition of an age-matched normative 3D pediatric skull model. Based on these measurements, surgical templates were fabricated using a 3D printer. Bifrontal craniotomy and the osteotomies for the orbital bandeau were performed based on the sterilized 3D templates. The remodeling was then done placing the bone plates within the negative 3D templates and fixing them using absorbable poly-dl-lactic acid plates and screws. RESULTS Both patients exhibited a satisfying head shape postoperatively and at follow-up. No surgery-related complications occurred. The cutting and positioning of the 3D surgical templates proved to be very accurate and easy to use as well as reproducible and efficient. CONCLUSIONS Computer-assisted virtual planning and 3D template fabrication for frontoorbital advancement surgery leads to reconstructions based on standardizedmeasurements, precludes subjective remodeling, and seems to be overall safe and feasible. A larger series of patients with long-term follow-up is needed for further evaluation of this novel technique.

  4. Recent advances in 3D computed tomography techniques for simulation and navigation in hepatobiliary pancreatic surgery.

    PubMed

    Uchida, Masafumi

    2014-04-01

    A few years ago it could take several hours to complete a 3D image using a 3D workstation. Thanks to advances in computer science, obtaining results of interest now requires only a few minutes. Many recent 3D workstations or multimedia computers are equipped with onboard 3D virtual patient modeling software, which enables patient-specific preoperative assessment and virtual planning, navigation, and tool positioning. Although medical 3D imaging can now be conducted using various modalities, including computed tomography (CT), magnetic resonance imaging (MRI), positron emission tomography (PET), and ultrasonography (US) among others, the highest quality images are obtained using CT data, and CT images are now the most commonly used source of data for 3D simulation and navigation image. If the 2D source image is bad, no amount of 3D image manipulation in software will provide a quality 3D image. In this exhibition, the recent advances in CT imaging technique and 3D visualization of the hepatobiliary and pancreatic abnormalities are featured, including scan and image reconstruction technique, contrast-enhanced techniques, new application of advanced CT scan techniques, and new virtual reality simulation and navigation imaging.

  5. Research Institute for Advanced Computer Science: Annual Report October 1998 through September 1999

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.; Gross, Anthony R. (Technical Monitor)

    1999-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center (ARC). It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. ARC has been designated NASA's Center of Excellence in Information Technology. In this capacity, ARC is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA ARC and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to

  6. LABORATORY PROCESS CONTROLLER USING NATURAL LANGUAGE COMMANDS FROM A PERSONAL COMPUTER

    NASA Technical Reports Server (NTRS)

    Will, H.

    1994-01-01

    The complex environment of the typical research laboratory requires flexible process control. This program provides natural language process control from an IBM PC or compatible machine. Sometimes process control schedules require changes frequently, even several times per day. These changes may include adding, deleting, and rearranging steps in a process. This program sets up a process control system that can either run without an operator, or be run by workers with limited programming skills. The software system includes three programs. Two of the programs, written in FORTRAN77, record data and control research processes. The third program, written in Pascal, generates the FORTRAN subroutines used by the other two programs to identify the user commands with the user-written device drivers. The software system also includes an input data set which allows the user to define the user commands which are to be executed by the computer. To set the system up the operator writes device driver routines for all of the controlled devices. Once set up, this system requires only an input file containing natural language command lines which tell the system what to do and when to do it. The operator can make up custom commands for operating and taking data from external research equipment at any time of the day or night without the operator in attendance. This process control system requires a personal computer operating under MS-DOS with suitable hardware interfaces to all controlled devices. The program requires a FORTRAN77 compiler and user-written device drivers. This program was developed in 1989 and has a memory requirement of about 62 Kbytes.

  7. Computer-Assisted Instruction in the Context of the Advanced Instructional System: Authoring Support Software. Final Report.

    ERIC Educational Resources Information Center

    Montgomery, Ann D.; Judd, Wilson A.

    This report details the design, development, and implementation of computer software to support the cost-effective production of computer assisted instruction (CAI) within the context of the Advanced Instructional System (AIS) located at Lowry Air Force Base. The report supplements the computer managed Air Force technical training that is…

  8. ADVANCING THE FUNDAMENTAL UNDERSTANDING AND SCALE-UP OF TRISO FUEL COATERS VIA ADVANCED MEASUREMENT AND COMPUTATIONAL TECHNIQUES

    SciTech Connect

    Biswas, Pratim; Al-Dahhan, Muthanna

    2012-11-01

    to advance the fundamental understanding of the hydrodynamics by systematically investigating the effect of design and operating variables, to evaluate the reported dimensionless groups as scaling factors, and to establish a reliable scale-up methodology for the TRISO fuel particle spouted bed coaters based on hydrodynamic similarity via advanced measurement and computational techniques. An additional objective is to develop an on-line non-invasive measurement technique based on gamma ray densitometry (i.e. Nuclear Gauge Densitometry) that can be installed and used for coater process monitoring to ensure proper performance and operation and to facilitate the developed scale-up methodology. To achieve the objectives set for the project, the work will use optical probes and gamma ray computed tomography (CT) (for the measurements of solids/voidage holdup cross-sectional distribution and radial profiles along the bed height, spouted diameter, and fountain height) and radioactive particle tracking (RPT) (for the measurements of the 3D solids flow field, velocity, turbulent parameters, circulation time, solids lagrangian trajectories, and many other of spouted bed related hydrodynamic parameters). In addition, gas dynamic measurement techniques and pressure transducers will be utilized to complement the obtained information. The measurements obtained by these techniques will be used as benchmark data to evaluate and validate the computational fluid dynamic (CFD) models (two fluid model or discrete particle model) and their closures. The validated CFD models and closures will be used to facilitate the developed methodology for scale-up, design and hydrodynamic similarity. Successful execution of this work and the proposed tasks will advance the fundamental understanding of the coater flow field and quantify it for proper and safe design, scale-up, and performance. Such achievements will overcome the barriers to AGR applications and will help assure that the US maintains

  9. Cost-effective and business-beneficial computer validation for bioanalytical laboratories.

    PubMed

    McDowall, Rd

    2011-07-01

    Computerized system validation is often viewed as a burden and a waste of time to meet regulatory requirements. This article presents a different approach by looking at validation in a bioanalytical laboratory from the business benefits that computer validation can bring. Ask yourself the question, have you ever bought a computerized system that did not meet your initial expectations? This article will look at understanding the process to be automated, the paper to be eliminated and the records to be signed to meet the requirements of the GLP or GCP and Part 11 regulations. This paper will only consider commercial nonconfigurable and configurable software such as plate readers and LC-MS/MS data systems rather than LIMS or custom applications. Two streamlined life cycle models are presented. The first one consists of a single document for validation of nonconfigurable software. The second is for configurable software and is a five-stage model that avoids the need to write functional and design specifications. Both models are aimed at managing the risk each type of software poses whist reducing the amount of documented evidence required for validation.

  10. A new x-ray computed tomography system for laboratory mouse imaging

    SciTech Connect

    Paulus, M.J.; Sari-Sarraf, H.; Gleason, S.S.; Bobrek, M.; Hicks, J.S.; Johnson, D.K.; Behel, J.K.; Thompson, L.H.; Allen, W.C.

    1999-06-01

    Two versions of a new high-resolution x-ray computed tomography system are being developed to screen mutagenized mice in the Oak Ridge National Laboratory Mammalian Genetics Research Facility. The first prototype employs a single-pixel cadmium zinc telluride detector with a pinhole collimator operating in pulse counting mode. The second version employs a phosphor screen/CCD detector operating in current mode. The major system hardware includes a low-energy X-ray tube, two linear translation stages and a rotational stage. For the single-pixel detector, image resolution is determined by the step size of the detector stage; preliminary images have been acquired at 100 {micro}m and 250 {micro}m resolutions. The resolution of the phosphor screen detector is determined by the modulation transfer function of the phosphor screen; images with resolutions approaching 50 {micro}m have been acquired. The system performance with the two detectors is described and recent images are presented.

  11. Computational Fluid Dynamics Simulation of the Hydrogen Reduction of Magnetite Concentrate in a Laboratory Flash Reactor

    NASA Astrophysics Data System (ADS)

    Fan, De-Qiu; Sohn, H. Y.; Mohassab, Yousef; Elzohiery, Mohamed

    2016-12-01

    A three-dimensional computational fluid dynamics (CFD) model was developed to study the hydrogen reduction of magnetite concentrate particles in a laboratory flash reactor representing a novel flash ironmaking process. The model was used to simulate the fluid flow, heat transfer, and chemical reactions involved. The governing equations for the gas phase were solved in the Eulerian frame of reference while the particles were tracked in the Lagrangian framework. The change in the particle mass was related to the chemical reaction and the particle temperature was calculated by taking into consideration the heat of reaction, convection, and radiation. The stochastic trajectory model was used to describe particle dispersion due to turbulence. Partial combustion of H2 by O2 injected through a non-premixed burner was also simulated in this study. The partial combustion mechanism used in this model consisted of seven chemical reactions involving six species. The temperature profiles and reduction degrees obtained from the simulations satisfactorily agreed with the experimental measurements.

  12. Computer Numerical Control: Instructional Manual. The North Dakota High Technology Mobile Laboratory Project.

    ERIC Educational Resources Information Center

    Sinn, John W.

    This instructional manual contains five learning activity packets for use in a workshop on computer numerical control for computer-aided manufacturing. The lessons cover the following topics: introduction to computer-aided manufacturing, understanding the lathe, using the computer, computer numerically controlled part programming, and executing a…

  13. Advances and trends in the development of computational models for tires

    NASA Technical Reports Server (NTRS)

    Noor, A. K.; Tanner, J. A.

    1985-01-01

    Status and some recent developments of computational models for tires are summarized. Discussion focuses on a number of aspects of tire modeling and analysis including: tire materials and their characterization; evolution of tire models; characteristics of effective finite element models for analyzing tires; analysis needs for tires; and impact of the advances made in finite element technology, computational algorithms, and new computing systems on tire modeling and analysis. An initial set of benchmark problems has been proposed in concert with the U.S. tire industry. Extensive sets of experimental data will be collected for these problems and used for evaluating and validating different tire models. Also, the new Aircraft Landing Dynamics Facility (ALDF) at NASA Langley Research Center is described.

  14. Improved dissection efficiency in the human gross anatomy laboratory by the integration of computers and modern technology.

    PubMed

    Reeves, Rustin E; Aschenbrenner, John E; Wordinger, Robert J; Roque, Rouel S; Sheedlo, Harold J

    2004-05-01

    The need to increase the efficiency of dissection in the gross anatomy laboratory has been the driving force behind the technologic changes we have recently implemented. With the introduction of an integrated systems-based medical curriculum and a reduction in laboratory teaching hours, anatomy faculty at the University of North Texas Health Science Center (UNTHSC) developed a computer-based dissection manual to adjust to these curricular changes and time constraints. At each cadaver workstation, Apple iMac computers were added and a new dissection manual, running in a browser-based format, was installed. Within the text of the manual, anatomical structures required for dissection were linked to digital images from prosected materials; in addition, for each body system, the dissection manual included images from cross sections, radiographs, CT scans, and histology. Although we have placed a high priority on computerization of the anatomy laboratory, we remain strong advocates of the importance of cadaver dissection. It is our belief that the utilization of computers for dissection is a natural evolution of technology and fosters creative teaching strategies adapted for anatomy laboratories in the 21st century. Our strategy has significantly enhanced the independence and proficiency of our students, the efficiency of their dissection time, and the quality of laboratory instruction by the faculty.

  15. Comparison of a Computer Simulation Program and a Traditional Laboratory Practical Class for Teaching the Principles of Intestinal Absorption.

    ERIC Educational Resources Information Center

    Dewhurst, D. G.; And Others

    1994-01-01

    Evaluates the effectiveness of an interactive computer-assisted learning program for undergraduate students that simulates experiments performed using isolated, everted sacs of rat small intestine. The program is designed to offer an alternative student-centered approach to traditional laboratory-based practical classes. Knowledge gain of students…

  16. A Low-Cost Computer-Controlled Arduino-Based Educational Laboratory System for Teaching the Fundamentals of Photovoltaic Cells

    ERIC Educational Resources Information Center

    Zachariadou, K.; Yiasemides, K.; Trougkakos, N.

    2012-01-01

    We present a low-cost, fully computer-controlled, Arduino-based, educational laboratory (SolarInsight) to be used in undergraduate university courses concerned with electrical engineering and physics. The major goal of the system is to provide students with the necessary instrumentation, software tools and methodology in order to learn fundamental…

  17. Reassigning the Structures of Natural Products Using NMR Chemical Shifts Computed with Quantum Mechanics: A Laboratory Exercise

    ERIC Educational Resources Information Center

    Palazzo, Teresa A.; Truong, Tiana T.; Wong, Shirley M. T.; Mack, Emma T.; Lodewyk, Michael W.; Harrison, Jason G.; Gamage, R. Alan; Siegel, Justin B.; Kurth, Mark J.; Tantillo, Dean J.

    2015-01-01

    An applied computational chemistry laboratory exercise is described in which students use modern quantum chemical calculations of chemical shifts to assign the structure of a recently isolated natural product. A pre/post assessment was used to measure student learning gains and verify that students demonstrated proficiency of key learning…

  18. Setting up a Low-Cost Lab Management System for a Multi-Purpose Computing Laboratory Using Virtualisation Technology

    ERIC Educational Resources Information Center

    Mok, Heng Ngee; Lee, Yeow Leong; Tan, Wee Kiat

    2012-01-01

    This paper describes how a generic computer laboratory equipped with 52 workstations is set up for teaching IT-related courses and other general purpose usage. The authors have successfully constructed a lab management system based on decentralised, client-side software virtualisation technology using Linux and free software tools from VMware that…

  19. Combining a Laboratory Practical Class with a Computer Simulation: Studies on the Synthesis of Urea in Isolated Hepatocytes.

    ERIC Educational Resources Information Center

    Bender, David A.

    1986-01-01

    Describes how a computer simulation is used with a laboratory experiment on the synthesis of urea in isolated hepatocytes. The simulation calculates the amount of urea formed and the amount of ammonium remaining as the concentrations of ornithine, citrulline, argininosuccinate, arginine, and aspartate are altered. (JN)

  20. The Use and Benefits of Computer Aided Learning in the Assessment of the Laboratory Exercise "Enzyme Induction in Escherichia coli".

    ERIC Educational Resources Information Center

    Pamula, F.; And Others

    1995-01-01

    Describes an interactive computer program written to provide accurate and immediate feedback to students while they are processing experimental data. Discusses the problems inherent in laboratory courses that led to the development of this program. Advantages of the software include allowing students to work at their own pace in a nonthreatening…

  1. To Compare the Effects of Computer Based Learning and the Laboratory Based Learning on Students' Achievement Regarding Electric Circuits

    ERIC Educational Resources Information Center

    Bayrak, Bekir; Kanli, Uygar; Kandil Ingeç, Sebnem

    2007-01-01

    In this study, the research problem was: "Is the computer based physics instruction as effective as laboratory intensive physics instruction with regards to academic success on electric circuits 9th grade students?" For this research of experimental quality the design of pre-test and post-test are applied with an experiment and a control…

  2. Computing and information services at the Jet Propulsion Laboratory - A management approach to a diversity of needs

    NASA Technical Reports Server (NTRS)

    Felberg, F. H.

    1984-01-01

    The Jet Propulsion Laboratory, a research and development organization with about 5,000 employees, presents a complicated set of requirements for an institutional system of computing and informational services. The approach taken by JPL in meeting this challenge is one of controlled flexibility. A central communications network is provided, together with selected computing facilities for common use. At the same time, staff members are given considerable discretion in choosing the mini- and microcomputers that they believe will best serve their needs. Consultation services, computer education, and other support functions are also provided.

  3. A Devoted Mini-Computer System for the Management of Clinical and Laboratory Data in an Intensive Care Unit

    PubMed Central

    Shinozaki, Tamotsu; Deane, Robert S.; Mazuzan, John E.

    1982-01-01

    In order to handle a large amount of clinical, laboratory and physiological information in intensive care units, a prototype distributed computer system is used at the Medical Center Hospital of Vermont. The system enables us to do extra tasks without increasing clerical help, eg., a progress note for respiratory care, statistical data for unit management, computation of cardiac and pulmonary parameters, IV schedule for vasoactive drugs, daily compilation of TISS and APACHE scores, data collection for audits and special products. Special attention is paid to computer/user interaction.

  4. NMR Studies of Structure-Reactivity Relationships in Carbonyl Reduction: A Collaborative Advanced Laboratory Experiment

    ERIC Educational Resources Information Center

    Marincean, Simona; Smith, Sheila R.; Fritz, Michael; Lee, Byung Joo; Rizk, Zeinab

    2012-01-01

    An upper-division laboratory project has been developed as a collaborative investigation of a reaction routinely taught in organic chemistry courses: the reduction of carbonyl compounds by borohydride reagents. Determination of several trends regarding structure-activity relationship was possible because each student contributed his or her results…

  5. Green, Enzymatic Syntheses of Divanillin and Diapocynin for the Organic, Biochemistry, or Advanced General Chemistry Laboratory

    ERIC Educational Resources Information Center

    Nishimura, Rachel T.; Giammanco, Chiara H.; Vosburg, David A.

    2010-01-01

    Environmentally benign chemistry is an increasingly important topic both in the classroom and the laboratory. In this experiment, students synthesize divanillin from vanillin or diapocynin from apocynin, using horseradish peroxidase and hydrogen peroxide in water. The dimerized products form rapidly at ambient temperature and are isolated by…

  6. Glycobiology, How to Sugar-Coat an Undergraduate Advanced Biochemistry Laboratory

    ERIC Educational Resources Information Center

    McReynolds, Katherine D.

    2006-01-01

    A second semester biochemistry laboratory has been implemented as an independent projects course at California State University, Sacramento since 1999. To incorporate aspects of carbohydrate biochemistry, or glycobiology, into our curriculum, projects in lectin isolation and purification were undertaken over the course of two semesters. Through…

  7. E-Learning in Engineering Education: Design of a Collaborative Advanced Remote Access Laboratory

    ERIC Educational Resources Information Center

    Chandra A. P., Jagadeesh; Samuel, R. D. Sudhaker

    2010-01-01

    Attaining excellence in technical education is a worthy challenge to any life goal. Distance learning opportunities make these goals easier to reach with added quality. Distance learning in engineering education is possible only through successful implementations of remote laboratories in a learning-by-doing environment. This paper presents one…

  8. Dipeptide Structural Analysis Using Two-Dimensional NMR for the Undergraduate Advanced Laboratory

    ERIC Educational Resources Information Center

    Gonzalez, Elizabeth; Dolino, Drew; Schwartzenburg, Danielle; Steiger, Michelle A.

    2015-01-01

    A laboratory experiment was developed to introduce students in either an organic chemistry or biochemistry lab course to two-dimensional nuclear magnetic resonance (2D NMR) spectroscopy using simple biomolecules. The goal of this experiment is for students to understand and interpret the information provided by a 2D NMR spectrum. Students are…

  9. Advanced Simulation and Computing Fiscal Year 2011-2012 Implementation Plan, Revision 0

    SciTech Connect

    McCoy, M; Phillips, J; Hpson, J; Meisner, R

    2010-04-22

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering (D&E) programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model

  10. Advanced Simulation and Computing FY08-09 Implementation Plan Volume 2 Revision 0

    SciTech Connect

    McCoy, M; Kusnezov, D; Bikkel, T; Hopson, J

    2007-04-25

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  11. Advanced Simulation and Computing FY07-08 Implementation Plan Volume 2

    SciTech Connect

    Kusnezov, D; Hale, A; McCoy, M; Hopson, J

    2006-06-22

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program will require the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from

  12. Advanced Simulation & Computing FY09-FY10 Implementation Plan Volume 2, Rev. 0

    SciTech Connect

    Meisner, R; Perry, J; McCoy, M; Hopson, J

    2008-04-30

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  13. Advanced Simulation and Computing FY09-FY10 Implementation Plan, Volume 2, Revision 0.5

    SciTech Connect

    Meisner, R; Hopson, J; Peery, J; McCoy, M

    2008-10-07

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  14. Advanced Simulation and Computing FY08-09 Implementation Plan, Volume 2, Revision 0.5

    SciTech Connect

    Kusnezov, D; Bickel, T; McCoy, M; Hopson, J

    2007-09-13

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from

  15. Advanced Simulation and Computing FY09-FY10 Implementation Plan Volume 2, Rev. 1

    SciTech Connect

    Kissel, L

    2009-04-01

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one that

  16. Advanced Simulation and Computing FY10-FY11 Implementation Plan Volume 2, Rev. 0.5

    SciTech Connect

    Meisner, R; Peery, J; McCoy, M; Hopson, J

    2009-09-08

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering (D&E) programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model

  17. Advanced computational simulation for design and manufacturing of lightweight material components for automotive applications

    SciTech Connect

    Simunovic, S.; Aramayo, G.A.; Zacharia, T.; Toridis, T.G.; Bandak, F.; Ragland, C.L.

    1997-04-01

    Computational vehicle models for the analysis of lightweight material performance in automobiles have been developed through collaboration between Oak Ridge National Laboratory, the National Highway Transportation Safety Administration, and George Washington University. The vehicle models have been verified against experimental data obtained from vehicle collisions. The crashed vehicles were analyzed, and the main impact energy dissipation mechanisms were identified and characterized. Important structural parts were extracted and digitized and directly compared with simulation results. High-performance computing played a key role in the model development because it allowed for rapid computational simulations and model modifications. The deformation of the computational model shows a very good agreement with the experiments. This report documents the modifications made to the computational model and relates them to the observations and findings on the test vehicle. Procedural guidelines are also provided that the authors believe need to be followed to create realistic models of passenger vehicles that could be used to evaluate the performance of lightweight materials in automotive structural components.

  18. Computational fluid dynamic study on obstructive sleep apnea syndrome treated with maxillomandibular advancement.

    PubMed

    Yu, Chung-Chih; Hsiao, Hung-Da; Lee, Lung-Cheng; Yao, Chih-Min; Chen, Ning-Hung; Wang, Chau-Jan; Chen, Yu-Ray

    2009-03-01

    Maxillomandibular advancement is one of the treatments available for obstructive sleep apnea. The influence of this surgery on the upper airway and its mechanism are not fully understood. The present research simulates the flow fields of narrowed upper airways of 2 patients with obstructive sleep apnea treated with maxillomandibular advancement. The geometry of the upper airway was reconstructed from computed tomographic images taken before and after surgery. The consequent three-dimensional surface model was rendered for measurement and computational fluid dynamics simulation. Patients showed clinical improvement 6 months after surgery. The cross-sectional area of the narrowest part of the upper airway was increased in all dimensions. The simulated results showed a less constricted upper airway, with less velocity change and a decreased pressure gradient across the whole conduit during passage of air. Less breathing effort is therefore expected to achieve equivalent ventilation with the postoperative airway. This study demonstrates the possibility of computational fluid dynamics in providing information for understanding the pathogenesis of OSA and the effects of its treatment.

  19. Computational fluid dynamics modeling of laboratory flames and an industrial flare.

    PubMed

    Singh, Kanwar Devesh; Gangadharan, Preeti; Chen, Daniel H; Lou, Helen H; Li, Xianchang; Richmond, Peyton

    2014-11-01

    A computational fluid dynamics (CFD) methodology for simulating the combustion process has been validated with experimental results. Three different types of experimental setups were used to validate the CFD model. These setups include an industrial-scale flare setups and two lab-scale flames. The CFD study also involved three different fuels: C3H6/CH/Air/N2, C2H4/O2/Ar and CH4/Air. In the first setup, flare efficiency data from the Texas Commission on Environmental Quality (TCEQ) 2010 field tests were used to validate the CFD model. In the second setup, a McKenna burner with flat flames was simulated. Temperature and mass fractions of important species were compared with the experimental data. Finally, results of an experimental study done at Sandia National Laboratories to generate a lifted jet flame were used for the purpose of validation. The reduced 50 species mechanism, LU 1.1, the realizable k-epsilon turbulence model, and the EDC turbulence-chemistry interaction model were usedfor this work. Flare efficiency, axial profiles of temperature, and mass fractions of various intermediate species obtained in the simulation were compared with experimental data and a good agreement between the profiles was clearly observed. In particular the simulation match with the TCEQ 2010 flare tests has been significantly improved (within 5% of the data) compared to the results reported by Singh et al. in 2012. Validation of the speciated flat flame data supports the view that flares can be a primary source offormaldehyde emission.

  20. Advanced Yellow Fever Virus Genome Detection in Point-of-Care Facilities and Reference Laboratories

    PubMed Central

    Patel, Pranav; Yillah, Jasmin; Weidmann, Manfred; Méndez, Jairo A.; Nakouné, Emmanuel Rivalyn; Niedrig, Matthias

    2012-01-01

    Reported methods for the detection of the yellow fever viral genome are beset by limitations in sensitivity, specificity, strain detection spectra, and suitability to laboratories with simple infrastructure in areas of endemicity. We describe the development of two different approaches affording sensitive and specific detection of the yellow fever genome: a real-time reverse transcription-quantitative PCR (RT-qPCR) and an isothermal protocol employing the same primer-probe set but based on helicase-dependent amplification technology (RT-tHDA). Both assays were evaluated using yellow fever cell culture supernatants as well as spiked and clinical samples. We demonstrate reliable detection by both assays of different strains of yellow fever virus with improved sensitivity and specificity. The RT-qPCR assay is a powerful tool for reference or diagnostic laboratories with real-time PCR capability, while the isothermal RT-tHDA assay represents a useful alternative to earlier amplification techniques for the molecular diagnosis of yellow fever by field or point-of-care laboratories. PMID:23052311

  1. Conditions for building a community of practice in an advanced physics laboratory

    NASA Astrophysics Data System (ADS)

    Irving, Paul W.; Sayre, Eleanor C.

    2014-06-01

    We use the theory of communities of practice and the concept of accountable disciplinary knowledge to describe how a learning community develops in the context of an upper-division physics laboratory course. The change in accountable disciplinary knowledge motivates students' enculturation into a community of practice. The enculturation process is facilitated by four specific structural features of the course and supported by a primary instructional choice. The four structural features are "paucity of instructor time," "all in a room together," "long and difficult experiments," and "same experiments at different times." The instructional choice is the encouragement of the sharing and development of knowledge and understanding by the instructor. The combination of the instructional choice and structural features promotes the development of the learning community in which students engage in authentic practices of a physicist. This results in a classroom community that can provide students with the opportunity to have an accelerated trajectory towards being a more central participant of the community of a practice of physicists. We support our claims with video-based observations of laboratory classroom interactions and individual, semistructured interviews with students about their laboratory experiences and physics identity.

  2. Advanced yellow fever virus genome detection in point-of-care facilities and reference laboratories.

    PubMed

    Domingo, Cristina; Patel, Pranav; Yillah, Jasmin; Weidmann, Manfred; Méndez, Jairo A; Nakouné, Emmanuel Rivalyn; Niedrig, Matthias

    2012-12-01

    Reported methods for the detection of the yellow fever viral genome are beset by limitations in sensitivity, specificity, strain detection spectra, and suitability to laboratories with simple infrastructure in areas of endemicity. We describe the development of two different approaches affording sensitive and specific detection of the yellow fever genome: a real-time reverse transcription-quantitative PCR (RT-qPCR) and an isothermal protocol employing the same primer-probe set but based on helicase-dependent amplification technology (RT-tHDA). Both assays were evaluated using yellow fever cell culture supernatants as well as spiked and clinical samples. We demonstrate reliable detection by both assays of different strains of yellow fever virus with improved sensitivity and specificity. The RT-qPCR assay is a powerful tool for reference or diagnostic laboratories with real-time PCR capability, while the isothermal RT-tHDA assay represents a useful alternative to earlier amplification techniques for the molecular diagnosis of yellow fever by field or point-of-care laboratories.

  3. Emotions beyond the laboratory: theoretical fundaments, study design, and analytic strategies for advanced ambulatory assessment.

    PubMed

    Wilhelm, Frank H; Grossman, Paul

    2010-07-01

    Questionnaire and interview assessment can provide reliable data on attitudes and self-perceptions on emotion, and experimental laboratory assessment can examine functional relations between stimuli and reactions under controlled conditions. On the other hand, ambulatory assessment is less constrained and provides naturalistic data on emotion in daily life, with the potential to (1) assure external validity of laboratory findings, (2) provide normative data on prevalence, quality and intensity of real-life emotion and associated processes, (3) characterize previously unidentified emotional phenomena, and (4) model real-life stimuli for representative laboratory research design. Technological innovations now allow for detailed ambulatory study of emotion across domains of subjective experience, overt behavior and physiology. However, methodological challenges abound that may compromise attempts to characterize biobehavioral aspects of emotion in the real world. For example, emotional effects can be masked by social engagement, mental and physical workloads, as well as by food intake and circadian and quasi-random variation in metabolic activity. The complexity of data streams and multitude of factors that influence them require a high degree of context specification for meaningful data interpretation. We consider possible solutions to typical and often overlooked issues related to ambulatory emotion research, including aspects of study design decisions, recording devices and channels, electronic diary implementation, and data analysis.

  4. Advanced Computational Thermal Fluid Physics (CTFP) and Its Assessment for Light Water Reactors and Supercritical Reactors

    SciTech Connect

    D.M. McEligot; K. G. Condie; G. E. McCreery; H. M. McIlroy; R. J. Pink; L.E. Hochreiter; J.D. Jackson; R.H. Pletcher; B.L. Smith; P. Vukoslavcevic; J.M. Wallace; J.Y. Yoo; J.S. Lee; S.T. Ro; S.O. Park

    2005-10-01

    Background: The ultimate goal of the study is the improvement of predictive methods for safety analyses and design of Generation IV reactor systems such as supercritical water reactors (SCWR) for higher efficiency, improved performance and operation, design simplification, enhanced safety and reduced waste and cost. The objective of this Korean / US / laboratory / university collaboration of coupled fundamental computational and experimental studies is to develop the supporting knowledge needed for improved predictive techniques for use in the technology development of Generation IV reactor concepts and their passive safety systems. The present study emphasizes SCWR concepts in the Generation IV program.

  5. Distributed Drug Discovery: Advancing Chemical Education through Contextualized Combinatorial Solid-Phase Organic Laboratories

    ERIC Educational Resources Information Center

    Scott, William L.; Denton, Ryan E.; Marrs, Kathleen A.; Durrant, Jacob D.; Samaritoni, J. Geno; Abraham, Milata M.; Brown, Stephen P.; Carnahan, Jon M.; Fischer, Lindsey G.; Glos, Courtney E.; Sempsrott, Peter J.; O'Donnell, Martin J.

    2015-01-01

    The Distributed Drug Discovery (D3) program trains students in three drug discovery disciplines (synthesis, computational analysis, and biological screening) while addressing the important challenge of discovering drug leads for neglected diseases. This article focuses on implementation of the synthesis component in the second-semester…

  6. Advanced computer techniques for inverse modeling of electric current in cardiac tissue

    SciTech Connect

    Hutchinson, S.A.; Romero, L.A.; Diegert, C.F.

    1996-08-01

    For many years, ECG`s and vector cardiograms have been the tools of choice for non-invasive diagnosis of cardiac conduction problems, such as found in reentrant tachycardia or Wolff-Parkinson-White (WPW) syndrome. Through skillful analysis of these skin-surface measurements of cardiac generated electric currents, a physician can deduce the general location of heart conduction irregularities. Using a combination of high-fidelity geometry modeling, advanced mathematical algorithms and massively parallel computing, Sandia`s approach would provide much more accurate information and thus allow the physician to pinpoint the source of an arrhythmia or abnormal conduction pathway.

  7. NDE of advanced turbine engine components and materials by computed tomography

    NASA Technical Reports Server (NTRS)

    Yancey, R. N.; Baaklini, George Y.; Klima, Stanley J.

    1991-01-01

    Computed tomography (CT) is an X-ray technique that provides quantitative 3D density information of materials and components and can accurately detail spatial distributions of cracks, voids, and density variations. CT scans of ceramic materials, composites, and engine components were taken and the resulting images will be discussed. Scans were taken with two CT systems with different spatial resolution capabilities. The scans showed internal damage, density variations, and geometrical arrangement of various features in the materials and components. It was concluded that CT can play an important role in the characterization of advanced turbine engine materials and components. Future applications of this technology will be outlined.

  8. Advanced Imaging of Athletes: Added Value of Coronary Computed Tomography and Cardiac Magnetic Resonance Imaging.

    PubMed

    Martinez, Matthew W

    2015-07-01

    Cardiac magnetic resonance imaging and cardiac computed tomographic angiography have become important parts of the armamentarium for noninvasive diagnosis of cardiovascular disease. Emerging technologies have produced faster imaging, lower radiation dose, improved spatial and temporal resolution, as well as a wealth of prognostic data to support usage. Investigating true pathologic disease as well as distinguishing normal from potentially dangerous is now increasingly more routine for the cardiologist in practice. This article investigates how advanced imaging technologies can assist the clinician when evaluating all athletes for pathologic disease that may put them at risk.

  9. DOE Advanced Scientific Computing Advisory Committee (ASCAC) Subcommittee Report on Scientific and Technical Information

    SciTech Connect

    Hey, Tony; Agarwal, Deborah; Borgman, Christine; Cartaro, Concetta; Crivelli, Silvia; Van Dam, Kerstin Kleese; Luce, Richard; Arjun, Shankar; Trefethen, Anne; Wade, Alex; Williams, Dean

    2015-09-04

    The Advanced Scientific Computing Advisory Committee (ASCAC) was charged to form a standing subcommittee to review the Department of Energy’s Office of Scientific and Technical Information (OSTI) and to begin by assessing the quality and effectiveness of OSTI’s recent and current products and services and to comment on its mission and future directions in the rapidly changing environment for scientific publication and data. The Committee met with OSTI staff and reviewed available products, services and other materials. This report summaries their initial findings and recommendations.

  10. Cardiovascular proteomics in the era of big data: experimental and computational advances.

    PubMed

    Lam, Maggie P Y; Lau, Edward; Ng, Dominic C M; Wang, Ding; Ping, Peipei

    2016-01-01

    Proteomics plays an increasingly important role in our quest to understand cardiovascular biology. Fueled by analytical and computational advances in the past decade, proteomics applications can now go beyond merely inventorying protein species, and address sophisticated questions on cardiac physiology. The advent of massive mass spectrometry datasets has in turn led to increasing intersection between proteomics and big data science. Here we review new frontiers in technological developments and their applications to cardiovascular medicine. The impact of big data science on cardiovascular proteomics investigations and translation to medicine is highlighted.

  11. Computational Models of Exercise on the Advanced Resistance Exercise Device (ARED)

    NASA Technical Reports Server (NTRS)

    Newby, Nate; Caldwell, Erin; Scott-Pandorf, Melissa; Peters,Brian; Fincke, Renita; DeWitt, John; Poutz-Snyder, Lori

    2011-01-01

    Muscle and bone loss remain a concern for crew returning from space flight. The advanced resistance exercise device (ARED) is used for on-orbit resistance exercise to help mitigate these losses. However, characterization of how the ARED loads the body in microgravity has yet to be determined. Computational models allow us to analyze ARED exercise in both 1G and 0G environments. To this end, biomechanical models of the squat, single-leg squat, and deadlift exercise on the ARED have been developed to further investigate bone and muscle forces resulting from the exercises.

  12. Argonne National Laboratory study of the transfer of federal computational technology to manufacturing industry in the State of Michigan

    SciTech Connect

    Mueller, C.J.

    1991-11-01

    This report describes a pilot study to develop, initiate the implementation, and document a process to identify computational technology capabilities resident within Argonne National Laboratory to small and medium-sized businesses in the State of Michigan. It is a derivative of a program entitled Technology Applications Development Process for the State of Michigan'' undertaken by the Industrial Technology Institute and MERRA under funding from the National Institute of Standards and Technology. The overall objective of the latter program is to develop procedures which can facilitate the discovery and commercialization of new technologies for the benefit of small and medium-size manufacturing firms. Federal laboratories such as Argonne, along with universities, have been identified by the Industrial Technology Institute as key sources of technology which can be profitably commercialized by the target firms. The scope of this study limited the investigation of technology areas for technology transfer to that of computational science and engineering featuring high performance computing. This area was chosen as the broad technological capability within Argonne to investigate for technology transfer to Michigan firms for several reasons. First, and most importantly, as a multidisciplinary laboratory, Argonne has the full range of scientific and engineering skills needed to utilize leading-edge computing capabilities in many areas of manufacturing.

  13. Argonne National Laboratory study of the transfer of federal computational technology to manufacturing industry in the State of Michigan

    SciTech Connect

    Mueller, C.J.

    1991-11-01

    This report describes a pilot study to develop, initiate the implementation, and document a process to identify computational technology capabilities resident within Argonne National Laboratory to small and medium-sized businesses in the State of Michigan. It is a derivative of a program entitled ``Technology Applications Development Process for the State of Michigan`` undertaken by the Industrial Technology Institute and MERRA under funding from the National Institute of Standards and Technology. The overall objective of the latter program is to develop procedures which can facilitate the discovery and commercialization of new technologies for the benefit of small and medium-size manufacturing firms. Federal laboratories such as Argonne, along with universities, have been identified by the Industrial Technology Institute as key sources of technology which can be profitably commercialized by the target firms. The scope of this study limited the investigation of technology areas for technology transfer to that of computational science and engineering featuring high performance computing. This area was chosen as the broad technological capability within Argonne to investigate for technology transfer to Michigan firms for several reasons. First, and most importantly, as a multidisciplinary laboratory, Argonne has the full range of scientific and engineering skills needed to utilize leading-edge computing capabilities in many areas of manufacturing.

  14. Advanced Stirling Convertor Dual Convertor Controller Testing at NASA Glenn Research Center in the Radioisotope Power Systems System Integration Laboratory

    NASA Technical Reports Server (NTRS)

    Dugala, Gina M.; Taylor, Linda M.; Bell, Mark E.; Dolce, James L.; Fraeman, Martin; Frankford, David P.

    2015-01-01

    NASA Glenn Research Center (GRC) developed a non-nuclear representation of a Radioisotope Power System (RPS) consisting of a pair of Advanced Stirling Convertors (ASC), a Dual Convertor Controller (DCC) EM (engineering model) 2 & 3, and associated support equipment, which were tested in the Radioisotope Power Systems System Integration Laboratory (RSIL). The DCC was designed by the Johns Hopkins University/Applied Physics Laboratory (JHU/APL) to actively control a pair of Advanced Stirling Convertors (ASC). The first phase of testing included a Dual Advanced Stirling Convertor Simulator (DASCS) which was developed by JHU/APL and simulates the operation and electrical behavior of a pair of ASC's in real time via a combination of hardware and software. RSIL provides insight into the electrical interactions between a representative radioisotope power generator, its associated control schemes, and realistic electric system loads. The first phase of integration testing included the following spacecraft bus configurations: capacitive, battery, and supercapacitor. A load profile, created based on data from several missions, tested the RPS and RSIL ability to maintain operation during load demands above and below the power provided by the RPS. The integration testing also confirmed the DCC's ability to disconnect from the spacecraft when the bus voltage dipped below 22 V or exceeded 36 V. Once operation was verified with the DASCS, the tests were repeated with actual operating ASC's. The goal of this integration testing was to verify operation of the DCC when connected to a spacecraft and to verify the functionality of the newly designed RSIL. The results of these tests are presented in this paper.

  15. Integrated Computational Materials Engineering (ICME) for Third Generation Advanced High-Strength Steel Development

    SciTech Connect

    Savic, Vesna; Hector, Louis G.; Ezzat, Hesham; Sachdev, Anil K.; Quinn, James; Krupitzer, Ronald; Sun, Xin

    2015-06-01

    This paper presents an overview of a four-year project focused on development of an integrated computational materials engineering (ICME) toolset for third generation advanced high-strength steels (3GAHSS). Following a brief look at ICME as an emerging discipline within the Materials Genome Initiative, technical tasks in the ICME project will be discussed. Specific aims of the individual tasks are multi-scale, microstructure-based material model development using state-of-the-art computational and experimental techniques, forming, toolset assembly, design optimization, integration and technical cost modeling. The integrated approach is initially illustrated using a 980 grade transformation induced plasticity (TRIP) steel, subject to a two-step quenching and partitioning (Q&P) heat treatment, as an example.

  16. Advanced Simulation and Computing Fiscal Year 2016 Implementation Plan, Version 0

    SciTech Connect

    McCoy, M.; Archer, B.; Hendrickson, B.

    2015-08-27

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The purpose of this IP is to outline key work requirements to be performed and to control individual work activities within the scope of work. Contractors may not deviate from this plan without a revised WA or subsequent IP.

  17. An Investigative Laboratory Course in Human Physiology Using Computer Technology and Collaborative Writing

    ERIC Educational Resources Information Center

    FitzPatrick, Kathleen A.

    2004-01-01

    Active investigative student-directed experiences in laboratory science are being encouraged by national science organizations. A growing body of evidence from classroom assessment supports their effectiveness. This study describes four years of implementation and assessment of an investigative laboratory course in human physiology for 65…

  18. Microcomputer-Based Laboratories and Computer Networking High School Science Classrooms.

    ERIC Educational Resources Information Center

    Lehman, James D.; Campbell, John P.

    Microcomputer-based laboratories (MBLs) are believed to have significant potential for improving laboratory experiences in science classrooms. A study funded through a grant project called STEPS to Better Science sought to broaden the knowledge base by examining MBL use by teachers and students in a variety of science classrooms in six high…

  19. A Simple and Resource-efficient Setup for the Computer-aided Drug Design Laboratory.

    PubMed

    Moretti, Loris; Sartori, Luca

    2016-10-01

    Undertaking modelling investigations for Computer-Aided Drug Design (CADD) requires a proper environment. In principle, this could be done on a single computer, but the reality of a drug discovery program requires robustness and high-throughput computing (HTC) to efficiently support the research. Therefore, a more capable alternative is needed but its implementation has no widespread solution. Here, the realization of such a computing facility is discussed, from general layout to technical details all aspects are covered.

  20. One-dimensional light localization with classical scatterers: An advanced undergraduate laboratory experiment

    NASA Astrophysics Data System (ADS)

    Kemp, K. J.; Barker, S.; Guthrie, J.; Hagood, B.; Havey, M. D.

    2016-10-01

    The phenomenon of electronic wave localization through disorder remains an important area of fundamental and applied research. Localization of all wave phenomena, including light, is thought to exist in a restricted one-dimensional geometry. We present here a series of experiments to illustrate, using a straightforward experimental arrangement and approach, the localization of light in a quasi-one-dimensional physical system. In the experiments, reflected and transmitted light from a stack of glass slides of varying thickness reveals an Ohm's law type behavior for small thicknesses, and evolution to exponential decay of the transmitted power for larger thicknesses. For larger stacks of slides, a weak departure from one-dimensional behavior is also observed. The experiment and analysis of the results, showing many of the essential features of wave localization, is relatively straightforward, economical, and suitable for laboratory experiments at an undergraduate level.