Science.gov

Sample records for advanced computing laboratory

  1. Advanced computing

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Advanced concepts in hardware, software and algorithms are being pursued for application in next generation space computers and for ground based analysis of space data. The research program focuses on massively parallel computation and neural networks, as well as optical processing and optical networking which are discussed under photonics. Also included are theoretical programs in neural and nonlinear science, and device development for magnetic and ferroelectric memories.

  2. Using an Advanced Computational Laboratory Experiment to Extend and Deepen Physical Chemistry Students' Understanding of Atomic Structure

    ERIC Educational Resources Information Center

    Hoffman, Gary G.

    2015-01-01

    A computational laboratory experiment is described, which involves the advanced study of an atomic system. The students use concepts and techniques typically covered in a physical chemistry course but extend those concepts and techniques to more complex situations. The students get a chance to explore the study of atomic states and perform…

  3. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan : ASC software quality engineering practices Version 3.0.

    SciTech Connect

    Turgeon, Jennifer L.; Minana, Molly A.; Hackney, Patricia; Pilch, Martin M.

    2009-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in the US Department of Energy/National Nuclear Security Agency (DOE/NNSA) Quality Criteria, Revision 10 (QC-1) as 'conformance to customer requirements and expectations'. This quality plan defines the SNL ASC Program software quality engineering (SQE) practices and provides a mapping of these practices to the SNL Corporate Process Requirement (CPR) 001.3.6; 'Corporate Software Engineering Excellence'. This plan also identifies ASC management's and the software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals. This SNL ASC Software Quality Plan establishes the signatories commitments to improving software products by applying cost-effective SQE practices. This plan enumerates the SQE practices that comprise the development of SNL ASC's software products and explains the project teams opportunities for tailoring and implementing the practices.

  4. Computer integrated laboratory testing

    NASA Technical Reports Server (NTRS)

    Dahl, Charles C.

    1992-01-01

    The objective is the integration of computers into the Engineering Materials Science Laboratory course, where existing test equipment is not computerized. The first lab procedure is to demonstrate and produce a material phase change curve. The second procedure is a demonstration of the modulus of elasticity and related stress-strain curve, plastic performance, maximum and failure strength. The process of recording data by sensors that are connected to a data logger which adds a time base, and the data logger in turn connected to a computer, places the materials labs into a computer integrated mode with minimum expense and maximum flexibility. The sensor signals are input into a spread sheet for tabular records, curve generation, and graph printing.

  5. Interfaces for Advanced Computing.

    ERIC Educational Resources Information Center

    Foley, James D.

    1987-01-01

    Discusses the coming generation of supercomputers that will have the power to make elaborate "artificial realities" that facilitate user-computer communication. Illustrates these technological advancements with examples of the use of head-mounted monitors which are connected to position and orientation sensors, and gloves that track finger and…

  6. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1: ASC software quality engineering practices, Version 2.0.

    SciTech Connect

    Sturtevant, Judith E.; Heaphy, Robert; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Minana, Molly A.; Hackney, Patricia; Forsythe, Christi A.; Schofield, Joseph Richard, Jr.; Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR 1.3.2 and 1.3.6 and to a Department of Energy document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines. This document also identifies ASC management and software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  7. Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan. Part 2, Mappings for the ASC software quality engineering practices. Version 1.0.

    SciTech Connect

    Ellis, Molly A.; Heaphy, Robert; Sturtevant, Judith E.; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Forsythe, Christi A.; Schofield, Joseph Richard, Jr.; Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2005-01-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR 1.3.2 and 1.3.6 and to a Department of Energy document, 'ASCI Software Quality Engineering: Goals, Principles, and Guidelines'. This document also identifies ASC management and software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  8. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan part 2 mappings for the ASC software quality engineering practices, version 2.0.

    SciTech Connect

    Heaphy, Robert; Sturtevant, Judith E.; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Minana, Molly A.; Hackney, Patricia; Forsythe, Christi A.; Schofield, Joseph Richard, Jr.; Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR001.3.2 and CPR001.3.6 and to a Department of Energy document, ''ASCI Software Quality Engineering: Goals, Principles, and Guidelines''. This document also identifies ASC management and software project teams' responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  9. Oklahoma's Mobile Computer Graphics Laboratory.

    ERIC Educational Resources Information Center

    McClain, Gerald R.

    This Computer Graphics Laboratory houses an IBM 1130 computer, U.C.C. plotter, printer, card reader, two key punch machines, and seminar-type classroom furniture. A "General Drafting Graphics System" (GDGS) is used, based on repetitive use of basic coordinate and plot generating commands. The system is used by 12 institutions of higher education…

  10. Advanced Computing for Medicine.

    ERIC Educational Resources Information Center

    Rennels, Glenn D.; Shortliffe, Edward H.

    1987-01-01

    Discusses contributions that computers and computer networks are making to the field of medicine. Emphasizes the computer's speed in storing and retrieving data. Suggests that doctors may soon be able to use computers to advise on diagnosis and treatment. (TW)

  11. Computer-Assisted Laboratory Stations.

    ERIC Educational Resources Information Center

    Snyder, William J., Hanyak, Michael E.

    1985-01-01

    Describes the advantages and features of computer-assisted laboratory stations for use in a chemical engineering program. Also describes a typical experiment at such a station: determining the response times of a solid state humidity sensor at various humidity conditions and developing an empirical model for the sensor. (JN)

  12. Advanced Computing for Science.

    ERIC Educational Resources Information Center

    Hut, Piet; Sussman, Gerald Jay

    1987-01-01

    Discusses some of the contributions that high-speed computing is making to the study of science. Emphasizes the use of computers in exploring complicated systems without the simplification required in traditional methods of observation and experimentation. Provides examples of computer assisted investigations in astronomy and physics. (TW)

  13. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1 : ASC software quality engineering practices version 1.0.

    SciTech Connect

    Minana, Molly A.; Sturtevant, Judith E.; Heaphy, Robert; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Forsythe, Christi A.; Schofield, Joseph Richard, Jr.; Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2005-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in DOE/AL Quality Criteria (QC-1) as conformance to customer requirements and expectations. This quality plan defines the ASC program software quality practices and provides mappings of these practices to the SNL Corporate Process Requirements (CPR 1.3.2 and CPR 1.3.6) and the Department of Energy (DOE) document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines (GP&G). This quality plan identifies ASC management and software project teams' responsibilities for cost-effective software engineering quality practices. The SNL ASC Software Quality Plan establishes the signatories commitment to improving software products by applying cost-effective software engineering quality practices. This document explains the project teams opportunities for tailoring and implementing the practices; enumerates the practices that compose the development of SNL ASC's software products; and includes a sample assessment checklist that was developed based upon the practices in this document.

  14. The Computation Directorate at Lawrence Livermore National Laboratory

    SciTech Connect

    Cook, L

    2006-09-07

    The Computation Directorate at Lawrence Livermore National Laboratory has four major areas of work: (1) Programmatic Support -- Programs are areas which receive funding to develop solutions to problems or advance basic science in their areas (Stockpile Stewardship, Homeland Security, the Human Genome project). Computer scientists are 'matrixed' to these programs to provide computer science support. (2) Livermore Computer Center (LCC) -- Development, support and advanced planning for the large, massively parallel computers, networks and storage facilities used throughout the laboratory. (3) Research -- Computer scientists research advanced solutions for programmatic work and for external contracts and research new HPC hardware solutions. (4) Infrastructure -- Support for thousands of desktop computers and numerous LANs, labwide unclassified networks, computer security, computer-use policy.

  15. Advanced computer languages

    SciTech Connect

    Bryce, H.

    1984-05-03

    If software is to become an equal partner in the so-called fifth generation of computers-which of course it must-programming languages and the human interface will need to clear some high hurdles. Again, the solutions being sought turn to cerebral emulation-here, the way that human beings understand language. The result would be natural or English-like languages that would allow a person to communicate with a computer much as he or she does with another person. In the discussion the authors look at fourth level languages and fifth level languages, used in meeting the goal of AI. The higher level languages aim to be non procedural. Application of LISP, and Forth to natural language interface are described as well as programs such as natural link technology package, written in C.

  16. Simulation methods for advanced scientific computing

    SciTech Connect

    Booth, T.E.; Carlson, J.A.; Forster, R.A.

    1998-11-01

    This is the final report of a three-year, Laboratory Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The objective of the project was to create effective new algorithms for solving N-body problems by computer simulation. The authors concentrated on developing advanced classical and quantum Monte Carlo techniques. For simulations of phase transitions in classical systems, they produced a framework generalizing the famous Swendsen-Wang cluster algorithms for Ising and Potts models. For spin-glass-like problems, they demonstrated the effectiveness of an extension of the multicanonical method for the two-dimensional, random bond Ising model. For quantum mechanical systems, they generated a new method to compute the ground-state energy of systems of interacting electrons. They also improved methods to compute excited states when the diffusion quantum Monte Carlo method is used and to compute longer time dynamics when the stationary phase quantum Monte Carlo method is used.

  17. Three-dimensional registration of synchrotron radiation-based micro-computed tomography images with advanced laboratory micro-computed tomography data from murine kidney casts

    NASA Astrophysics Data System (ADS)

    Thalmann, Peter; Hieber, Simone E.; Schulz, Georg; Deyhle, Hans; Khimchenko, Anna; Kurtcuoglu, Vartan; Olgac, Ufuk; Marmaras, Anastasios; Kuo, Willy; Meyer, Eric P.; Beckmann, Felix; Herzen, Julia; Ehrbar, Stefanie; Müller, Bert

    2014-09-01

    Malfunction of oxygen regulation in kidney and liver may lead to the pathogenesis of chronic diseases. The underlying mechanisms are poorly understood. In kidney, it is hypothesized that renal gas shunting from arteries to veins eliminates excess oxygen. Such shunting is highly dependent on the structure of the renal vascular network. The vascular tree has so far not been quantified under maintenance of its connectivity as three-dimensional imaging of the vessel tree down to the smallest capillaries, which in mouse model are smaller than 5 μm in diameter, is a challenging task. An established protocol uses corrosion casts and applies synchrotron radiation-based micro-computed tomography (SRμCT), which provides the desired spatial resolution with the necessary contrast. However, SRμCT is expensive and beamtime access is limited. We show here that measurements with a phoenix nanotomrm (General Electric, Wunstorf, Germany) can provide comparable results to those obtained with SRμCT, except for regions with small vessel structures, where the signal-to-noise level was significantly reduced. For this purpose the nanotom®m measurement was compared with its corresponding measurement acquired at the beamline P05 at PETRA III at DESY, Hamburg, Germany.

  18. Redirecting Under-Utilised Computer Laboratories into Cluster Computing Facilities

    ERIC Educational Resources Information Center

    Atkinson, John S.; Spenneman, Dirk H. R.; Cornforth, David

    2005-01-01

    Purpose: To provide administrators at an Australian university with data on the feasibility of redirecting under-utilised computer laboratories facilities into a distributed high performance computing facility. Design/methodology/approach: The individual log-in records for each computer located in the computer laboratories at the university were…

  19. Using the Computer as a Laboratory Instrument.

    ERIC Educational Resources Information Center

    Collings, Peter J.; Greenslade, Thomas B., Jr.

    1989-01-01

    Reports experiences during a two-year period in introducing the computer to the laboratory and students to the computer as a laboratory instrument. Describes a working philosophy, data acquisition system, and experiments. Summarizes the laboratory procedures of nine experiments, covering mechanics, heat, electromagnetism, and optics. (YP)

  20. The Laboratory for Oceans Computing Facility

    NASA Technical Reports Server (NTRS)

    Kao, R.

    1988-01-01

    The first VAX computer in the Laboratory for Oceans Computing Facility (LOCF) was installed and the facility was largely expanded. The growth is not only in hardware and software, but also in the number of users and in supporting research and development projects. The LOCF serves as a general purpose computing facility for: ocean color research projects, sea ice research projects, processing of the Nimbus-7 Coastal Zone Color Scanner data set, real time ingest and analysis of TIROS-N satellite data, study of the Synthetic Aperture Radar data, study of LANDSAT data, and many others. The physical space and the electrical power layout of the computing room were modified to accommodate all the equipment. The LOCF has several image processing stations which include two International Imaging Systems (IIS) model 75 processors and one Adage processor. The facility has the capability of ingesting the TIROS-N HRPT satellite data on a real time basis. More than 30 software packages were installed on the systems. System software packages, network software, FORTRAN and C compilers, database management software, image processing software, graphics, mathematics and statistics packages, TAE, Catalog Manager, GEMPAK, LAS and many other software developed on the LOCF computers such as SEAPAK have greatly advanced the capability of the LOCF.

  1. Center for Advanced Computational Technology

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    2000-01-01

    The Center for Advanced Computational Technology (ACT) was established to serve as a focal point for diverse research activities pertaining to application of advanced computational technology to future aerospace systems. These activities include the use of numerical simulations, artificial intelligence methods, multimedia and synthetic environments, and computational intelligence, in the modeling, analysis, sensitivity studies, optimization, design and operation of future aerospace systems. The Center is located at NASA Langley and is an integral part of the School of Engineering and Applied Science of the University of Virginia. The Center has four specific objectives: 1) conduct innovative research on applications of advanced computational technology to aerospace systems; 2) act as pathfinder by demonstrating to the research community what can be done (high-potential, high-risk research); 3) help in identifying future directions of research in support of the aeronautical and space missions of the twenty-first century; and 4) help in the rapid transfer of research results to industry and in broadening awareness among researchers and engineers of the state-of-the-art in applications of advanced computational technology to the analysis, design prototyping and operations of aerospace and other high-performance engineering systems. In addition to research, Center activities include helping in the planning and coordination of the activities of a multi-center team of NASA and JPL researchers who are developing an intelligent synthesis environment for future aerospace systems; organizing workshops and national symposia; as well as writing state-of-the-art monographs and NASA special publications on timely topics.

  2. Recent advances in computational aerodynamics

    NASA Astrophysics Data System (ADS)

    Agarwal, Ramesh K.; Desse, Jerry E.

    1991-04-01

    The current state of the art in computational aerodynamics is described. Recent advances in the discretization of surface geometry, grid generation, and flow simulation algorithms have led to flowfield predictions for increasingly complex and realistic configurations. As a result, computational aerodynamics is emerging as a crucial enabling technology for the development and design of flight vehicles. Examples illustrating the current capability for the prediction of aircraft, launch vehicle and helicopter flowfields are presented. Unfortunately, accurate modeling of turbulence remains a major difficulty in the analysis of viscosity-dominated flows. In the future inverse design methods, multidisciplinary design optimization methods, artificial intelligence technology and massively parallel computer technology will be incorporated into computational aerodynamics, opening up greater opportunities for improved product design at substantially reduced costs.

  3. Advanced flight computer. Special study

    NASA Technical Reports Server (NTRS)

    Coo, Dennis

    1995-01-01

    This report documents a special study to define a 32-bit radiation hardened, SEU tolerant flight computer architecture, and to investigate current or near-term technologies and development efforts that contribute to the Advanced Flight Computer (AFC) design and development. An AFC processing node architecture is defined. Each node may consist of a multi-chip processor as needed. The modular, building block approach uses VLSI technology and packaging methods that demonstrate a feasible AFC module in 1998 that meets that AFC goals. The defined architecture and approach demonstrate a clear low-risk, low-cost path to the 1998 production goal, with intermediate prototypes in 1996.

  4. Argonne's Laboratory computing center - 2007 annual report.

    SciTech Connect

    Bair, R.; Pieper, G. W.

    2008-05-28

    Argonne National Laboratory founded the Laboratory Computing Resource Center (LCRC) in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. In September 2002 the LCRC deployed a 350-node computing cluster from Linux NetworX to address Laboratory needs for mid-range supercomputing. This cluster, named 'Jazz', achieved over a teraflop of computing power (1012 floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the 50 fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2007, there were over 60 active projects representing a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to foster growth in the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure providers to offer more scientific data management capabilities, expanding Argonne staff use of national computing facilities, and improving the scientific reach and

  5. Software Security in the University Computer Laboratories.

    ERIC Educational Resources Information Center

    Kung, Mable T.

    1989-01-01

    Discussion of software security in university computer laboratories focuses on the causes of computer viruses. Possible ways to detect an infected disk are described; strategies for professors, students, and computer personnel to eradicate the spread of a computer virus are proposed; and two resources for further information are given. (LRW)

  6. Computational Biology, Advanced Scientific Computing, and Emerging Computational Architectures

    SciTech Connect

    2007-06-27

    This CRADA was established at the start of FY02 with $200 K from IBM and matching funds from DOE to support post-doctoral fellows in collaborative research between International Business Machines and Oak Ridge National Laboratory to explore effective use of emerging petascale computational architectures for the solution of computational biology problems. 'No cost' extensions of the CRADA were negotiated with IBM for FY03 and FY04.

  7. Systems engineering and integration: Advanced avionics laboratories

    NASA Technical Reports Server (NTRS)

    1990-01-01

    In order to develop the new generation of avionics which will be necessary for upcoming programs such as the Lunar/Mars Initiative, Advanced Launch System, and the National Aerospace Plane, new Advanced Avionics Laboratories are required. To minimize costs and maximize benefits, these laboratories should be capable of supporting multiple avionics development efforts at a single location, and should be of a common design to support and encourage data sharing. Recent technological advances provide the capability of letting the designer or analyst perform simulations and testing in an environment similar to his engineering environment and these features should be incorporated into the new laboratories. Existing and emerging hardware and software standards must be incorporated wherever possible to provide additional cost savings and compatibility. Special care must be taken to design the laboratories such that real-time hardware-in-the-loop performance is not sacrificed in the pursuit of these goals. A special program-independent funding source should be identified for the development of Advanced Avionics Laboratories as resources supporting a wide range of upcoming NASA programs.

  8. Advanced Laboratory NMR Spectrometer with Applications.

    ERIC Educational Resources Information Center

    Biscegli, Clovis; And Others

    1982-01-01

    A description is given of an inexpensive nuclear magnetic resonance (NMR) spectrometer suitable for use in advanced laboratory courses. Applications to the nondestructive analysis of the oil content in corn seeds and in monitoring the crystallization of polymers are presented. (SK)

  9. Teaching Cardiovascular Integrations with Computer Laboratories.

    ERIC Educational Resources Information Center

    Peterson, Nils S.; Campbell, Kenneth B.

    1985-01-01

    Describes a computer-based instructional unit in cardiovascular physiology. The program (which employs simulated laboratory experimental techniques with a problem-solving format is designed to supplement an animal laboratory and to offer students an integrative approach to physiology through use of microcomputers. Also presents an overview of the…

  10. A Flexible Software Scheme for a Clinical Laboratory Computer Network

    PubMed Central

    Foulis, Philip R.; Megargle, Robert; Shecket, Gordon; Su, Joseph; Dartt, Arthur

    1983-01-01

    A laboratory information management system (LMS) must disseminate pertinent data to other hospital departments and organize laboratory workflow, while remaining flexible to conform to the organization and practices of the laboratory. While stand-alone LMS's excell at providing versatility through specialized functions like direct instrument interfaces, total hospital information systems (HIS's) are better at combining and distributing diverse data. In general, neither of these approaches have provided the level of performance desired in a modern hospital environment. A formalized scheme of implementing a LMS with a network of computers has been devised to provide the advantages of both approaches, and incorporate advanced levels of customization.

  11. Advances in computational solvation thermodynamics

    NASA Astrophysics Data System (ADS)

    Wyczalkowski, Matthew A.

    The aim of this thesis is to develop improved methods for calculating the free energy, entropy and enthalpy of solvation from molecular simulations. Solvation thermodynamics of model compounds provides quantitative measurements used to analyze the stability of protein conformations in aqueous milieus. Solvation free energies govern the favorability of the solvation process, while entropy and enthalpy decompositions give insight into the molecular mechanisms by which the process occurs. Computationally, a coupling parameter lambda modulates solute-solvent interactions to simulate an insertion process, and multiple lengthy simulations at a fixed lambda value are typically required for free energy calculations to converge; entropy and enthalpy decompositions generally take 10-100 times longer. This thesis presents three advances which accelerate the convergence of such calculations: (1) Development of entropy and enthalpy estimators which combine data from multiple simulations; (2) Optimization of lambda schedules, or the set of parameter values associated with each simulation; (3) Validation of Hamiltonian replica exchange, a technique which swaps lambda values between two otherwise independent simulations. Taken together, these techniques promise to increase the accuracy and precision of free energy, entropy and enthalpy calculations. Improved estimates, in turn, can be used to investigate the validity and limits of existing solvation models and refine force field parameters, with the goal of understanding better the collapse transition and aggregation behavior of polypeptides.

  12. Los Alamos National Laboratory computer benchmarking 1982

    SciTech Connect

    Martin, J.L.

    1983-06-01

    Evaluating the performance of computing machinery is a continual effort of the Computer Research and Applications Group of the Los Alamos National Laboratory. This report summarizes the results of the group's benchmarking activities performed between October 1981 and September 1982, presenting compilation and execution times as well as megaflop rates for a set of benchmark codes. Tests were performed on the following computers: Cray Research, Inc. (CRI) Cray-1S; Control Data Corporation (CDC) 7600, 6600, Cyber 73, Cyber 825, Cyber 835, Cyber 855, and Cyber 205; Digital Equipment Corporation (DEC) VAX 11/780 and VAX 11/782; and Apollo Computer, Inc., Apollo.

  13. JPL Robotics Laboratory computer vision software library

    NASA Technical Reports Server (NTRS)

    Cunningham, R.

    1984-01-01

    The past ten years of research on computer vision have matured into a powerful real time system comprised of standardized commercial hardware, computers, and pipeline processing laboratory prototypes, supported by anextensive set of image processing algorithms. The software system was constructed to be transportable via the choice of a popular high level language (PASCAL) and a widely used computer (VAX-11/750), it comprises a whole realm of low level and high level processing software that has proven to be versatile for applications ranging from factory automation to space satellite tracking and grappling.

  14. Computer Based Simulation of Laboratory Experiments.

    ERIC Educational Resources Information Center

    Edward, Norrie S.

    1997-01-01

    Examines computer based simulations of practical laboratory experiments in engineering. Discusses the aims and achievements of lab work (cognitive, process, psychomotor, and affective); types of simulations (model building and behavioral); and the strengths and weaknesses of simulations. Describes the development of a centrifugal pump simulation,…

  15. Computers in the General Physics Laboratory.

    ERIC Educational Resources Information Center

    Preston, Daryl W.; Good, R. H.

    1996-01-01

    Provides ideas and outcomes for nine computer laboratory experiments using a commercial eight-bit analog to digital (ADC) interface. Experiments cover statistics; rotation; harmonic motion; voltage, current, and resistance; ADC conversions; temperature measurement; single slit diffraction; and radioactive decay. Includes necessary schematics. (MVL)

  16. Advancing manufacturing through computational chemistry

    SciTech Connect

    Noid, D.W.; Sumpter, B.G.; Tuzun, R.E.

    1995-12-31

    The capabilities of nanotechnology and computational chemistry are reaching a point of convergence. New computer hardware and novel computational methods have created opportunities to test proposed nanometer-scale devices, investigate molecular manufacturing and model and predict properties of new materials. Experimental methods are also beginning to provide new capabilities that make the possibility of manufacturing various devices with atomic precision tangible. In this paper, we will discuss some of the novel computational methods we have used in molecular dynamics simulations of polymer processes, neural network predictions of new materials, and simulations of proposed nano-bearings and fluid dynamics in nano- sized devices.

  17. Advanced Materials Laboratory User Test Planning Guide

    NASA Technical Reports Server (NTRS)

    Orndoff, Evelyne

    2012-01-01

    Test process, milestones and inputs are unknowns to first-time users of the Advanced Materials Laboratory. The User Test Planning Guide aids in establishing expectations for both NASA and non-NASA facility customers. The potential audience for this guide includes both internal and commercial spaceflight hardware/software developers. It is intended to assist their test engineering personnel in test planning and execution. Material covered includes a roadmap of the test process, roles and responsibilities of facility and user, major milestones, facility capabilities, and inputs required by the facility. Samples of deliverables, test article interfaces, and inputs necessary to define test scope, cost, and schedule are included as an appendix to the guide.

  18. Quantum chromodynamics with advanced computing

    SciTech Connect

    Kronfeld, Andreas S.; /Fermilab

    2008-07-01

    We survey results in lattice quantum chromodynamics from groups in the USQCD Collaboration. The main focus is on physics, but many aspects of the discussion are aimed at an audience of computational physicists.

  19. Aerodynamic Analyses Requiring Advanced Computers, Part 1

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Papers are presented which deal with results of theoretical research on aerodynamic flow problems requiring the use of advanced computers. Topics discussed include: viscous flows, boundary layer equations, turbulence modeling and Navier-Stokes equations, and internal flows.

  20. Aerodynamic Analyses Requiring Advanced Computers, part 2

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Papers given at the conference present the results of theoretical research on aerodynamic flow problems requiring the use of advanced computers. Topics discussed include two-dimensional configurations, three-dimensional configurations, transonic aircraft, and the space shuttle.

  1. Bringing Advanced Computational Techniques to Energy Research

    SciTech Connect

    Mitchell, Julie C

    2012-11-17

    Please find attached our final technical report for the BACTER Institute award. BACTER was created as a graduate and postdoctoral training program for the advancement of computational biology applied to questions of relevance to bioenergy research.

  2. Sandia National Laboratories Advanced Simulation and Computing (ASC) : appraisal method for the implementation of the ASC software quality engineering practices: Version 1.0.

    SciTech Connect

    Turgeon, Jennifer; Minana, Molly A.

    2008-02-01

    This document provides a guide to the process of conducting software appraisals under the Sandia National Laboratories (SNL) ASC Program. The goal of this document is to describe a common methodology for planning, conducting, and reporting results of software appraisals thereby enabling: development of an objective baseline on implementation of the software quality engineering (SQE) practices identified in the ASC Software Quality Plan across the ASC Program; feedback from project teams on SQE opportunities for improvement; identification of strengths and opportunities for improvement for individual project teams; guidance to the ASC Program on the focus of future SQE activities Document contents include process descriptions, templates to promote consistent conduct of appraisals, and an explanation of the relationship of this procedure to the SNL ASC software program.

  3. Advanced algorithm for orbit computation

    NASA Technical Reports Server (NTRS)

    Szenbehely, V.

    1983-01-01

    Computational and analytical techniques which simplify the solution of complex problems in orbit mechanics, Astrodynamics and Celestial Mechanics were developed. The major tool of the simplification is the substitution of transformations in place of numerical or analytical integrations. In this way the rather complicated equations of orbit mechanics might sometimes be reduced to linear equations representing harmonic oscillators with constant coefficients.

  4. 77 FR 62231 - DOE/Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-12

    .../Advanced Scientific Computing Advisory Committee AGENCY: Office of Science, Department of Energy. ACTION: Notice of Open Meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing...: Melea Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building;...

  5. 76 FR 31945 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-02

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION... the Advanced Scientific Computing Advisory Committee (ASCAC). The Federal Advisory Committee Act (Pub... INFORMATION CONTACT: Melea Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown...

  6. Advanced Biomedical Computing Center (ABCC) | DSITP

    Cancer.gov

    The Advanced Biomedical Computing Center (ABCC), located in Frederick Maryland (MD), provides HPC resources for both NIH/NCI intramural scientists and the extramural biomedical research community. Its mission is to provide HPC support, to provide collaborative research, and to conduct in-house research in various areas of computational biology and biomedical research.

  7. Advances and trends in computational structural mechanics

    NASA Technical Reports Server (NTRS)

    Noor, A. K.

    1986-01-01

    Recent developments in computational structural mechanics are reviewed with reference to computational needs for future structures technology, advances in computational models for material behavior, discrete element technology, assessment and control of numerical simulations of structural response, hybrid analysis, and techniques for large-scale optimization. Research areas in computational structural mechanics which have high potential for meeting future technological needs are identified. These include prediction and analysis of the failure of structural components made of new materials, development of computational strategies and solution methodologies for large-scale structural calculations, and assessment of reliability and adaptive improvement of response predictions.

  8. Eagleworks Laboratories: Advanced Propulsion Physics Research

    NASA Technical Reports Server (NTRS)

    White, Harold; March, Paul; Williams, Nehemiah; ONeill, William

    2011-01-01

    NASA/JSC is implementing an advanced propulsion physics laboratory, informally known as "Eagleworks", to pursue propulsion technologies necessary to enable human exploration of the solar system over the next 50 years, and enabling interstellar spaceflight by the end of the century. This work directly supports the "Breakthrough Propulsion" objectives detailed in the NASA OCT TA02 In-space Propulsion Roadmap, and aligns with the #10 Top Technical Challenge identified in the report. Since the work being pursued by this laboratory is applied scientific research in the areas of the quantum vacuum, gravitation, nature of space-time, and other fundamental physical phenomenon, high fidelity testing facilities are needed. The lab will first implement a low-thrust torsion pendulum (<1 uN), and commission the facility with an existing Quantum Vacuum Plasma Thruster. To date, the QVPT line of research has produced data suggesting very high specific impulse coupled with high specific force. If the physics and engineering models can be explored and understood in the lab to allow scaling to power levels pertinent for human spaceflight, 400kW SEP human missions to Mars may become a possibility, and at power levels of 2MW, 1-year transit to Neptune may also be possible. Additionally, the lab is implementing a warp field interferometer that will be able to measure spacetime disturbances down to 150nm. Recent work published by White [1] [2] [3] suggests that it may be possible to engineer spacetime creating conditions similar to what drives the expansion of the cosmos. Although the expected magnitude of the effect would be tiny, it may be a "Chicago pile" moment for this area of physics.

  9. Advancing crime scene computer forensics techniques

    NASA Astrophysics Data System (ADS)

    Hosmer, Chet; Feldman, John; Giordano, Joe

    1999-02-01

    Computers and network technology have become inexpensive and powerful tools that can be applied to a wide range of criminal activity. Computers have changed the world's view of evidence because computers are used more and more as tools in committing `traditional crimes' such as embezzlements, thefts, extortion and murder. This paper will focus on reviewing the current state-of-the-art of the data recovery and evidence construction tools used in both the field and laboratory for prosection purposes.

  10. Opportunities in computational mechanics: Advances in parallel computing

    SciTech Connect

    Lesar, R.A.

    1999-02-01

    In this paper, the authors will discuss recent advances in computing power and the prospects for using these new capabilities for studying plasticity and failure. They will first review the new capabilities made available with parallel computing. They will discuss how these machines perform and how well their architecture might work on materials issues. Finally, they will give some estimates on the size of problems possible using these computers.

  11. Laboratory Diagnosis of Human Rabies: Recent Advances

    PubMed Central

    Mani, Reeta Subramaniam; Madhusudana, Shampur Narayan

    2013-01-01

    Rabies, an acute progressive, fatal encephalomyelitis, transmitted most commonly through the bite of a rabid animal, is responsible for an estimated 61,000 human deaths worldwide. The true disease burden and public health impact due to rabies remain underestimated due to lack of sensitive laboratory diagnostic methods. Rapid diagnosis of rabies can help initiate prompt infection control and public health measures, obviate the need for unnecessary treatment/medical tests, and assist in timely administration of pre- or postexposure prophylactic vaccination to family members and medical staff. Antemortem diagnosis of human rabies provides an impetus for clinicians to attempt experimental therapeutic approaches in some patients, especially after the reported survival of a few cases of human rabies. Traditional methods for antemortem and postmortem rabies diagnosis have several limitations. Recent advances in technology have led to the improvement or development of several diagnostic assays which include methods for rabies viral antigen and antibody detection and assays for viral nucleic acid detection and identification of specific biomarkers. These assays which complement traditional methods have the potential to revolutionize rabies diagnosis in future. PMID:24348170

  12. Computer Simulations Improve University Instructional Laboratories1

    PubMed Central

    2004-01-01

    Laboratory classes are commonplace and essential in biology departments but can sometimes be cumbersome, unreliable, and a drain on time and resources. As university intakes increase, pressure on budgets and staff time can often lead to reduction in practical class provision. Frequently, the ability to use laboratory equipment, mix solutions, and manipulate test animals are essential learning outcomes, and “wet” laboratory classes are thus appropriate. In others, however, interpretation and manipulation of the data are the primary learning outcomes, and here, computer-based simulations can provide a cheaper, easier, and less time- and labor-intensive alternative. We report the evaluation of two computer-based simulations of practical exercises: the first in chromosome analysis, the second in bioinformatics. Simulations can provide significant time savings to students (by a factor of four in our first case study) without affecting learning, as measured by performance in assessment. Moreover, under certain circumstances, performance can be improved by the use of simulations (by 7% in our second case study). We concluded that the introduction of these simulations can significantly enhance student learning where consideration of the learning outcomes indicates that it might be appropriate. In addition, they can offer significant benefits to teaching staff. PMID:15592599

  13. Role of HPC in Advancing Computational Aeroelasticity

    NASA Technical Reports Server (NTRS)

    Guruswamy, Guru P.

    2004-01-01

    On behalf of the High Performance Computing and Modernization Program (HPCMP) and NASA Advanced Supercomputing Division (NAS) a study is conducted to assess the role of supercomputers on computational aeroelasticity of aerospace vehicles. The study is mostly based on the responses to a web based questionnaire that was designed to capture the nuances of high performance computational aeroelasticity, particularly on parallel computers. A procedure is presented to assign a fidelity-complexity index to each application. Case studies based on major applications using HPCMP resources are presented.

  14. Advances and trends in computational structures technology

    NASA Technical Reports Server (NTRS)

    Noor, A. K.; Venneri, S. L.

    1990-01-01

    The major goals of computational structures technology (CST) are outlined, and recent advances in CST are examined. These include computational material modeling, stochastic-based modeling, computational methods for articulated structural dynamics, strategies and numerical algorithms for new computing systems, multidisciplinary analysis and optimization. The role of CST in the future development of structures technology and the multidisciplinary design of future flight vehicles is addressed, and the future directions of CST research in the prediction of failures of structural components, the solution of large-scale structural problems, and quality assessment and control of numerical simulations are discussed.

  15. OPENING REMARKS: Scientific Discovery through Advanced Computing

    NASA Astrophysics Data System (ADS)

    Strayer, Michael

    2006-01-01

    as the national and regional electricity grid, carbon sequestration, virtual engineering, and the nuclear fuel cycle. The successes of the first five years of SciDAC have demonstrated the power of using advanced computing to enable scientific discovery. One measure of this success could be found in the President’s State of the Union address in which President Bush identified ‘supercomputing’ as a major focus area of the American Competitiveness Initiative. Funds were provided in the FY 2007 President’s Budget request to increase the size of the NERSC-5 procurement to between 100-150 teraflops, to upgrade the LCF Cray XT3 at Oak Ridge to 250 teraflops and acquire a 100 teraflop IBM BlueGene/P to establish the Leadership computing facility at Argonne. We believe that we are on a path to establish a petascale computing resource for open science by 2009. We must develop software tools, packages, and libraries as well as the scientific application software that will scale to hundreds of thousands of processors. Computer scientists from universities and the DOE’s national laboratories will be asked to collaborate on the development of the critical system software components such as compilers, light-weight operating systems and file systems. Standing up these large machines will not be business as usual for ASCR. We intend to develop a series of interconnected projects that identify cost, schedule, risks, and scope for the upgrades at the LCF at Oak Ridge, the establishment of the LCF at Argonne, and the development of the software to support these high-end computers. The critical first step in defining the scope of the project is to identify a set of early application codes for each leadership class computing facility. These codes will have access to the resources during the commissioning phase of the facility projects and will be part of the acceptance tests for the machines. Applications will be selected, in part, by breakthrough science, scalability, and

  16. America's most computer advanced healthcare facilities.

    PubMed

    1993-02-01

    Healthcare Informatics polled industry experts for nominations for this listing of America's Most Computer-Advanced Healthcare Facilities. Nominations were reviewed for extent of departmental automation, leading-edge applications, advanced point-of-care technologies, and networking communications capabilities. Additional consideration was given to smaller facilities automated beyond "normal expectations." Facility representatives who believe their organizations should be included in our next listing, please contact Healthcare Informatics for a nomination form.

  17. National Laboratory for Advanced Scientific Visualization at UNAM - Mexico

    NASA Astrophysics Data System (ADS)

    Manea, Marina; Constantin Manea, Vlad; Varela, Alfredo

    2016-04-01

    In 2015, the National Autonomous University of Mexico (UNAM) joined the family of Universities and Research Centers where advanced visualization and computing plays a key role to promote and advance missions in research, education, community outreach, as well as business-oriented consulting. This initiative provides access to a great variety of advanced hardware and software resources and offers a range of consulting services that spans a variety of areas related to scientific visualization, among which are: neuroanatomy, embryonic development, genome related studies, geosciences, geography, physics and mathematics related disciplines. The National Laboratory for Advanced Scientific Visualization delivers services through three main infrastructure environments: the 3D fully immersive display system Cave, the high resolution parallel visualization system Powerwall, the high resolution spherical displays Earth Simulator. The entire visualization infrastructure is interconnected to a high-performance-computing-cluster (HPCC) called ADA in honor to Ada Lovelace, considered to be the first computer programmer. The Cave is an extra large 3.6m wide room with projected images on the front, left and right, as well as floor walls. Specialized crystal eyes LCD-shutter glasses provide a strong stereo depth perception, and a variety of tracking devices allow software to track the position of a user's hand, head and wand. The Powerwall is designed to bring large amounts of complex data together through parallel computing for team interaction and collaboration. This system is composed by 24 (6x4) high-resolution ultra-thin (2 mm) bezel monitors connected to a high-performance GPU cluster. The Earth Simulator is a large (60") high-resolution spherical display used for global-scale data visualization like geophysical, meteorological, climate and ecology data. The HPCC-ADA, is a 1000+ computing core system, which offers parallel computing resources to applications that requires

  18. The Particle Beam Optics Interactive Computer Laboratory

    SciTech Connect

    Gillespie, George H.; Hill, Barrey W.; Brown, Nathan A.; Babcock, R. Chris; Martono, Hendy; Carey, David C.

    1997-02-01

    The Particle Beam Optics Interactive Computer Laboratory (PBO Lab) is an educational software concept to aid students and professionals in learning about charged particle beams and particle beam optical systems. The PBO Lab is being developed as a cross-platform application and includes four key elements. The first is a graphic user interface shell that provides for a highly interactive learning session. The second is a knowledge database containing information on electric and magnetic optics transport elements. The knowledge database provides interactive tutorials on the fundamental physics of charged particle optics and on the technology used in particle optics hardware. The third element is a graphical construction kit that provides tools for students to interactively and visually construct optical beamlines. The final element is a set of charged particle optics computational engines that compute trajectories, transport beam envelopes, fit parameters to optical constraints and carry out similar calculations for the student designed beamlines. The primary computational engine is provided by the third-order TRANSPORT code. Augmenting TRANSPORT is the multiple ray tracing program TURTLE and a first-order matrix program that includes a space charge model and support for calculating single particle trajectories in the presence of the beam space charge. This paper describes progress on the development of the PBO Lab.

  19. The Particle Beam Optics Interactive Computer Laboratory

    SciTech Connect

    Gillespie, G.H.; Hill, B.W.; Brown, N.A.; Babcock, R.C.; Martono, H.; Carey, D.C. |

    1997-02-01

    The Particle Beam Optics Interactive Computer Laboratory (PBO Lab) is an educational software concept to aid students and professionals in learning about charged particle beams and particle beam optical systems. The PBO Lab is being developed as a cross-platform application and includes four key elements. The first is a graphic user interface shell that provides for a highly interactive learning session. The second is a knowledge database containing information on electric and magnetic optics transport elements. The knowledge database provides interactive tutorials on the fundamental physics of charged particle optics and on the technology used in particle optics hardware. The third element is a graphical construction kit that provides tools for students to interactively and visually construct optical beamlines. The final element is a set of charged particle optics computational engines that compute trajectories, transport beam envelopes, fit parameters to optical constraints and carry out similar calculations for the student designed beamlines. The primary computational engine is provided by the third-order TRANSPORT code. Augmenting TRANSPORT is the multiple ray tracing program TURTLE and a first-order matrix program that includes a space charge model and support for calculating single particle trajectories in the presence of the beam space charge. This paper describes progress on the development of the PBO Lab. {copyright} {ital 1997 American Institute of Physics.}

  20. [Advanced data analysis and visualization for clinical laboratory].

    PubMed

    Inada, Masanori; Yoneyama, Akiko

    2011-01-01

    This paper describes visualization techniques that help identify hidden structures in clinical laboratory data. The visualization of data is helpful for a rapid and better understanding of the characteristics of data sets. Various charts help the user identify trends in data. Scatter plots help prevent misinterpretations due to invalid data by identifying outliers. The representation of experimental data in figures is always useful for communicating results to others. Currently, flexible methods such as smoothing methods and latent structure analysis are available owing to the presence of advanced hardware and software. Principle component analysis, which is a well-known technique used to reduce multidimensional data sets, can be carried out on a personal computer. These methods could lead to advanced visualization with regard to exploratory data analysis. In this paper, we present 3 examples in order to introduce advanced data analysis. In the first example, a smoothing spline was fitted to a time-series from the control chart which is not in a state of statistical control. The trend line was clearly extracted from the daily measurements of the control samples. In the second example, principal component analysis was used to identify a new diagnostic indicator for Graves' disease. The multi-dimensional data obtained from patients were reduced to lower dimensions, and the principle components thus obtained summarized the variation in the data set. In the final example, a latent structure analysis for a Gaussian mixture model was used to draw complex density functions suitable for actual laboratory data. As a result, 5 clusters were extracted. The mixed density function of these clusters represented the data distribution graphically. The methods used in the above examples make the creation of complicated models for clinical laboratories more simple and flexible.

  1. Incorporation of Process Control Computers in the Undergraduate Laboratory.

    ERIC Educational Resources Information Center

    Conner, Wm. Curtis, Jr.

    1990-01-01

    Describes the conversion of a laboratory and change in course content in a chemical engineering curriculum. Lists laboratory experiments and computer programs used in the course. Discusses difficulties during the laboratory conversion and future plans for the course. (YP)

  2. Advances and Challenges in Computational Plasma Science

    SciTech Connect

    W.M. Tang; V.S. Chan

    2005-01-03

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behavior. Recent advances in simulations of magnetically-confined plasmas are reviewed in this paper with illustrative examples chosen from associated research areas such as microturbulence, magnetohydrodynamics, and other topics. Progress has been stimulated in particular by the exponential growth of computer speed along with significant improvements in computer technology.

  3. Applied human factors research at the NASA Johnson Space Center Human-Computer Interaction Laboratory

    NASA Technical Reports Server (NTRS)

    Rudisill, Marianne; Mckay, Timothy D.

    1990-01-01

    The applied human factors research program performed at the NASA Johnson Space Center's Human-Computer Interaction Laboratory is discussed. Research is conducted to advance knowledge in human interaction with computer systems during space crew tasks. In addition, the Laboratory is directly involved in the specification of the human-computer interface (HCI) for space systems in development (e.g., Space Station Freedom) and is providing guidelines and support for HCI design to current and future space missions.

  4. Advanced computational tools for 3-D seismic analysis

    SciTech Connect

    Barhen, J.; Glover, C.W.; Protopopescu, V.A.

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  5. Advanced networks and computing in healthcare

    PubMed Central

    Ackerman, Michael

    2011-01-01

    As computing and network capabilities continue to rise, it becomes increasingly important to understand the varied applications for using them to provide healthcare. The objective of this review is to identify key characteristics and attributes of healthcare applications involving the use of advanced computing and communication technologies, drawing upon 45 research and development projects in telemedicine and other aspects of healthcare funded by the National Library of Medicine over the past 12 years. Only projects publishing in the professional literature were included in the review. Four projects did not publish beyond their final reports. In addition, the authors drew on their first-hand experience as project officers, reviewers and monitors of the work. Major themes in the corpus of work were identified, characterizing key attributes of advanced computing and network applications in healthcare. Advanced computing and network applications are relevant to a range of healthcare settings and specialties, but they are most appropriate for solving a narrower range of problems in each. Healthcare projects undertaken primarily to explore potential have also demonstrated effectiveness and depend on the quality of network service as much as bandwidth. Many applications are enabling, making it possible to provide service or conduct research that previously was not possible or to achieve outcomes in addition to those for which projects were undertaken. Most notable are advances in imaging and visualization, collaboration and sense of presence, and mobility in communication and information-resource use. PMID:21486877

  6. Predictive Dynamic Security Assessment through Advanced Computing

    SciTech Connect

    Huang, Zhenyu; Diao, Ruisheng; Jin, Shuangshuang; Chen, Yousu

    2014-11-30

    Abstract— Traditional dynamic security assessment is limited by several factors and thus falls short in providing real-time information to be predictive for power system operation. These factors include the steady-state assumption of current operating points, static transfer limits, and low computational speed. This addresses these factors and frames predictive dynamic security assessment. The primary objective of predictive dynamic security assessment is to enhance the functionality and computational process of dynamic security assessment through the use of high-speed phasor measurements and the application of advanced computing technologies for faster-than-real-time simulation. This paper presents algorithms, computing platforms, and simulation frameworks that constitute the predictive dynamic security assessment capability. Examples of phasor application and fast computation for dynamic security assessment are included to demonstrate the feasibility and speed enhancement for real-time applications.

  7. Electromagnetically induced transparency in rubidium: An advanced undergraduate laboratory

    NASA Astrophysics Data System (ADS)

    Mayer, Shannon; Olson, Abraham

    2008-05-01

    Electromagnetically induced transparency (EIT) can be used to modify the optical response of an atomic medium to a resonant laser field. In EIT, a non-resonant pump laser beam can result in the reduction of absorption of a weak, resonant probe laser beam, provided the fields are coherently coupled by a common state. EIT provides a unique means of coherently controlling photons and has potential applications in fields ranging from quantum computing to telecommunications. In this advanced laboratory we describe the theory and experiment for investigating ladder-type EIT in rubidium gas. The theoretical absorption profile of a weak probe laser beam tuned across the 5S 1/2 to 5P 3/2 transition (780.2 nm) is modeled in the presence of a strong coupling laser beam tuned to the 5P 3/2 to 5D 5/2 transition (776.0 nm) and the absorption transparency window is characterized. Using grating-feedback diode lasers, we observe EIT experimentally in rubidium gas and compare the results to the theoretical model. Applications of EIT to high-resolution two-photon spectroscopy are also discussed. This laboratory uses much of the same equipment as the saturated absorption experiment commonly performed on the D2 line in rubidium, so it is easily implemented in laboratories with the equipment to conduct that experiment.

  8. Electromagnetically induced transparency in rubidium: An advanced undergraduate laboratory

    NASA Astrophysics Data System (ADS)

    Mayer, Shannon; Olson, Abraham

    2008-05-01

    Electromagnetically induced transparency (EIT) is a quantum interference effect used to modify the optical response of an atomic medium to a resonant laser field. In EIT, a non-resonant pump laser beam can result in the reduction of absorption of a weak, resonant probe laser beam, provided the fields are coherently coupled by a common state. EIT provides a unique means of coherently controlling photons and has potential applications in fields ranging from quantum computing to telecommunications. In this advanced laboratory we describe the theory and experiment for investigating ladder-type EIT in rubidium gas. The theoretical absorption profile of a weak probe laser beam tuned across the 5S 1/2 to 5P 3/2 transition (780.2 nm) is modeled in the presence of a strong coupling laser beam tuned to the 5P 3/2 to 5D 5/2 transition (776.0 nm) and the absorption transparency window is characterized. Using grating-feedback diode lasers, we observe EIT experimentally in rubidium gas and compare the results to the theoretical model. Applications of EIT to high-resolution two-photon spectroscopy are also discussed. This laboratory uses much of the same equipment as the saturated absorption experiment commonly performed on the D2 line in rubidium, so it is easily implemented in laboratories with the equipment to conduct that experiment.

  9. Recent advances in computer image generation simulation.

    PubMed

    Geltmacher, H E

    1988-11-01

    An explosion in flight simulator technology over the past 10 years is revolutionizing U.S. Air Force (USAF) operational training. The single, most important development has been in computer image generation. However, other significant advances are being made in simulator handling qualities, real-time computation systems, and electro-optical displays. These developments hold great promise for achieving high fidelity combat mission simulation. This article reviews the progress to date and predicts its impact, along with that of new computer science advances such as very high speed integrated circuits (VHSIC), on future USAF aircrew simulator training. Some exciting possibilities are multiship, full-mission simulators at replacement training units, miniaturized unit level mission rehearsal training simulators, onboard embedded training capability, and national scale simulator networking.

  10. The NASA Advanced Propulsion Concepts at the Jet Propulsion Laboratory

    NASA Technical Reports Server (NTRS)

    Leifer, S. D.; Frisbee, R. H.; Brophy, J. R.

    1997-01-01

    Research activities in advanced propulsion concepts at the Jet Propulsion Laboratory are reviewed. The concepts were selected for study because each offers the potential for either significantly enhancing space transportation capability or enabling bold, ambitious new missions.

  11. Results of Laboratory Testing of Advanced Power Strips: Preprint

    SciTech Connect

    Earle, L.; Sparn, B.

    2012-08-01

    This paper describes the results of a laboratory investigation to evaluate the technical performance of advanced power strip (APS) devices when subjected to a range of home entertainment center and home office usage scenarios.

  12. Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2000-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center. It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. Ames has been designated NASA's Center of Excellence in Information Technology. In this capacity, Ames is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA Ames and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth; (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking. Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a

  13. A Virtual Laboratory on Natural Computing: A Learning Experiment

    ERIC Educational Resources Information Center

    de Castro, Leandro Nunes; Muñoz, Yupanqui Julho; de Freitas, Leandro Rubim; El-Hani, Charbel Niño

    2008-01-01

    Natural computing is a terminology used to describe computational algorithms developed by taking inspiration from information processing mechanisms in nature, methods to synthesize natural phenomena in computers, and novel computational approaches based on natural materials. The virtual laboratory on natural computing (LVCoN) is a Web environment…

  14. Guide to computing and communications at Brookhaven National Laboratory

    SciTech Connect

    Berry, H.; Fuchel, K.; Harris, A.

    1991-04-01

    This report contains information on the following topics of computing and communications at Brookhaven National Laboratory: computing hardware and operating systems; support services and facilities; getting started using the Central Scientific Computing Center (CSCF); CSCF software; data communication services; computer networking; personal computers and workstations; file storage and exchange; graphics; telecommunications services; and radio systems.

  15. Airborne Advanced Reconfigurable Computer System (ARCS)

    NASA Technical Reports Server (NTRS)

    Bjurman, B. E.; Jenkins, G. M.; Masreliez, C. J.; Mcclellan, K. L.; Templeman, J. E.

    1976-01-01

    A digital computer subsystem fault-tolerant concept was defined, and the potential benefits and costs of such a subsystem were assessed when used as the central element of a new transport's flight control system. The derived advanced reconfigurable computer system (ARCS) is a triple-redundant computer subsystem that automatically reconfigures, under multiple fault conditions, from triplex to duplex to simplex operation, with redundancy recovery if the fault condition is transient. The study included criteria development covering factors at the aircraft's operation level that would influence the design of a fault-tolerant system for commercial airline use. A new reliability analysis tool was developed for evaluating redundant, fault-tolerant system availability and survivability; and a stringent digital system software design methodology was used to achieve design/implementation visibility.

  16. A Multistep Synthesis for an Advanced Undergraduate Organic Chemistry Laboratory

    ERIC Educational Resources Information Center

    Chang Ji; Peters, Dennis G.

    2006-01-01

    Multistep syntheses are often important components of the undergraduate organic laboratory experience and a three-step synthesis of 5-(2-sulfhydrylethyl) salicylaldehyde was described. The experiment is useful as a special project for an advanced undergraduate organic chemistry laboratory course and offers opportunities for students to master a…

  17. Argonne's Laboratory computing resource center : 2006 annual report.

    SciTech Connect

    Bair, R. B.; Kaushik, D. K.; Riley, K. R.; Valdes, J. V.; Drugan, C. D.; Pieper, G. P.

    2007-05-31

    Argonne National Laboratory founded the Laboratory Computing Resource Center (LCRC) in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. In September 2002 the LCRC deployed a 350-node computing cluster from Linux NetworX to address Laboratory needs for mid-range supercomputing. This cluster, named 'Jazz', achieved over a teraflop of computing power (10{sup 12} floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the 50 fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2006, there were 76 active projects on Jazz involving over 380 scientists and engineers. These projects represent a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to foster growth in the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure providers to offer more scientific data management capabilities, expanding Argonne staff use of national

  18. 78 FR 6087 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-29

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing..., Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U. S. Department of...

  19. 75 FR 9887 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-04

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing... Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U.S. Department...

  20. 76 FR 9765 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-22

    ... Advanced Scientific Computing Advisory Committee AGENCY: Office of Science, Department of Energy. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing..., Office of Advanced Scientific Computing Research, SC-21/Germantown Building, U.S. Department of...

  1. 78 FR 41046 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-09

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION... hereby given that the Advanced Scientific Computing Advisory Committee will be renewed for a two-year... (DOE), on the Advanced Scientific Computing Research Program managed by the Office of...

  2. 75 FR 64720 - DOE/Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-20

    .../Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing... Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U. S. Department...

  3. 76 FR 41234 - Advanced Scientific Computing Advisory Committee Charter Renewal

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-13

    ... Advanced Scientific Computing Advisory Committee Charter Renewal AGENCY: Department of Energy, Office of... Administration, notice is hereby given that the Advanced Scientific Computing Advisory Committee will be renewed... concerning the Advanced Scientific Computing program in response only to charges from the Director of...

  4. 78 FR 56871 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-16

    ... Advanced Scientific Computing Advisory Committee AGENCY: Office of Science, Department of Energy. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing... Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U.S. Department...

  5. 77 FR 45345 - DOE/Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-31

    .../Advanced Scientific Computing Advisory Committee AGENCY: Office of Science, Department of Energy. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing... Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U.S. Department...

  6. 75 FR 43518 - Advanced Scientific Computing Advisory Committee; Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-26

    ... Advanced Scientific Computing Advisory Committee; Meeting AGENCY: Office of Science, DOE. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing Advisory..., Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U. S. Department of...

  7. 77 FR 12823 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-02

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION: Notice of Open Meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing..., Office of Advanced Scientific Computing Research, SC-21/Germantown Building, U.S. Department of...

  8. Argonne Laboratory Computing Resource Center - FY2004 Report.

    SciTech Connect

    Bair, R.

    2005-04-14

    In the spring of 2002, Argonne National Laboratory founded the Laboratory Computing Resource Center, and in April 2003 LCRC began full operations with Argonne's first teraflops computing cluster. The LCRC's driving mission is to enable and promote computational science and engineering across the Laboratory, primarily by operating computing facilities and supporting application use and development. This report describes the scientific activities, computing facilities, and usage in the first eighteen months of LCRC operation. In this short time LCRC has had broad impact on programs across the Laboratory. The LCRC computing facility, Jazz, is available to the entire Laboratory community. In addition, the LCRC staff provides training in high-performance computing and guidance on application usage, code porting, and algorithm development. All Argonne personnel and collaborators are encouraged to take advantage of this computing resource and to provide input into the vision and plans for computing and computational analysis at Argonne. Steering for LCRC comes from the Computational Science Advisory Committee, composed of computing experts from many Laboratory divisions. The CSAC Allocations Committee makes decisions on individual project allocations for Jazz.

  9. Computational Design of Advanced Nuclear Fuels

    SciTech Connect

    Savrasov, Sergey; Kotliar, Gabriel; Haule, Kristjan

    2014-06-03

    The objective of the project was to develop a method for theoretical understanding of nuclear fuel materials whose physical and thermophysical properties can be predicted from first principles using a novel dynamical mean field method for electronic structure calculations. We concentrated our study on uranium, plutonium, their oxides, nitrides, carbides, as well as some rare earth materials whose 4f eletrons provide a simplified framework for understanding complex behavior of the f electrons. We addressed the issues connected to the electronic structure, lattice instabilities, phonon and magnon dynamics as well as thermal conductivity. This allowed us to evaluate characteristics of advanced nuclear fuel systems using computer based simulations and avoid costly experiments.

  10. Advances in computed tomography imaging technology.

    PubMed

    Ginat, Daniel Thomas; Gupta, Rajiv

    2014-07-11

    Computed tomography (CT) is an essential tool in diagnostic imaging for evaluating many clinical conditions. In recent years, there have been several notable advances in CT technology that already have had or are expected to have a significant clinical impact, including extreme multidetector CT, iterative reconstruction algorithms, dual-energy CT, cone-beam CT, portable CT, and phase-contrast CT. These techniques and their clinical applications are reviewed and illustrated in this article. In addition, emerging technologies that address deficiencies in these modalities are discussed.

  11. ATCA for Machines-- Advanced Telecommunications Computing Architecture

    SciTech Connect

    Larsen, R.S.; /SLAC

    2008-04-22

    The Advanced Telecommunications Computing Architecture is a new industry open standard for electronics instrument modules and shelves being evaluated for the International Linear Collider (ILC). It is the first industrial standard designed for High Availability (HA). ILC availability simulations have shown clearly that the capabilities of ATCA are needed in order to achieve acceptable integrated luminosity. The ATCA architecture looks attractive for beam instruments and detector applications as well. This paper provides an overview of ongoing R&D including application of HA principles to power electronics systems.

  12. An introduction to NASA's advanced computing program: Integrated computing systems in advanced multichip modules

    NASA Technical Reports Server (NTRS)

    Fang, Wai-Chi; Alkalai, Leon

    1996-01-01

    Recent changes within NASA's space exploration program favor the design, implementation, and operation of low cost, lightweight, small and micro spacecraft with multiple launches per year. In order to meet the future needs of these missions with regard to the use of spacecraft microelectronics, NASA's advanced flight computing (AFC) program is currently considering industrial cooperation and advanced packaging architectures. In relation to this, the AFC program is reviewed, considering the design and implementation of NASA's AFC multichip module.

  13. Making the Computer Laboratory Accessible to Minorities.

    ERIC Educational Resources Information Center

    Martin, Robert

    Research literature shows that current uses of computers in public school education are not reaching minority populations, and that females and ethnic minority students are less likely to have access to a computer at home. In addition, a study of computer use in the Learning Skills Center (LSC) at the University of Alabama supported the research…

  14. Technological advances in the hemostasis laboratory.

    PubMed

    Lippi, Giuseppe; Plebani, Mario; Favaloro, Emmanuel J

    2014-03-01

    Automation is conventionally defined as the use of machines, control systems, and information technologies to optimize productivity. Although automation is now commonplace in several areas of diagnostic testing, especially in clinical chemistry and immunochemistry, the concept of extending this process to hemostasis testing has only recently been advanced. The leading drawbacks are still represented by the almost unique biological matrix because citrated plasma can only be used for clotting assays and few other notable exceptions, and by the highly specific pretreatment of samples, which is particularly distinct to other test systems. Despite these important limitations, a certain degree of automation is also now embracing hemostasis testing. The more relevant developments include the growing integration of routine hemostasis analyzers with track line systems and workcells, the development of specific instrumentation tools to enhance reliability of testing (i.e., signal detection with different technologies to increase test panels, plasma indices for preanalytical check of interfering substances, failure patterns sensors for identifying insufficient volume, clots or bubbles, cap-piercing for enhancing operator safety, automatic reflex testing, automatic redilution of samples, and laser barcode readers), preanalytical features (e.g., positive identification, automatic systems for tube(s) labeling, transillumination devices), and postphlebotomy tools (pneumatic tube systems for reducing turnaround time, sample transport boxes for ensuring stability of specimens, monitoring systems for identifying unsuitable conditions of transport). Regardless of these important innovations, coagulation/hemostasis testing still requires specific technical and clinical expertise, not only in terms of measurement procedures but also for interpreting and then appropriately utilizing the derived information. Thus, additional and special caution has to be used when designing projects of

  15. A Software Laboratory Environment for Computer-Based Problem Solving.

    ERIC Educational Resources Information Center

    Kurtz, Barry L.; O'Neal, Micheal B.

    This paper describes a National Science Foundation-sponsored project at Louisiana Technological University to develop computer-based laboratories for "hands-on" introductions to major topics of computer science. The underlying strategy is to develop structured laboratory environments that present abstract concepts through the use of computer…

  16. Determination of Absolute Zero Using a Computer-Based Laboratory

    ERIC Educational Resources Information Center

    Amrani, D.

    2007-01-01

    We present a simple computer-based laboratory experiment for evaluating absolute zero in degrees Celsius, which can be performed in college and undergraduate physical sciences laboratory courses. With a computer, absolute zero apparatus can help demonstrators or students to observe the relationship between temperature and pressure and use…

  17. Computer systems for laboratory networks and high-performance NMR.

    PubMed

    Levy, G C; Begemann, J H

    1985-08-01

    Modern computer technology is significantly enhancing the associated tasks of spectroscopic data acquisition and data reduction and analysis. Distributed data processing techniques, particularly laboratory computer networking, are rapidly changing the scientist's ability to optimize results from complex experiments. Optimization of nuclear magnetic resonance spectroscopy (NMR) and magnetic resonance imaging (MRI) experimental results requires use of powerful, large-memory (virtual memory preferred) computers with integrated (and supported) high-speed links to magnetic resonance instrumentation. Laboratory architectures with larger computers, in order to extend data reduction capabilities, have facilitated the transition to NMR laboratory computer networking. Examples of a polymer microstructure analysis and in vivo 31P metabolic analysis are given. This paper also discusses laboratory data processing trends anticipated over the next 5-10 years. Full networking of NMR laboratories is just now becoming a reality. PMID:3840171

  18. The Advanced Controls Program at Oak Ridge National Laboratory

    SciTech Connect

    Knee, H.E.; White, J.D.

    1990-01-01

    The Oak Ridge National Laboratory (ORNL), under sponsorship of the US Department of Energy (DOE), is conducting research that will lead to advanced, automated control of new liquid-metal-reactor (LMR) nuclear power plants. Although this program of research (entitled the Advanced Controls Program'') is focused on LMR technology, it will be capable of providing control design, test, and qualification capability for other advanced reactor designs (e.g., the advanced light water reactor (ALWR) and high temperature gas-cooled reactor (HTGR) designs), while also benefiting existing nuclear plants. The Program will also have applicability to complex, non-nuclear process control environments (e.g., petrochemical, aerospace, etc.). The Advanced Controls Program will support capabilities throughout the entire plant design life cycle, i.e., from the initial interactive first-principle dynamic model development for the process, systems, components, and instruments through advanced control room qualification. The current program involves five principal areas of research activities: (1) demonstrations of advanced control system designs, (2) development of an advanced controls design environment, (3) development of advanced control strategies, (4) research and development (R D) in human-system integration for advanced control system designs, and (5) testing and validation of advanced control system designs. Discussion of the research in these five areas forms the basis of this paper. Also included is a description of the research directions of the program. 8 refs.

  19. National facility for advanced computational science: A sustainable path to scientific discovery

    SciTech Connect

    Simon, Horst; Kramer, William; Saphir, William; Shalf, John; Bailey, David; Oliker, Leonid; Banda, Michael; McCurdy, C. William; Hules, John; Canning, Andrew; Day, Marc; Colella, Philip; Serafini, David; Wehner, Michael; Nugent, Peter

    2004-04-02

    Lawrence Berkeley National Laboratory (Berkeley Lab) proposes to create a National Facility for Advanced Computational Science (NFACS) and to establish a new partnership between the American computer industry and a national consortium of laboratories, universities, and computing facilities. NFACS will provide leadership-class scientific computing capability to scientists and engineers nationwide, independent of their institutional affiliation or source of funding. This partnership will bring into existence a new class of computational capability in the United States that is optimal for science and will create a sustainable path towards petaflops performance.

  20. Mentoring for retention and advancement in the multigenerational clinical laboratory.

    PubMed

    Laudicina, R J

    2001-01-01

    Retention of recent graduates and other laboratory practitioners in the workplace will play a key role in addressing current and projected shortages of clinical laboratory scientists (CLS) and technicians (CLT). In addition, with overrepresentation of the aging Baby Boomer generation in laboratory supervisory and management positions, it is crucial not only to retain younger practitioners, but to prepare them for assuming these important functions in the future. Mentoring, a practice commonly employed in other professions, is widely considered to be useful in employee retention and career advancement. Mentoring has probably been used in the clinical laboratory profession, but has not been well documented. In the clinical laboratory environment, potential mentors are in the Veteran and Baby Boomer generations, and new practitioners who could benefit from mentoring are in Generation X. Generational differences among these groups may present challenges to the use of mentoring. This article will attempt to provide a better understanding of generational differences and show how mentoring can be applied in the setting of the clinical laboratory in order to increase retention and promote career advancement of younger practitioners. A panel of five laboratory managers provided examples of mentoring strategies. Definitions, benefits, and examples of mentoring are addressed in the accompanying article, "Passing the Torch: Mentoring the Next Generation of Laboratory Professionals". PMID:15633495

  1. Mentoring for retention and advancement in the multigenerational clinical laboratory.

    PubMed

    Laudicina, R J

    2001-01-01

    Retention of recent graduates and other laboratory practitioners in the workplace will play a key role in addressing current and projected shortages of clinical laboratory scientists (CLS) and technicians (CLT). In addition, with overrepresentation of the aging Baby Boomer generation in laboratory supervisory and management positions, it is crucial not only to retain younger practitioners, but to prepare them for assuming these important functions in the future. Mentoring, a practice commonly employed in other professions, is widely considered to be useful in employee retention and career advancement. Mentoring has probably been used in the clinical laboratory profession, but has not been well documented. In the clinical laboratory environment, potential mentors are in the Veteran and Baby Boomer generations, and new practitioners who could benefit from mentoring are in Generation X. Generational differences among these groups may present challenges to the use of mentoring. This article will attempt to provide a better understanding of generational differences and show how mentoring can be applied in the setting of the clinical laboratory in order to increase retention and promote career advancement of younger practitioners. A panel of five laboratory managers provided examples of mentoring strategies. Definitions, benefits, and examples of mentoring are addressed in the accompanying article, "Passing the Torch: Mentoring the Next Generation of Laboratory Professionals".

  2. Advanced Propulsion Concepts at the Jet Propulsion Laboratory

    NASA Technical Reports Server (NTRS)

    Brophy, J. R.

    1997-01-01

    Current interest in advanced propulsion within NASA and research activities in advanced propulsion concepts at the Jet Propulsion Laboratory are reviewed. The concepts, which include high power plasma thrusters such as lithuim-fueled Lorentz-Force-Accelerators, MEMS-scale propulsion systems, in-situ propellant utilization techniques, fusion propulsion systems and methods of using antimatter, offer the potential for either significantly enhancing space transportation capability as compared with that of traditional chemical propulsion, or enabling ambitious new missions.

  3. Exploring Electronics Laboratory Experiments Using Computer Software

    ERIC Educational Resources Information Center

    Gandole, Yogendra Babarao

    2011-01-01

    The roles of teachers and students are changing, and there are undoubtedly ways of learning not yet discovered. However, the computer and software technology may provide a significant role to identify the problems, to present solutions and life-long learning. It is clear that the computer based educational technology has reached the point where…

  4. Teaching Advanced Concepts in Computer Networks: VNUML-UM Virtualization Tool

    ERIC Educational Resources Information Center

    Ruiz-Martinez, A.; Pereniguez-Garcia, F.; Marin-Lopez, R.; Ruiz-Martinez, P. M.; Skarmeta-Gomez, A. F.

    2013-01-01

    In the teaching of computer networks the main problem that arises is the high price and limited number of network devices the students can work with in the laboratories. Nowadays, with virtualization we can overcome this limitation. In this paper, we present a methodology that allows students to learn advanced computer network concepts through…

  5. Advanced Computational Thermal Studies and their Assessment for Supercritical-Pressure Reactors (SCRs)

    SciTech Connect

    D. M. McEligot; J. Y. Yoo; J. S. Lee; S. T. Ro; E. Lurien; S. O. Park; R. H. Pletcher; B. L. Smith; P. Vukoslavcevic; J. M. Wallace

    2009-04-01

    The goal of this laboratory / university collaboration of coupled computational and experimental studies is the improvement of predictive methods for supercritical-pressure reactors. The general objective is to develop supporting knowledge needed of advanced computational techniques for the technology development of the concepts and their safety systems.

  6. The advanced computational testing and simulation toolkit (ACTS)

    SciTech Connect

    Drummond, L.A.; Marques, O.

    2002-05-21

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  7. New virtual laboratories presenting advanced motion control concepts

    NASA Astrophysics Data System (ADS)

    Goubej, Martin; Krejčí, Alois; Reitinger, Jan

    2015-11-01

    The paper deals with development of software framework for rapid generation of remote virtual laboratories. Client-server architecture is chosen in order to employ real-time simulation core which is running on a dedicated server. Ordinary web browser is used as a final renderer to achieve hardware independent solution which can be run on different target platforms including laptops, tablets or mobile phones. The provided toolchain allows automatic generation of the virtual laboratory source code from the configuration file created in the open- source Inkscape graphic editor. Three virtual laboratories presenting advanced motion control algorithms have been developed showing the applicability of the proposed approach.

  8. Laboratories for a Liberal Education Computer Science Course.

    ERIC Educational Resources Information Center

    Kiper, James D.; Bishop-Clark, Cathy

    Computer science and other computer related fields are faced with the high velocity of change in technology. Far more important than the knowledge of a particular software package is the liberal education skills that are learned in the process. This paper reviews the laboratory component of a new computer science course offered at Miami University…

  9. Real-Time, Sensor-Based Computing in the Laboratory.

    ERIC Educational Resources Information Center

    Badmus, O. O.; And Others

    1996-01-01

    Demonstrates the importance of Real-Time, Sensor-Based (RTSB) computing and how it can be easily and effectively integrated into university student laboratories. Describes the experimental processes, the process instrumentation and process-computer interface, the computer and communications systems, and typical software. Provides much technical…

  10. Argonne's Laboratory Computing Resource Center : 2005 annual report.

    SciTech Connect

    Bair, R. B.; Coghlan, S. C; Kaushik, D. K.; Riley, K. R.; Valdes, J. V.; Pieper, G. P.

    2007-06-30

    Argonne National Laboratory founded the Laboratory Computing Resource Center in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. The first goal of the LCRC was to deploy a mid-range supercomputing facility to support the unmet computational needs of the Laboratory. To this end, in September 2002, the Laboratory purchased a 350-node computing cluster from Linux NetworX. This cluster, named 'Jazz', achieved over a teraflop of computing power (10{sup 12} floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the fifty fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2005, there were 62 active projects on Jazz involving over 320 scientists and engineers. These projects represent a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to improve the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure providers to develop

  11. Advanced Scientific Computing Research Network Requirements

    SciTech Connect

    Bacon, Charles; Bell, Greg; Canon, Shane; Dart, Eli; Dattoria, Vince; Goodwin, Dave; Lee, Jason; Hicks, Susan; Holohan, Ed; Klasky, Scott; Lauzon, Carolyn; Rogers, Jim; Shipman, Galen; Skinner, David; Tierney, Brian

    2013-03-08

    The Energy Sciences Network (ESnet) is the primary provider of network connectivity for the U.S. Department of Energy (DOE) Office of Science (SC), the single largest supporter of basic research in the physical sciences in the United States. In support of SC programs, ESnet regularly updates and refreshes its understanding of the networking requirements of the instruments, facilities, scientists, and science programs that it serves. This focus has helped ESnet to be a highly successful enabler of scientific discovery for over 25 years. In October 2012, ESnet and the Office of Advanced Scientific Computing Research (ASCR) of the DOE SC organized a review to characterize the networking requirements of the programs funded by the ASCR program office. The requirements identified at the review are summarized in the Findings section, and are described in more detail in the body of the report.

  12. Analog Computer Laboratory with Biological Examples.

    ERIC Educational Resources Information Center

    Strebel, Donald E.

    1979-01-01

    The use of biological examples in teaching applications of the analog computer is discussed and several examples from mathematical ecology, enzyme kinetics, and tracer dynamics are described. (Author/GA)

  13. Polybrominated Diphenyl Ethers in Dryer Lint: An Advanced Analysis Laboratory

    ERIC Educational Resources Information Center

    Thompson, Robert Q.

    2008-01-01

    An advanced analytical chemistry laboratory experiment is described that involves environmental analysis and gas chromatography-mass spectrometry. Students analyze lint from clothes dryers for traces of flame retardant chemicals, polybrominated diphenylethers (PBDEs), compounds receiving much attention recently. In a typical experiment, ng/g…

  14. A Reverse Osmosis System for an Advanced Separation Process Laboratory.

    ERIC Educational Resources Information Center

    Slater, C. S.; Paccione, J. D.

    1987-01-01

    Focuses on the development of a pilot unit for use in an advanced separations process laboratory in an effort to develop experiments on such processes as reverse osmosis, ultrafiltration, adsorption, and chromatography. Discusses reverse osmosis principles, the experimental system design, and some experimental studies. (TW)

  15. A Simultaneous Analysis Problem for Advanced General Chemistry Laboratories.

    ERIC Educational Resources Information Center

    Leary, J. J.; Gallaher, T. N.

    1983-01-01

    Oxidation of magnesium metal in air has been used as an introductory experiment for determining the formula of a compound. The experiment described employs essentially the same laboratory procedure but is significantly more advanced in terms of information sought. Procedures and sample calculations/results are provided. (JN)

  16. Results of Laboratory Testing of Advanced Power Strips

    SciTech Connect

    Earle, L.; Sparn, B.

    2012-08-01

    Presented at the ACEEE Summer Study on Energy Efficiency in Buildings on August 12-17, 2012, this presentation reports on laboratory tests of 20 currently available advanced power strip products, which reduce wasteful electricity use of miscellaneous electric loads in buildings.

  17. Computer-Assisted Management of the Hospital Clinical Laboratory

    PubMed Central

    Steinbach, Glen L.; Miller, Robert E.

    1980-01-01

    Computer systems in hospital clinical laboratories historically have been used largely to manage medically-oriented patient data, particularly laboratory test requests and results. At The Johns Hopkins Hospital, effort has been devoted to the development of computer-assisted laboratory management applications in addition to routine medical data processing. This paper describes these development efforts in four areas: Workload Measurement and Reporting, Measurement of Personnel Productivity, Control of Expenses, and Laboratory Performance Measurement. Sample reports from each management subsystem are included, along with a discussion of the purpose and benefits of each application.

  18. Development and Evaluation of Computer-Based Laboratory Practical Learning Tool

    ERIC Educational Resources Information Center

    Gandole, Y. B.

    2006-01-01

    Effective evaluation of educational software is a key issue for successful introduction of advanced tools in the curriculum. This paper details to developing and evaluating a tool for computer assisted learning of science laboratory courses. The process was based on the generic instructional system design model. Various categories of educational…

  19. 76 FR 45786 - Advanced Scientific Computing Advisory Committee; Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-01

    ... Advanced Scientific Computing Advisory Committee; Meeting AGENCY: Office of Science, Department of Energy... Computing Advisory Committee (ASCAC). Federal Advisory Committee Act (Pub. L. 92-463, 86 Stat. 770) requires... INFORMATION CONTACT: Melea Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown...

  20. 75 FR 57742 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-22

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION... Scientific Computing Advisory Committee (ASCAC). Federal Advisory Committee Act (Pub. L. 92-463, 86 Stat. 770...: Melea Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building;...

  1. 78 FR 64931 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-30

    ... Advanced Scientific Computing Advisory Committee AGENCY: Office of Science, Department of Energy. ACTION... Computing Advisory Committee (ASCAC). This meeting replaces the cancelled ASCAC meeting that was to be held... Advanced Scientific Computing Research; SC-21/Germantown Building; U. S. Department of Energy;...

  2. 76 FR 64330 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-18

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION... Computing Advisory Committee (ASCAC). The Federal Advisory Committee Act (Pub. L. 92-463, 86 Stat. 770..., Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U. S. Department of...

  3. 78 FR 50404 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-19

    ... Advanced Scientific Computing Advisory Committee AGENCY: Office of Science, Department of Energy. ] ACTION... Scientific Computing Advisory Committee (ASCAC). The Federal Advisory Committee Act (Pub. L. 92-463, 86 Stat... INFORMATION CONTACT: Melea Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown...

  4. Making Advanced Computer Science Topics More Accessible through Interactive Technologies

    ERIC Educational Resources Information Center

    Shao, Kun; Maher, Peter

    2012-01-01

    Purpose: Teaching advanced technical concepts in a computer science program to students of different technical backgrounds presents many challenges. The purpose of this paper is to present a detailed experimental pedagogy in teaching advanced computer science topics, such as computer networking, telecommunications and data structures using…

  5. Textual Analysis with Computers: Tests of Bell Laboratories' Computer Software.

    ERIC Educational Resources Information Center

    Kiefer, Kathleen E.; Smith, Charles R.

    1983-01-01

    Concludes that textual analysis with computers intrigues college writers and speeds learning of editing skills by offering immediate, reliable, and consistent attention to surface features of their prose. (HOD)

  6. Argonne's Laboratory Computing Resource Center 2009 annual report.

    SciTech Connect

    Bair, R. B.

    2011-05-13

    Now in its seventh year of operation, the Laboratory Computing Resource Center (LCRC) continues to be an integral component of science and engineering research at Argonne, supporting a diverse portfolio of projects for the U.S. Department of Energy and other sponsors. The LCRC's ongoing mission is to enable and promote computational science and engineering across the Laboratory, primarily by operating computing facilities and supporting high-performance computing application use and development. This report describes scientific activities carried out with LCRC resources in 2009 and the broad impact on programs across the Laboratory. The LCRC computing facility, Jazz, is available to the entire Laboratory community. In addition, the LCRC staff provides training in high-performance computing and guidance on application usage, code porting, and algorithm development. All Argonne personnel and collaborators are encouraged to take advantage of this computing resource and to provide input into the vision and plans for computing and computational analysis at Argonne. The LCRC Allocations Committee makes decisions on individual project allocations for Jazz. Committee members are appointed by the Associate Laboratory Directors and span a range of computational disciplines. The 350-node LCRC cluster, Jazz, began production service in April 2003 and has been a research work horse ever since. Hosting a wealth of software tools and applications and achieving high availability year after year, researchers can count on Jazz to achieve project milestones and enable breakthroughs. Over the years, many projects have achieved results that would have been unobtainable without such a computing resource. In fiscal year 2009, there were 49 active projects representing a wide cross-section of Laboratory research and almost all research divisions.

  7. Managing personal desktop computing at Argonne National Laboratory

    SciTech Connect

    Coultis, D.W.; Gallo, M.A.

    1985-01-01

    The increasing number of personal desktop computer users, coupled with rapidly changing personal desktop computer technology, has increased the demand for computing services at many computing centers. However, for many computing centers, budgets have not increased as rapidly as the demand for services. These centers can use several methods to help balance increasing demands and fixed resources. Providing personal desktop computing recommendations and encouraging users to follow them are effective means of meeting demands for computing services, even while operating within a constant or slowly growing budget. Providing personal desktop computing recommendations reduces evaluation costs, shifts the burden of evaluating personal desktop computing products, and encourages expertise. Making the acquisition of personal desktop computer hardware and software easier benefits the user as well as the computing center. Users receive evaluated products for which they can obtain assistance from the computing center, while computing centers encourage the use of recommended products. Computing centers can also encourage and nurture personal desktop computing users groups. These users groups identify areas of need, participate in the evaluation process, and help reduce the demand for computing center resources. This paper discusses issues associated with managing personal desktop computing, proposes guidelines for instituting a management program, and discusses our experiences with managing personal desktop computing at Argonne National Laboratory.

  8. Computational Electromagnetics (CEM) Laboratory: Simulation Planning Guide

    NASA Technical Reports Server (NTRS)

    Khayat, Michael A.

    2011-01-01

    The simulation process, milestones and inputs are unknowns to first-time users of the CEM Laboratory. The Simulation Planning Guide aids in establishing expectations for both NASA and non-NASA facility customers. The potential audience for this guide includes both internal and commercial spaceflight hardware/software developers. It is intended to assist their engineering personnel in simulation planning and execution. Material covered includes a roadmap of the simulation process, roles and responsibilities of facility and user, major milestones, facility capabilities, and inputs required by the facility. Samples of deliverables, facility interfaces, and inputs necessary to define scope, cost, and schedule are included as an appendix to the guide.

  9. A Choice of Terminals: Spatial Patterning in Computer Laboratories

    ERIC Educational Resources Information Center

    Spennemann, Dirk; Cornforth, David; Atkinson, John

    2007-01-01

    Purpose: This paper seeks to examine the spatial patterns of student use of machines in each laboratory to whether there are underlying commonalities. Design/methodology/approach: The research was carried out by assessing the user behaviour in 16 computer laboratories at a regional university in Australia. Findings: The study found that computers…

  10. A Low Cost Microcomputer Laboratory for Investigating Computer Architecture.

    ERIC Educational Resources Information Center

    Mitchell, Eugene E., Ed.

    1980-01-01

    Described is a microcomputer laboratory at the United States Military Academy at West Point, New York, which provides easy access to non-volatile memory and a single input/output file system for 16 microcomputer laboratory positions. A microcomputer network that has a centralized data base is implemented using the concepts of computer network…

  11. Advanced CNC and CAM Series. Educational Resources for the Machine Tool Industry. Course Syllabi, Instructor's Handbook [and] Student Laboratory Manual.

    ERIC Educational Resources Information Center

    Texas State Technical Coll. System, Waco.

    This package consists of course syllabi, an instructor's handbook, and student laboratory manual for a 1-year vocational training program to prepare students for entry-level positions as advanced computer numerical control (CNC) and computer-assisted manufacturing (CAM) technicians.. The program was developed through a modification of the DACUM…

  12. Developing an Advanced Environment for Collaborative Computing

    NASA Technical Reports Server (NTRS)

    Becerra-Fernandez, Irma; Stewart, Helen; DelAlto, Martha; DelAlto, Martha; Knight, Chris

    1999-01-01

    Knowledge management in general tries to organize and make available important know-how, whenever and where ever is needed. Today, organizations rely on decision-makers to produce "mission critical" decisions that am based on inputs from multiple domains. The ideal decision-maker has a profound understanding of specific domains that influence the decision-making process coupled with the experience that allows them to act quickly and decisively on the information. In addition, learning companies benefit by not repeating costly mistakes, and by reducing time-to-market in Research & Development projects. Group-decision making tools can help companies make better decisions by capturing the knowledge from groups of experts. Furthermore, companies that capture their customers preferences can improve their customer service, which translates to larger profits. Therefore collaborative computing provides a common communication space, improves sharing of knowledge, provides a mechanism for real-time feedback on the tasks being performed, helps to optimize processes, and results in a centralized knowledge warehouse. This paper presents the research directions. of a project which seeks to augment an advanced collaborative web-based environment called Postdoc, with workflow capabilities. Postdoc is a "government-off-the-shelf" document management software developed at NASA-Ames Research Center (ARC).

  13. The Computer Connection: Four Approaches to Microcomputer Laboratory Interfacing.

    ERIC Educational Resources Information Center

    Graef, Jean L.

    1983-01-01

    Four ways in which microcomputers can be turned into laboratory instruments are discussed. These include adding an analog/digital (A/D) converter on a printed circuit board, adding an external A/D converter using the computer's serial port, attaching transducers to the game paddle ports, or connecting an instrument to the computer. (JN)

  14. Managing Laboratory Data Using Cloud Computing as an Organizational Tool

    ERIC Educational Resources Information Center

    Bennett, Jacqueline; Pence, Harry E.

    2011-01-01

    One of the most significant difficulties encountered when directing undergraduate research and developing new laboratory experiments is how to efficiently manage the data generated by a number of students. Cloud computing, where both software and computer files reside online, offers a solution to this data-management problem and allows researchers…

  15. THE LEARNING RESEARCH AND DEVELOPMENT CENTER'S COMPUTER ASSISTED LABORATORY.

    ERIC Educational Resources Information Center

    RAGSDALE, RONALD G.

    THIS PAPER DESCRIBES THE OPERATION AND PLANNED APPLICATIONS OF A COMPUTER ASSISTED LABORATORY FOR SOCIAL SCIENCE RESEARCH. THE LAB CENTERS AROUND AN 8K PDP-7 COMPUTER AND ITS SPECIAL PERIPHERAL EQUIPMENT. SPECIAL DEVICES INCLUDE RANDOM ACCESS AUDIO AND VIDEO, GRAPHICAL INPUT, AND TOUCH-SENSITIVE AND BLOCK-MANIPULATION INPUTS. THE SYSTEM MAY BE…

  16. Computers in a Teaching Laboratory: Just Another Piece of Apparatus.

    ERIC Educational Resources Information Center

    Harrison, David; Pitre, John M.

    1988-01-01

    Describes computer use in the undergraduate physics laboratory at the University of Toronto. Topics discussed include user interfaces; local area networking; data analysis and acquisition; other computer applications, including a programmable calculator and word processing; and an example of an experiment involving gravity. (LRW)

  17. Performance evaluation of the Oak Ridge National Laboratory's advanced servomanipulator

    SciTech Connect

    Draper, J.V.; Schrock, S.L.; Handel, S.J.

    1988-01-01

    The Consolidated Fuel Reprocessing Program (CFRP) at the Oak Ridge National Laboratory (ORNL) is developing technology for future nuclear fuel reprocessing facilities. This responsibility includes developing advanced telerobotic systems for repair and maintenance of such facilities. In response to a requirement for a highly reliable, remotely maintainable manipulator system, CFRP designed and built the advanced servomanipulator (ASM). This paper reports results of a recent comparison of ASM's performance to that of another highly dexterous manipulator, the Sargeant Industries/Central Research Laboratory's (CRL's) model M-2. Operators using ASM were able to complete tasks in about the same amount of time required to complete tasks with the CRL M-2. 13 refs., 5 figs., 2 tabs.

  18. Laboratory Demonstrations for PDE and Metals Combustion at NASA MSFC's Advanced Propulsion Laboratory

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Report provides status reporting on activities under order no. H-30549 for the period December 1 through December 31, 1999. Details the activities of the contract in the coordination of planned conduct of experiments at the MSFC Advanced Propulsion Laboratory in pulse detonation MHD power production and metals combustion.

  19. Interfacing laboratory instruments to multiuser, virtual memory computers

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R.; Stang, David B.; Roth, Don J.

    1989-01-01

    Incentives, problems and solutions associated with interfacing laboratory equipment with multiuser, virtual memory computers are presented. The major difficulty concerns how to utilize these computers effectively in a medium sized research group. This entails optimization of hardware interconnections and software to facilitate multiple instrument control, data acquisition and processing. The architecture of the system that was devised, and associated programming and subroutines are described. An example program involving computer controlled hardware for ultrasonic scan imaging is provided to illustrate the operational features.

  20. Advanced flight computers for planetary exploration

    NASA Technical Reports Server (NTRS)

    Stephenson, R. Rhoads

    1988-01-01

    Research concerning flight computers for use on interplanetary probes is reviewed. The history of these computers from the Viking mission to the present is outlined. The differences between ground commercial computers and computers for planetary exploration are listed. The development of a computer for the Mariner Mark II comet rendezvous asteroid flyby mission is described. Various aspects of recently developed computer systems are examined, including the Max real time, embedded computer, a hypercube distributed supercomputer, a SAR data processor, a processor for the High Resolution IR Imaging Spectrometer, and a robotic vision multiresolution pyramid machine for processsing images obtained by a Mars Rover.

  1. Advanced Manufacturing Processes Laboratory Building 878 hazards assessment document

    SciTech Connect

    Wood, C.; Thornton, W.; Swihart, A.; Gilman, T.

    1994-07-01

    The introduction of the hazards assessment process is to document the impact of the release of hazards at the Advanced Manufacturing Processes Laboratory (AMPL) that are significant enough to warrant consideration in Sandia National Laboratories` operational emergency management program. This hazards assessment is prepared in accordance with the Department of Energy Order 5500.3A requirement that facility-specific hazards assessments be prepared, maintained, and used for emergency planning purposes. This hazards assessment provides an analysis of the potential airborne release of chemicals associated with the operations and processes at the AMPL. This research and development laboratory develops advanced manufacturing technologies, practices, and unique equipment and provides the fabrication of prototype hardware to meet the needs of Sandia National Laboratories, Albuquerque, New Mexico (SNL/NM). The focus of the hazards assessment is the airborne release of materials because this requires the most rapid, coordinated emergency response on the part of the AMPL, SNL/NM, collocated facilities, and surrounding jurisdiction to protect workers, the public, and the environment.

  2. ADVANCED COMPUTATIONAL METHODS IN DOSE MODELING: APPLICATION OF COMPUTATIONAL BIOPHYSICAL TRANSPORT, COMPUTATIONAL CHEMISTRY, AND COMPUTATIONAL BIOLOGY

    EPA Science Inventory

    Computational toxicology (CompTox) leverages the significant gains in computing power and computational techniques (e.g., numerical approaches, structure-activity relationships, bioinformatics) realized over the last few years, thereby reducing costs and increasing efficiency i...

  3. Computing Advances in the Teaching of Chemistry.

    ERIC Educational Resources Information Center

    Baskett, W. P.; Matthews, G. P.

    1984-01-01

    Discusses three trends in computer-oriented chemistry instruction: (1) availability of interfaces to integrate computers with experiments; (2) impact of the development of higher resolution graphics and greater memory capacity; and (3) role of videodisc technology on computer assisted instruction. Includes program listings for auto-titration and…

  4. Two Crystallographic Laboratory and Computational Exercises for Undergraduates.

    ERIC Educational Resources Information Center

    Lessinger, Leslie

    1988-01-01

    Describes two introductory exercises designed to teach the fundamental ideas and methods of crystallography, and to convey some important features of inorganic and organic crystal structures to students in an advanced laboratory course. Exercises include "The Crystal Structure of NiO" and "The Crystal Structure of Beta-Fumaric Acid." (CW)

  5. Description of a digital computer simulation of an Annular Momentum Control Device (AMCD) laboratory test model

    NASA Technical Reports Server (NTRS)

    Woolley, C. T.; Groom, N. J.

    1981-01-01

    A description of a digital computer simulation of an Annular Momentum Control Device (AMCD) laboratory model is presented. The AMCD is a momentum exchange device which is under development as an advanced control effector for spacecraft attitude control systems. The digital computer simulation of this device incorporates the following models: six degree of freedom rigid body dynamics; rim warp; controller dynamics; nonlinear distributed element axial bearings; as well as power driver and power supply current limits. An annotated FORTRAN IV source code listing of the computer program is included.

  6. COMPUTATIONAL TOXICOLOGY ADVANCES: EMERGING CAPABILITIES FOR DATA EXPLORATION AND SAR MODEL DEVELOPMENT

    EPA Science Inventory

    Computational Toxicology Advances: Emerging capabilities for data exploration and SAR model development
    Ann M. Richard and ClarLynda R. Williams, National Health & Environmental Effects Research Laboratory, US EPA, Research Triangle Park, NC, USA; email: richard.ann@epa.gov

  7. Some Hail 'Computational Science' as Biggest Advance Since Newton, Galileo.

    ERIC Educational Resources Information Center

    Turner, Judith Axler

    1987-01-01

    Computational science is defined as science done on a computer. A computer can serve as a laboratory for researchers who cannot experiment with their subjects, and as a calculator for those who otherwise might need centuries to solve some problems mathematically. The National Science Foundation's support of supercomputers is discussed. (MLW)

  8. Optical design and characterization of an advanced computational imaging system

    NASA Astrophysics Data System (ADS)

    Shepard, R. Hamilton; Fernandez-Cull, Christy; Raskar, Ramesh; Shi, Boxin; Barsi, Christopher; Zhao, Hang

    2014-09-01

    We describe an advanced computational imaging system with an optical architecture that enables simultaneous and dynamic pupil-plane and image-plane coding accommodating several task-specific applications. We assess the optical requirement trades associated with custom and commercial-off-the-shelf (COTS) optics and converge on the development of two low-cost and robust COTS testbeds. The first is a coded-aperture programmable pixel imager employing a digital micromirror device (DMD) for image plane per-pixel oversampling and spatial super-resolution experiments. The second is a simultaneous pupil-encoded and time-encoded imager employing a DMD for pupil apodization or a deformable mirror for wavefront coding experiments. These two testbeds are built to leverage two MIT Lincoln Laboratory focal plane arrays - an orthogonal transfer CCD with non-uniform pixel sampling and on-chip dithering and a digital readout integrated circuit (DROIC) with advanced on-chip per-pixel processing capabilities. This paper discusses the derivation of optical component requirements, optical design metrics, and performance analyses for the two testbeds built.

  9. Renewable Energy Laboratory Development for Biofuels Advanced Combustion Studies

    SciTech Connect

    Soloiu, Valentin A.

    2012-03-31

    The research advanced fundamental science and applied engineering for increasing the efficiency of internal combustion engines and meeting emissions regulations with biofuels. The project developed a laboratory with new experiments and allowed investigation of new fuels and their combustion and emissions. This project supports a sustainable domestic biofuels and automotive industry creating economic opportunities across the nation, reducing the dependence on foreign oil, and enhancing U.S. energy security. The one year period of research developed fundamental knowledge and applied technology in advanced combustion, emissions and biofuels formulation to increase vehicle's efficiency. Biofuels combustion was investigated in a Compression Ignition Direct Injection (DI) to develop idling strategies with biofuels and an Indirect Diesel Injection (IDI) intended for auxiliary power unit.

  10. Incorporation of Advanced Laboratory Equipment into Introductory Physics Labs

    NASA Astrophysics Data System (ADS)

    Gilbert, John; Bellis, Matt; Cummings, John

    2015-04-01

    Siena College recently completed construction of the Stewart's Advanced Instrumentation and Technology Center (SAInt Center) which includes both a scanning electron microscope (SEM) and an atomic force microscope (AFM). The goal of this project is to design laboratory exercises for introductory physics courses that make use of this equipment. Early involvement with the SAInt center aims to increase undergraduate lab skills and expand research possibilities. These lab exercises are tested on select students and evaluated as to their effectiveness in contributing to the learning goals.The current status of this work is presented here.

  11. Computer Simulation of Laboratory Experiments: An Unrealized Potential.

    ERIC Educational Resources Information Center

    Magin, D. J.; Reizes, J. A.

    1990-01-01

    Discussion of the use of computer simulation for laboratory experiments in undergraduate engineering education focuses on work at the University of New South Wales in the instructional design and software development of a package simulating a heat exchange device. The importance of integrating theory, design, and experimentation is also discussed.…

  12. Computer protection plan for the Superconducing Super Collider Laboratory

    SciTech Connect

    Hunter, S.

    1992-04-15

    The purpose of this document is to describe the current unclassified computer security program practices, Policies and procedures for the Superconducting Super Collider Laboratory (SSCL). This document includes or references all related policies and procedures currently implemented throughout the SSCL. The document includes security practices which are planned when the facility is fully operational.

  13. An Easily Assembled Laboratory Exercise in Computed Tomography

    ERIC Educational Resources Information Center

    Mylott, Elliot; Klepetka, Ryan; Dunlap, Justin C.; Widenhorn, Ralf

    2011-01-01

    In this paper, we present a laboratory activity in computed tomography (CT) primarily composed of a photogate and a rotary motion sensor that can be assembled quickly and partially automates data collection and analysis. We use an enclosure made with a light filter that is largely opaque in the visible spectrum but mostly transparent to the near…

  14. COMPUTATIONAL SCIENCE AT BROOKHAVEN NATIONAL LABORATORY: THREE SELECTED TOPICS.

    SciTech Connect

    DAVENPORT,J.W.DENG,Y.GLIMM,J.SAMULYAK,R.

    2003-09-15

    We present an overview of computational science at Brookhaven National Laboratory (BNL), with selections from three areas: fluids, nanoscience, and biology. The work at BNL in each of these areas is itself very broad, and we select a few topics for presentation within each of them.

  15. Aerodynamic optimization studies on advanced architecture computers

    NASA Technical Reports Server (NTRS)

    Chawla, Kalpana

    1995-01-01

    The approach to carrying out multi-discipline aerospace design studies in the future, especially in massively parallel computing environments, comprises of choosing (1) suitable solvers to compute solutions to equations characterizing a discipline, and (2) efficient optimization methods. In addition, for aerodynamic optimization problems, (3) smart methodologies must be selected to modify the surface shape. In this research effort, a 'direct' optimization method is implemented on the Cray C-90 to improve aerodynamic design. It is coupled with an existing implicit Navier-Stokes solver, OVERFLOW, to compute flow solutions. The optimization method is chosen such that it can accomodate multi-discipline optimization in future computations. In the work , however, only single discipline aerodynamic optimization will be included.

  16. Advances in Monte Carlo computer simulation

    NASA Astrophysics Data System (ADS)

    Swendsen, Robert H.

    2011-03-01

    Since the invention of the Metropolis method in 1953, Monte Carlo methods have been shown to provide an efficient, practical approach to the calculation of physical properties in a wide variety of systems. In this talk, I will discuss some of the advances in the MC simulation of thermodynamics systems, with an emphasis on optimization to obtain a maximum of useful information.

  17. Brookhaven National Laboratory's capabilities for advanced analyses of cyber threats

    SciTech Connect

    DePhillips, M. P.

    2014-01-01

    BNL has several ongoing, mature, and successful programs and areas of core scientific expertise that readily could be modified to address problems facing national security and efforts by the IC related to securing our nation’s computer networks. In supporting these programs, BNL houses an expansive, scalable infrastructure built exclusively for transporting, storing, and analyzing large disparate data-sets. Our ongoing research projects on various infrastructural issues in computer science undoubtedly would be relevant to national security. Furthermore, BNL frequently partners with researchers in academia and industry worldwide to foster unique and innovative ideas for expanding research opportunities and extending our insights. Because the basic science conducted at BNL is unique, such projects have led to advanced techniques, unlike any others, to support our mission of discovery. Many of them are modular techniques, thus making them ideal for abstraction and retrofitting to other uses including those facing national security, specifically the safety of the nation’s cyber space.

  18. An easily assembled laboratory exercise in computed tomography

    NASA Astrophysics Data System (ADS)

    Mylott, Elliot; Klepetka, Ryan; Dunlap, Justin C.; Widenhorn, Ralf

    2011-09-01

    In this paper, we present a laboratory activity in computed tomography (CT) primarily composed of a photogate and a rotary motion sensor that can be assembled quickly and partially automates data collection and analysis. We use an enclosure made with a light filter that is largely opaque in the visible spectrum but mostly transparent to the near IR light of the photogate (880 nm) to scan objects hidden from the human eye. This experiment effectively conveys how an image is formed during a CT scan and highlights the important physical and imaging concepts behind CT such as electromagnetic radiation, the interaction of light and matter, artefacts and windowing. Like our setup, previous undergraduate level laboratory activities which teach the basics of CT have also utilized light sources rather than x-rays; however, they required a more extensive setup and used devices not always easily found in undergraduate laboratories. Our setup is easily implemented with equipment found in many teaching laboratories.

  19. Evaluation of Advanced Computing Techniques and Technologies: Reconfigurable Computing

    NASA Technical Reports Server (NTRS)

    Wells, B. Earl

    2003-01-01

    The focus of this project was to survey the technology of reconfigurable computing determine its level of maturity and suitability for NASA applications. To better understand and assess the effectiveness of the reconfigurable design paradigm that is utilized within the HAL-15 reconfigurable computer system. This system was made available to NASA MSFC for this purpose, from Star Bridge Systems, Inc. To implement on at least one application that would benefit from the performance levels that are possible with reconfigurable hardware. It was originally proposed that experiments in fault tolerance and dynamically reconfigurability would be perform but time constraints mandated that these be pursued as future research.

  20. Space data systems: Advanced flight computers

    NASA Technical Reports Server (NTRS)

    Benz, Harry F.

    1991-01-01

    The technical objectives are to develop high-performance, space-qualifiable, onboard computing, storage, and networking technologies. The topics are presented in viewgraph form and include the following: technology challenges; state-of-the-art assessment; program description; relationship to external programs; and cooperation and coordination effort.

  1. Advances in Computer-Supported Learning

    ERIC Educational Resources Information Center

    Neto, Francisco; Brasileiro, Francisco

    2007-01-01

    The Internet and growth of computer networks have eliminated geographic barriers, creating an environment where education can be brought to a student no matter where that student may be. The success of distance learning programs and the availability of many Web-supported applications and multimedia resources have increased the effectiveness of…

  2. Advanced Computing Tools and Models for Accelerator Physics

    SciTech Connect

    Ryne, Robert; Ryne, Robert D.

    2008-06-11

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.

  3. Virtual Computing Laboratories: A Case Study with Comparisons to Physical Computing Laboratories

    ERIC Educational Resources Information Center

    Burd, Stephen D.; Seazzu, Alessandro F.; Conway, Christopher

    2009-01-01

    Current technology enables schools to provide remote or virtual computing labs that can be implemented in multiple ways ranging from remote access to banks of dedicated workstations to sophisticated access to large-scale servers hosting virtualized workstations. This paper reports on the implementation of a specific lab using remote access to…

  4. Advances in Computationally Modeling Human Oral Bioavailability

    PubMed Central

    Wang, Junmei; Hou, Tingjun

    2015-01-01

    Although significant progress has been made in experimental high throughput screening (HTS) of ADME (absorption, distribution, metabolism, excretion) and pharmacokinetic properties, the ADME and Toxicity (ADME-Tox) in silico modeling is still indispensable in drug discovery as it can guide us to wisely select drug candidates prior to expensive ADME screenings and clinical trials. Compared to other ADME-Tox properties, human oral bioavailability (HOBA) is particularly important but extremely difficult to predict. In this paper, the advances in human oral bioavailability modeling will be reviewed. Moreover, our deep insight on how to construct more accurate and reliable HOBA QSAR and classification models will also discussed. PMID:25582307

  5. The Advanced Laboratory: beyond the ``black box'' to real physics

    NASA Astrophysics Data System (ADS)

    Zimmerman, George; Sulak, Lawrence

    2008-04-01

    The balance between theory and experiment in present physics curricula is too heavily weighted towards theory. Our physics majors do not realize that ``truth in physics'' depends either on experimental verification of theoretical predictions or on serendipitous discovery. Nor do they appreciate that most theories originate to explain experimental facts. They regard instruments as ``black boxes'' (although usually they are painted a different color). The Advanced Laboratory is essentially the only place in the curriculum where students confront the link between theory and experiment. In this age of disposing of (rather than repairing) equipment, Advanced Lab gives students insight into the inner workings of instruments and essential hands-on skills exploiting them: soldering wires, transferring cryo liquids, achieving high vacuum, acquiring reliable data, evaluating errors, fitting data, and drafting a PRL. Students learn techniques critical to several branches of physics, leading to different experimental approaches in their eventual work. If a student pursues theory, AdLab teaches her how to evaluate experiments, experimentalists, and their data. The basic skills learned, and the resulting understanding of physics, will be illustrated with the experiment on the Quantum Hall Effect from our AdLab.

  6. Optimization of analytical laboratory work using computer networking and databasing

    SciTech Connect

    Upp, D.L.; Metcalf, R.A.

    1996-06-01

    The Health Physics Analysis Laboratory (HPAL) performs around 600,000 analyses for radioactive nuclides each year at Los Alamos National Laboratory (LANL). Analysis matrices vary from nasal swipes, air filters, work area swipes, liquids, to the bottoms of shoes and cat litter. HPAL uses 8 liquid scintillation counters, 8 gas proportional counters, and 9 high purity germanium detectors in 5 laboratories to perform these analyses. HPAL has developed a computer network between the labs and software to produce analysis results. The software and hardware package includes barcode sample tracking, log-in, chain of custody, analysis calculations, analysis result printing, and utility programs. All data are written to a database, mirrored on a central server, and eventually written to CD-ROM to provide for online historical results. This system has greatly reduced the work required to provide for analysis results as well as improving the quality of the work performed.

  7. A field emission microscope in an advanced students' laboratory

    NASA Astrophysics Data System (ADS)

    Greczylo, Tomasz; Mazur, Piotr; Debowska, Ewa

    2006-03-01

    This paper describes a university level experiment during which students can observe the surface structure and determine the work function of a clean single tungsten crystal and a crystal covered with barium. The authors used a commercial field emission microscope offered by Leybold Didactic and designed an experiment which can be easily reproduced and performed in a students' laboratory. The use of a digital camera and computer allowed simultaneous observation and imaging of the surface of the body-centred cubic structure of the single tungsten crystal. Some interesting results about the changes in tungsten work function with time and with barium coverage are presented and discussed. The data help to improve knowledge and skills in the calculation of measurement uncertainty.

  8. Jonathan F. Reichert and Barbara Wolff-Reichert Award for Excellence in Advanced Laboratory Instruction: Advanced Instructional Labs: Why Bother?

    NASA Astrophysics Data System (ADS)

    Bistrow, Van

    What aren't we teaching about physics in the traditional lecture course? Plenty! By offering the Advanced Laboratory Course, we hope to shed light on the following questions: How do we develop a systematic process of doing experiments? How do we record procedures and results? How should we interpret theoretical concepts in the real world? What experimental and computational techniques are available for producing and analyzing data? With what degree of confidence can we trust our measurements and interpretations? How well does a theory represent physical reality? How do we collaborate with experimental partners? How do we best communicate our findings to others?These questions are of fundamental importance to experimental physics, yet are not generally addressed by reading textbooks, attending lectures or doing homework problems. Thus, to provide a more complete understanding of physics, we offer laboratory exercises as a supplement to the other modes of learning. The speaker will describe some examples of experiments, and outline the history, structure and student impressions of the Advanced Lab course at the University of Chicago Department of Physics.

  9. Advances in computational studies of energy materials.

    PubMed

    Catlow, C R A; Guo, Z X; Miskufova, M; Shevlin, S A; Smith, A G H; Sokol, A A; Walsh, A; Wilson, D J; Woodley, S M

    2010-07-28

    We review recent developments and applications of computational modelling techniques in the field of materials for energy technologies including hydrogen production and storage, energy storage and conversion, and light absorption and emission. In addition, we present new work on an Sn2TiO4 photocatalyst containing an Sn(II) lone pair, new interatomic potential models for SrTiO3 and GaN, an exploration of defects in the kesterite/stannite-structured solar cell absorber Cu2ZnSnS4, and report details of the incorporation of hydrogen into Ag2O and Cu2O. Special attention is paid to the modelling of nanostructured systems, including ceria (CeO2, mixed Ce(x)O(y) and Ce2O3) and group 13 sesquioxides. We consider applications based on both interatomic potential and electronic structure methodologies; and we illustrate the increasingly quantitative and predictive nature of modelling in this field. PMID:20566517

  10. The Nuclear Energy Advanced Modeling and Simulation Enabling Computational Technologies FY09 Report

    SciTech Connect

    Diachin, L F; Garaizar, F X; Henson, V E; Pope, G

    2009-10-12

    In this document we report on the status of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Enabling Computational Technologies (ECT) effort. In particular, we provide the context for ECT In the broader NEAMS program and describe the three pillars of the ECT effort, namely, (1) tools and libraries, (2) software quality assurance, and (3) computational facility (computers, storage, etc) needs. We report on our FY09 deliverables to determine the needs of the integrated performance and safety codes (IPSCs) in these three areas and lay out the general plan for software quality assurance to meet the requirements of DOE and the DOE Advanced Fuel Cycle Initiative (AFCI). We conclude with a brief description of our interactions with the Idaho National Laboratory computer center to determine what is needed to expand their role as a NEAMS user facility.

  11. Activities and operations of the Advanced Computing Research Facility, July-October 1986

    SciTech Connect

    Pieper, G.W.

    1986-01-01

    Research activities and operations of the Advanced Computing Research Facility (ACRF) at Argonne National Laboratory are discussed for the period from July 1986 through October 1986. The facility is currently supported by the Department of Energy, and is operated by the Mathematics and Computer Science Division at Argonne. Over the past four-month period, a new commercial multiprocessor, the Intel iPSC-VX/d4 hypercube was installed. In addition, four other commercial multiprocessors continue to be available for research - an Encore Multimax, a Sequent Balance 21000, an Alliant FX/8, and an Intel iPSC/d5 - as well as a locally designed multiprocessor, the Lemur. These machines are being actively used by scientists at Argonne and throughout the nation in a wide variety of projects concerning computer systems with parallel and vector architectures. A variety of classes, workshops, and seminars have been sponsored to train researchers on computing techniques for the advanced computer systems at the Advanced Computing Research Facility. For example, courses were offered on writing programs for parallel computer systems and hosted the first annual Alliant users group meeting. A Sequent users group meeting and a two-day workshop on performance evaluation of parallel computers and programs are being organized.

  12. Report on Computing and Networking in the Space Science Laboratory by the SSL Computer Committee

    NASA Technical Reports Server (NTRS)

    Gallagher, D. L. (Editor)

    1993-01-01

    The Space Science Laboratory (SSL) at Marshall Space Flight Center is a multiprogram facility. Scientific research is conducted in four discipline areas: earth science and applications, solar-terrestrial physics, astrophysics, and microgravity science and applications. Representatives from each of these discipline areas participate in a Laboratory computer requirements committee, which developed this document. The purpose is to establish and discuss Laboratory objectives for computing and networking in support of science. The purpose is also to lay the foundation for a collective, multiprogram approach to providing these services. Special recognition is given to the importance of the national and international efforts of our research communities toward the development of interoperable, network-based computer applications.

  13. Final Report - Advanced Ion Trap Mass Spectrometry Program - Oak Ridge National Laboratory - Sandia National Laboratory

    SciTech Connect

    Whitten, W.B.

    2002-12-18

    This report covers the three main projects that collectively comprised the Advanced Ion Trap Mass Spectrometry Program. Chapter 1 describes the direct interrogation of individual particles by laser desorption within the ion trap mass spectrometer analyzer. The goals were (1) to develop an ''intelligent trigger'' capable of distinguishing particles of biological origin from those of nonbiological origin in the background and interferent particles and (2) to explore the capability for individual particle identification. Direct interrogation of particles by laser ablation and ion trap mass spectrometry was shown to have good promise for discriminating between particles of biological origin and those of nonbiological origin, although detailed protocols and operating conditions were not worked out. A library of more than 20,000 spectra of various types of biological particles has been assembled. Methods based on multivariate analysis and on neural networks were used to discriminate between particles of biological origin and those of nonbiological origin. It was possible to discriminate between at least some species of bacteria if mass spectra of several hundred similar particles were obtained. Chapter 2 addresses the development of a new ion trap mass analyzer geometry that offers the potential for a significant increase in ion storage capacity for a given set of analyzer operating conditions. This geometry may lead to the development of smaller, lower-power field-portable ion trap mass spectrometers while retaining laboratory-scale analytical performance. A novel ion trap mass spectrometer based on toroidal ion storage geometry has been developed. The analyzer geometry is based on the edge rotation of a quadrupolar ion trap cross section into the shape of a torus. Initial performance of this device was poor, however, due to the significant contribution of nonlinear fields introduced by the rotation of the symmetric ion-trapping geometry. These nonlinear resonances

  14. Advanced robotic technologies for transfer at Sandia National Laboratories

    SciTech Connect

    Bennett, P.C.

    1994-10-01

    Hazardous operations which have in the past been completed by technicians are under increased scrutiny due to high costs and low productivity associated with providing protective clothing and environments. As a result, remote systems are needed to accomplish many hazardous materials handling tasks such as the clean-up of waste sites in which the exposure of personnel to radiation, chemical, explosive and other hazardous constituents is unacceptable. Computer models augmented by sensing, and structured, modular computing environments are proving effective in automating many unstructured hazardous tasks. Work at Sandia National Laboratories (SNL) has focused on applying flexible automation (robotics) to meet the needs of the U.S. Department of Energy (USDOE). Dismantling facilities, environmental remediation, and materials handling in changing, hazardous environments lead to many technical challenges. Computer planning, monitoring and operator assistance shorten training cycles, reduce errors, and speed execution of operations. Robotic systems that re-use well-understood generic technologies can be much better characterized than robotic systems developed for a particular application, leading to a more reliable and safer systems. Further safety in robotic operations results from use of environmental sensors and knowledge of the task and environment. Collision detection and avoidance is achieved from such sensor integration and model-based control. This paper discusses selected technologies developed at SNL for use within the USDOE complex that have been or are ready for transfer to government and industrial suppliers. These technologies include sensors, sub-systems, and the design philosophy applied to quickly integrate them into a working robotic system. This paper represents the work of many people at the Intelligent Systems and Robotics Center at SNL, to whom the credit belongs.

  15. Laboratory Diagnosis of Lyme Disease - Advances and Challenges

    PubMed Central

    Marques, Adriana R.

    2015-01-01

    Synopsis Lyme disease is the most common tick-borne illness in the United States and Europe. Culture for B. burgdorferi is not routinely available. PCR can be helpful in synovial fluid of patients with Lyme arthritis. The majority of laboratory tests performed for the diagnosis of Lyme disease are based on detection of the antibody responses against B. burgdorferi in serum. The sensitivity of antibody-based tests increases with the duration of the infection, and patients who present very early in their illness are more likely to have a negative result. Patients with erythema migrans should receive treatment based on the clinical diagnosis. The current Centers for Disease Control and Prevention recommendations for serodiagnosis of Lyme disease is a 2-tiered algorithm, an initial enzyme immunoassay (EIA) followed by separate IgM and IgG Western blots if the first EIA test result is positive or borderline. The IgM result is only relevant for patients with illness duration of less than a month. While the 2-tier algorithm works well for later stages of the infection, it has low sensitivity during early infection. A major advance has been the discovery of VlsE and its C6 peptide as markers of antibody response in Lyme disease. Specificity is extremely important in Lyme disease testing, as the majority of tests are being performed in situations with low likelihood of the disease, a situation where a positive result is more likely to be a false positive. Current assays do not distinguish between active and inactive infection, and patients may continue to be seropositive for years. There is a need to simplify the testing algorithm for Lyme disease, improving sensitivity in early disease while still maintaining high specificity and providing information about the stage of infection. The development of a point of care assay and biomarkers for active infection would be major advances for the field. PMID:25999225

  16. Use of advanced computers for aerodynamic flow simulation

    NASA Technical Reports Server (NTRS)

    Bailey, F. R.; Ballhaus, W. F.

    1980-01-01

    The current and projected use of advanced computers for large-scale aerodynamic flow simulation applied to engineering design and research is discussed. The design use of mature codes run on conventional, serial computers is compared with the fluid research use of new codes run on parallel and vector computers. The role of flow simulations in design is illustrated by the application of a three dimensional, inviscid, transonic code to the Sabreliner 60 wing redesign. Research computations that include a more complete description of the fluid physics by use of Reynolds averaged Navier-Stokes and large-eddy simulation formulations are also presented. Results of studies for a numerical aerodynamic simulation facility are used to project the feasibility of design applications employing these more advanced three dimensional viscous flow simulations.

  17. GeoBrain Computational Cyber-laboratory for Earth Science Studies

    NASA Astrophysics Data System (ADS)

    Deng, M.; di, L.

    2009-12-01

    Computational approaches (e.g., computer-based data visualization, analysis and modeling) are critical for conducting increasingly data-intensive Earth science (ES) studies to understand functions and changes of the Earth system. However, currently Earth scientists, educators, and students have met two major barriers that prevent them from being effectively using computational approaches in their learning, research and application activities. The two barriers are: 1) difficulties in finding, obtaining, and using multi-source ES data; and 2) lack of analytic functions and computing resources (e.g., analysis software, computing models, and high performance computing systems) to analyze the data. Taking advantages of recent advances in cyberinfrastructure, Web service, and geospatial interoperability technologies, GeoBrain, a project funded by NASA, has developed a prototype computational cyber-laboratory to effectively remove the two barriers. The cyber-laboratory makes ES data and computational resources at large organizations in distributed locations available to and easily usable by the Earth science community through 1) enabling seamless discovery, access and retrieval of distributed data, 2) federating and enhancing data discovery with a catalogue federation service and a semantically-augmented catalogue service, 3) customizing data access and retrieval at user request with interoperable, personalized, and on-demand data access and services, 4) automating or semi-automating multi-source geospatial data integration, 5) developing a large number of analytic functions as value-added, interoperable, and dynamically chainable geospatial Web services and deploying them in high-performance computing facilities, 6) enabling the online geospatial process modeling and execution, and 7) building a user-friendly extensible web portal for users to access the cyber-laboratory resources. Users can interactively discover the needed data and perform on-demand data analysis and

  18. The Advanced Labs Website: resources for upper-level laboratories

    NASA Astrophysics Data System (ADS)

    Torres-Isea, Ramon

    2012-03-01

    The Advanced Labs web resource collection is an effort to create a central, comprehensive information base for college/university faculty who teach upper-level undergraduate laboratories. The website is produced by the American Association of Physics Teachers (AAPT). It is a part of ComPADRE, the online collection of resources in physics and astronomy education, which itself is a part of the National Science Foundation-funded National Science Digital Library (NSDL). After a brief review of its history, we will discuss the current status of the website while describing the various types of resources available at the site and presenting examples of each. We will detail a step-by-step procedure for submitting resources to the website. The resource collection is designed to be a community effort and thus welcomes input and contributions from its users. We will also present plans, and will seek audience feedback, for additional website services and features. The constraints, roadblocks, and rewards of this project will also be addressed.

  19. Computer-Assisted Foreign Language Teaching and Learning: Technological Advances

    ERIC Educational Resources Information Center

    Zou, Bin; Xing, Minjie; Wang, Yuping; Sun, Mingyu; Xiang, Catherine H.

    2013-01-01

    Computer-Assisted Foreign Language Teaching and Learning: Technological Advances highlights new research and an original framework that brings together foreign language teaching, experiments and testing practices that utilize the most recent and widely used e-learning resources. This comprehensive collection of research will offer linguistic…

  20. Advanced Placement Computer Science with Pascal. Volume 2. Experimental Edition.

    ERIC Educational Resources Information Center

    New York City Board of Education, Brooklyn, NY.

    This curriculum guide presents 100 lessons for an advanced placement course on programming in Pascal. Some of the topics covered include arrays, sorting, strings, sets, records, computers in society, files, stacks, queues, linked lists, binary trees, searching, hashing, and chaining. Performance objectives, vocabulary, motivation, aim,…

  1. [Activities of Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2001-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administrations missions. RIACS is located at the NASA Ames Research Center, Moffett Field, California. RIACS research focuses on the three cornerstones of IT research necessary to meet the future challenges of NASA missions: 1. Automated Reasoning for Autonomous Systems Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. 2. Human-Centered Computing Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities. 3. High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to analysis of large scientific datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply IT research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, visiting scientist programs and student summer programs, designed to encourage and facilitate collaboration between the university and NASA IT research communities.

  2. Laboratory markers in ulcerative colitis: Current insights and future advances.

    PubMed

    Cioffi, Michele; Rosa, Antonella De; Serao, Rosalba; Picone, Ilaria; Vietri, Maria Teresa

    2015-02-15

    Ulcerative colitis (UC) and Crohn's disease (CD) are the major forms of inflammatory bowel diseases (IBD) in man. Despite some common features, these forms can be distinguished by different genetic predisposition, risk factors and clinical, endoscopic and histological characteristics. The aetiology of both CD and UC remains unknown, but several evidences suggest that CD and perhaps UC are due to an excessive immune response directed against normal constituents of the intestinal bacterial flora. Tests sometimes invasive are routine for the diagnosis and care of patients with IBD. Diagnosis of UC is based on clinical symptoms combined with radiological and endoscopic investigations. The employment of non-invasive biomarkers is needed. These biomarkers have the potential to avoid invasive diagnostic tests that may result in discomfort and potential complications. The ability to determine the type, severity, prognosis and response to therapy of UC, using biomarkers has long been a goal of clinical researchers. We describe the biomarkers assessed in UC, with special reference to acute-phase proteins and serologic markers and thereafter, we describe the new biological markers and the biological markers could be developed in the future: (1) serum markers of acute phase response: The laboratory tests most used to measure the acute-phase proteins in clinical practice are the serum concentration of C-reactive protein and the erythrocyte sedimentation rate. Other biomarkers of inflammation in UC include platelet count, leukocyte count, and serum albumin and serum orosomucoid concentrations; (2) serologic markers/antibodies: In the last decades serological and immunologic biomarkers have been studied extensively in immunology and have been used in clinical practice to detect specific pathologies. In UC, the presence of these antibodies can aid as surrogate markers for the aberrant host immune response; and (3) future biomarkers: The development of biomarkers in UC will be very

  3. Laboratory markers in ulcerative colitis: Current insights and future advances.

    PubMed

    Cioffi, Michele; Rosa, Antonella De; Serao, Rosalba; Picone, Ilaria; Vietri, Maria Teresa

    2015-02-15

    Ulcerative colitis (UC) and Crohn's disease (CD) are the major forms of inflammatory bowel diseases (IBD) in man. Despite some common features, these forms can be distinguished by different genetic predisposition, risk factors and clinical, endoscopic and histological characteristics. The aetiology of both CD and UC remains unknown, but several evidences suggest that CD and perhaps UC are due to an excessive immune response directed against normal constituents of the intestinal bacterial flora. Tests sometimes invasive are routine for the diagnosis and care of patients with IBD. Diagnosis of UC is based on clinical symptoms combined with radiological and endoscopic investigations. The employment of non-invasive biomarkers is needed. These biomarkers have the potential to avoid invasive diagnostic tests that may result in discomfort and potential complications. The ability to determine the type, severity, prognosis and response to therapy of UC, using biomarkers has long been a goal of clinical researchers. We describe the biomarkers assessed in UC, with special reference to acute-phase proteins and serologic markers and thereafter, we describe the new biological markers and the biological markers could be developed in the future: (1) serum markers of acute phase response: The laboratory tests most used to measure the acute-phase proteins in clinical practice are the serum concentration of C-reactive protein and the erythrocyte sedimentation rate. Other biomarkers of inflammation in UC include platelet count, leukocyte count, and serum albumin and serum orosomucoid concentrations; (2) serologic markers/antibodies: In the last decades serological and immunologic biomarkers have been studied extensively in immunology and have been used in clinical practice to detect specific pathologies. In UC, the presence of these antibodies can aid as surrogate markers for the aberrant host immune response; and (3) future biomarkers: The development of biomarkers in UC will be very

  4. Advances in Numerical Boundary Conditions for Computational Aeroacoustics

    NASA Technical Reports Server (NTRS)

    Tam, Christopher K. W.

    1997-01-01

    Advances in Computational Aeroacoustics (CAA) depend critically on the availability of accurate, nondispersive, least dissipative computation algorithm as well as high quality numerical boundary treatments. This paper focuses on the recent developments of numerical boundary conditions. In a typical CAA problem, one often encounters two types of boundaries. Because a finite computation domain is used, there are external boundaries. On the external boundaries, boundary conditions simulating the solution outside the computation domain are to be imposed. Inside the computation domain, there may be internal boundaries. On these internal boundaries, boundary conditions simulating the presence of an object or surface with specific acoustic characteristics are to be applied. Numerical boundary conditions, both external or internal, developed for simple model problems are reviewed and examined. Numerical boundary conditions for real aeroacoustic problems are also discussed through specific examples. The paper concludes with a description of some much needed research in numerical boundary conditions for CAA.

  5. Coupled explosive/structure computational techniques at Sandia National Laboratories

    SciTech Connect

    Preece, D.S.; Attaway, S.W.; Swegle, J.W.

    1997-06-01

    Simulation of the effects of explosives on structures is a challenge because the explosive response can best be simulated using Eulerian computational techniques and structural behavior is best modeled using Lagrangian methods. Due to the different methodology of the two computational techniques and code architecture requirements, they are usually implemented in different computer programs. Explosive and structure modeling in two different codes make it difficult or next to impossible to do coupled explosive/structure interaction simulations. Sandia National Laboratories has developed two techniques for solving this problem. The first is called Smoothed Particle Hydrodynamics (SPH), a relatively new gridless method comparable to Eulerian, that is especially suited for treating liquids and gases such as those produced by an explosive. The SPH capability has been fully implemented into the transient dynamics finite element (Lagrangian) codes PRONTO-2D and -3D. A PRONTO-3D/SPH simulation of the effect of a blast on a protective-wall barrier is presented in this paper. The second technique employed at Sandia uses a new code called Zapotec that combines the 3-D Eulerian code CTH and the Lagrangian code PRONTO-3D with minimal changes to either code. CTH and PRONTO-3D are currently executing on the Sandia Terraflops machine (9000 Pentium Pro processors). Eulerian simulations with 100 million cells have been completed on the current configuration of the machine (4500 Pentium Pro processors). The CTH and PRONTO-3D combination will soon be executing in a coupled fashion on this machine.

  6. MIT Laboratory for Computer Science Progress Report 26. Final technical report, July 1988-June 1989

    SciTech Connect

    Dertouzos, M.L.

    1989-06-01

    Contents: advanced network architecture; clinical decision making; computer architecture group; computation structures; information mechanics; mercury; parallel processing; programming methodology; programming systems research; spoken language systems; systematic program development; theory of computation; theory of distributed systems.

  7. The Design and Implementation of NASA's Advanced Flight Computing Module

    NASA Technical Reports Server (NTRS)

    Alkakaj, Leon; Straedy, Richard; Jarvis, Bruce

    1995-01-01

    This paper describes a working flight computer Multichip Module developed jointly by JPL and TRW under their respective research programs in a collaborative fashion. The MCM is fabricated by nCHIP and is packaged within a 2 by 4 inch Al package from Coors. This flight computer module is one of three modules under development by NASA's Advanced Flight Computer (AFC) program. Further development of the Mass Memory and the programmable I/O MCM modules will follow. The three building block modules will then be stacked into a 3D MCM configuration. The mass and volume of the flight computer MCM achieved at 89 grams and 1.5 cubic inches respectively, represent a major enabling technology for future deep space as well as commercial remote sensing applications.

  8. Computer Security Awareness Guide for Department of Energy Laboratories, Government Agencies, and others for use with Lawrence Livermore National Laboratory`s (LLNL): Computer security short subjects videos

    SciTech Connect

    Not Available

    1993-12-31

    Lonnie Moore, the Computer Security Manager, CSSM/CPPM at Lawrence Livermore National Laboratory (LLNL) and Gale Warshawsky, the Coordinator for Computer Security Education & Awareness at LLNL, wanted to share topics such as computer ethics, software piracy, privacy issues, and protecting information in a format that would capture and hold an audience`s attention. Four Computer Security Short Subject videos were produced which ranged from 1-3 minutes each. These videos are very effective education and awareness tools that can be used to generate discussions about computer security concerns and good computing practices. Leaders may incorporate the Short Subjects into presentations. After talking about a subject area, one of the Short Subjects may be shown to highlight that subject matter. Another method for sharing them could be to show a Short Subject first and then lead a discussion about its topic. The cast of characters and a bit of information about their personalities in the LLNL Computer Security Short Subjects is included in this report.

  9. STRUCTURED LEARNING AND TRAINING ENVIRONMENTS--A PREPARATION LABORATORY FOR ADVANCED MAMMALIAN PHYSIOLOGY.

    ERIC Educational Resources Information Center

    FIEL, NICHOLAS J.; JOHNSTON, RAYMOND F.

    A PREPARATION LABORATORY WAS DESIGNED TO FAMILIARIZE STUDENTS IN ADVANCED MAMMALIAN PHYSIOLOGY WITH LABORATORY SKILLS AND TECHNIQUES AND THUS SHORTEN THE TIME THEY SPEND IN SETTING UP ACTUAL EXPERIMENTS. THE LABORATORY LASTS 30 MINUTES, IS FLEXIBLE AND SIMPLE OF OPERATION, AND DOES NOT REQUIRE A PROFESSOR'S PRESENCE. THE BASIC TRAINING UNIT IS THE…

  10. Utility programs used for EPA Contract Laboratory Program computations

    SciTech Connect

    Hwang, E.Y.; Wingender, R.J.

    1994-03-01

    Four computer programs were developed to manipulate GC/MS data and expedite both sample turnaround time and data package preparation for presentation of data following the EPA Contract Laboratory Program requirements. One program rapidly determines if the GC/MS daily standard has met QC criteria and therefore allows more samples to be analyzed within the 12-hour time limit. Another extracts total ion current for quantitation of tentatively identified compounds. The limit of detection is obtained on an as needed basis by the third, and the fourth provides a list of the names of target compounds, internal standards, and surrogate standards from the daily standard with their retention times in increasing order of elution. This paper describes the format, data entry approach, and procedures to set up and run these programs.

  11. Activities of the Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Oliger, Joseph

    1994-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. Research at RIACS is currently being done in the following areas: (1) parallel computing; (2) advanced methods for scientific computing; (3) high performance networks; and (4) learning systems. RIACS technical reports are usually preprints of manuscripts that have been submitted to research journals or conference proceedings. A list of these reports for the period January 1, 1994 through December 31, 1994 is in the Reports and Abstracts section of this report.

  12. Advanced computer architecture specification for automated weld systems

    NASA Technical Reports Server (NTRS)

    Katsinis, Constantine

    1994-01-01

    This report describes the requirements for an advanced automated weld system and the associated computer architecture, and defines the overall system specification from a broad perspective. According to the requirements of welding procedures as they relate to an integrated multiaxis motion control and sensor architecture, the computer system requirements are developed based on a proven multiple-processor architecture with an expandable, distributed-memory, single global bus architecture, containing individual processors which are assigned to specific tasks that support sensor or control processes. The specified architecture is sufficiently flexible to integrate previously developed equipment, be upgradable and allow on-site modifications.

  13. Advanced computer graphic techniques for laser range finder (LRF) simulation

    NASA Astrophysics Data System (ADS)

    Bedkowski, Janusz; Jankowski, Stanislaw

    2008-11-01

    This paper show an advanced computer graphic techniques for laser range finder (LRF) simulation. The LRF is the common sensor for unmanned ground vehicle, autonomous mobile robot and security applications. The cost of the measurement system is extremely high, therefore the simulation tool is designed. The simulation gives an opportunity to execute algorithm such as the obstacle avoidance[1], slam for robot localization[2], detection of vegetation and water obstacles in surroundings of the robot chassis[3], LRF measurement in crowd of people[1]. The Axis Aligned Bounding Box (AABB) and alternative technique based on CUDA (NVIDIA Compute Unified Device Architecture) is presented.

  14. Computer assisted laboratory diagnosis: a ten-year experience.

    PubMed

    Zatti, M; Guidi, G; Marcolini, F

    1988-10-01

    An automated procedure to help general practitioners in clinical diagnosis and decision making is presented. The computer-based program is conceived to process results from laboratory tests performed on outpatients, providing general practitioners with possible causes of abnormal results. When only one or two abnormal tests are observed, a series of suggestions pertinent to each abnormality is printed. When there are more abnormal test results, the program performs a more complex procedure ending with the output of some diagnostic hypotheses. Messages are also printed to focus the physician's attention to particular aspects of patient pathology that were sometimes missed or disregarded and to suggest new investigations the laboratory can perform to improve diagnostic efficiency. Moreover some advice is supplied to allow a better evaluation of particular risk conditions, as those associated with the development of coronary heart disease. The program has been recently extended with the calculation of intraindividual reference intervals. The system described has been working since 1976 and appears particularly useful when the general practitioner is faced with a number of pathological results of difficult interpretation.

  15. Advanced computer modeling techniques expand belt conveyor technology

    SciTech Connect

    Alspaugh, M.

    1998-07-01

    Increased mining production is continuing to challenge engineers and manufacturers to keep up. The pressure to produce larger and more versatile equipment is increasing. This paper will show some recent major projects in the belt conveyor industry that have pushed the limits of design and engineering technology. Also, it will discuss the systems engineering discipline and advanced computer modeling tools that have helped make these achievements possible. Several examples of technologically advanced designs will be reviewed. However, new technology can sometimes produce increased problems with equipment availability and reliability if not carefully developed. Computer modeling techniques that help one design larger equipment can also compound operational headaches if engineering processes and algorithms are not carefully analyzed every step of the way.

  16. Soft computing in design and manufacturing of advanced materials

    NASA Technical Reports Server (NTRS)

    Cios, Krzysztof J.; Baaklini, George Y; Vary, Alex

    1993-01-01

    The potential of fuzzy sets and neural networks, often referred to as soft computing, for aiding in all aspects of manufacturing of advanced materials like ceramics is addressed. In design and manufacturing of advanced materials, it is desirable to find which of the many processing variables contribute most to the desired properties of the material. There is also interest in real time quality control of parameters that govern material properties during processing stages. The concepts of fuzzy sets and neural networks are briefly introduced and it is shown how they can be used in the design and manufacturing processes. These two computational methods are alternatives to other methods such as the Taguchi method. The two methods are demonstrated by using data collected at NASA Lewis Research Center. Future research directions are also discussed.

  17. Industry and Academic Consortium for Computer Based Subsurface Geology Laboratory

    NASA Astrophysics Data System (ADS)

    Brown, A. L.; Nunn, J. A.; Sears, S. O.

    2008-12-01

    Twenty two licenses for Petrel Software acquired through a grant from Schlumberger are being used to redesign the laboratory portion of Subsurface Geology at Louisiana State University. The course redesign is a cooperative effort between LSU's Geology and Geophysics and Petroleum Engineering Departments and Schlumberger's Technical Training Division. In spring 2008, two laboratory sections were taught with 22 students in each section. The class contained geology majors, petroleum engineering majors, and geology graduate students. Limited enrollments and 3 hour labs make it possible to incorporate hands-on visualization, animation, manipulation of data and images, and access to geological data available online. 24/7 access to the laboratory and step by step instructions for Petrel exercises strongly promoted peer instruction and individual learning. Goals of the course redesign include: enhancing visualization of earth materials; strengthening student's ability to acquire, manage, and interpret multifaceted geological information; fostering critical thinking, the scientific method; improving student communication skills; providing cross training between geologists and engineers and increasing the quantity, quality, and diversity of students pursuing Earth Science and Petroleum Engineering careers. IT resources available in the laboratory provide students with sophisticated visualization tools, allowing them to switch between 2-D and 3-D reconstructions more seamlessly, and enabling them to manipulate larger integrated data-sets, thus permitting more time for critical thinking and hypothesis testing. IT resources also enable faculty and students to simultaneously work with the software to visually interrogate a 3D data set and immediately test hypothesis formulated in class. Preliminary evaluation of class results indicate that students found MS-Windows based Petrel easy to learn. By the end of the semester, students were able to not only map horizons and faults

  18. Advanced Combustion and Fuels; NREL (National Renewable Energy Laboratory)

    SciTech Connect

    Zigler, Brad

    2015-06-08

    Presented at the U.S. Department of Energy Vehicle Technologies Office 2015 Annual Merit Review and Peer Evaluation Meeting, held June 8-12, 2015, in Arlington, Virginia. It addresses technical barriers of inadequate data and predictive tools for fuel and lubricant effects on advanced combustion engines, with the strategy being through collaboration, develop techniques, tools, and data to quantify critical fuel physico-chemical effects to enable development of advanced combustion engines that use alternative fuels.

  19. A Computationally Based Approach to Homogenizing Advanced Alloys

    SciTech Connect

    Jablonski, P D; Cowen, C J

    2011-02-27

    We have developed a computationally based approach to optimizing the homogenization heat treatment of complex alloys. The Scheil module within the Thermo-Calc software is used to predict the as-cast segregation present within alloys, and DICTRA (Diffusion Controlled TRAnsformations) is used to model the homogenization kinetics as a function of time, temperature and microstructural scale. We will discuss this approach as it is applied to both Ni based superalloys as well as the more complex (computationally) case of alloys that solidify with more than one matrix phase as a result of segregation. Such is the case typically observed in martensitic steels. With these alloys it is doubly important to homogenize them correctly, especially at the laboratory scale, since they are austenitic at high temperature and thus constituent elements will diffuse slowly. The computationally designed heat treatment and the subsequent verification real castings are presented.

  20. High-Performance Computing for Advanced Smart Grid Applications

    SciTech Connect

    Huang, Zhenyu; Chen, Yousu

    2012-07-06

    The power grid is becoming far more complex as a result of the grid evolution meeting an information revolution. Due to the penetration of smart grid technologies, the grid is evolving as an unprecedented speed and the information infrastructure is fundamentally improved with a large number of smart meters and sensors that produce several orders of magnitude larger amounts of data. How to pull data in, perform analysis, and put information out in a real-time manner is a fundamental challenge in smart grid operation and planning. The future power grid requires high performance computing to be one of the foundational technologies in developing the algorithms and tools for the significantly increased complexity. New techniques and computational capabilities are required to meet the demands for higher reliability and better asset utilization, including advanced algorithms and computing hardware for large-scale modeling, simulation, and analysis. This chapter summarizes the computational challenges in smart grid and the need for high performance computing, and present examples of how high performance computing might be used for future smart grid operation and planning.

  1. A Comparison between the Learning Environments of Open and Closed Computer Laboratories.

    ERIC Educational Resources Information Center

    Newby, Michael; Fisher, Darrell

    There are two main ways in which the practical component of computer science and information systems courses, the computer laboratory class, may be organized. They may be closed laboratories, which are scheduled and staffed in the same way as other classes, or open laboratories where the students come and go as they please. In U.S. universities,…

  2. Lawrence Livermore National Laboratory`s Computer Security Short Subjects Videos: Hidden Password, The Incident, Dangerous Games and The Mess; Computer Security Awareness Guide

    SciTech Connect

    1993-12-31

    A video on computer security is described. Lonnie Moore, the Computer Security Manager, CSSM/CPPM at Lawrence Livermore National Laboratory (LLNL) and Gale Warshawsky, the Coordinator for Computer Security Education and Awareness at LLNL, wanted to share topics such as computer ethics, software piracy, privacy issues, and protecting information in a format that would capture and hold an audience`s attention. Four Computer Security Short Subject videos were produced which ranged from 1--3 minutes each. These videos are very effective education and awareness tools that can be used to generate discussions about computer security concerns and good computing practices.

  3. The Particle Beam Optics Interactive Computer Laboratory for Personal Computers and Workstations

    NASA Astrophysics Data System (ADS)

    Gillespie, G. H.; Hill, B.; Brown, N.; Martono, H.; Moore, J.; Babcock, C.

    1997-05-01

    The Particle Beam Optics Interactive Computer Laboratory (PBO Lab) is a new software concept to aid both students and professionals in modeling charged particle beams and particle beam optical systems. The PBO Lab has been designed to run on several computer platforms and includes four key elements: a graphic user interface shell; (2) a knowledge database on electric and magnetic optics elements, including interactive tutorials on the physics of charged particle optics and on the technology used in particle optics hardware; (3) a graphic construction kit for users to interactively and visually construct optical beam lines; and (4) a set of charged particle optics computational engines that compute transport matrices, beam envelopes and trajectories, fit parameters to optical constraints, and carry out similar calculations for the graphically-defined beam lines. The primary computational engines in the first generation PBO Lab are the third-order TRANSPORT code, the multiple ray tracing program TURTLE, and a new first-order matrix code that includes an envelope space charge model with support for calculating single trajectories in the presence of the beam space charge. Progress on the PBO Lab development is described and a demonstration will be given.

  4. The Synthesis and Proton NMR Spectrum of Methyl 7-Cycloheptatrienylacetate: An Advanced Undergraduate Laboratory Experiment.

    ERIC Educational Resources Information Center

    Jurch, G. R., Jr.; And Others

    1980-01-01

    Describes an advanced undergraduate laboratory experiment designed to give the senior chemistry student an opportunity to apply several synthetic and purification techniques as well as possibilities for the application of NMR spectroscopy. (CS)

  5. Frame synchronization in Jet Propulsion Laboratory's Advanced Multi-Mission Operations System (AMMOS)

    NASA Technical Reports Server (NTRS)

    Wilson, E.

    2002-01-01

    The Jet Propulsion Laboratory's Advanced Multi-Mission Operations System system processes data received from deep-space spacecraft, where error rates can be high, bit rates are low, and data is unique precious.

  6. Cavity Ring down Spectroscopy Experiment for an Advanced Undergraduate Laboratory

    ERIC Educational Resources Information Center

    Stacewicz, T.; Wasylczyk, P.; Kowalczyk, P.; Semczuk, M.

    2007-01-01

    A simple experiment is described that permits advanced undergraduates to learn the principles and applications of the cavity ring down spectroscopy technique. The apparatus is used for measurements of low concentrations of NO[subscript 2] produced in air by an electric discharge. We present the setup, experimental procedure, data analysis and some…

  7. Acoustic resonance spectroscopy for the advanced undergraduate laboratory

    NASA Astrophysics Data System (ADS)

    Franco-Villafañe, J. A.; Flores-Olmedo, E.; Báez, G.; Gandarilla-Carrillo, O.; Méndez-Sánchez, R. A.

    2012-11-01

    We present a simple experiment that allows advanced undergraduates to learn the principles and applications of spectroscopy. The technique, known as acoustic resonance spectroscopy, is applied to study a vibrating rod. The setup includes electromagnetic-acoustic transducers, an audio amplifier and a vector network analyzer. Typical results of compressional, torsional and bending waves are analyzed and compared with analytical results.

  8. Whole-genome CNV analysis: advances in computational approaches

    PubMed Central

    Pirooznia, Mehdi; Goes, Fernando S.; Zandi, Peter P.

    2015-01-01

    Accumulating evidence indicates that DNA copy number variation (CNV) is likely to make a significant contribution to human diversity and also play an important role in disease susceptibility. Recent advances in genome sequencing technologies have enabled the characterization of a variety of genomic features, including CNVs. This has led to the development of several bioinformatics approaches to detect CNVs from next-generation sequencing data. Here, we review recent advances in CNV detection from whole genome sequencing. We discuss the informatics approaches and current computational tools that have been developed as well as their strengths and limitations. This review will assist researchers and analysts in choosing the most suitable tools for CNV analysis as well as provide suggestions for new directions in future development. PMID:25918519

  9. In Situ Techniques for Monitoring Electrochromism: An Advanced Laboratory Experiment

    ERIC Educational Resources Information Center

    Saricayir, Hakan; Uce, Musa; Koca, Atif

    2010-01-01

    This experiment employs current technology to enhance and extend existing lab content. The basic principles of spectroscopic and electroanalytical techniques and their use in determining material properties are covered in some detail in many undergraduate chemistry programs. However, there are limited examples of laboratory experiments with in…

  10. A Simple Photochemical Experiment for the Advanced Laboratory.

    ERIC Educational Resources Information Center

    Rosenfeld, Stuart M.

    1986-01-01

    Describes an experiment to provide students with: (1) an introduction to photochemical techniques and theory; (2) an experience with semimicro techniques; (3) an application of carbon-14 nuclear magnetic resonance; and (4) a laboratory with some qualities of a genuine experiment. These criteria are met in the photooxidation of 9,…

  11. Advances in adaptive structures at Jet Propulsion Laboratory

    NASA Technical Reports Server (NTRS)

    Wada, Ben K.; Garba, John A.

    1993-01-01

    Future proposed NASA missions with the need for large deployable or erectable precision structures will require solutions to many technical problems. The Jet Propulsion Laboratory (JPL) is developing new technologies in Adaptive Structures to meet these challenges. The technology requirements, approaches to meet the requirements using Adaptive Structures, and the recent JPL research results in Adaptive Structures are described.

  12. Software for the ACP (Advanced Computer Program) multiprocessor system

    SciTech Connect

    Biel, J.; Areti, H.; Atac, R.; Cook, A.; Fischler, M.; Gaines, I.; Kaliher, C.; Hance, R.; Husby, D.; Nash, T.

    1987-02-02

    Software has been developed for use with the Fermilab Advanced Computer Program (ACP) multiprocessor system. The software was designed to make a system of a hundred independent node processors as easy to use as a single, powerful CPU. Subroutines have been developed by which a user's host program can send data to and get results from the program running in each of his ACP node processors. Utility programs make it easy to compile and link host and node programs, to debug a node program on an ACP development system, and to submit a debugged program to an ACP production system.

  13. Advances in Computational Stability Analysis of Composite Aerospace Structures

    SciTech Connect

    Degenhardt, R.; Araujo, F. C. de

    2010-09-30

    European aircraft industry demands for reduced development and operating costs. Structural weight reduction by exploitation of structural reserves in composite aerospace structures contributes to this aim, however, it requires accurate and experimentally validated stability analysis of real structures under realistic loading conditions. This paper presents different advances from the area of computational stability analysis of composite aerospace structures which contribute to that field. For stringer stiffened panels main results of the finished EU project COCOMAT are given. It investigated the exploitation of reserves in primary fibre composite fuselage structures through an accurate and reliable simulation of postbuckling and collapse. For unstiffened cylindrical composite shells a proposal for a new design method is presented.

  14. Computer modeling for advanced life support system analysis.

    PubMed

    Drysdale, A

    1997-01-01

    This article discusses the equivalent mass approach to advanced life support system analysis, describes a computer model developed to use this approach, and presents early results from modeling the NASA JSC BioPlex. The model is built using an object oriented approach and G2, a commercially available modeling package Cost factor equivalencies are given for the Volosin scenarios. Plant data from NASA KSC and Utah State University (USU) are used, together with configuration data from the BioPlex design effort. Initial results focus on the importance of obtaining high plant productivity with a flight-like configuration. PMID:11540448

  15. Condition monitoring through advanced sensor and computational technology : final report (January 2002 to May 2005).

    SciTech Connect

    Kim, Jung-Taek; Luk, Vincent K.

    2005-05-01

    The overall goal of this joint research project was to develop and demonstrate advanced sensors and computational technology for continuous monitoring of the condition of components, structures, and systems in advanced and next-generation nuclear power plants (NPPs). This project included investigating and adapting several advanced sensor technologies from Korean and US national laboratory research communities, some of which were developed and applied in non-nuclear industries. The project team investigated and developed sophisticated signal processing, noise reduction, and pattern recognition techniques and algorithms. The researchers installed sensors and conducted condition monitoring tests on two test loops, a check valve (an active component) and a piping elbow (a passive component), to demonstrate the feasibility of using advanced sensors and computational technology to achieve the project goal. Acoustic emission (AE) devices, optical fiber sensors, accelerometers, and ultrasonic transducers (UTs) were used to detect mechanical vibratory response of check valve and piping elbow in normal and degraded configurations. Chemical sensors were also installed to monitor the water chemistry in the piping elbow test loop. Analysis results of processed sensor data indicate that it is feasible to differentiate between the normal and degraded (with selected degradation mechanisms) configurations of these two components from the acquired sensor signals, but it is questionable that these methods can reliably identify the level and type of degradation. Additional research and development efforts are needed to refine the differentiation techniques and to reduce the level of uncertainties.

  16. A Comprehensive Microfluidics Device Construction and Characterization Module for the Advanced Undergraduate Analytical Chemistry Laboratory

    ERIC Educational Resources Information Center

    Piunno, Paul A. E.; Zetina, Adrian; Chu, Norman; Tavares, Anthony J.; Noor, M. Omair; Petryayeva, Eleonora; Uddayasankar, Uvaraj; Veglio, Andrew

    2014-01-01

    An advanced analytical chemistry undergraduate laboratory module on microfluidics that spans 4 weeks (4 h per week) is presented. The laboratory module focuses on comprehensive experiential learning of microfluidic device fabrication and the core characteristics of microfluidic devices as they pertain to fluid flow and the manipulation of samples.…

  17. PREFACE: 16th International workshop on Advanced Computing and Analysis Techniques in physics research (ACAT2014)

    NASA Astrophysics Data System (ADS)

    Fiala, L.; Lokajicek, M.; Tumova, N.

    2015-05-01

    This volume of the IOP Conference Series is dedicated to scientific contributions presented at the 16th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2014), this year the motto was ''bridging disciplines''. The conference took place on September 1-5, 2014, at the Faculty of Civil Engineering, Czech Technical University in Prague, Czech Republic. The 16th edition of ACAT explored the boundaries of computing system architectures, data analysis algorithmics, automatic calculations, and theoretical calculation technologies. It provided a forum for confronting and exchanging ideas among these fields, where new approaches in computing technologies for scientific research were explored and promoted. This year's edition of the workshop brought together over 140 participants from all over the world. The workshop's 16 invited speakers presented key topics on advanced computing and analysis techniques in physics. During the workshop, 60 talks and 40 posters were presented in three tracks: Computing Technology for Physics Research, Data Analysis - Algorithms and Tools, and Computations in Theoretical Physics: Techniques and Methods. The round table enabled discussions on expanding software, knowledge sharing and scientific collaboration in the respective areas. ACAT 2014 was generously sponsored by Western Digital, Brookhaven National Laboratory, Hewlett Packard, DataDirect Networks, M Computers, Bright Computing, Huawei and PDV-Systemhaus. Special appreciations go to the track liaisons Lorenzo Moneta, Axel Naumann and Grigory Rubtsov for their work on the scientific program and the publication preparation. ACAT's IACC would also like to express its gratitude to all referees for their work on making sure the contributions are published in the proceedings. Our thanks extend to the conference liaisons Andrei Kataev and Jerome Lauret who worked with the local contacts and made this conference possible as well as to the program

  18. Advanced coordinate measuring machine at Sandia National Laboratories

    NASA Astrophysics Data System (ADS)

    Pilkey, R. D.; Klevgard, P. A.

    1993-03-01

    Sandia National Laboratories/California has acquired a new Moore M-48V CNC five-axis universal coordinate measuring machine (CMM). Site preparation, acceptance testing, and initial performance results are discussed. Unique features of the machine include a ceramic ram and vacuum evacuated laser pathways (VELPS). The implementation of a VELPS system on the machine imposed certain design requirements and entailed certain start-up problems. The machine's projected capabilities, workload, and research possibilities are outlined.

  19. Advanced coordinate measuring machine at Sandia National Laboratories/California

    SciTech Connect

    Pilkey, R.D.; Klevgard, P.A.

    1993-03-01

    Sandia National Laboratories/California has acquired a new Moore M-48V CNC five-axis universal coordinate measuring machine (CMM). Site preparation, acceptance testing, and initial performance results are discussed. Unique features of the machine include a ceramic ram and vacuum evacuated laser pathways (VELPS). The implementation of a VELPS system on the machine imposed certain design requirements and entailed certain start-up problems. The machine's projected capabilities, workload, and research possibilities are outlined.

  20. Advanced coordinate measuring machine at Sandia National Laboratories/California

    SciTech Connect

    Pilkey, R.D.; Klevgard, P.A.

    1993-03-01

    Sandia National Laboratories/California has acquired a new Moore M-48V CNC five-axis universal coordinate measuring machine (CMM). Site preparation, acceptance testing, and initial performance results are discussed. Unique features of the machine include a ceramic ram and vacuum evacuated laser pathways (VELPS). The implementation of a VELPS system on the machine imposed certain design requirements and entailed certain start-up problems. The machine`s projected capabilities, workload, and research possibilities are outlined.

  1. Advanced Benchmarking for Complex Building Types: Laboratories as an Exemplar

    SciTech Connect

    Mathew, Paul A.; Clear, Robert; Kircher, Kevin; Webster, Tom; Lee, Kwang Ho; Hoyt, Tyler

    2010-08-01

    Complex buildings such as laboratories, data centers and cleanrooms present particular challenges for energy benchmarking because it is difficult to normalize special requirements such as health and safety in laboratories and reliability (i.e., system redundancy to maintain uptime) in data centers which significantly impact energy use. For example, air change requirements vary widely based on the type of work being performed in each laboratory space. We present methods and tools for energy benchmarking in laboratories, as an exemplar of a complex building type. First, we address whole building energy metrics and normalization parameters. We present empirical methods based on simple data filtering as well as multivariate regression analysis on the Labs21 database. The regression analysis showed lab type, lab-area ratio and occupancy hours to be significant variables. Yet the dataset did not allow analysis of factors such as plug loads and air change rates, both of which are critical to lab energy use. The simulation-based method uses an EnergyPlus model to generate a benchmark energy intensity normalized for a wider range of parameters. We suggest that both these methods have complementary strengths and limitations. Second, we present"action-oriented" benchmarking, which extends whole-building benchmarking by utilizing system-level features and metrics such as airflow W/cfm to quickly identify a list of potential efficiency actions which can then be used as the basis for a more detailed audit. While action-oriented benchmarking is not an"audit in a box" and is not intended to provide the same degree of accuracy afforded by an energy audit, we demonstrate how it can be used to focus and prioritize audit activity and track performance at the system level. We conclude with key principles that are more broadly applicable to other complex building types.

  2. Advances in the laboratory culture of octopuses for biomedical research.

    PubMed

    Hanlon, R T; Forsythe, J W

    1985-02-01

    Five species of Octopus were cultured in pilot, large-scale 2,600 liter circulating seawater systems. Improvements in system design, water management and culture methodology were described. These five species all produced large eggs and correspondingly large hatchlings that had no planktonic or larval stage and thus were easier to culture. Octopuses grew well only when fed live marine crustaceans, fishes and other molluscs. Growth occurred as a 4-7% increase in body weight per day during the early exponential growth phase and 2-4% during the latter 1/2 to 3/4 of the life cycle, which ranged from 6-15 months depending upon species. All species reproduced in captivity. Survival was 70-80% when octopuses were reared in individual containers, but in group culture survival dropped to as low as 40% by the adult stage. Causes of mortality were species-specific and included hatchling abnormalities, escapes, aggression, cannibalism, disease, senescence and laboratory accidents. Octopus bimaculoides showed superior qualities for laboratory culture. The future potential of providing American scientists with laboratory-cultured octopuses was discussed along with their uses in biomedical research.

  3. Advances in the laboratory culture of octopuses for biomedical research.

    PubMed

    Hanlon, R T; Forsythe, J W

    1985-02-01

    Five species of Octopus were cultured in pilot, large-scale 2,600 liter circulating seawater systems. Improvements in system design, water management and culture methodology were described. These five species all produced large eggs and correspondingly large hatchlings that had no planktonic or larval stage and thus were easier to culture. Octopuses grew well only when fed live marine crustaceans, fishes and other molluscs. Growth occurred as a 4-7% increase in body weight per day during the early exponential growth phase and 2-4% during the latter 1/2 to 3/4 of the life cycle, which ranged from 6-15 months depending upon species. All species reproduced in captivity. Survival was 70-80% when octopuses were reared in individual containers, but in group culture survival dropped to as low as 40% by the adult stage. Causes of mortality were species-specific and included hatchling abnormalities, escapes, aggression, cannibalism, disease, senescence and laboratory accidents. Octopus bimaculoides showed superior qualities for laboratory culture. The future potential of providing American scientists with laboratory-cultured octopuses was discussed along with their uses in biomedical research. PMID:3981958

  4. Advances in parallel computer technology for desktop atmospheric dispersion models

    SciTech Connect

    Bian, X.; Ionescu-Niscov, S.; Fast, J.D.; Allwine, K.J.

    1996-12-31

    Desktop models are those models used by analysts with varied backgrounds, for performing, for example, air quality assessment and emergency response activities. These models must be robust, well documented, have minimal and well controlled user inputs, and have clear outputs. Existing coarse-grained parallel computers can provide significant increases in computation speed in desktop atmospheric dispersion modeling without considerable increases in hardware cost. This increased speed will allow for significant improvements to be made in the scientific foundations of these applied models, in the form of more advanced diffusion schemes and better representation of the wind and turbulence fields. This is especially attractive for emergency response applications where speed and accuracy are of utmost importance. This paper describes one particular application of coarse-grained parallel computer technology to a desktop complex terrain atmospheric dispersion modeling system. By comparing performance characteristics of the coarse-grained parallel version of the model with the single-processor version, we will demonstrate that applying coarse-grained parallel computer technology to desktop atmospheric dispersion modeling systems will allow us to address critical issues facing future requirements of this class of dispersion models.

  5. DOE Advanced Scientific Computing Advisory Committee (ASCAC) Report: Exascale Computing Initiative Review

    SciTech Connect

    Reed, Daniel; Berzins, Martin; Pennington, Robert; Sarkar, Vivek; Taylor, Valerie

    2015-08-01

    On November 19, 2014, the Advanced Scientific Computing Advisory Committee (ASCAC) was charged with reviewing the Department of Energy’s conceptual design for the Exascale Computing Initiative (ECI). In particular, this included assessing whether there are significant gaps in the ECI plan or areas that need to be given priority or extra management attention. Given the breadth and depth of previous reviews of the technical challenges inherent in exascale system design and deployment, the subcommittee focused its assessment on organizational and management issues, considering technical issues only as they informed organizational or management priorities and structures. This report presents the observations and recommendations of the subcommittee.

  6. Advancing Materials Science using Neutrons at Oak Ridge National Laboratory

    SciTech Connect

    Carpenter, John

    2014-04-24

    Jack Carpenter, pioneer of accelerator-based pulsed spallation neutron sources, talks about neutron science at Oak Ridge National Laboratory (ORNL) and a need for a second target station at the Spallation Neutron Source (SNS). ORNL is the Department of Energy's largest multiprogram science and energy laboratory, and is home to two scientific user facilities serving the neutron science research community: the High Flux Isotope Reactor (HFIR) and SNS. HFIR and SNS provide researchers with unmatched capabilities for understanding the structure and properties of materials, macromolecular and biological systems, and the fundamental physics of the neutron. Neutrons provide a window through which to view materials at a microscopic level that allow researchers to develop better materials and better products. Neutrons enable us to understand materials we use in everyday life. Carpenter explains the need for another station to produce long wavelength neutrons, or cold neutrons, to answer questions that are addressed only with cold neutrons. The second target station is optimized for that purpose. Modern technology depends more and more upon intimate atomic knowledge of materials, and neutrons are an ideal probe.

  7. NMR of a Phospholipid: Modules for Advanced Laboratory Courses

    NASA Astrophysics Data System (ADS)

    Gaede, Holly C.; Stark, Ruth E.

    2001-09-01

    A laboratory project is described that builds upon the NMR experience undergraduates receive in organic chemistry with a battery of NMR experiments that investigate egg phosphatidylcholine (egg PC). This material, often labeled in health food stores as lecithin, is a major constituent of mammalian cell membranes. The NMR experiments may be used to make resonance assignments, to study molecular organization in model membranes, to test the effects of instrumental parameters, and to investigate the physics of nuclear spin systems. A suite of modular NMR exercises is described, so that the instructor may tailor the laboratory sessions to biochemistry, instrumental analysis, or physical chemistry. The experiments include solution-state one-dimensional (1D) 1H, 13C, and 31P experiments; two-dimensional (2D) TOtal Correlated SpectroscopY (TOCSY); and the spectral editing technique of Distortionless Enhancement by Polarization Transfer (DEPT). To demonstrate the differences between solution and solid-state NMR spectroscopy and instrumentation, a second set of experiments generates 1H, 13C, and 31P spectra of egg PC dispersed in aqueous solution, under both static and magic-angle spinning conditions.

  8. Advancing Materials Science using Neutrons at Oak Ridge National Laboratory

    ScienceCinema

    Carpenter, John

    2016-07-12

    Jack Carpenter, pioneer of accelerator-based pulsed spallation neutron sources, talks about neutron science at Oak Ridge National Laboratory (ORNL) and a need for a second target station at the Spallation Neutron Source (SNS). ORNL is the Department of Energy's largest multiprogram science and energy laboratory, and is home to two scientific user facilities serving the neutron science research community: the High Flux Isotope Reactor (HFIR) and SNS. HFIR and SNS provide researchers with unmatched capabilities for understanding the structure and properties of materials, macromolecular and biological systems, and the fundamental physics of the neutron. Neutrons provide a window through which to view materials at a microscopic level that allow researchers to develop better materials and better products. Neutrons enable us to understand materials we use in everyday life. Carpenter explains the need for another station to produce long wavelength neutrons, or cold neutrons, to answer questions that are addressed only with cold neutrons. The second target station is optimized for that purpose. Modern technology depends more and more upon intimate atomic knowledge of materials, and neutrons are an ideal probe.

  9. Recent Advances in Computational Mechanics of the Human Knee Joint

    PubMed Central

    Kazemi, M.; Dabiri, Y.; Li, L. P.

    2013-01-01

    Computational mechanics has been advanced in every area of orthopedic biomechanics. The objective of this paper is to provide a general review of the computational models used in the analysis of the mechanical function of the knee joint in different loading and pathological conditions. Major review articles published in related areas are summarized first. The constitutive models for soft tissues of the knee are briefly discussed to facilitate understanding the joint modeling. A detailed review of the tibiofemoral joint models is presented thereafter. The geometry reconstruction procedures as well as some critical issues in finite element modeling are also discussed. Computational modeling can be a reliable and effective method for the study of mechanical behavior of the knee joint, if the model is constructed correctly. Single-phase material models have been used to predict the instantaneous load response for the healthy knees and repaired joints, such as total and partial meniscectomies, ACL and PCL reconstructions, and joint replacements. Recently, poromechanical models accounting for fluid pressurization in soft tissues have been proposed to study the viscoelastic response of the healthy and impaired knee joints. While the constitutive modeling has been considerably advanced at the tissue level, many challenges still exist in applying a good material model to three-dimensional joint simulations. A complete model validation at the joint level seems impossible presently, because only simple data can be obtained experimentally. Therefore, model validation may be concentrated on the constitutive laws using multiple mechanical tests of the tissues. Extensive model verifications at the joint level are still crucial for the accuracy of the modeling. PMID:23509602

  10. A Timesharing Computer Program for a General Chemistry Laboratory

    ERIC Educational Resources Information Center

    Cutler, Gary L.; Drum, Donald A.

    1975-01-01

    Describes an experiment in which general and physical chemistry students can determine the heat of vaporization of a volatile substance from experimental laboratory data using timesharing techniques. (MLH)

  11. SEED: A Suite of Instructional Laboratories for Computer Security Education

    ERIC Educational Resources Information Center

    Du, Wenliang; Wang, Ronghua

    2008-01-01

    The security and assurance of our computing infrastructure has become a national priority. To address this priority, higher education has gradually incorporated the principles of computer and information security into the mainstream undergraduate and graduate computer science curricula. To achieve effective education, learning security principles…

  12. NASA Trapezoidal Wing Computations Including Transition and Advanced Turbulence Modeling

    NASA Technical Reports Server (NTRS)

    Rumsey, C. L.; Lee-Rausch, E. M.

    2012-01-01

    Flow about the NASA Trapezoidal Wing is computed with several turbulence models by using grids from the first High Lift Prediction Workshop in an effort to advance understanding of computational fluid dynamics modeling for this type of flowfield. Transition is accounted for in many of the computations. In particular, a recently-developed 4-equation transition model is utilized and works well overall. Accounting for transition tends to increase lift and decrease moment, which improves the agreement with experiment. Upper surface flap separation is reduced, and agreement with experimental surface pressures and velocity profiles is improved. The predicted shape of wakes from upstream elements is strongly influenced by grid resolution in regions above the main and flap elements. Turbulence model enhancements to account for rotation and curvature have the general effect of increasing lift and improving the resolution of the wing tip vortex as it convects downstream. However, none of the models improve the prediction of surface pressures near the wing tip, where more grid resolution is needed.

  13. Advanced Simulation and Computing: A Summary Report to the Director's Review

    SciTech Connect

    McCoy, M G; Peck, T

    2003-06-01

    It has now been three years since the Advanced Simulation and Computing Program (ASCI), as managed by Defense and Nuclear Technologies (DNT) Directorate, has been reviewed by this Director's Review Committee (DRC). Since that time, there has been considerable progress for all components of the ASCI Program, and these developments will be highlighted in this document and in the presentations planned for June 9 and 10, 2003. There have also been some name changes. Today, the Program is called ''Advanced Simulation and Computing,'' Although it retains the familiar acronym ASCI, the initiative nature of the effort has given way to sustained services as an integral part of the Stockpile Stewardship Program (SSP). All computing efforts at LLNL and the other two Defense Program (DP) laboratories are funded and managed under ASCI. This includes the so-called legacy codes, which remain essential tools in stockpile stewardship. The contract between the Department of Energy (DOE) and the University of California (UC) specifies an independent appraisal of Directorate technical work and programmatic management. Such represents the work of this DNT Review Committee. Beginning this year, the Laboratory is implementing a new review system. This process was negotiated between UC, the National Nuclear Security Administration (NNSA), and the Laboratory Directors. Central to this approach are eight performance objectives that focus on key programmatic and administrative goals. Associated with each of these objectives are a number of performance measures to more clearly characterize the attainment of the objectives. Each performance measure has a lead directorate and one or more contributing directorates. Each measure has an evaluation plan and has identified expected documentation to be included in the ''Assessment File''.

  14. Modeling emergency department operations using advanced computer simulation systems.

    PubMed

    Saunders, C E; Makens, P K; Leblanc, L J

    1989-02-01

    We developed a computer simulation model of emergency department operations using simulation software. This model uses multiple levels of preemptive patient priority; assigns each patient to an individual nurse and physician; incorporates all standard tests, procedures, and consultations; and allows patient service processes to proceed simultaneously, sequentially, repetitively, or a combination of these. Selected input data, including the number of physicians, nurses, and treatment beds, and the blood test turnaround time, then were varied systematically to determine their simulated effect on patient throughput time, selected queue sizes, and rates of resource utilization. Patient throughput time varied directly with laboratory service times and inversely with the number of physician or nurse servers. Resource utilization rates varied inversely with resource availability, and patient waiting time and patient throughput time varied indirectly with the level of patient acuity. The simulation can be animated on a computer monitor, showing simulated patients, specimens, and staff members moving throughout the ED. Computer simulation is a potentially useful tool that can help predict the results of changes in the ED system without actually altering it and may have implications for planning, optimizing resources, and improving the efficiency and quality of care.

  15. Computer Interfacing to Laboratory Instruments: How to Minimize Noise Interferences.

    ERIC Educational Resources Information Center

    Karpinski, Mary

    1987-01-01

    Discusses the problems of increased noise levels when using microcomputers as interfaces to chemistry laboratory instruments. Describes how to properly connect a laboratory instrument to a microcomputer's A/D converter board. Suggests how to obtain an analog signal free of interference noise. (TW)

  16. Examining Student Outcomes in University Computer Laboratory Environments: Issues for Educational Management

    ERIC Educational Resources Information Center

    Newby, Michael; Marcoulides, Laura D.

    2008-01-01

    Purpose: The purpose of this paper is to model the relationship between student performance, student attitudes, and computer laboratory environments. Design/methodology/approach: Data were collected from 234 college students enrolled in courses that involved the use of a computer to solve problems and provided the laboratory experience by means of…

  17. Beyond a Battery Hen Model?: A Computer Laboratory, Micropolitics and Educational Change

    ERIC Educational Resources Information Center

    Grieshaber, Susan

    2010-01-01

    This paper investigates what happened in one Australian primary school as part of the establishment, use and development of a computer laboratory over a period of two years. As part of a school renewal project, the computer laboratory was introduced as an "innovative" way to improve the skills of teachers and children in information and…

  18. Voting with Their Seats: Computer Laboratory Design and the Casual User

    ERIC Educational Resources Information Center

    Spennemann, Dirk H. R.; Atkinson, John; Cornforth, David

    2007-01-01

    Student computer laboratories are provided by most teaching institutions around the world; however, what is the most effective layout for such facilities? The log-in data files from computer laboratories at a regional university in Australia were analysed to determine whether there was a pattern in student seating. In particular, it was…

  19. Physical and Chemical Properties of the Copper-Alanine System: An Advanced Laboratory Project

    ERIC Educational Resources Information Center

    Farrell, John J.

    1977-01-01

    An integrated physical-analytical-inorganic chemistry laboratory procedure for use with undergraduate biology majors is described. The procedure requires five to six laboratory periods and includes acid-base standardizations, potentiometric determinations, computer usage, spectrophotometric determinations of crystal-field splitting…

  20. PREFACE: 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT2013)

    NASA Astrophysics Data System (ADS)

    Wang, Jianxiong

    2014-06-01

    This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2013) which took place on 16-21 May 2013 at the Institute of High Energy Physics, Chinese Academy of Sciences, Beijing, China. The workshop series brings together computer science researchers and practitioners, and researchers from particle physics and related fields to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. This year's edition of the workshop brought together over 120 participants from all over the world. 18 invited speakers presented key topics on the universe in computer, Computing in Earth Sciences, multivariate data analysis, automated computation in Quantum Field Theory as well as computing and data analysis challenges in many fields. Over 70 other talks and posters presented state-of-the-art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. The round table discussions on open-source, knowledge sharing and scientific collaboration stimulate us to think over the issue in the respective areas. ACAT 2013 was generously sponsored by the Chinese Academy of Sciences (CAS), National Natural Science Foundation of China (NFSC), Brookhaven National Laboratory in the USA (BNL), Peking University (PKU), Theoretical Physics Cernter for Science facilities of CAS (TPCSF-CAS) and Sugon. We would like to thank all the participants for their scientific contributions and for the en- thusiastic participation in all its activities of the workshop. Further information on ACAT 2013 can be found at http://acat2013.ihep.ac.cn. Professor Jianxiong Wang Institute of High Energy Physics Chinese Academy of Science Details of committees and sponsors are available in the PDF

  1. SciDAC Advances and Applications in Computational Beam Dynamics

    SciTech Connect

    Ryne, R.; Abell, D.; Adelmann, A.; Amundson, J.; Bohn, C.; Cary, J.; Colella, P.; Dechow, D.; Decyk, V.; Dragt, A.; Gerber, R.; Habib, S.; Higdon, D.; Katsouleas, T.; Ma, K.-L.; McCorquodale, P.; Mihalcea, D.; Mitchell, C.; Mori, W.; Mottershead, C.T.; Neri, F.; Pogorelov, I.; Qiang, J.; Samulyak, R.; Serafini, D.; Shalf, J.; Siegerist, C.; Spentzouris, P.; Stoltz, P.; Terzic, B.; Venturini, M.; Walstrom, P.

    2005-06-26

    SciDAC has had a major impact on computational beam dynamics and the design of particle accelerators. Particle accelerators--which account for half of the facilities in the DOE Office of Science Facilities for the Future of Science 20 Year Outlook--are crucial for US scientific, industrial, and economic competitiveness. Thanks to SciDAC, accelerator design calculations that were once thought impossible are now carried routinely, and new challenging and important calculations are within reach. SciDAC accelerator modeling codes are being used to get the most science out of existing facilities, to produce optimal designs for future facilities, and to explore advanced accelerator concepts that may hold the key to qualitatively new ways of accelerating charged particle beams. In this poster we present highlights from the SciDAC Accelerator Science and Technology (AST) project Beam Dynamics focus area in regard to algorithm development, software development, and applications.

  2. Advanced information processing system: Inter-computer communication services

    NASA Technical Reports Server (NTRS)

    Burkhardt, Laura; Masotto, Tom; Sims, J. Terry; Whittredge, Roy; Alger, Linda S.

    1991-01-01

    The purpose is to document the functional requirements and detailed specifications for the Inter-Computer Communications Services (ICCS) of the Advanced Information Processing System (AIPS). An introductory section is provided to outline the overall architecture and functional requirements of the AIPS and to present an overview of the ICCS. An overview of the AIPS architecture as well as a brief description of the AIPS software is given. The guarantees of the ICCS are provided, and the ICCS is described as a seven-layered International Standards Organization (ISO) Model. The ICCS functional requirements, functional design, and detailed specifications as well as each layer of the ICCS are also described. A summary of results and suggestions for future work are presented.

  3. The use of computers in a materials science laboratory

    NASA Technical Reports Server (NTRS)

    Neville, J. P.

    1990-01-01

    The objective is to make available a method of easily recording the microstructure of a sample by means of a computer. The method requires a minimum investment and little or no instruction on the operation of a computer. An outline of the setup involving a black and white TV camera, a digitizer control box, a metallurgical microscope and a computer screen, printer, and keyboard is shown.

  4. Representing Numbers in the Computer: A Laboratory Exercise

    NASA Astrophysics Data System (ADS)

    Glasser, Leslie

    1998-06-01

    The finite memory of a computer creates problems in the representation and manipulation of non-integer ("floating point") quantities; it is explained how such problems can be magnified by inappropriate numerical procedures or ameliorated by careful data treatment. In order to clarify the issues involved, the methods by which characters, integers and floating point quantities are stored and handled in the computer are outlined. Then, a number of exercises is presented, for operation on both computers and calculators, by which the accuracy of number representation may be judged and which illustrate some of the problems which might arise in naive computational procedures.

  5. 5 CFR 550.404 - Computation of advance payments and evacuation payments; time periods.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 5 Administrative Personnel 1 2014-01-01 2014-01-01 false Computation of advance payments and... Computation of advance payments and evacuation payments; time periods. (a) Payments shall be based on the rate... others, when applicable, shall be made before advance payments or evacuation payments are made....

  6. 5 CFR 550.404 - Computation of advance payments and evacuation payments; time periods.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 5 Administrative Personnel 1 2013-01-01 2013-01-01 false Computation of advance payments and... Computation of advance payments and evacuation payments; time periods. (a) Payments shall be based on the rate... others, when applicable, shall be made before advance payments or evacuation payments are made....

  7. 'Dry Laboratories' in Science Education; Computer-Based Practical Work.

    ERIC Educational Resources Information Center

    Kirschner, Paul; Huisman, Willibrord

    1998-01-01

    Identifies the problems associated with the use of dry laboratories in science education, presents design considerations for the use of such practicals in science education, and presents examples of innovative nontraditional practicals. Contains 23 references. (DDR)

  8. Reliability of an interactive computer program for advance care planning.

    PubMed

    Schubart, Jane R; Levi, Benjamin H; Camacho, Fabian; Whitehead, Megan; Farace, Elana; Green, Michael J

    2012-06-01

    Despite widespread efforts to promote advance directives (ADs), completion rates remain low. Making Your Wishes Known: Planning Your Medical Future (MYWK) is an interactive computer program that guides individuals through the process of advance care planning, explaining health conditions and interventions that commonly involve life or death decisions, helps them articulate their values/goals, and translates users' preferences into a detailed AD document. The purpose of this study was to demonstrate that (in the absence of major life changes) the AD generated by MYWK reliably reflects an individual's values/preferences. English speakers ≥30 years old completed MYWK twice, 4 to 6 weeks apart. Reliability indices were assessed for three AD components: General Wishes; Specific Wishes for treatment; and Quality-of-Life values (QoL). Twenty-four participants completed the study. Both the Specific Wishes and QoL scales had high internal consistency in both time periods (Knuder Richardson formula 20 [KR-20]=0.83-0.95, and 0.86-0.89). Test-retest reliability was perfect for General Wishes (κ=1), high for QoL (Pearson's correlation coefficient=0.83), but lower for Specific Wishes (Pearson's correlation coefficient=0.57). MYWK generates an AD where General Wishes and QoL (but not Specific Wishes) statements remain consistent over time. PMID:22512830

  9. Reliability of an Interactive Computer Program for Advance Care Planning

    PubMed Central

    Levi, Benjamin H.; Camacho, Fabian; Whitehead, Megan; Farace, Elana; Green, Michael J

    2012-01-01

    Abstract Despite widespread efforts to promote advance directives (ADs), completion rates remain low. Making Your Wishes Known: Planning Your Medical Future (MYWK) is an interactive computer program that guides individuals through the process of advance care planning, explaining health conditions and interventions that commonly involve life or death decisions, helps them articulate their values/goals, and translates users' preferences into a detailed AD document. The purpose of this study was to demonstrate that (in the absence of major life changes) the AD generated by MYWK reliably reflects an individual's values/preferences. English speakers ≥30 years old completed MYWK twice, 4 to 6 weeks apart. Reliability indices were assessed for three AD components: General Wishes; Specific Wishes for treatment; and Quality-of-Life values (QoL). Twenty-four participants completed the study. Both the Specific Wishes and QoL scales had high internal consistency in both time periods (Knuder Richardson formula 20 [KR-20]=0.83–0.95, and 0.86–0.89). Test-retest reliability was perfect for General Wishes (κ=1), high for QoL (Pearson's correlation coefficient=0.83), but lower for Specific Wishes (Pearson's correlation coefficient=0.57). MYWK generates an AD where General Wishes and QoL (but not Specific Wishes) statements remain consistent over time. PMID:22512830

  10. Recent advances in computational structural reliability analysis methods

    NASA Technical Reports Server (NTRS)

    Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.

    1993-01-01

    The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.

  11. Realistic modeling of clinical laboratory operation by computer simulation.

    PubMed

    Vogt, W; Braun, S L; Hanssmann, F; Liebl, F; Berchtold, G; Blaschke, H; Eckert, M; Hoffmann, G E; Klose, S

    1994-06-01

    An important objective of laboratory management is to adjust the laboratory's capability to the needs of patients' care as well as economy. The consequences of management may be changes in laboratory organization, equipment, or personnel planning. At present only one's individual experience can be used for making such decisions. We have investigated whether the techniques of operations research could be transferred to a clinical laboratory and whether an adequate simulation model of the laboratory could be realized. First we listed and documented the system design and the process flow for each single laboratory request. These input data were linked by the simulation model (programming language SIMSCRIPT II.5). The output data (turnaround times, utilization rates, and analysis of queue length) were validated by comparison with the current performance data obtained by tracking specimen flow. Congruence of the data was excellent (within +/- 4%). In planning experiments we could study the consequences of changes in order entry, staffing, and equipment on turnaround times, utilization, and queue lengths. We conclude that simulation can be a valuable tool for better management decisions.

  12. An expanded framework for the advanced computational testing and simulation toolkit

    SciTech Connect

    Marques, Osni A.; Drummond, Leroy A.

    2003-11-09

    The Advanced Computational Testing and Simulation (ACTS) Toolkit is a set of computational tools developed primarily at DOE laboratories and is aimed at simplifying the solution of common and important computational problems. The use of the tools reduces the development time for new codes and the tools provide functionality that might not otherwise be available. This document outlines an agenda for expanding the scope of the ACTS Project based on lessons learned from current activities. Highlights of this agenda include peer-reviewed certification of new tools; finding tools to solve problems that are not currently addressed by the Toolkit; working in collaboration with other software initiatives and DOE computer facilities; expanding outreach efforts; promoting interoperability, further development of the tools; and improving functionality of the ACTS Information Center, among other tasks. The ultimate goal is to make the ACTS tools more widely used and more effective in solving DOE's and the nation's scientific problems through the creation of a reliable software infrastructure for scientific computing.

  13. Accomplishment Summary 1968-1969. Biological Computer Laboratory.

    ERIC Educational Resources Information Center

    Von Foerster, Heinz; And Others

    This report summarizes theoretical, applied, and experimental studies in the areas of computational principles in complex intelligent systems, cybernetics, multivalued logic, and the mechanization of cognitive processes. This work is summarized under the following topic headings: properties of complex dynamic systems; computers and the language…

  14. TerraFERMA: Harnessing Advanced Computational Libraries in Earth Science

    NASA Astrophysics Data System (ADS)

    Wilson, C. R.; Spiegelman, M.; van Keken, P.

    2012-12-01

    Many important problems in Earth sciences can be described by non-linear coupled systems of partial differential equations. These "multi-physics" problems include thermo-chemical convection in Earth and planetary interiors, interactions of fluids and magmas with the Earth's mantle and crust and coupled flow of water and ice. These problems are of interest to a large community of researchers but are complicated to model and understand. Much of this complexity stems from the nature of multi-physics where small changes in the coupling between variables or constitutive relations can lead to radical changes in behavior, which in turn affect critical computational choices such as discretizations, solvers and preconditioners. To make progress in understanding such coupled systems requires a computational framework where multi-physics problems can be described at a high-level while maintaining the flexibility to easily modify the solution algorithm. Fortunately, recent advances in computational science provide a basis for implementing such a framework. Here we present the Transparent Finite Element Rapid Model Assembler (TerraFERMA), which leverages several advanced open-source libraries for core functionality. FEniCS (fenicsproject.org) provides a high level language for describing the weak forms of coupled systems of equations, and an automatic code generator that produces finite element assembly code. PETSc (www.mcs.anl.gov/petsc) provides a wide range of scalable linear and non-linear solvers that can be composed into effective multi-physics preconditioners. SPuD (amcg.ese.ic.ac.uk/Spud) is an application neutral options system that provides both human and machine-readable interfaces based on a single xml schema. Our software integrates these libraries and provides the user with a framework for exploring multi-physics problems. A single options file fully describes the problem, including all equations, coefficients and solver options. Custom compiled applications are

  15. Advances in computer technology: impact on the practice of medicine.

    PubMed

    Groth-Vasselli, B; Singh, K; Farnsworth, P N

    1995-01-01

    Advances in computer technology provide a wide range of applications which are revolutionizing the practice of medicine. The development of new software for the office creates a web of communication among physicians, staff members, health care facilities and associated agencies. This provides the physician with the prospect of a paperless office. At the other end of the spectrum, the development of 3D work stations and software based on computational chemistry permits visualization of protein molecules involved in disease. Computer assisted molecular modeling has been used to construct working 3D models of lens alpha-crystallin. The 3D structure of alpha-crystallin is basic to our understanding of the molecular mechanisms involved in lens fiber cell maturation, stabilization of the inner nuclear region, the maintenance of lens transparency and cataractogenesis. The major component of the high molecular weight aggregates that occur during cataractogenesis is alpha-crystallin subunits. Subunits of alpha-crystallin occur in other tissues of the body. In the central nervous system accumulation of these subunits in the form of dense inclusion bodies occurs in pathological conditions such as Alzheimer's disease, Huntington's disease, multiple sclerosis and toxoplasmosis (Iwaki, Wisniewski et al., 1992), as well as neoplasms of astrocyte origin (Iwaki, Iwaki, et al., 1991). Also cardiac ischemia is associated with an increased alpha B synthesis (Chiesi, Longoni et al., 1990). On a more global level, the molecular structure of alpha-crystallin may provide information pertaining to the function of small heat shock proteins, hsp, in maintaining cell stability under the stress of disease.

  16. Computer Aided Design: Instructional Manual. The North Dakota High Technology Mobile Laboratory Project.

    ERIC Educational Resources Information Center

    Cheng, Wan-Lee

    This instructional manual contains 12 learning activity packets for use in a workshop in computer-aided design and drafting (CADD). The lessons cover the following topics: introduction to computer graphics and computer-aided design/drafting; coordinate systems; advance space graphics hardware configuration and basic features of the IBM PC…

  17. The Scanning Electron Microscope As An Accelerator For The Undergraduate Advanced Physics Laboratory

    SciTech Connect

    Peterson, Randolph S.; Berggren, Karl K.; Mondol, Mark

    2011-06-01

    Few universities or colleges have an accelerator for use with advanced physics laboratories, but many of these institutions have a scanning electron microscope (SEM) on site, often in the biology department. As an accelerator for the undergraduate, advanced physics laboratory, the SEM is an excellent substitute for an ion accelerator. Although there are no nuclear physics experiments that can be performed with a typical 30 kV SEM, there is an opportunity for experimental work on accelerator physics, atomic physics, electron-solid interactions, and the basics of modern e-beam lithography.

  18. A Computer-Assisted Laboratory Sequence for Petroleum Geology.

    ERIC Educational Resources Information Center

    Lumsden, David N.

    1979-01-01

    Describes a competitive oil-play game for petroleum geology students. It is accompanied by a computer program written in interactive Fortran. The program, however, is not essential, but useful for adding more interest. (SA)

  19. Modification of Simple Computer Models as a Laboratory Exercise.

    ERIC Educational Resources Information Center

    Stratton, Lewis P.

    1983-01-01

    Describes an exercise (using Apple microcomputer) which provides an introduction to computer simulation as a biological research tool. Includes examples of students' modeling programs and partial program listings for programs developed by students from an original model. (JN)

  20. Experimental and computing strategies in advanced material characterization problems

    SciTech Connect

    Bolzon, G.

    2015-10-28

    The mechanical characterization of materials relies more and more often on sophisticated experimental methods that permit to acquire a large amount of data and, contemporarily, to reduce the invasiveness of the tests. This evolution accompanies the growing demand of non-destructive diagnostic tools that assess the safety level of components in use in structures and infrastructures, for instance in the strategic energy sector. Advanced material systems and properties that are not amenable to traditional techniques, for instance thin layered structures and their adhesion on the relevant substrates, can be also characterized by means of combined experimental-numerical tools elaborating data acquired by full-field measurement techniques. In this context, parameter identification procedures involve the repeated simulation of the laboratory or in situ tests by sophisticated and usually expensive non-linear analyses while, in some situation, reliable and accurate results would be required in real time. The effectiveness and the filtering capabilities of reduced models based on decomposition and interpolation techniques can be profitably used to meet these conflicting requirements. This communication intends to summarize some results recently achieved in this field by the author and her co-workers. The aim is to foster further interaction between engineering and mathematical communities.

  1. Experimental and computing strategies in advanced material characterization problems

    NASA Astrophysics Data System (ADS)

    Bolzon, G.

    2015-10-01

    The mechanical characterization of materials relies more and more often on sophisticated experimental methods that permit to acquire a large amount of data and, contemporarily, to reduce the invasiveness of the tests. This evolution accompanies the growing demand of non-destructive diagnostic tools that assess the safety level of components in use in structures and infrastructures, for instance in the strategic energy sector. Advanced material systems and properties that are not amenable to traditional techniques, for instance thin layered structures and their adhesion on the relevant substrates, can be also characterized by means of combined experimental-numerical tools elaborating data acquired by full-field measurement techniques. In this context, parameter identification procedures involve the repeated simulation of the laboratory or in situ tests by sophisticated and usually expensive non-linear analyses while, in some situation, reliable and accurate results would be required in real time. The effectiveness and the filtering capabilities of reduced models based on decomposition and interpolation techniques can be profitably used to meet these conflicting requirements. This communication intends to summarize some results recently achieved in this field by the author and her co-workers. The aim is to foster further interaction between engineering and mathematical communities.

  2. Recovery Act: Advanced Direct Methanol Fuel Cell for Mobile Computing

    SciTech Connect

    Fletcher, James H.; Cox, Philip; Harrington, William J; Campbell, Joseph L

    2013-09-03

    ABSTRACT Project Title: Recovery Act: Advanced Direct Methanol Fuel Cell for Mobile Computing PROJECT OBJECTIVE The objective of the project was to advance portable fuel cell system technology towards the commercial targets of power density, energy density and lifetime. These targets were laid out in the DOE’s R&D roadmap to develop an advanced direct methanol fuel cell power supply that meets commercial entry requirements. Such a power supply will enable mobile computers to operate non-stop, unplugged from the wall power outlet, by using the high energy density of methanol fuel contained in a replaceable fuel cartridge. Specifically this project focused on balance-of-plant component integration and miniaturization, as well as extensive component, subassembly and integrated system durability and validation testing. This design has resulted in a pre-production power supply design and a prototype that meet the rigorous demands of consumer electronic applications. PROJECT TASKS The proposed work plan was designed to meet the project objectives, which corresponded directly with the objectives outlined in the Funding Opportunity Announcement: To engineer the fuel cell balance-of-plant and packaging to meet the needs of consumer electronic systems, specifically at power levels required for mobile computing. UNF used existing balance-of-plant component technologies developed under its current US Army CERDEC project, as well as a previous DOE project completed by PolyFuel, to further refine them to both miniaturize and integrate their functionality to increase the system power density and energy density. Benefits of UNF’s novel passive water recycling MEA (membrane electrode assembly) and the simplified system architecture it enabled formed the foundation of the design approach. The package design was hardened to address orientation independence, shock, vibration, and environmental requirements. Fuel cartridge and fuel subsystems were improved to ensure effective fuel

  3. A fission matrix based validation protocol for computed power distributions in the advanced test reactor

    SciTech Connect

    Nielsen, J. W.; Nigg, D. W.; LaPorta, A. W.

    2013-07-01

    The Idaho National Laboratory (INL) has been engaged in a significant multi year effort to modernize the computational reactor physics tools and validation procedures used to support operations of the Advanced Test Reactor (ATR) and its companion critical facility (ATRC). Several new protocols for validation of computed neutron flux distributions and spectra as well as for validation of computed fission power distributions, based on new experiments and well-recognized least-squares statistical analysis techniques, have been under development. In the case of power distributions, estimates of the a priori ATR-specific fuel element-to-element fission power correlation and covariance matrices are required for validation analysis. A practical method for generating these matrices using the element-to-element fission matrix is presented, along with a high-order scheme for estimating the underlying fission matrix itself. The proposed methodology is illustrated using the MCNP5 neutron transport code for the required neutronics calculations. The general approach is readily adaptable for implementation using any multidimensional stochastic or deterministic transport code that offers the required level of spatial, angular, and energy resolution in the computed solution for the neutron flux and fission source. (authors)

  4. Performance of computational tools in evaluating the functional impact of laboratory-induced amino acid mutations.

    PubMed

    Gray, Vanessa E; Kukurba, Kimberly R; Kumar, Sudhir

    2012-08-15

    Site-directed mutagenesis is frequently used by scientists to investigate the functional impact of amino acid mutations in the laboratory. Over 10,000 such laboratory-induced mutations have been reported in the UniProt database along with the outcomes of functional assays. Here, we explore the performance of state-of-the-art computational tools (Condel, PolyPhen-2 and SIFT) in correctly annotating the function-altering potential of 10,913 laboratory-induced mutations from 2372 proteins. We find that computational tools are very successful in diagnosing laboratory-induced mutations that elicit significant functional change in the laboratory (up to 92% accuracy). But, these tools consistently fail in correctly annotating laboratory-induced mutations that show no functional impact in the laboratory assays. Therefore, the overall accuracy of computational tools for laboratory-induced mutations is much lower than that observed for the naturally occurring human variants. We tested and rejected the possibilities that the preponderance of changes to alanine and the presence of multiple base-pair mutations in the laboratory were the reasons for the observed discordance between the performance of computational tools for natural and laboratory mutations. Instead, we discover that the laboratory-induced mutations occur predominately at the highly conserved positions in proteins, where the computational tools have the lowest accuracy of correct prediction for variants that do not impact function (neutral). Therefore, the comparisons of experimental-profiling results with those from computational predictions need to be sensitive to the evolutionary conservation of the positions harboring the amino acid change. PMID:22685075

  5. The development of an engineering computer graphics laboratory

    NASA Technical Reports Server (NTRS)

    Anderson, D. C.; Garrett, R. E.

    1975-01-01

    Hardware and software systems developed to further research and education in interactive computer graphics were described, as well as several of the ongoing application-oriented projects, educational graphics programs, and graduate research projects. The software system consists of a FORTRAN 4 subroutine package, in conjunction with a PDP 11/40 minicomputer as the primary computation processor and the Imlac PDS-1 as an intelligent display processor. The package comprises a comprehensive set of graphics routines for dynamic, structured two-dimensional display manipulation, and numerous routines to handle a variety of input devices at the Imlac.

  6. Development, Evaluation and Use of a Student Experience Survey in Undergraduate Science Laboratories: The Advancing Science by Enhancing Learning in the Laboratory Student Laboratory Learning Experience Survey

    NASA Astrophysics Data System (ADS)

    Barrie, Simon C.; Bucat, Robert B.; Buntine, Mark A.; Burke da Silva, Karen; Crisp, Geoffrey T.; George, Adrian V.; Jamie, Ian M.; Kable, Scott H.; Lim, Kieran F.; Pyke, Simon M.; Read, Justin R.; Sharma, Manjula D.; Yeung, Alexandra

    2015-07-01

    Student experience surveys have become increasingly popular to probe various aspects of processes and outcomes in higher education, such as measuring student perceptions of the learning environment and identifying aspects that could be improved. This paper reports on a particular survey for evaluating individual experiments that has been developed over some 15 years as part of a large national Australian study pertaining to the area of undergraduate laboratories-Advancing Science by Enhancing Learning in the Laboratory. This paper reports on the development of the survey instrument and the evaluation of the survey using student responses to experiments from different institutions in Australia, New Zealand and the USA. A total of 3153 student responses have been analysed using factor analysis. Three factors, motivation, assessment and resources, have been identified as contributing to improved student attitudes to laboratory activities. A central focus of the survey is to provide feedback to practitioners to iteratively improve experiments. Implications for practitioners and researchers are also discussed.

  7. An Advanced Undergraduate Chemistry Laboratory Experiment Exploring NIR Spectroscopy and Chemometrics

    ERIC Educational Resources Information Center

    Wanke, Randall; Stauffer, Jennifer

    2007-01-01

    An advanced undergraduate chemistry laboratory experiment to study the advantages and hazards of the coupling of NIR spectroscopy and chemometrics is described. The combination is commonly used for analysis and process control of various ingredients used in agriculture, petroleum and food products.

  8. Understanding Fluorescence Measurements through a Guided-Inquiry and Discovery Experiment in Advanced Analytical Laboratory

    ERIC Educational Resources Information Center

    Wilczek-Vera, Grazyna; Salin, Eric Dunbar

    2011-01-01

    An experiment on fluorescence spectroscopy suitable for an advanced analytical laboratory is presented. Its conceptual development used a combination of the expository and discovery styles. The "learn-as-you-go" and direct "hands-on" methodology applied ensures an active role for a student in the process of visualization and discovery of concepts.…

  9. Ring-Closing Metathesis: An Advanced Guided-Inquiry Experiment for the Organic Laboratory

    ERIC Educational Resources Information Center

    Schepmann, Hala G.; Mynderse, Michelle

    2010-01-01

    The design and implementation of an advanced guided-inquiry experiment for the organic laboratory is described. Grubbs's second-generation catalyst is used to effect the ring-closing metathesis of diethyl diallylmalonate. The reaction is carried out under an inert atmosphere at room temperature and monitored by argentic TLC. The crude reaction is…

  10. Adapting Advanced Inorganic Chemistry Lecture and Laboratory Instruction for a Legally Blind Student

    ERIC Educational Resources Information Center

    Miecznikowski, John R.; Guberman-Pfeffer, Matthew J.; Butrick, Elizabeth E.; Colangelo, Julie A.; Donaruma, Cristine E.

    2015-01-01

    In this article, the strategies and techniques used to successfully teach advanced inorganic chemistry, in the lecture and laboratory, to a legally blind student are described. At Fairfield University, these separate courses, which have a physical chemistry corequisite or a prerequisite, are taught for junior and senior chemistry and biochemistry…

  11. Advanced Undergraduate-Laboratory Experiment on Electron Spin Resonance in Single-Crystal Ruby

    ERIC Educational Resources Information Center

    Collins, Lee A.; And Others

    1974-01-01

    An electron-spin-resonance experiment which has been successfully performed in an advanced undergraduate physics laboratory is described. A discussion of that part of the theory of magnetic resonance necessary for the understanding of the experiment is also provided in this article. (DT)

  12. Precious bits: frame synchronization in Jet Propulsion Laboratory's Advanced Multi-Mission Operations System (AMMOS)

    NASA Technical Reports Server (NTRS)

    Wilson, E.

    2001-01-01

    The Jet Propulsion Laboratory's (JPL) Advanced Multi-Mission Operations System (AMMOS) system processes data received from deep-space spacecraft, where error rates are high, bit rates are low, and every bit is precious. Frame synchronization and data extraction as performed by AMMOS enhanced data acquisition and reliability for maximum data return and validity.

  13. Three Computer Programs for Use in Introductory Level Physics Laboratories.

    ERIC Educational Resources Information Center

    Kagan, David T.

    1984-01-01

    Describes three computer programs which operate on Apple II+ microcomputers: (1) a menu-driven graph drawing program; (2) a simulation of the Millikan oil drop experiment; and (3) a program used to study the half-life of silver. (Instructions for obtaining the programs from the author are included.) (JN)

  14. Computer Assisted Laboratory Problems for Teaching Business and Economic Statistics.

    ERIC Educational Resources Information Center

    Moore, Charles N.

    A computer-based Statistical Program to Assist in Teaching Statistics (SPATS) has been successfully developed to aid the teaching of statistics to undergraduates with business and economics majors. SPATS simplifies the problem of experimentally creating and analyzing a variety of populations and of selecting and analyzing different kinds of random…

  15. Quantitative Computed Tomography and Image Analysis for Advanced Muscle Assessment

    PubMed Central

    Edmunds, Kyle Joseph; Gíslason, Magnus K.; Arnadottir, Iris D.; Marcante, Andrea; Piccione, Francesco; Gargiulo, Paolo

    2016-01-01

    Medical imaging is of particular interest in the field of translational myology, as extant literature describes the utilization of a wide variety of techniques to non-invasively recapitulate and quantity various internal and external tissue morphologies. In the clinical context, medical imaging remains a vital tool for diagnostics and investigative assessment. This review outlines the results from several investigations on the use of computed tomography (CT) and image analysis techniques to assess muscle conditions and degenerative process due to aging or pathological conditions. Herein, we detail the acquisition of spiral CT images and the use of advanced image analysis tools to characterize muscles in 2D and 3D. Results from these studies recapitulate changes in tissue composition within muscles, as visualized by the association of tissue types to specified Hounsfield Unit (HU) values for fat, loose connective tissue or atrophic muscle, and normal muscle, including fascia and tendon. We show how results from these analyses can be presented as both average HU values and compositions with respect to total muscle volumes, demonstrating the reliability of these tools to monitor, assess and characterize muscle degeneration. PMID:27478562

  16. Advanced computational sensors technology: testing and evaluation in visible, SWIR, and LWIR imaging

    NASA Astrophysics Data System (ADS)

    Rizk, Charbel G.; Wilson, John P.; Pouliquen, Philippe

    2015-05-01

    The Advanced Computational Sensors Team at the Johns Hopkins University Applied Physics Laboratory and the Johns Hopkins University Department of Electrical and Computer Engineering has been developing advanced readout integrated circuit (ROIC) technology for more than 10 years with a particular focus on the key challenges of dynamic range, sampling rate, system interface and bandwidth, and detector materials or band dependencies. Because the pixel array offers parallel sampling by default, the team successfully demonstrated that adding smarts in the pixel and the chip can increase performance significantly. Each pixel becomes a smart sensor and can operate independently in collecting, processing, and sharing data. In addition, building on the digital circuit revolution, the effective well size can be increased by orders of magnitude within the same pixel pitch over analog designs. This research has yielded an innovative class of a system-on-chip concept: the Flexible Readout and Integration Sensor (FRIS) architecture. All key parameters are programmable and/or can be adjusted dynamically, and this architecture can potentially be sensor and application agnostic. This paper reports on the testing and evaluation of one prototype that can support either detector polarity and includes sample results with visible, short-wavelength infrared (SWIR), and long-wavelength infrared (LWIR) imaging.

  17. Innovation of laboratory exercises in course Distributed systems and computer networks

    NASA Astrophysics Data System (ADS)

    Souček, Pavel; Slavata, Oldřich; Holub, Jan

    2013-09-01

    This paper is focused on innovation of laboratory exercises in course Distributed Systems and Computer Networks. These exercises were introduced in November of 2012 and replaced older exercises in order to reflect real life applications.

  18. Status report on the Advanced Photon Source Project at Argonne National Laboratory

    SciTech Connect

    Huebner, R.H. Sr.

    1989-01-01

    The Advanced Photon Source at Argonne National Laboratory is designed as a national synchrotron radiation user facility which will provide extremely bright, highly energetic x-rays for multidisciplinary research. When operational, the Advanced Photon Source will accelerate positrons to a nominal energy of 7 GeV. The positrons will be manipulated by insertion devices to produce x-rays 10,000 times brighter than any currently available for research. Accelerator components, insertion devices, optical elements, and optical-element cooling schemes have been and continue to be the subjects of intensive research and development. A call for Letters of Intent from prospective users of the Advanced Photon Source has resulted in a substantial response from industrial, university, and national laboratory researchers.

  19. Status report on the Advanced Photon Source Project at Argonne National Laboratory

    SciTech Connect

    Huebner, R.H. Sr.

    1989-12-31

    The Advanced Photon Source at Argonne National Laboratory is designed as a national synchrotron radiation user facility which will provide extremely bright, highly energetic x-rays for multidisciplinary research. When operational, the Advanced Photon Source will accelerate positrons to a nominal energy of 7 GeV. The positrons will be manipulated by insertion devices to produce x-rays 10,000 times brighter than any currently available for research. Accelerator components, insertion devices, optical elements, and optical-element cooling schemes have been and continue to be the subjects of intensive research and development. A call for Letters of Intent from prospective users of the Advanced Photon Source has resulted in a substantial response from industrial, university, and national laboratory researchers.

  20. Advanced computational tools for optimization and uncertainty quantification of carbon capture processes

    SciTech Connect

    Miller, David C.; Ng, Brenda; Eslick, John

    2014-01-01

    Advanced multi-scale modeling and simulation has the potential to dramatically reduce development time, resulting in considerable cost savings. The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and universities that is developing, demonstrating, and deploying a suite of multi-scale modeling and simulation tools. One significant computational tool is FOQUS, a Framework for Optimization and Quantification of Uncertainty and Sensitivity, which enables basic data submodels, including thermodynamics and kinetics, to be used within detailed process models to rapidly synthesize and optimize a process and determine the level of uncertainty associated with the resulting process. The overall approach of CCSI is described with a more detailed discussion of FOQUS and its application to carbon capture systems.

  1. Design and experimental analysis of an advanced static VAR compensator with computer aided control.

    PubMed

    Irmak, Erdal; Bayındır, Ramazan; Köse, Ali

    2016-09-01

    This study presents integration of a real-time energy monitoring and control system with an advanced reactive power compensation unit based on fixed capacitor-thyristor controlled reactor (FC-TCR). Firing angles of the thyristors located in the FC-TCR are controlled by a microcontroller in order to keep the power factor within the limits. Electrical parameters of the system are measured by specially designed circuits and simultaneously transferred to the computer via a data acquisition board. Thus, real time data of the system can be observed through a visual user interface. The data obtained is not only analyzed for control process, but also regularly saved into a database. The system has been tested in laboratory conditions under different load characteristics and experimental results verified that the system successfully and accurately achieves compensation process against the all operational conditions.

  2. A Matched-Pairs Study of Interactive Computer Laboratory Activities in a Liberal Arts Math Course

    ERIC Educational Resources Information Center

    Butler, Frederick; Butler, Melanie

    2011-01-01

    This paper details the culmination of a large, multi-year study on the effects of an interactive computer laboratory component in a large liberal arts math course at a state university. After several semesters of piloting these laboratory activities in the course, one of two sections, taught by the same senior instructor, was randomly selected to…

  3. Students' Cognitive Focus during a Chemistry Laboratory Exercise: Effects of a Computer-Simulated Prelab

    ERIC Educational Resources Information Center

    Winberg, T. Mikael; Berg, C. Anders R.

    2007-01-01

    To enhance the learning outcomes achieved by students, learners undertook a computer-simulated activity based on an acid-base titration prior to a university-level chemistry laboratory activity. Students were categorized with respect to their attitudes toward learning. During the laboratory exercise, questions that students asked their assistant…

  4. Computer-Guided Experimentation-A New System for Laboratory Instruction.

    ERIC Educational Resources Information Center

    Neal, J. P.; Meller, D. V.

    This paper reports the development, operation, and initial performance evaluations of an electrical engineering laboratory station equipped for computer-guided experimentation (CGE). A practical evaluation of the actual instructional value of two programed lessons utilizing this new system for laboratory instruction and experimentation is also…

  5. Recent advances in data assimilation in computational geodynamic models

    NASA Astrophysics Data System (ADS)

    Ismail-Zadeh, Alik

    2010-05-01

    . The QRV method was most recently introduced in geodynamic modelling (Ismail-Zadeh et al., 2007, 2008; Tantsyrev, 2008; Glisovic et al., 2009). The advances in computational geodynamics and in data assimilation attract an interest of the community dealing with lithosphere, mantle and core dynamics.

  6. Process for selecting NEAMS applications for access to Idaho National Laboratory high performance computing resources

    SciTech Connect

    Michael Pernice

    2010-09-01

    INL has agreed to provide participants in the Nuclear Energy Advanced Mod- eling and Simulation (NEAMS) program with access to its high performance computing (HPC) resources under sponsorship of the Enabling Computational Technologies (ECT) program element. This report documents the process used to select applications and the software stack in place at INL.

  7. A User Assessment of Workspaces in Selected Music Education Computer Laboratories.

    ERIC Educational Resources Information Center

    Badolato, Michael Jeremy

    A study of 120 students selected from the user populations of four music education computer laboratories was conducted to determine the applicability of current ergonomic and environmental design guidelines in satisfying the needs of users of educational computing workspaces. Eleven categories of workspace factors were organized into a…

  8. Computation of Chemical Shifts for Paramagnetic Molecules: A Laboratory Experiment for the Undergraduate Curriculum

    ERIC Educational Resources Information Center

    Pritchard, Benjamin P.; Simpson, Scott; Zurek, Eva; Autschbach, Jochen

    2014-01-01

    A computational experiment investigating the [superscript 1]H and [superscript 13]C nuclear magnetic resonance (NMR) chemical shifts of molecules with unpaired electrons has been developed and implemented. This experiment is appropriate for an upper-level undergraduate laboratory course in computational, physical, or inorganic chemistry. The…

  9. Planning for Academic Computing Laboratories: Haves, Have Nots, and Student Uses of Information Technology.

    ERIC Educational Resources Information Center

    Kleen, Betty A.; Shell, L. Wayne; Wells-Roger, Craig A.

    The researchers investigated the intensity of computer lab use by a broad representation of students at their university. The purpose was to ascertain adequacy of computer laboratory hardware, software, and hours of access. Additionally, the researchers needed to answer a social policy question, and wished to determine who was making most use of…

  10. Knowledge Retention for Computer Simulations: A study comparing virtual and hands-on laboratories

    NASA Astrophysics Data System (ADS)

    Croom, John R., III

    The use of virtual laboratories has the potential to change physics education. These low-cost, interactive computer activities interest students, allow for easy setup, and give educators a way to teach laboratory based online classes. This study investigated whether virtual laboratories could replace traditional hands-on laboratories and whether students could retain the same long-term knowledge in virtual laboratories as compared to hands-on laboratories. This study is a quantitative quasi-experiment that used a multiple posttest design to determine if students using virtual laboratories would retain the same knowledge as students who performed hands-on laboratories after 9 weeks. The study was composed of 336 students from 14 school districts. Students had their performances on the laboratories and their retention of the laboratories compared to a series of factors that might have affected their retention using a pretest and two posttests, which were compared using a t test. The results showed no significant difference in short-term learning between the hands-on laboratory groups and virtual laboratory groups. There was, however, a significant difference (p = .005) between the groups in long-term retention; students in the hands-on laboratory groups retained more information than those in the virtual laboratory groups. These results suggest that long-term learning is enhanced when a laboratory contains a hands-on component. Finally, the results showed that both groups of students felt their particular laboratory style was superior to the alternative method. The findings of this study can be used to improve the integration of virtual laboratories into science curriculum.

  11. Important advances in technology and unique applications to cardiovascular computed tomography.

    PubMed

    Chaikriangkrai, Kongkiat; Choi, Su Yeon; Nabi, Faisal; Chang, Su Min

    2014-01-01

    For the past decade, multidetector cardiac computed tomography and its main application, coronary computed tomography angiography, have been established as a noninvasive technique for anatomical assessment of coronary arteries. This new era of coronary artery evaluation by coronary computed tomography angiography has arisen from the rapid advancement in computed tomography technology, which has led to massive diagnostic and prognostic clinical studies in various patient populations. This article gives a brief overview of current multidetector cardiac computed tomography systems, developing cardiac computed tomography technologies in both hardware and software fields, innovative radiation exposure reduction measures, multidetector cardiac computed tomography functional studies, and their newer clinical applications beyond coronary computed tomography angiography. PMID:25574342

  12. First 3 years of operation of RIACS (Research Institute for Advanced Computer Science) (1983-1985)

    NASA Technical Reports Server (NTRS)

    Denning, P. J.

    1986-01-01

    The focus of the Research Institute for Advanced Computer Science (RIACS) is to explore matches between advanced computing architectures and the processes of scientific research. An architecture evaluation of the MIT static dataflow machine, specification of a graphical language for expressing distributed computations, and specification of an expert system for aiding in grid generation for two-dimensional flow problems was initiated. Research projects for 1984 and 1985 are summarized.

  13. Computer simulation and laboratory work in the teaching of mechanics

    NASA Astrophysics Data System (ADS)

    Borghi, L.; DeAmbrosis, A.; Mascheretti, P.; Massara, C. I.

    1987-03-01

    By analysing the measures of student success in learning the fundamentals of physics in conjunction with the research reported in the literature one can conclude that it is difficult or undergraduates as well as high-school students to gain a reasonable understanding of elementary mechanics. Considerable effort has been devoted to identifying those factors which might prevent mechanics being successfully learnt and also to developing instructional methods which could improve its teaching (Champagne et al. 1984, Hewson 1985, McDermott 1983, Saltiel and Malgrange 1980, Whitaker 1983, White 1983). Starting from these research results and drawing from their own experience (Borghi et al. 1984, 1985), they arrived at the following conclusions. A strategy based on experimental activity, performed by the students themselves, together with a proper use of computer simulations, could well improve the learning of mechanics and enhance the interest in, and understanding of, topics which are difficult to treat in a traditional way. The authors describe the strategy they have designed to help high school students to learn mechanics and report how they have applied this strategy to their particular topic of projectile motion.

  14. Grand Challenges of Advanced Computing for Energy Innovation Report from the Workshop Held July 31-August 2, 2012

    SciTech Connect

    Larzelere, Alex R.; Ashby, Steven F.; Christensen, Dana C.; Crawford, Dona L.; Khaleel, Mohammad A.; John, Grosh; Stults, B. Ray; Lee, Steven L.; Hammond, Steven W.; Grover, Benjamin T.; Neely, Rob; Dudney, Lee Ann; Goldstein, Noah C.; Wells, Jack; Peltz, Jim

    2013-03-06

    On July 31-August 2 of 2012, the U.S. Department of Energy (DOE) held a workshop entitled Grand Challenges of Advanced Computing for Energy Innovation. This workshop built on three earlier workshops that clearly identified the potential for the Department and its national laboratories to enable energy innovation. The specific goal of the workshop was to identify the key challenges that the nation must overcome to apply the full benefit of taxpayer-funded advanced computing technologies to U.S. energy innovation in the ways that the country produces, moves, stores, and uses energy. Perhaps more importantly, the workshop also developed a set of recommendations to help the Department overcome those challenges. These recommendations provide an action plan for what the Department can do in the coming years to improve the nation’s energy future.

  15. Laboratory evaluation and analysis of advanced lead-acid load-leveling batteries

    NASA Astrophysics Data System (ADS)

    Miller, J. F.; Mulcahey, T. P.; Christianson, C. C.; Marr, J. J.; Smaga, J. A.

    Argonne National Laboratory has conducted an extensive evaluation of advanced lead-acid batteries developed by the Exide Corporation for load-leveling applications. This paper presents the results of performance and accelerated life tests conducted on these batteries over a five-year period. This paper describes the operational reliability and maintenance requirements for this technology, and also includes analyses of the batteries' thermal characteristics, arsine/stibine emission rates, and cell degradation modes as determined from post-test examinations.

  16. Integrated safeguards testing laboratories in support of the advanced fuel cycle initiative

    SciTech Connect

    Santi, Peter A; Demuth, Scott F; Klasky, Kristen L; Lee, Haeok; Miller, Michael C; Sprinkle, James K; Tobin, Stephen J; Williams, Bradley

    2009-01-01

    A key enabler for advanced fuel cycle safeguards research and technology development for programs such as the Advanced Fuel Cycle Initiative (AFCI) is access to facilities and nuclear materials. This access is necessary in many cases in order to ensure that advanced safeguards techniques and technologies meet the measurement needs for which they were designed. One such crucial facility is a hot cell based laboratory which would allow developers from universities, national laboratories, and commercial companies to perform iterative research and development of advanced safeguards instrumentation under realistic operating conditions but not be subject to production schedule limitations. The need for such a facility arises from the requirement to accurately measure minor actinide and/or fission product bearing nuclear materials that cannot be adequately shielded in glove boxes. With the contraction of the DOE nuclear complex following the end of the cold war, many suitable facilities at DOE sites are increasingly costly to operate and are being evaluated for closure. A hot cell based laboratory that allowed developers to install and remove instrumentation from the hot cell would allow for both risk mitigation and performance optimization of the instrumentation prior to fielding equipment in facilities where maintenance and repair of the instrumentation is difficult or impossible. These benefits are accomplished by providing developers the opportunity to iterate between testing the performance of the instrumentation by measuring realistic types and amounts of nuclear material, and adjusting and refining the instrumentation based on the results of these measurements. In this paper, we review the requirements for such a facility using the Wing 9 hot cells in the Los Alamos National Laboratory's Chemistry and Metallurgy Research facility as a model for such a facility and describe recent use of these hot cells in support of AFCI.

  17. Advanced Control Design and Field Testing for Wind Turbines at the National Renewable Energy Laboratory: Preprint

    SciTech Connect

    Hand, M. M.; Johnson, K. E.; Fingersh, L. J.; Wright, A. D.

    2004-05-01

    Utility-scale wind turbines require active control systems to operate at variable rotational speeds. As turbines become larger and more flexible, advanced control algorithms become necessary to meet multiple objectives such as speed regulation, blade load mitigation, and mode stabilization. At the same time, they must maximize energy capture. The National Renewable Energy Laboratory has developed control design and testing capabilities to meet these growing challenges.

  18. Precision laser range finder system design for Advanced Technology Laboratory applications

    NASA Technical Reports Server (NTRS)

    Golden, K. E.; Kohn, R. L.; Seib, D. H.

    1974-01-01

    Preliminary system design of a pulsed precision ruby laser rangefinder system is presented which has a potential range resolution of 0.4 cm when atmospheric effects are negligible. The system being proposed for flight testing on the advanced technology laboratory (ATL) consists of a modelocked ruby laser transmitter, course and vernier rangefinder receivers, optical beacon retroreflector tracking system, and a network of ATL tracking retroreflectors. Performance calculations indicate that spacecraft to ground ranging accuracies of 1 to 2 cm are possible.

  19. Generalized Advanced Propeller Analysis System (GAPAS). Volume 2: Computer program user manual

    NASA Technical Reports Server (NTRS)

    Glatt, L.; Crawford, D. R.; Kosmatka, J. B.; Swigart, R. J.; Wong, E. W.

    1986-01-01

    The Generalized Advanced Propeller Analysis System (GAPAS) computer code is described. GAPAS was developed to analyze advanced technology multi-bladed propellers which operate on aircraft with speeds up to Mach 0.8 and altitudes up to 40,000 feet. GAPAS includes technology for analyzing aerodynamic, structural, and acoustic performance of propellers. The computer code was developed for the CDC 7600 computer and is currently available for industrial use on the NASA Langley computer. A description of all the analytical models incorporated in GAPAS is included. Sample calculations are also described as well as users requirements for modifying the analysis system. Computer system core requirements and running times are also discussed.

  20. Computer architectures for computational physics work done by Computational Research and Technology Branch and Advanced Computational Concepts Group

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Slides are reproduced that describe the importance of having high performance number crunching and graphics capability. They also indicate the types of research and development underway at Ames Research Center to ensure that, in the near term, Ames is a smart buyer and user, and in the long-term that Ames knows the best possible solutions for number crunching and graphics needs. The drivers for this research are real computational physics applications of interest to Ames and NASA. They are concerned with how to map the applications, and how to maximize the physics learned from the results of the calculations. The computer graphics activities are aimed at getting maximum information from the three-dimensional calculations by using the real time manipulation of three-dimensional data on the Silicon Graphics workstation. Work is underway on new algorithms that will permit the display of experimental results that are sparse and random, the same way that the dense and regular computed results are displayed.

  1. New directions in scientific computing: impact of advances in microprocessor architecture and system design.

    PubMed

    Malyj, W; Smith, R E; Horowitz, J M

    1984-01-01

    The new generation of microcomputers has brought computing power previously restricted to mainframe and supermini computers within the reach of individual scientific laboratories. Microcomputers can now provide computing speeds rivaling mainframes and computational accuracies exceeding those available in most computer centers. Inexpensive memory makes possible the transfer to microcomputers of software packages developed for mainframes and tested by years of experience. Combinations of high level languages and assembler subroutines permit the efficient design of specialized applications programs. Microprocessor architecture is approaching that of superminis, with coprocessors providing major contributions to computing power. The combined result of these developments is a major and perhaps revolutionary increase in the computing power now available to scientists.

  2. Advanced Scientific Computing Environment Team new scientific database management task

    SciTech Connect

    Church, J.P.; Roberts, J.C.; Sims, R.N.; Smetana, A.O.; Westmoreland, B.W.

    1991-06-01

    The mission of the ASCENT Team is to continually keep pace with, evaluate, and select emerging computing technologies to define and implement prototypic scientific environments that maximize the ability of scientists and engineers to manage scientific data. These environments are to be implemented in a manner consistent with the site computing architecture and standards and NRTSC/SCS strategic plans for scientific computing. The major trends in computing hardware and software technology clearly indicate that the future computer'' will be a network environment that comprises supercomputers, graphics boxes, mainframes, clusters, workstations, terminals, and microcomputers. This network computer'' will have an architecturally transparent operating system allowing the applications code to run on any box supplying the required computing resources. The environment will include a distributed database and database managing system(s) that permits use of relational, hierarchical, object oriented, GIS, et al, databases. To reach this goal requires a stepwise progression from the present assemblage of monolithic applications codes running on disparate hardware platforms and operating systems. The first steps include converting from the existing JOSHUA system to a new J80 system that complies with modern language standards, development of a new J90 prototype to provide JOSHUA capabilities on Unix platforms, development of portable graphics tools to greatly facilitate preparation of input and interpretation of output; and extension of Jvv'' concepts and capabilities to distributed and/or parallel computing environments.

  3. The Advance of Computing from the Ground to the Cloud

    ERIC Educational Resources Information Center

    Breeding, Marshall

    2009-01-01

    A trend toward the abstraction of computing platforms that has been developing in the broader IT arena over the last few years is just beginning to make inroads into the library technology scene. Cloud computing offers for libraries many interesting possibilities that may help reduce technology costs and increase capacity, reliability, and…

  4. Computational Enzyme Design: Advances, hurdles and possible ways forward

    PubMed Central

    Linder, Mats

    2012-01-01

    This mini review addresses recent developments in computational enzyme design. Successful protocols as well as known issues and limitations are discussed from an energetic perspective. It will be argued that improved results can be obtained by including a dynamic treatment in the design protocol. Finally, a molecular dynamics-based approach for evaluating and refining computational designs is presented. PMID:24688650

  5. Data handling and reporting for microbiology specimens with a small laboratory computer system.

    PubMed Central

    Landowne, R A

    1981-01-01

    A small laboratory computer system designed for general application in chemistry, haematology, and urinalysis has been adapted for the bacteriology section of the laboratory using the same available programming routines. Specimens are requisitioned according to predetermined common site codes, with both preliminary and final reporting allowed for where desired. Sensitivity data also appended and entered where required, even for different organisms in the same culture. PMID:7251905

  6. A Computational Methodology for Simulating Thermal Loss Testing of the Advanced Stirling Convertor

    NASA Technical Reports Server (NTRS)

    Reid, Terry V.; Wilson, Scott D.; Schifer, Nicholas A.; Briggs, Maxwell H.

    2012-01-01

    The U.S. Department of Energy (DOE) and Lockheed Martin Space Systems Company (LMSSC) have been developing the Advanced Stirling Radioisotope Generator (ASRG) for use as a power system for space science missions. This generator would use two highefficiency Advanced Stirling Convertors (ASCs), developed by Sunpower Inc. and NASA Glenn Research Center (GRC). The ASCs convert thermal energy from a radioisotope heat source into electricity. As part of ground testing of these ASCs, different operating conditions are used to simulate expected mission conditions. These conditions require achieving a particular operating frequency, hot end and cold end temperatures, and specified electrical power output for a given net heat input. In an effort to improve net heat input predictions, numerous tasks have been performed which provided a more accurate value for net heat input into the ASCs, including the use of multidimensional numerical models. Validation test hardware has also been used to provide a direct comparison of numerical results and validate the multi-dimensional numerical models used to predict convertor net heat input and efficiency. These validation tests were designed to simulate the temperature profile of an operating Stirling convertor and resulted in a measured net heat input of 244.4 W. The methodology was applied to the multi-dimensional numerical model which resulted in a net heat input of 240.3 W. The computational methodology resulted in a value of net heat input that was 1.7 percent less than that measured during laboratory testing. The resulting computational methodology and results are discussed.

  7. Certain irregularities in the use of computer facilities at Sandia Laboratory

    SciTech Connect

    Not Available

    1980-10-22

    This report concerns irregularities in the use of computer systems at Sandia Laboratories (Sandia) in Albuquerque, New Mexico. Our interest in this subject was triggered when we learned late last year that the Federal Bureau of Investigation (FBI) was planning to undertake an investigation into possible misuse of the computer systems at Sandia. That investigation, which was carried out with the assistance of our staff, disclosed that an employee of Sandia was apparently using the Sandia computer system to assist in running a bookmaking operation for local gamblers. As a result of that investigation, we decided to conduct a separate review of Sandia's computer systems to determine the extent of computer misuse at Sandia. We found that over 200 employees of Sandia had stored games, personal items, classified material, and otherwise sensitive material on their computer files.

  8. Student teaching and research laboratory focusing on brain-computer interface paradigms--A creative environment for computer science students.

    PubMed

    Rutkowski, Tomasz M

    2015-08-01

    This paper presents an applied concept of a brain-computer interface (BCI) student research laboratory (BCI-LAB) at the Life Science Center of TARA, University of Tsukuba, Japan. Several successful case studies of the student projects are reviewed together with the BCI Research Award 2014 winner case. The BCI-LAB design and project-based teaching philosophy is also explained. Future teaching and research directions summarize the review.

  9. Advances in computational design and analysis of airbreathing propulsion systems

    NASA Technical Reports Server (NTRS)

    Klineberg, John M.

    1989-01-01

    The development of commercial and military aircraft depends, to a large extent, on engine manufacturers being able to achieve significant increases in propulsion capability through improved component aerodynamics, materials, and structures. The recent history of propulsion has been marked by efforts to develop computational techniques that can speed up the propulsion design process and produce superior designs. The availability of powerful supercomputers, such as the NASA Numerical Aerodynamic Simulator, and the potential for even higher performance offered by parallel computer architectures, have opened the door to the use of multi-dimensional simulations to study complex physical phenomena in propulsion systems that have previously defied analysis or experimental observation. An overview of several NASA Lewis research efforts is provided that are contributing toward the long-range goal of a numerical test-cell for the integrated, multidisciplinary design, analysis, and optimization of propulsion systems. Specific examples in Internal Computational Fluid Mechanics, Computational Structural Mechanics, Computational Materials Science, and High Performance Computing are cited and described in terms of current capabilities, technical challenges, and future research directions.

  10. OPENING REMARKS: SciDAC: Scientific Discovery through Advanced Computing

    NASA Astrophysics Data System (ADS)

    Strayer, Michael

    2005-01-01

    Good morning. Welcome to SciDAC 2005 and San Francisco. SciDAC is all about computational science and scientific discovery. In a large sense, computational science characterizes SciDAC and its intent is change. It transforms both our approach and our understanding of science. It opens new doors and crosses traditional boundaries while seeking discovery. In terms of twentieth century methodologies, computational science may be said to be transformational. There are a number of examples to this point. First are the sciences that encompass climate modeling. The application of computational science has in essence created the field of climate modeling. This community is now international in scope and has provided precision results that are challenging our understanding of our environment. A second example is that of lattice quantum chromodynamics. Lattice QCD, while adding precision and insight to our fundamental understanding of strong interaction dynamics, has transformed our approach to particle and nuclear science. The individual investigator approach has evolved to teams of scientists from different disciplines working side-by-side towards a common goal. SciDAC is also undergoing a transformation. This meeting is a prime example. Last year it was a small programmatic meeting tracking progress in SciDAC. This year, we have a major computational science meeting with a variety of disciplines and enabling technologies represented. SciDAC 2005 should position itself as a new corner stone for Computational Science and its impact on science. As we look to the immediate future, FY2006 will bring a new cycle to SciDAC. Most of the program elements of SciDAC will be re-competed in FY2006. The re-competition will involve new instruments for computational science, new approaches for collaboration, as well as new disciplines. There will be new opportunities for virtual experiments in carbon sequestration, fusion, and nuclear power and nuclear waste, as well as collaborations

  11. The Impact of Advance Organizers upon Students' Achievement in Computer-Assisted Video Instruction.

    ERIC Educational Resources Information Center

    Saidi, Houshmand

    1994-01-01

    Describes a study of undergraduates that was conducted to determine the impact of advance organizers on students' achievement in computer-assisted video instruction (CAVI). Treatments of the experimental and control groups are explained, and results indicate that advance organizers do not facilitate near-transfer of rule-learning in CAVI.…

  12. Integration of Computational Chemistry into the Undergraduate Organic Chemistry Laboratory Curriculum

    ERIC Educational Resources Information Center

    Esselman, Brian J.; Hill, Nicholas J.

    2016-01-01

    Advances in software and hardware have promoted the use of computational chemistry in all branches of chemical research to probe important chemical concepts and to support experimentation. Consequently, it has become imperative that students in the modern undergraduate curriculum become adept at performing simple calculations using computational…

  13. Some recent advances in computational aerodynamics for helicopter applications

    NASA Technical Reports Server (NTRS)

    Mccroskey, W. J.; Baeder, J. D.

    1985-01-01

    The growing application of computational aerodynamics to nonlinear helicopter problems is outlined, with particular emphasis on several recent quasi-two-dimensional examples that used the thin-layer Navier-Stokes equations and an eddy-viscosity model to approximate turbulence. Rotor blade section characteristics can now be calculated accurately over a wide range of transonic flow conditions. However, a finite-difference simulation of the complete flow field about a helicopter in forward flight is not currently feasible, despite the impressive progress that is being made in both two and three dimensions. The principal limitations are today's computer speeds and memories, algorithm and solution methods, grid generation, vortex modeling, structural and aerodynamic coupling, and a shortage of engineers who are skilled in both computational fluid dynamics and helicopter aerodynamics and dynamics.

  14. Cane Toad or Computer Mouse? Real and Computer-Simulated Laboratory Exercises in Physiology Classes

    ERIC Educational Resources Information Center

    West, Jan; Veenstra, Anneke

    2012-01-01

    Traditional practical classes in many countries are being rationalised to reduce costs. The challenge for university educators is to provide students with the opportunity to reinforce theoretical concepts by running something other than a traditional practical program. One alternative is to replace wet labs with comparable computer simulations.…

  15. Configuration and Management of a Cluster Computing Facility in Undergraduate Student Computer Laboratories

    ERIC Educational Resources Information Center

    Cornforth, David; Atkinson, John; Spennemann, Dirk H. R.

    2006-01-01

    Purpose: Many researchers require access to computer facilities beyond those offered by desktop workstations. Traditionally, these are offered either through partnerships, to share the cost of supercomputing facilities, or through purpose-built cluster facilities. However, funds are not always available to satisfy either of these options, and…

  16. Current research activities at the NASA-sponsored Illinois Computing Laboratory of Aerospace Systems and Software

    NASA Technical Reports Server (NTRS)

    Smith, Kathryn A.

    1994-01-01

    The Illinois Computing Laboratory of Aerospace Systems and Software (ICLASS) was established to: (1) pursue research in the areas of aerospace computing systems, software and applications of critical importance to NASA, and (2) to develop and maintain close contacts between researchers at ICLASS and at various NASA centers to stimulate interaction and cooperation, and facilitate technology transfer. Current ICLASS activities are in the areas of parallel architectures and algorithms, reliable and fault tolerant computing, real time systems, distributed systems, software engineering and artificial intelligence.

  17. BASIC and the Density of Glass. A First-Year Laboratory/Computer Experiment.

    ERIC Educational Resources Information Center

    Harris, Arlo D.

    1986-01-01

    Describes a first-year chemistry laboratory experiment which uses a simple computer program written in BASIC, to analyze data collected by students about the density of a set of marbles. A listing of the program is provided, along with a sample printout of the experiment's results. (TW)

  18. First Year Students Understanding of Elementary Concepts in Differential Calculus in a Computer Laboratory Teaching Environment

    ERIC Educational Resources Information Center

    Naidoo, K.; Naidoo, R.

    2007-01-01

    This paper focuses on blended learning in mathematics I module in elementary calculus, at a University of Technology. A computer laboratory was used to create a learning environment that promoted interactive learning together with traditional teaching. The interactive learning was performed using projects to optimize the discovery and error…

  19. Problem-Solving Inquiry-Oriented Biology Tasks Integrating Practical Laboratory and Computer.

    ERIC Educational Resources Information Center

    Friedler, Yael; And Others

    1992-01-01

    Presents results of a study that examines the development and use of computer simulations for high school science instruction and for integrated laboratory and computerized tests that are part of the biology matriculation examination in Israel. Eleven implications for teaching are presented. (MDH)

  20. Journal of Chemical Education: Software: Abstract of "The Computer-Based Laboratory."

    ERIC Educational Resources Information Center

    Krause, Daniel C.; Divis, Lynne M., Ed.

    1988-01-01

    Describes a chemistry laboratory software package for interfacing the Apple IIe for high school and introductory college courses. Topics include: thermistor calibration, phase change, heat of reaction, freezing point depression, Beer's law, and color decay in crystal violet. Explains the computer interface and the tools needed. (MVL)

  1. Usnic Acid and the Intramolecular Hydrogen Bond: A Computational Experiment for the Organic Laboratory

    ERIC Educational Resources Information Center

    Green, Thomas K.; Lane, Charles A.

    2006-01-01

    A computational experiment is described for the organic chemistry laboratory that allows students to estimate the relative strengths of the intramolecular hydrogen bonds of usnic and isousnic acids, two related lichen secondary metabolites. Students first extract and purify usnic acid from common lichens and obtain [superscript 1]H NMR and IR…

  2. Annotated List of Chemistry Laboratory Experiments with Computer Access. Final Report.

    ERIC Educational Resources Information Center

    Bunce, S. C.; And Others

    Project Chemlab was designed to prepare an "Annotated List of Laboratory Experiments in Chemistry from the Journal of Chemical Education (1957-1979)" and to develop a computer file and program to search for specific types of experiments. Provided in this document are listings (photoreduced copies of printouts) of over 1500 entries classified into…

  3. Motion of Electrons in Electric and Magnetic Fields: Introductory Laboratory and Computer Studies.

    ERIC Educational Resources Information Center

    Huggins, Elisha R.; Lelek, Jeffrey J.

    1979-01-01

    Describes a series of laboratory experiments and computer simulations of the motion of electrons in electric and magnetic fields. These experiments, which involve an inexpensive student-built electron gun, study the electron mean free path, magnetic focusing, and other aspects. (Author/HM)

  4. The combination of specimen tracking with an advanced AutoLog in a laboratory information system.

    PubMed

    Emmerich, K A; Quam, E F; Bowers, K L; Eggert, A A

    1998-06-01

    The ability to provide timely laboratory results is an important aspect of quality which must be continually monitored. In order to complete all testing before the maximum turnaround time requirements are exceeded, laboratorians need to have immediate and automatic access to the location of specimens and the status of tests ordered on each specimen. Any such automated approach must be able to monitor continually the status of work in progress, while simultaneously linking it to a specimen tracking (history) system that allows real-time tracing of the path of specimens through all laboratory operations. The authors have greatly advanced the capabilities of the AutoLog technology and have added to it a tracking system that captures specimen movement with minimum user assistance. This has been accomplished without the need to implement total process automation.

  5. High Energy Laboratory Astrophysics Experiments using electron beam ion traps and advanced light sources

    NASA Astrophysics Data System (ADS)

    Brown, Gregory V.; Beiersdorfer, Peter; Bernitt, Sven; Eberle, Sita; Hell, Natalie; Kilbourne, Caroline; Kelley, Rich; Leutenegger, Maurice; Porter, F. Scott; Rudolph, Jan; Steinbrugge, Rene; Traebert, Elmar; Crespo-Lopez-Urritia, Jose R.

    2015-08-01

    We have used the Lawrence Livermore National Laboratory's EBIT-I electron beam ion trap coupled with a NASA/GSFC microcalorimeter spectrometer instrument to systematically address problems found in the analysis of high resolution X-ray spectra from celestial sources, and to benchmark atomic physics codes employed by high resolution spectral modeling packages. Our results include laboratory measurements of transition energies, absolute and relative electron impact excitation cross sections, charge exchange cross sections, and dielectronic recombination resonance strengths. More recently, we have coupled to the Max-Plank Institute for Nuclear Physics-Heidelberg's FLASH-EBIT electron beam ion trap to third and fourth generation advanced light sources to measure photoexcitation and photoionization cross sections, as well as, natural line widths of X-ray transitions in highly charged iron ions. Selected results will be presented.

  6. The impact of recent advances in laboratory astrophysics on our understanding of the cosmos.

    PubMed

    Savin, D W; Brickhouse, N S; Cowan, J J; Drake, R P; Federman, S R; Ferland, G J; Frank, A; Gudipati, M S; Haxton, W C; Herbst, E; Profumo, S; Salama, F; Ziurys, L M; Zweibel, E G

    2012-03-01

    An emerging theme in modern astrophysics is the connection between astronomical observations and the underlying physical phenomena that drive our cosmos. Both the mechanisms responsible for the observed astrophysical phenomena and the tools used to probe such phenomena-the radiation and particle spectra we observe-have their roots in atomic, molecular, condensed matter, plasma, nuclear and particle physics. Chemistry is implicitly included in both molecular and condensed matter physics. This connection is the theme of the present report, which provides a broad, though non-exhaustive, overview of progress in our understanding of the cosmos resulting from recent theoretical and experimental advances in what is commonly called laboratory astrophysics. This work, carried out by a diverse community of laboratory astrophysicists, is increasingly important as astrophysics transitions into an era of precise measurement and high fidelity modeling.

  7. A landmark recognition and tracking experiment for flight on the Shuttle/Advanced Technology Laboratory (ATL)

    NASA Technical Reports Server (NTRS)

    Welch, J. D.

    1975-01-01

    The preliminary design of an experiment for landmark recognition and tracking from the Shuttle/Advanced Technology Laboratory is described. It makes use of parallel coherent optical processing to perform correlation tests between landmarks observed passively with a telescope and previously made holographic matched filters. The experimental equipment including the optics, the low power laser, the random access file of matched filters and the electro-optical readout device are described. A real time optically excited liquid crystal device is recommended for performing the input non-coherent optical to coherent optical interface function. A development program leading to a flight experiment in 1981 is outlined.

  8. Cattle Uterus: A Novel Animal Laboratory Model for Advanced Hysteroscopic Surgery Training

    PubMed Central

    Ewies, Ayman A. A.; Khan, Zahid R.

    2015-01-01

    In recent years, due to reduced training opportunities, the major shift in surgical training is towards the use of simulation and animal laboratories. Despite the merits of Virtual Reality Simulators, they are far from representing the real challenges encountered in theatres. We introduce the “Cattle Uterus Model” in the hope that it will be adopted in training courses as a low cost and easy-to-set-up tool. It adds new dimensions to the advanced hysteroscopic surgery training experience by providing tactile sensation and simulating intraoperative difficulties. It complements conventional surgical training, aiming to maximise clinical exposure and minimise patients' harm. PMID:26265918

  9. System reliability of randomly vibrating structures: Computational modeling and laboratory testing

    NASA Astrophysics Data System (ADS)

    Sundar, V. S.; Ammanagi, S.; Manohar, C. S.

    2015-09-01

    The problem of determination of system reliability of randomly vibrating structures arises in many application areas of engineering. We discuss in this paper approaches based on Monte Carlo simulations and laboratory testing to tackle problems of time variant system reliability estimation. The strategy we adopt is based on the application of Girsanov's transformation to the governing stochastic differential equations which enables estimation of probability of failure with significantly reduced number of samples than what is needed in a direct simulation study. Notably, we show that the ideas from Girsanov's transformation based Monte Carlo simulations can be extended to conduct laboratory testing to assess system reliability of engineering structures with reduced number of samples and hence with reduced testing times. Illustrative examples include computational studies on a 10-degree of freedom nonlinear system model and laboratory/computational investigations on road load response of an automotive system tested on a four-post test rig.

  10. Connecting Performance Analysis and Visualization to Advance Extreme Scale Computing

    SciTech Connect

    Bremer, Peer-Timo; Mohr, Bernd; Schulz, Martin; Pasccci, Valerio; Gamblin, Todd; Brunst, Holger

    2015-07-29

    The characterization, modeling, analysis, and tuning of software performance has been a central topic in High Performance Computing (HPC) since its early beginnings. The overall goal is to make HPC software run faster on particular hardware, either through better scheduling, on-node resource utilization, or more efficient distributed communication.

  11. Advanced Simulation and Computing Co-Design Strategy

    SciTech Connect

    Ang, James A.; Hoang, Thuc T.; Kelly, Suzanne M.; McPherson, Allen; Neely, Rob

    2015-11-01

    This ASC Co-design Strategy lays out the full continuum and components of the co-design process, based on what we have experienced thus far and what we wish to do more in the future to meet the program’s mission of providing high performance computing (HPC) and simulation capabilities for NNSA to carry out its stockpile stewardship responsibility.

  12. Advanced Computer Image Generation Techniques Exploiting Perceptual Characteristics. Final Report.

    ERIC Educational Resources Information Center

    Stenger, Anthony J.; And Others

    This study suggests and identifies computer image generation (CIG) algorithms for visual simulation that improve the training effectiveness of CIG simulators and identifies areas of basic research in visual perception that are significant for improving CIG technology. The first phase of the project entailed observing three existing CIG simulators.…

  13. MAX - An advanced parallel computer for space applications

    NASA Technical Reports Server (NTRS)

    Lewis, Blair F.; Bunker, Robert L.

    1991-01-01

    MAX is a fault-tolerant multicomputer hardware and software architecture designed to meet the needs of NASA spacecraft systems. It consists of conventional computing modules (computers) connected via a dual network topology. One network is used to transfer data among the computers and between computers and I/O devices. This network's topology is arbitrary. The second network operates as a broadcast medium for operating system synchronization messages and supports the operating system's Byzantine resilience. A fully distributed operating system supports multitasking in an asynchronous event and data driven environment. A large grain dataflow paradigm is used to coordinate the multitasking and provide easy control of concurrency. It is the basis of the system's fault tolerance and allows both static and dynamical location of tasks. Redundant execution of tasks with software voting of results may be specified for critical tasks. The dataflow paradigm also supports simplified software design, test and maintenance. A unique feature is a method for reliably patching code in an executing dataflow application.

  14. The effect of introducing computers into an introductory physics problem-solving laboratory

    NASA Astrophysics Data System (ADS)

    McCullough, Laura Ellen

    2000-10-01

    Computers are appearing in every type of classroom across the country. Yet they often appear without benefit of studying their effects. The research that is available on computer use in classrooms has found mixed results, and often ignores the theoretical and instructional contexts of the computer in the classroom. The University of Minnesota's physics department employs a cooperative-group problem solving pedagogy, based on a cognitive apprenticeship instructional model, in its calculus-based introductory physics course. This study was designed to determine possible negative effects of introducing a computerized data-acquisition and analysis tool into this pedagogy as a problem-solving tool for students to use in laboratory. To determine the effects of the computer tool, two quasi-experimental treatment groups were selected. The computer-tool group (N = 170) used a tool, designed for this study (VideoTool), to collect and analyze motion data in the laboratory. The control group (N = 170) used traditional non-computer equipment (spark tapes and Polaroid(TM) film). The curriculum was kept as similar as possible for the two groups. During the ten week academic quarter, groups were examined for effects on performance on conceptual tests and grades, attitudes towards the laboratory and the laboratory tools, and behaviors within cooperative groups. Possible interactions with gender were also examined. Few differences were found between the control and computer-tool groups. The control group received slightly higher scores on one conceptual test, but this difference was not educationally significant. The computer-tool group had slightly more positive attitudes towards using the computer tool than their counterparts had towards the traditional tools. The computer-tool group also perceived that they spoke more frequently about physics misunderstandings, while the control group felt that they discussed equipment difficulties more often. This perceptual difference interacted

  15. UNEDF: Advanced Scientific Computing Transforms the Low-Energy Nuclear Many-Body Problem

    SciTech Connect

    Stoitsov, Mario; Nam, Hai Ah; Nazarewicz, Witold; Bulgac, Aurel; Hagen, Gaute; Kortelainen, E. M.; Pei, Junchen; Roche, K. J.; Schunck, N.; Thompson, I.; Vary, J. P.; Wild, S.

    2011-01-01

    The UNEDF SciDAC collaboration of nuclear theorists, applied mathematicians, and computer scientists is developing a comprehensive description of nuclei and their reactions that delivers maximum predictive power with quantified uncertainties. This paper illustrates significant milestones accomplished by UNEDF through integration of the theoretical approaches, advanced numerical algorithms, and leadership class computational resources.

  16. Project T.E.A.M. (Technical Education Advancement Modules). Introduction to Computers.

    ERIC Educational Resources Information Center

    Ellis, Brenda

    This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 3-hour introduction to computers. The purpose is to develop the following competencies: (1) orientation to data processing; (2) use of data entry devices; (3) use of computer menus; and (4) entry of data with accuracy and…

  17. Computer Assisted Fluid Power Instruction: A Comparison of Hands-On and Computer-Simulated Laboratory Experiences for Post-Secondary Students

    ERIC Educational Resources Information Center

    Wilson, Scott B.

    2005-01-01

    The primary purpose of this study was to examine the effectiveness of utilizing a combination of lecture and computer resources to train personnel to assume roles as hydraulic system technicians and specialists in the fluid power industry. This study compared computer simulated laboratory instruction to traditional hands-on laboratory instruction,…

  18. UNEDF: Advanced Scientific Computing Collaboration Transforms the Low-Energy Nuclear Many-Body Problem

    NASA Astrophysics Data System (ADS)

    Nam, H.; Stoitsov, M.; Nazarewicz, W.; Bulgac, A.; Hagen, G.; Kortelainen, M.; Maris, P.; Pei, J. C.; Roche, K. J.; Schunck, N.; Thompson, I.; Vary, J. P.; Wild, S. M.

    2012-12-01

    The demands of cutting-edge science are driving the need for larger and faster computing resources. With the rapidly growing scale of computing systems and the prospect of technologically disruptive architectures to meet these needs, scientists face the challenge of effectively using complex computational resources to advance scientific discovery. Multi-disciplinary collaborating networks of researchers with diverse scientific backgrounds are needed to address these complex challenges. The UNEDF SciDAC collaboration of nuclear theorists, applied mathematicians, and computer scientists is developing a comprehensive description of nuclei and their reactions that delivers maximum predictive power with quantified uncertainties. This paper describes UNEDF and identifies attributes that classify it as a successful computational collaboration. We illustrate significant milestones accomplished by UNEDF through integrative solutions using the most reliable theoretical approaches, most advanced algorithms, and leadership-class computational resources.

  19. UNEDF: Advanced Scientific Computing Collaboration Transforms the Low-Energy Nuclear Many-Body Problem

    SciTech Connect

    Nam, H.; Stoitsov, M.; Nazarewicz, W.; Bulgac, A.; Hagen, G.; Kortelainen, M.; Maris, P.; Pei, J. C.; Roche, K. J.; Schunck, N.; Thompson, I.; Vary, J. P.; Wild, S. M.

    2012-12-20

    The demands of cutting-edge science are driving the need for larger and faster computing resources. With the rapidly growing scale of computing systems and the prospect of technologically disruptive architectures to meet these needs, scientists face the challenge of effectively using complex computational resources to advance scientific discovery. Multi-disciplinary collaborating networks of researchers with diverse scientific backgrounds are needed to address these complex challenges. The UNEDF SciDAC collaboration of nuclear theorists, applied mathematicians, and computer scientists is developing a comprehensive description of nuclei and their reactions that delivers maximum predictive power with quantified uncertainties. This paper describes UNEDF and identifies attributes that classify it as a successful computational collaboration. Finally, we illustrate significant milestones accomplished by UNEDF through integrative solutions using the most reliable theoretical approaches, most advanced algorithms, and leadership-class computational resources.

  20. The ergonomics of computer aided design within advanced manufacturing technology.

    PubMed

    John, P A

    1988-03-01

    Many manufacturing companies have now awakened to the significance of computer aided design (CAD), although the majority of them have only been able to purchase computerised draughting systems of which only a subset produce direct manufacturing data. Such companies are moving steadily towards the concept of computer integrated manufacture (CIM), and this demands CAD to address more than draughting. CAD architects are thus having to rethink the basic specification of such systems, although they typically suffer from an insufficient understanding of the design task and have consequently been working with inadequate specifications. It is at this fundamental level that ergonomics has much to offer, making its contribution by encouraging user-centred design. The discussion considers the relationships between CAD and: the design task; the organisation and people; creativity; and artificial intelligence. It finishes with a summary of the contribution of ergonomics.

  1. Cogeneration computer model assessment: Advanced cogeneration research study

    NASA Technical Reports Server (NTRS)

    Rosenberg, L.

    1983-01-01

    Cogeneration computer simulation models to recommend the most desirable models or their components for use by the Southern California Edison Company (SCE) in evaluating potential cogeneration projects was assessed. Existing cogeneration modeling capabilities are described, preferred models are identified, and an approach to the development of a code which will best satisfy SCE requirements is recommended. Five models (CELCAP, COGEN 2, CPA, DEUS, and OASIS) are recommended for further consideration.

  2. Parallel computing in genomic research: advances and applications.

    PubMed

    Ocaña, Kary; de Oliveira, Daniel

    2015-01-01

    Today's genomic experiments have to process the so-called "biological big data" that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC) environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities.

  3. Parallel computing in genomic research: advances and applications.

    PubMed

    Ocaña, Kary; de Oliveira, Daniel

    2015-01-01

    Today's genomic experiments have to process the so-called "biological big data" that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC) environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities. PMID:26604801

  4. Parallel computing in genomic research: advances and applications

    PubMed Central

    Ocaña, Kary; de Oliveira, Daniel

    2015-01-01

    Today’s genomic experiments have to process the so-called “biological big data” that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC) environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities. PMID:26604801

  5. Advanced Computational Methods for Security Constrained Financial Transmission Rights

    SciTech Connect

    Kalsi, Karanjit; Elbert, Stephen T.; Vlachopoulou, Maria; Zhou, Ning; Huang, Zhenyu

    2012-07-26

    Financial Transmission Rights (FTRs) are financial insurance tools to help power market participants reduce price risks associated with transmission congestion. FTRs are issued based on a process of solving a constrained optimization problem with the objective to maximize the FTR social welfare under power flow security constraints. Security constraints for different FTR categories (monthly, seasonal or annual) are usually coupled and the number of constraints increases exponentially with the number of categories. Commercial software for FTR calculation can only provide limited categories of FTRs due to the inherent computational challenges mentioned above. In this paper, first an innovative mathematical reformulation of the FTR problem is presented which dramatically improves the computational efficiency of optimization problem. After having re-formulated the problem, a novel non-linear dynamic system (NDS) approach is proposed to solve the optimization problem. The new formulation and performance of the NDS solver is benchmarked against widely used linear programming (LP) solvers like CPLEX™ and tested on both standard IEEE test systems and large-scale systems using data from the Western Electricity Coordinating Council (WECC). The performance of the NDS is demonstrated to be comparable and in some cases is shown to outperform the widely used CPLEX algorithms. The proposed formulation and NDS based solver is also easily parallelizable enabling further computational improvement.

  6. A driver linac for the Advanced Exotic Beam Laboratory : physics design and beam dynamics simulations.

    SciTech Connect

    Ostroumov, P. N.; Mustapha, B.; Nolen, J.; Physics

    2007-01-01

    The Advanced Exotic Beam Laboratory (AEBL) being developed at ANL consists of an 833 MV heavy-ion driver linac capable of producing uranium ions up to 200 MeV/u and protons to 580 MeV with 400 kW beam power. We have designed all accelerator components including a two charge state LEBT, an RFQ, a MEBT, a superconducting linac, a stripper station and chicane. We present the results of an optimized linac design and end-to-end simulations including machine errors and detailed beam loss analysis. The Advanced Exotic Beam Laboratory (AEBL) has been proposed at ANL as a reduced scale of the original Rare Isotope Accelerator (RIA) project with about half the cost but the same beam power. AEBL will address 90% or more of RIA physics but with reduced multi-users capabilities. The focus of this paper is the physics design and beam dynamics simulations of the AEBL driver linac. The reported results are for a multiple charge state U{sup 238} beam.

  7. An ultrafast optics undergraduate advanced laboratory with a mode-locked fiber laser

    NASA Astrophysics Data System (ADS)

    Schaffer, Andrew; Fredrick, Connor; Hoyt, Chad; Jones, Jason

    2015-05-01

    We describe an ultrafast optics undergraduate advanced laboratory comprising a mode-locked erbium fiber laser, auto-correlation measurements, and an external, free-space parallel grating dispersion compensation apparatus. The simple design of the stretched pulse laser uses nonlinear polarization rotation mode-locking to produce pulses at a repetition rate of 55 MHz and average power of 5.5 mW. Interferometric and intensity auto-correlation measurements are made using a Michelson interferometer that takes advantage of the two-photon nonlinear response of a common silicon photodiode for the second order correlation between 1550 nm laser pulses. After a pre-amplifier and compression, pulse widths as narrow as 108 fs are measured at 17 mW average power. A detailed parts list includes previously owned and common components used by the telecommunications industry, which may decrease the cost of the lab to within reach of many undergraduate and graduate departments. We also describe progress toward a relatively low-cost optical frequency comb advanced laboratory. NSF EIR #1208930.

  8. Vision 20/20: Automation and advanced computing in clinical radiation oncology

    SciTech Connect

    Moore, Kevin L. Moiseenko, Vitali; Kagadis, George C.; McNutt, Todd R.; Mutic, Sasa

    2014-01-15

    This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authors contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy.

  9. Advanced detector development, laboratory simulations, diagnostic development, and data analysis on wake physics

    NASA Astrophysics Data System (ADS)

    Talbet, John M.; Chan, Chung

    1994-06-01

    We have performed a number of experiments and have developed a theoretical analysis of the current scaling issues of a negatively biased probe in a plasma wake. The end product of the laboratory research performed in the large vacuum chamber JUMBO at Phillips Laboratory can be found in the associated publication. Theoretical assumptions have been analyzed qualitatively and quantitatively by the use of the POLAR computer code. For a plasma with a constant beam energy, the analysis shows that the I-V collection characteristics are unaffected by Mach variations for Mach numbers above the value of two. For the same case, the current was found to scale with the 3/7 power of the density.

  10. Advances in the design, development, and deployment of the U.S. Army Research Laboratory (ARL) multimodal signatures database

    NASA Astrophysics Data System (ADS)

    Bennett, Kelly; Robertson, James

    2011-06-01

    Recent advances in the design, development, and deployment of U.S. Army Research Laboratory's (ARL) Multimodal Signature Database (MMSDB) create a state-of-the-art database system with Web-based access through a Web interface designed specifically for research and development. Tens of thousands of signatures are currently available for researchers to support their algorithm development and refinement for sensors and other security systems. Each dataset is stored in (Hierarchical Data Format 5 (HDF5) format for easy modeling and storing of signatures and archived sensor data, ground truth, calibration information, algorithms, and other documentation. Archived HDF5 formatted data provides the basis for computational interoperability across a variety of tools including MATLAB, Octave, and Python. The database has a Web-based front-end with public and restricted access interfaces, along with 24/7 availability and support. This paper describes the overall design of the system, and the recent enhancements and future vision, including the ability for researchers to share algorithms, data, and documentation in the cloud, and providing an ability to run algorithms and software for testing and evaluation purposes remotely across multiple domains and computational tools. The paper will also describe in detail the HDF5 format for several multimodal sensor types.

  11. Virtual earthquake engineering laboratory with physics-based degrading materials on parallel computers

    NASA Astrophysics Data System (ADS)

    Cho, In Ho

    For the last few decades, we have obtained tremendous insight into underlying microscopic mechanisms of degrading quasi-brittle materials from persistent and near-saintly efforts in laboratories, and at the same time we have seen unprecedented evolution in computational technology such as massively parallel computers. Thus, time is ripe to embark on a novel approach to settle unanswered questions, especially for the earthquake engineering community, by harmoniously combining the microphysics mechanisms with advanced parallel computing technology. To begin with, it should be stressed that we placed a great deal of emphasis on preserving clear meaning and physical counterparts of all the microscopic material models proposed herein, since it is directly tied to the belief that by doing so, the more physical mechanisms we incorporate, the better prediction we can obtain. We departed from reviewing representative microscopic analysis methodologies, selecting out "fixed-type" multidirectional smeared crack model as the base framework for nonlinear quasi-brittle materials, since it is widely believed to best retain the physical nature of actual cracks. Microscopic stress functions are proposed by integrating well-received existing models to update normal stresses on the crack surfaces (three orthogonal surfaces are allowed to initiate herein) under cyclic loading. Unlike the normal stress update, special attention had to be paid to the shear stress update on the crack surfaces, due primarily to the well-known pathological nature of the fixed-type smeared crack model---spurious large stress transfer over the open crack under nonproportional loading. In hopes of exploiting physical mechanism to resolve this deleterious nature of the fixed crack model, a tribology-inspired three-dimensional (3d) interlocking mechanism has been proposed. Following the main trend of tribology (i.e., the science and engineering of interacting surfaces), we introduced the base fabric of solid

  12. Improved computational neutronics methods and validation protocols for the advanced test reactor

    SciTech Connect

    Nigg, D. W.; Nielsen, J. W.; Chase, B. M.; Murray, R. K.; Steuhm, K. A.; Unruh, T.

    2012-07-01

    The Idaho National Laboratory (INL) is in the process of updating the various reactor physics modeling and simulation tools used to support operation and safety assurance of the Advanced Test Reactor (ATR). Key accomplishments so far have encompassed both computational as well as experimental work. A new suite of stochastic and deterministic transport theory based reactor physics codes and their supporting nuclear data libraries (HELIOS, KENO6/SCALE, NEWT/SCALE, ATTILA, and an extended implementation of MCNP5) has been installed at the INL. Corresponding models of the ATR and ATRC are now operational with all five codes, demonstrating the basic feasibility of the new code packages for their intended purposes. On the experimental side of the project, new hardware was fabricated, measurement protocols were finalized, and the first four of six planned physics code validation experiments based on neutron activation spectrometry have been conducted at the ATRC facility. Data analysis for the first three experiments, focused on characterization of the neutron spectrum in one of the ATR flux traps, has been completed. The six experiments will ultimately form the basis for flexible and repeatable ATR physics code validation protocols that are consistent with applicable national standards. (authors)

  13. IMPROVED COMPUTATIONAL NEUTRONICS METHODS AND VALIDATION PROTOCOLS FOR THE ADVANCED TEST REACTOR

    SciTech Connect

    David W. Nigg; Joseph W. Nielsen; Benjamin M. Chase; Ronnie K. Murray; Kevin A. Steuhm

    2012-04-01

    The Idaho National Laboratory (INL) is in the process of modernizing the various reactor physics modeling and simulation tools used to support operation and safety assurance of the Advanced Test Reactor (ATR). Key accomplishments so far have encompassed both computational as well as experimental work. A new suite of stochastic and deterministic transport theory based reactor physics codes and their supporting nuclear data libraries (HELIOS, KENO6/SCALE, NEWT/SCALE, ATTILA, and an extended implementation of MCNP5) has been installed at the INL. Corresponding models of the ATR and ATRC are now operational with all five codes, demonstrating the basic feasibility of the new code packages for their intended purpose. Of particular importance, a set of as-run core depletion HELIOS calculations for all ATR cycles since August 2009 was successfully completed during 2011. This demonstration supported a decision late in the year to proceed with the phased incorporation of the HELIOS methodology into the ATR fuel cycle management process beginning in 2012. On the experimental side of the project, new hardware was fabricated, measurement protocols were finalized, and the first four of six planned physics code validation experiments based on neutron activation spectrometry were conducted at the ATRC facility. Data analysis for the first three experiments, focused on characterization of the neutron spectrum in one of the ATR flux traps, has been completed. The six experiments will ultimately form the basis for a flexible, easily-repeatable ATR physics code validation protocol that is consistent with applicable ASTM standards.

  14. Computer-Assisted Photo Interpretation Research At United States Army Engineer Topographic Laboratories (USAETL)

    NASA Astrophysics Data System (ADS)

    Lukes, George E.

    1981-11-01

    A program in computer-assisted photo interpretation research (CAPIR) has been initiated at the U.S. Army Engineer Topographic Laboratories. In a new laboratory, a photo interpreter (PI) analyzing high-resolution, aerial photography interfaces directly to a digital computer and geographic information system (GIS). A modified analytical plotter enables the PI to transmit encoded three-dimensional spatial data from the stereomodel to the computer. Computer-generated graphics are displayed in the stereomodel for direct feedback of digital spatial data to the PI. Initial CAPIR capabilities include point positioning, mensuration, stereoscopic area search, GIS creation and playback, and elevation data extraction. New capabilities under development include stereo graphic superposition, a digital image workstation, and integration of panoramic Optical Bar Camera photography as a primary GIS data source. This project has been conceived as an evolutionary approach to the digital cartographic feature extraction problem. As a working feature extraction system, the CAPIR laboratory can serve as a testbed for new concepts emerging from image understanding and knowledge-based systems research.

  15. Computational Approaches to Enhance Nanosafety and Advance Nanomedicine

    NASA Astrophysics Data System (ADS)

    Mendoza, Eduardo R.

    With the increasing use of nanoparticles in food processing, filtration/purification and consumer products, as well as the huge potential of their use in nanomedicine, a quantitative understanding of the effects of nanoparticle uptake and transport is needed. We provide examples of novel methods for modeling complex bio-nano interactions which are based on stochastic process algebras. Since model construction presumes sufficient availability of experimental data, recent developments in "nanoinformatics", an emerging discipline analogous to bioinfomatics, in building an accessible information infrastructure are subsequently discussed. Both computational areas offer opportunities for Filipinos to engage in collaborative, cutting edge research in this impactful field.

  16. An integrated computer system for preliminary design of advanced aircraft.

    NASA Technical Reports Server (NTRS)

    Fulton, R. E.; Sobieszczanski, J.; Landrum, E. J.

    1972-01-01

    A progress report is given on the first phase of a research project to develop a system of Integrated Programs for Aerospace-Vehicle Design (IPAD) which is intended to automate to the largest extent possible the preliminary and detailed design of advanced aircraft. The approach used is to build a pilot system and simultaneously to carry out two major contractual studies to define a practical IPAD system preparatory to programing. The paper summarizes the specifications and goals of the IPAD system, the progress to date, and any conclusion reached regarding its feasibility and scope. Sample calculations obtained with the pilot system are given for aircraft preliminary designs optimized with respect to discipline parameters, such as weight or L/D, and these results are compared with designs optimized with respect to overall performance parameters, such as range or payload.

  17. Lawrence Livermore National Laboratories Perspective on Code Development and High Performance Computing Resources in Support of the National HED/ICF Effort

    SciTech Connect

    Clouse, C. J.; Edwards, M. J.; McCoy, M. G.; Marinak, M. M.; Verdon, C. P.

    2015-07-07

    Through its Advanced Scientific Computing (ASC) and Inertial Confinement Fusion (ICF) code development efforts, Lawrence Livermore National Laboratory (LLNL) provides a world leading numerical simulation capability for the National HED/ICF program in support of the Stockpile Stewardship Program (SSP). In addition the ASC effort provides high performance computing platform capabilities upon which these codes are run. LLNL remains committed to, and will work with, the national HED/ICF program community to help insure numerical simulation needs are met and to make those capabilities available, consistent with programmatic priorities and available resources.

  18. Virtual earthquake engineering laboratory with physics-based degrading materials on parallel computers

    NASA Astrophysics Data System (ADS)

    Cho, In Ho

    For the last few decades, we have obtained tremendous insight into underlying microscopic mechanisms of degrading quasi-brittle materials from persistent and near-saintly efforts in laboratories, and at the same time we have seen unprecedented evolution in computational technology such as massively parallel computers. Thus, time is ripe to embark on a novel approach to settle unanswered questions, especially for the earthquake engineering community, by harmoniously combining the microphysics mechanisms with advanced parallel computing technology. To begin with, it should be stressed that we placed a great deal of emphasis on preserving clear meaning and physical counterparts of all the microscopic material models proposed herein, since it is directly tied to the belief that by doing so, the more physical mechanisms we incorporate, the better prediction we can obtain. We departed from reviewing representative microscopic analysis methodologies, selecting out "fixed-type" multidirectional smeared crack model as the base framework for nonlinear quasi-brittle materials, since it is widely believed to best retain the physical nature of actual cracks. Microscopic stress functions are proposed by integrating well-received existing models to update normal stresses on the crack surfaces (three orthogonal surfaces are allowed to initiate herein) under cyclic loading. Unlike the normal stress update, special attention had to be paid to the shear stress update on the crack surfaces, due primarily to the well-known pathological nature of the fixed-type smeared crack model---spurious large stress transfer over the open crack under nonproportional loading. In hopes of exploiting physical mechanism to resolve this deleterious nature of the fixed crack model, a tribology-inspired three-dimensional (3d) interlocking mechanism has been proposed. Following the main trend of tribology (i.e., the science and engineering of interacting surfaces), we introduced the base fabric of solid

  19. Design and Laboratory Evaluation of Future Elongation and Diameter Measurements at the Advanced Test Reactor

    SciTech Connect

    K. L. Davis; D. L. Knudson; J. L. Rempe; J. C. Crepeau; S. Solstad

    2015-07-01

    New materials are being considered for fuel, cladding, and structures in next generation and existing nuclear reactors. Such materials can undergo significant dimensional and physical changes during high temperature irradiations. In order to accurately predict these changes, real-time data must be obtained under prototypic irradiation conditions for model development and validation. To provide such data, researchers at the Idaho National Laboratory (INL) High Temperature Test Laboratory (HTTL) are developing several instrumented test rigs to obtain data real-time from specimens irradiated in well-controlled pressurized water reactor (PWR) coolant conditions in the Advanced Test Reactor (ATR). This paper reports the status of INL efforts to develop and evaluate prototype test rigs that rely on Linear Variable Differential Transformers (LVDTs) in laboratory settings. Although similar LVDT-based test rigs have been deployed in lower flux Materials Testing Reactors (MTRs), this effort is unique because it relies on robust LVDTs that can withstand higher temperatures and higher fluxes than often found in other MTR irradiations. Specifically, the test rigs are designed for detecting changes in length and diameter of specimens irradiated in ATR PWR loops. Once implemented, these test rigs will provide ATR users with unique capabilities that are sorely needed to obtain measurements such as elongation caused by thermal expansion and/or creep loading and diameter changes associated with fuel and cladding swelling, pellet-clad interaction, and crud buildup.

  20. 2015 Final Reports from the Los Alamos National Laboratory Computational Physics Student Summer Workshop

    SciTech Connect

    Runnels, Scott Robert; Caldwell, Wendy; Brown, Barton Jed; Pederson, Clark; Brown, Justin; Burrill, Daniel; Feinblum, David; Hyde, David; Levick, Nathan; Lyngaas, Isaac; Maeng, Brad; Reed, Richard LeRoy; Sarno-Smith, Lois; Shohet, Gil; Skarda, Jinhie; Stevens, Josey; Zeppetello, Lucas; Grossman-Ponemon, Benjamin; Bottini, Joseph Larkin; Loudon, Tyson Shane; VanGessel, Francis Gilbert; Nagaraj, Sriram; Price, Jacob

    2015-10-15

    The two primary purposes of LANL’s Computational Physics Student Summer Workshop are (1) To educate graduate and exceptional undergraduate students in the challenges and applications of computational physics of interest to LANL, and (2) Entice their interest toward those challenges. Computational physics is emerging as a discipline in its own right, combining expertise in mathematics, physics, and computer science. The mathematical aspects focus on numerical methods for solving equations on the computer as well as developing test problems with analytical solutions. The physics aspects are very broad, ranging from low-temperature material modeling to extremely high temperature plasma physics, radiation transport and neutron transport. The computer science issues are concerned with matching numerical algorithms to emerging architectures and maintaining the quality of extremely large codes built to perform multi-physics calculations. Although graduate programs associated with computational physics are emerging, it is apparent that the pool of U.S. citizens in this multi-disciplinary field is relatively small and is typically not focused on the aspects that are of primary interest to LANL. Furthermore, more structured foundations for LANL interaction with universities in computational physics is needed; historically interactions rely heavily on individuals’ personalities and personal contacts. Thus a tertiary purpose of the Summer Workshop is to build an educational network of LANL researchers, university professors, and emerging students to advance the field and LANL’s involvement in it. This report includes both the background for the program and the reports from the students.

  1. Restructuring the introductory physics lab with the addition of computer-based laboratories.

    PubMed

    Pierri-Galvao, Monica

    2011-07-01

    Nowadays, data acquisition software and sensors are being widely used in introductory physics laboratories. This allows the student to spend more time exploring the data that is collected by the computer hence focusing more on the physical concept. Very often, a faculty is faced with the challenge of updating or introducing a microcomputer-based laboratory (MBL) at his or her institution. This article will provide a list of experiments and equipment needed to convert about half of the traditional labs on a 1-year introductory physics lab into MBLs.

  2. Comparison of nonmesonic hypernuclear decay rates computed in laboratory and center-of-mass coordinates

    SciTech Connect

    De Conti, C.; Barbero, C.; Galeão, A. P.; Krmpotić, F.

    2014-11-11

    In this work we compute the one-nucleon-induced nonmesonic hypernuclear decay rates of {sub Λ}{sup 5}He, {sub Λ}{sup 12}C and {sub Λ}{sup 13}C using a formalism based on the independent particle shell model in terms of laboratory coordinates. To ascertain the correctness and precision of the method, these results are compared with those obtained using a formalism in terms of center-of-mass coordinates, which has been previously reported in the literature. The formalism in terms of laboratory coordinates will be useful in the shell-model approach to two-nucleon-induced transitions.

  3. Inference on arthropod demographic parameters: computational advances using R.

    PubMed

    Maia, Aline De Holanda Nunes; Pazianotto, Ricardo Antonio De Almeida; Luiz, Alfredo José Barreto; Marinho-Prado, Jeanne Scardini; Pervez, Ahmad

    2014-02-01

    We developed a computer program for life table analysis using the open source, free software programming environment R. It is useful to quantify chronic nonlethal effects of treatments on arthropod populations by summarizing information on their survival and fertility in key population parameters referred to as fertility life table parameters. Statistical inference on fertility life table parameters is not trivial because it requires the use of computationally intensive methods for variance estimation. Our codes present some advantages with respect to a previous program developed in Statistical Analysis System. Additional multiple comparison tests were incorporated for the analysis of qualitative factors; a module for regression analysis was implemented, thus, allowing analysis of quantitative factors such as temperature or agrochemical doses; availability is granted for users, once it was developed using an open source, free software programming environment. To illustrate the descriptive and inferential analysis implemented in lifetable.R, we present and discuss two examples: 1) a study quantifying the influence of the proteinase inhibitor berenil on the eucalyptus defoliator Thyrinteina arnobia (Stoll) and 2) a study investigating the influence of temperature on demographic parameters of a predaceous ladybird, Hippodamia variegata (Goeze). PMID:24665730

  4. Recent advances in computer camera methods for machine vision

    NASA Astrophysics Data System (ADS)

    Olson, Gaylord G.; Walker, Jo N.

    1998-10-01

    During the past year, several new computer camera methods (hardware and software) have been developed which have applications in machine vision. These are described below, along with some test results. The improvements are generally in the direction of higher speed and greater parallelism. A PCI interface card has been designed which is adaptable to multiple CCD types, both color and monochrome. A newly designed A/D converter allows for a choice of 8 or 10-bit conversion resolution and a choice of two different analog inputs. Thus, by using four of these converters feeding the 32-bit PCI data bus, up to 8 camera heads can be used with a single PCI card, and four camera heads can be operated in parallel. The card has been designed so that any of 8 different CCD types can be used with it (6 monochrome and 2 color CCDs) ranging in resolution from 192 by 165 pixels up to 1134 by 972 pixels. In the area of software, a method has been developed to better utilize the decision-making capability of the computer along with the sub-array scan capabilities of many CCDs. Specifically, it is shown below how to achieve a dual scan mode camera system wherein one scan mode is a low density, high speed scan of a complete image area, and a higher density sub-array scan is used in those areas where changes have been observed. The name given to this technique is adaptive sub-array scanning.

  5. Computing environment for the ASSIST data warehouse at Lawrence Livermore National Laboratory

    SciTech Connect

    Shuk, K.

    1995-11-01

    The current computing environment for the ASSIST data warehouse at Lawrence Livermore National Laboratory is that of a central server that is accessed by a terminal or terminal emulator. The initiative to move to a client/server environment is strong, backed by desktop machines becoming more and more powerful. The desktop machines can now take on parts of tasks once run entirely on the central server, making the whole environment computationally more efficient as a result. Services are tasks that are repeated throughout the environment such that it makes sense to share them; tasks such as email, user authentication and file transfer are services. The new client/;server environment needs to determine which services must be included in the environment for basic functionality. These services then unify the computing environment, not only for the forthcoming ASSIST+, but for Administrative Information Systems as a whole, joining various server platforms with heterogeneous desktop computing platforms.

  6. ADVANCED COMPUTATIONAL MODEL FOR THREE-PHASE SLURRY REACTORS

    SciTech Connect

    Goodarz Ahmadi

    2001-10-01

    In the second year of the project, the Eulerian-Lagrangian formulation for analyzing three-phase slurry flows in a bubble column is further developed. The approach uses an Eulerian analysis of liquid flows in the bubble column, and makes use of the Lagrangian trajectory analysis for the bubbles and particle motions. An experimental set for studying a two-dimensional bubble column is also developed. The operation of the bubble column is being tested and diagnostic methodology for quantitative measurements is being developed. An Eulerian computational model for the flow condition in the two-dimensional bubble column is also being developed. The liquid and bubble motions are being analyzed and the results are being compared with the experimental setup. Solid-fluid mixture flows in ducts and passages at different angle of orientations were analyzed. The model predictions were compared with the experimental data and good agreement was found. Gravity chute flows of solid-liquid mixtures is also being studied. Further progress was also made in developing a thermodynamically consistent model for multiphase slurry flows with and without chemical reaction in a state of turbulent motion. The balance laws are obtained and the constitutive laws are being developed. Progress was also made in measuring concentration and velocity of particles of different sizes near a wall in a duct flow. The technique of Phase-Doppler anemometry was used in these studies. The general objective of this project is to provide the needed fundamental understanding of three-phase slurry reactors in Fischer-Tropsch (F-T) liquid fuel synthesis. The other main goal is to develop a computational capability for predicting the transport and processing of three-phase coal slurries. The specific objectives are: (1) To develop a thermodynamically consistent rate-dependent anisotropic model for multiphase slurry flows with and without chemical reaction for application to coal liquefaction. Also establish the

  7. When Learning about the Real World is Better Done Virtually: A Study of Substituting Computer Simulations for Laboratory Equipment

    ERIC Educational Resources Information Center

    Finkelstein, N. D.; Adams, W. K.; Keller, C. J.; Kohl, P. B.; Perkins, K. K.; Podolefsky, N. S.; Reid, S.; LeMaster, R.

    2005-01-01

    This paper examines the effects of substituting a computer simulation for real laboratory equipment in the second semester of a large-scale introductory physics course. The direct current circuit laboratory was modified to compare the effects of using computer simulations with the effects of using real light bulbs, meters, and wires. Two groups of…

  8. National Energy Research Scientific Computing Center (NERSC): Advancing the frontiers of computational science and technology

    SciTech Connect

    Hules, J.

    1996-11-01

    National Energy Research Scientific Computing Center (NERSC) provides researchers with high-performance computing tools to tackle science`s biggest and most challenging problems. Founded in 1974 by DOE/ER, the Controlled Thermonuclear Research Computer Center was the first unclassified supercomputer center and was the model for those that followed. Over the years the center`s name was changed to the National Magnetic Fusion Energy Computer Center and then to NERSC; it was relocated to LBNL. NERSC, one of the largest unclassified scientific computing resources in the world, is the principal provider of general-purpose computing services to DOE/ER programs: Magnetic Fusion Energy, High Energy and Nuclear Physics, Basic Energy Sciences, Health and Environmental Research, and the Office of Computational and Technology Research. NERSC users are a diverse community located throughout US and in several foreign countries. This brochure describes: the NERSC advantage, its computational resources and services, future technologies, scientific resources, and computational science of scale (interdisciplinary research over a decade or longer; examples: combustion in engines, waste management chemistry, global climate change modeling).

  9. 3D chemical imaging in the laboratory by hyperspectral X-ray computed tomography

    PubMed Central

    Egan, C. K.; Jacques, S. D. M.; Wilson, M. D.; Veale, M. C.; Seller, P.; Beale, A. M.; Pattrick, R. A. D.; Withers, P. J.; Cernik, R. J.

    2015-01-01

    We report the development of laboratory based hyperspectral X-ray computed tomography which allows the internal elemental chemistry of an object to be reconstructed and visualised in three dimensions. The method employs a spectroscopic X-ray imaging detector with sufficient energy resolution to distinguish individual elemental absorption edges. Elemental distributions can then be made by K-edge subtraction, or alternatively by voxel-wise spectral fitting to give relative atomic concentrations. We demonstrate its application to two material systems: studying the distribution of catalyst material on porous substrates for industrial scale chemical processing; and mapping of minerals and inclusion phases inside a mineralised ore sample. The method makes use of a standard laboratory X-ray source with measurement times similar to that required for conventional computed tomography. PMID:26514938

  10. 3D chemical imaging in the laboratory by hyperspectral X-ray computed tomography.

    PubMed

    Egan, C K; Jacques, S D M; Wilson, M D; Veale, M C; Seller, P; Beale, A M; Pattrick, R A D; Withers, P J; Cernik, R J

    2015-01-01

    We report the development of laboratory based hyperspectral X-ray computed tomography which allows the internal elemental chemistry of an object to be reconstructed and visualised in three dimensions. The method employs a spectroscopic X-ray imaging detector with sufficient energy resolution to distinguish individual elemental absorption edges. Elemental distributions can then be made by K-edge subtraction, or alternatively by voxel-wise spectral fitting to give relative atomic concentrations. We demonstrate its application to two material systems: studying the distribution of catalyst material on porous substrates for industrial scale chemical processing; and mapping of minerals and inclusion phases inside a mineralised ore sample. The method makes use of a standard laboratory X-ray source with measurement times similar to that required for conventional computed tomography. PMID:26514938

  11. Strategies for casualty mitigation programs by using advanced tsunami computation

    NASA Astrophysics Data System (ADS)

    IMAI, K.; Imamura, F.

    2012-12-01

    1. Purpose of the study In this study, based on the scenario of great earthquakes along the Nankai trough, we aim on the estimation of the run up and high accuracy inundation process of tsunami in coastal areas including rivers. Here, using a practical method of tsunami analytical model, and taking into account characteristics of detail topography, land use and climate change in a realistic present and expected future environment, we examined the run up and tsunami inundation process. Using these results we estimated the damage due to tsunami and obtained information for the mitigation of human casualties. Considering the time series from the occurrence of the earthquake and the risk of tsunami damage, in order to mitigate casualties we provide contents of disaster risk information displayed in a tsunami hazard and risk map. 2. Creating a tsunami hazard and risk map From the analytical and practical tsunami model (a long wave approximated model) and the high resolution topography (5 m) including detailed data of shoreline, rivers, building and houses, we present a advanced analysis of tsunami inundation considering the land use. Based on the results of tsunami inundation and its analysis; it is possible to draw a tsunami hazard and risk map with information of human casualty, building damage estimation, drift of vehicles, etc. 3. Contents of disaster prevention information To improve the hazard, risk and evacuation information distribution, it is necessary to follow three steps. (1) Provide basic information such as tsunami attack info, areas and routes for evacuation and location of tsunami evacuation facilities. (2) Provide as additional information the time when inundation starts, the actual results of inundation, location of facilities with hazard materials, presence or absence of public facilities and areas underground that required evacuation. (3) Provide information to support disaster response such as infrastructure and traffic network damage prediction

  12. Assessment of physical activity with the Computer Science and Applications, Inc., accelerometer: laboratory versus field validation.

    PubMed

    Nichols, J F; Morgan, C G; Chabot, L E; Sallis, J F; Calfas, K J

    2000-03-01

    Our purpose was to compare the validity of the Computer Science and Applications, (CSA) Inc., accelerometer in laboratory and field settings and establish CSA count ranges for light, moderate, and vigorous physical activity. Validity was determined in 60 adults during treadmill exercise, using oxygen consumption (VO2) as the criterion measure, while 30 adults walked and jogged outdoors on a 400-m track. The relationship between CSA counts and VO2 was linear (R2 = .89 SEE = 3.72 ml.kg-1.min-1), as was the relationship between velocity and counts in the field (R2 = .89, SEE = 0.89 mi.hr-1). However, significant differences were found (p < .05) between laboratory and field measures of CSA counts for light and vigorous intensity. We conclude that the CSA can be used to quantify walking and jogging outdoors on level ground; however, laboratory equations may not be appropriate for use in field settings, particularly for light and vigorous activity.

  13. CART V: recent advancements in computer-aided camouflage assessment

    NASA Astrophysics Data System (ADS)

    Müller, Thomas; Müller, Markus

    2011-05-01

    In order to facilitate systematic, computer aided improvements of camouflage and concealment assessment methods, the software system CART (Camouflage Assessment in Real-Time) was built up for the camouflage assessment of objects in multispectral image sequences (see contributions to SPIE 2007-2010 [1], [2], [3], [4]). It comprises a semi-automatic marking of target objects (ground truth generation) including their propagation over the image sequence and the evaluation via user-defined feature extractors as well as methods to assess the object's movement conspicuity. In this fifth part in an annual series at the SPIE conference in Orlando, this paper presents the enhancements over the recent year and addresses the camouflage assessment of static and moving objects in multispectral image data that can show noise or image artefacts. The presented methods fathom the correlations between image processing and camouflage assessment. A novel algorithm is presented based on template matching to assess the structural inconspicuity of an object objectively and quantitatively. The results can easily be combined with an MTI (moving target indication) based movement conspicuity assessment function in order to explore the influence of object movement to a camouflage effect in different environments. As the results show, the presented methods contribute to a significant benefit in the field of camouflage assessment.

  14. The CT Scanner Facility at Stellenbosch University: An open access X-ray computed tomography laboratory

    NASA Astrophysics Data System (ADS)

    du Plessis, Anton; le Roux, Stephan Gerhard; Guelpa, Anina

    2016-10-01

    The Stellenbosch University CT Scanner Facility is an open access laboratory providing non-destructive X-ray computed tomography (CT) and a high performance image analysis services as part of the Central Analytical Facilities (CAF) of the university. Based in Stellenbosch, South Africa, this facility offers open access to the general user community, including local researchers, companies and also remote users (both local and international, via sample shipment and data transfer). The laboratory hosts two CT instruments, i.e. a micro-CT system, as well as a nano-CT system. A workstation-based Image Analysis Centre is equipped with numerous computers with data analysis software packages, which are to the disposal of the facility users, along with expert supervision, if required. All research disciplines are accommodated at the X-ray CT laboratory, provided that non-destructive analysis will be beneficial. During its first four years, the facility has accommodated more than 400 unique users (33 in 2012; 86 in 2013; 154 in 2014; 140 in 2015; 75 in first half of 2016), with diverse industrial and research applications using X-ray CT as means. This paper summarises the existence of the laboratory's first four years by way of selected examples, both from published and unpublished projects. In the process a detailed description of the capabilities and facilities available to users is presented.

  15. Computational Protein Engineering: Bridging the Gap between Rational Design and Laboratory Evolution

    PubMed Central

    Barrozo, Alexandre; Borstnar, Rok; Marloie, Gaël; Kamerlin, Shina Caroline Lynn

    2012-01-01

    Enzymes are tremendously proficient catalysts, which can be used as extracellular catalysts for a whole host of processes, from chemical synthesis to the generation of novel biofuels. For them to be more amenable to the needs of biotechnology, however, it is often necessary to be able to manipulate their physico-chemical properties in an efficient and streamlined manner, and, ideally, to be able to train them to catalyze completely new reactions. Recent years have seen an explosion of interest in different approaches to achieve this, both in the laboratory, and in silico. There remains, however, a gap between current approaches to computational enzyme design, which have primarily focused on the early stages of the design process, and laboratory evolution, which is an extremely powerful tool for enzyme redesign, but will always be limited by the vastness of sequence space combined with the low frequency for desirable mutations. This review discusses different approaches towards computational enzyme design and demonstrates how combining newly developed screening approaches that can rapidly predict potential mutation “hotspots” with approaches that can quantitatively and reliably dissect the catalytic step can bridge the gap that currently exists between computational enzyme design and laboratory evolution studies. PMID:23202907

  16. Computer-simulated laboratory explorations for middle school life, earth, and physical Science

    NASA Astrophysics Data System (ADS)

    von Blum, Ruth

    1992-06-01

    Explorations in Middle School Science is a set of 72 computer-simulated laboratory lessons in life, earth, and physical Science for grades 6 9 developed by Jostens Learning Corporation with grants from the California State Department of Education and the National Science Foundation.3 At the heart of each lesson is a computer-simulated laboratory that actively involves students in doing science improving their: (1) understanding of science concepts by applying critical thinking to solve real problems; (2) skills in scientific processes and communications; and (3) attitudes about science. Students use on-line tools (notebook, calculator, word processor) to undertake in-depth investigations of phenomena (like motion in outer space, disease transmission, volcanic eruptions, or the structure of the atom) that would be too difficult, dangerous, or outright impossible to do in a “live” laboratory. Suggested extension activities lead students to hands-on investigations, away from the computer. This article presents the underlying rationale, instructional model, and process by which Explorations was designed and developed. It also describes the general courseware structure and three lesson's in detail, as well as presenting preliminary data from the evaluation. Finally, it suggests a model for incorporating technology into the science classroom.

  17. ADVANCED METHODS FOR THE COMPUTATION OF PARTICLE BEAM TRANSPORT AND THE COMPUTATION OF ELECTROMAGNETIC FIELDS AND MULTIPARTICLE PHENOMENA

    SciTech Connect

    Alex J. Dragt

    2012-08-31

    Since 1980, under the grant DEFG02-96ER40949, the Department of Energy has supported the educational and research work of the University of Maryland Dynamical Systems and Accelerator Theory (DSAT) Group. The primary focus of this educational/research group has been on the computation and analysis of charged-particle beam transport using Lie algebraic methods, and on advanced methods for the computation of electromagnetic fields and multiparticle phenomena. This Final Report summarizes the accomplishments of the DSAT Group from its inception in 1980 through its end in 2011.

  18. Lipid membranes and single ion channel recording for the advanced physics laboratory

    NASA Astrophysics Data System (ADS)

    Klapper, Yvonne; Nienhaus, Karin; Röcker, Carlheinz; Ulrich Nienhaus, G.

    2014-05-01

    We present an easy-to-handle, low-cost, and reliable setup to study various physical phenomena on a nanometer-thin lipid bilayer using the so-called black lipid membrane technique. The apparatus allows us to precisely measure optical and electrical properties of free-standing lipid membranes, to study the formation of single ion channels, and to gain detailed information on the ion conduction properties of these channels using statistical physics and autocorrelation analysis. The experiments are well suited as part of an advanced physics or biophysics laboratory course; they interconnect physics, chemistry, and biology and will be appealing to students of the natural sciences who are interested in quantitative experimentation.

  19. Advances in Engine Test Capabilities at the NASA Glenn Research Center's Propulsion Systems Laboratory

    NASA Technical Reports Server (NTRS)

    Pachlhofer, Peter M.; Panek, Joseph W.; Dicki, Dennis J.; Piendl, Barry R.; Lizanich, Paul J.; Klann, Gary A.

    2006-01-01

    The Propulsion Systems Laboratory at the National Aeronautics and Space Administration (NASA) Glenn Research Center is one of the premier U.S. facilities for research on advanced aeropropulsion systems. The facility can simulate a wide range of altitude and Mach number conditions while supplying the aeropropulsion system with all the support services necessary to operate at those conditions. Test data are recorded on a combination of steady-state and highspeed data-acquisition systems. Recently a number of upgrades were made to the facility to meet demanding new requirements for the latest aeropropulsion concepts and to improve operational efficiency. Improvements were made to data-acquisition systems, facility and engine-control systems, test-condition simulation systems, video capture and display capabilities, and personnel training procedures. This paper discusses the facility s capabilities, recent upgrades, and planned future improvements.

  20. ADVANCED COMPUTATIONAL MODEL FOR THREE-PHASE SLURRY REACTORS

    SciTech Connect

    Goodarz Ahmadi

    2004-10-01

    In this project, an Eulerian-Lagrangian formulation for analyzing three-phase slurry flows in a bubble column was developed. The approach used an Eulerian analysis of liquid flows in the bubble column, and made use of the Lagrangian trajectory analysis for the bubbles and particle motions. The bubble-bubble and particle-particle collisions are included the model. The model predictions are compared with the experimental data and good agreement was found An experimental setup for studying two-dimensional bubble columns was developed. The multiphase flow conditions in the bubble column were measured using optical image processing and Particle Image Velocimetry techniques (PIV). A simple shear flow device for bubble motion in a constant shear flow field was also developed. The flow conditions in simple shear flow device were studied using PIV method. Concentration and velocity of particles of different sizes near a wall in a duct flow was also measured. The technique of Phase-Doppler anemometry was used in these studies. An Eulerian volume of fluid (VOF) computational model for the flow condition in the two-dimensional bubble column was also developed. The liquid and bubble motions were analyzed and the results were compared with observed flow patterns in the experimental setup. Solid-fluid mixture flows in ducts and passages at different angle of orientations were also analyzed. The model predictions were compared with the experimental data and good agreement was found. Gravity chute flows of solid-liquid mixtures were also studied. The simulation results were compared with the experimental data and discussed A thermodynamically consistent model for multiphase slurry flows with and without chemical reaction in a state of turbulent motion was developed. The balance laws were obtained and the constitutive laws established.

  1. WAATS: A computer program for Weights Analysis of Advanced Transportation Systems

    NASA Technical Reports Server (NTRS)

    Glatt, C. R.

    1974-01-01

    A historical weight estimating technique for advanced transportation systems is presented. The classical approach to weight estimation is discussed and sufficient data is presented to estimate weights for a large spectrum of flight vehicles including horizontal and vertical takeoff aircraft, boosters and reentry vehicles. A computer program, WAATS (Weights Analysis for Advanced Transportation Systems) embracing the techniques discussed has been written and user instructions are presented. The program was developed for use in the ODIN (Optimal Design Integration System) system.

  2. Advanced entry guidance algorithm with landing footprint computation

    NASA Astrophysics Data System (ADS)

    Leavitt, James Aaron

    -determined angle of attack profile. The method is also capable of producing orbital footprints using an automatically-generated set of angle of attack profiles of varying range, with the lowest profile designed for near-maximum range in the absence of an active heat load constraint. The accuracy of the footprint method is demonstrated by direct comparison with footprints computed independently by an optimization program.

  3. Computer and laboratory simulation in the teaching of neonatal nursing: innovation and impact on learning 1

    PubMed Central

    Fonseca, Luciana Mara Monti; Aredes, Natália Del' Angelo; Fernandes, Ananda Maria; Batalha, Luís Manuel da Cunha; Apóstolo, Jorge Manuel Amado; Martins, José Carlos Amado; Rodrigues, Manuel Alves

    2016-01-01

    ABSTRACT Objectives: to evaluate the cognitive learning of nursing students in neonatal clinical evaluation from a blended course with the use of computer and laboratory simulation; to compare the cognitive learning of students in a control and experimental group testing the laboratory simulation; and to assess the extracurricular blended course offered on the clinical assessment of preterm infants, according to the students. Method: a quasi-experimental study with 14 Portuguese students, containing pretest, midterm test and post-test. The technologies offered in the course were serious game e-Baby, instructional software of semiology and semiotechnique, and laboratory simulation. Data collection tools developed for this study were used for the course evaluation and characterization of the students. Nonparametric statistics were used: Mann-Whitney and Wilcoxon. Results: the use of validated digital technologies and laboratory simulation demonstrated a statistically significant difference (p = 0.001) in the learning of the participants. The course was evaluated as very satisfactory for them. The laboratory simulation alone did not represent a significant difference in the learning. Conclusions: the cognitive learning of participants increased significantly. The use of technology can be partly responsible for the course success, showing it to be an important teaching tool for innovation and motivation of learning in healthcare. PMID:27737376

  4. Recent Advances in Laboratory Infrared Spectroscopy of Polycyclic Aromatic Hydrocarbons: PAHs in the Far Infrared

    NASA Technical Reports Server (NTRS)

    Mattioda, Andrew L.; Ricca, Alessandra; Tucker, Jonathan; Boersma, Christiaan; Bauschlicher, Charles, Jr.; Allamandola, Louis J.

    2010-01-01

    Over 25 years of observations and laboratory work have shown that the mid-IR spectra of a majority of astronomical sources are dominated by emission features near 3.3, 6.2, 7.7, and 11.2 microns, which originate in free polycyclic aromatic hydrocarbon (PAH) molecules. PAHs dominate the mid-IR emission from many galactic and extragalactic objects. As such, this material tracks a wide variety of astronomical processes, making this spectrum a powerful probe of the cosmos Apart from bands in the mid-IR, PAHs have bands spanning the Far-IR (FIR) and emission from these FIR features should be present in astronomical sources showing the Mid-IR PAH bands. However, with one exception, the FIR spectral characteristics are known only for a few neutral small PAHs trapped in salt pellets or oils at room temperature, data which is not relevant to astrophysics. Furthermore, since most emitting PAHs responsible for the mid-IR astronomical features are ionized, the absence of any experimental or theoretical PAH ion FIR spectra will make it impossible to correctly interpret the FIR data from these objects. In view of the upcoming Herschel space telescope mission and SOFIA's FIR airborne instrumentation, which will pioneer the FIR region, it is now urgent to obtain PAH FIR spectra. This talk will present an overview recent advances in the laboratory spectroscopy of PAHs, Highlighting the FIR spectroscopy along with some quantum calculations.

  5. Projected role of advanced computational aerodynamic methods at the Lockheed-Georgia company

    NASA Technical Reports Server (NTRS)

    Lores, M. E.

    1978-01-01

    Experience with advanced computational methods being used at the Lockheed-Georgia Company to aid in the evaluation and design of new and modified aircraft indicates that large and specialized computers will be needed to make advanced three-dimensional viscous aerodynamic computations practical. The Numerical Aerodynamic Simulation Facility should be used to provide a tool for designing better aerospace vehicles while at the same time reducing development costs by performing computations using Navier-Stokes equations solution algorithms and permitting less sophisticated but nevertheless complex calculations to be made efficiently. Configuration definition procedures and data output formats can probably best be defined in cooperation with industry, therefore, the computer should handle many remote terminals efficiently. The capability of transferring data to and from other computers needs to be provided. Because of the significant amount of input and output associated with 3-D viscous flow calculations and because of the exceedingly fast computation speed envisioned for the computer, special attention should be paid to providing rapid, diversified, and efficient input and output.

  6. Research in Computational Aeroscience Applications Implemented on Advanced Parallel Computing Systems

    NASA Technical Reports Server (NTRS)

    Wigton, Larry

    1996-01-01

    Improving the numerical linear algebra routines for use in new Navier-Stokes codes, specifically Tim Barth's unstructured grid code, with spin-offs to TRANAIR is reported. A fast distance calculation routine for Navier-Stokes codes using the new one-equation turbulence models is written. The primary focus of this work was devoted to improving matrix-iterative methods. New algorithms have been developed which activate the full potential of classical Cray-class computers as well as distributed-memory parallel computers.

  7. Impact of computer advances on future finite elements computations. [for aircraft and spacecraft design

    NASA Technical Reports Server (NTRS)

    Fulton, Robert E.

    1985-01-01

    Research performed over the past 10 years in engineering data base management and parallel computing is discussed, and certain opportunities for research toward the next generation of structural analysis capability are proposed. Particular attention is given to data base management associated with the IPAD project and parallel processing associated with the Finite Element Machine project, both sponsored by NASA, and a near term strategy for a distributed structural analysis capability based on relational data base management software and parallel computers for a future structural analysis system.

  8. 5 CFR 550.404 - Computation of advance payments and evacuation payments; time periods.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... evacuation payments; time periods. 550.404 Section 550.404 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS PAY ADMINISTRATION (GENERAL) Payments During Evacuation § 550.404 Computation of advance payments and evacuation payments; time periods. (a) Payments shall be based on the...

  9. 5 CFR 550.404 - Computation of advance payments and evacuation payments; time periods.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... evacuation payments; time periods. 550.404 Section 550.404 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS PAY ADMINISTRATION (GENERAL) Payments During Evacuation § 550.404 Computation of advance payments and evacuation payments; time periods. (a) Payments shall be based on the...

  10. Computers-for-edu: An Advanced Business Application Programming (ABAP) Teaching Case

    ERIC Educational Resources Information Center

    Boyle, Todd A.

    2007-01-01

    The "Computers-for-edu" case is designed to provide students with hands-on exposure to creating Advanced Business Application Programming (ABAP) reports and dialogue programs, as well as navigating various mySAP Enterprise Resource Planning (ERP) transactions needed by ABAP developers. The case requires students to apply a wide variety of ABAP…

  11. Advanced Telecommunications and Computer Technologies in Georgia Public Elementary School Library Media Centers.

    ERIC Educational Resources Information Center

    Rogers, Jackie L.

    The purpose of this study was to determine what recent progress had been made in Georgia public elementary school library media centers regarding access to advanced telecommunications and computer technologies as a result of special funding. A questionnaire addressed the following areas: automation and networking of the school library media center…

  12. PARTNERING WITH DOE TO APPLY ADVANCED BIOLOGICAL, ENVIRONMENTAL, AND COMPUTATIONAL SCIENCE TO ENVIRONMENTAL ISSUES

    EPA Science Inventory

    On February 18, 2004, the U.S. Environmental Protection Agency and Department of Energy signed a Memorandum of Understanding to expand the research collaboration of both agencies to advance biological, environmental, and computational sciences for protecting human health and the ...

  13. Scientific Discovery through Advanced Computing (SciDAC-3) Partnership Project Annual Report

    SciTech Connect

    Hoffman, Forest M.; Bochev, Pavel B.; Cameron-Smith, Philip J..; Easter, Richard C; Elliott, Scott M.; Ghan, Steven J.; Liu, Xiaohong; Lowrie, Robert B.; Lucas, Donald D.; Ma, Po-lun; Sacks, William J.; Shrivastava, Manish; Singh, Balwinder; Tautges, Timothy J.; Taylor, Mark A.; Vertenstein, Mariana; Worley, Patrick H.

    2014-01-15

    The Applying Computationally Efficient Schemes for BioGeochemical Cycles ACES4BGC Project is advancing the predictive capabilities of Earth System Models (ESMs) by reducing two of the largest sources of uncertainty, aerosols and biospheric feedbacks, with a highly efficient computational approach. In particular, this project is implementing and optimizing new computationally efficient tracer advection algorithms for large numbers of tracer species; adding important biogeochemical interactions between the atmosphere, land, and ocean models; and applying uncertainty quanti cation (UQ) techniques to constrain process parameters and evaluate uncertainties in feedbacks between biogeochemical cycles and the climate system.

  14. A first attempt to bring computational biology into advanced high school biology classrooms.

    PubMed

    Gallagher, Suzanne Renick; Coon, William; Donley, Kristin; Scott, Abby; Goldberg, Debra S

    2011-10-01

    Computer science has become ubiquitous in many areas of biological research, yet most high school and even college students are unaware of this. As a result, many college biology majors graduate without adequate computational skills for contemporary fields of biology. The absence of a computational element in secondary school biology classrooms is of growing concern to the computational biology community and biology teachers who would like to acquaint their students with updated approaches in the discipline. We present a first attempt to correct this absence by introducing a computational biology element to teach genetic evolution into advanced biology classes in two local high schools. Our primary goal was to show students how computation is used in biology and why a basic understanding of computation is necessary for research in many fields of biology. This curriculum is intended to be taught by a computational biologist who has worked with a high school advanced biology teacher to adapt the unit for his/her classroom, but a motivated high school teacher comfortable with mathematics and computing may be able to teach this alone. In this paper, we present our curriculum, which takes into consideration the constraints of the required curriculum, and discuss our experiences teaching it. We describe the successes and challenges we encountered while bringing this unit to high school students, discuss how we addressed these challenges, and make suggestions for future versions of this curriculum.We believe that our curriculum can be a valuable seed for further development of computational activities aimed at high school biology students. Further, our experiences may be of value to others teaching computational biology at this level. Our curriculum can be obtained at http://ecsite.cs.colorado.edu/?page_id=149#biology or by contacting the authors.

  15. Some research advances in computer graphics that will enhance applications to engineering design

    NASA Technical Reports Server (NTRS)

    Allan, J. J., III

    1975-01-01

    Research in man/machine interactions and graphics hardware/software that will enhance applications to engineering design was described. Research aspects of executive systems, command languages, and networking used in the computer applications laboratory are mentioned. Finally, a few areas where little or no research is being done were identified.

  16. Building an advanced climate model: Program plan for the CHAMMP (Computer Hardware, Advanced Mathematics, and Model Physics) Climate Modeling Program

    SciTech Connect

    Not Available

    1990-12-01

    The issue of global warming and related climatic changes from increasing concentrations of greenhouse gases in the atmosphere has received prominent attention during the past few years. The Computer Hardware, Advanced Mathematics, and Model Physics (CHAMMP) Climate Modeling Program is designed to contribute directly to this rapid improvement. The goal of the CHAMMP Climate Modeling Program is to develop, verify, and apply a new generation of climate models within a coordinated framework that incorporates the best available scientific and numerical approaches to represent physical, biogeochemical, and ecological processes, that fully utilizes the hardware and software capabilities of new computer architectures, that probes the limits of climate predictability, and finally that can be used to address the challenging problem of understanding the greenhouse climate issue through the ability of the models to simulate time-dependent climatic changes over extended times and with regional resolution.

  17. FY05-FY06 Advanced Simulation and Computing Implementation Plan, Volume 2

    SciTech Connect

    Baron, A L

    2004-07-19

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program will require the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapon design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile life extension programs and the resolution of significant finding investigations (SFIs). This requires a balanced system of technical staff, hardware, simulation software, and computer science solutions.

  18. Comparison of Mars Science Laboratory Reaction Control System Jet Computations With Flow Visualization and Velocimetry

    NASA Technical Reports Server (NTRS)

    Bathel, Brett F.; Danehy, Paul M.; Johansen, Craig T.; Ashcraft, Scott W.; Novak, Luke A.

    2013-01-01

    Numerical predictions of the Mars Science Laboratory reaction control system jets interacting with a Mach 10 hypersonic flow are compared to experimental nitric oxide planar laser-induced fluorescence data. The steady Reynolds Averaged Navier Stokes equations using the Baldwin-Barth one-equation turbulence model were solved using the OVERFLOW code. The experimental fluorescence data used for comparison consists of qualitative two-dimensional visualization images, qualitative reconstructed three-dimensional flow structures, and quantitative two-dimensional distributions of streamwise velocity. Through modeling of the fluorescence signal equation, computational flow images were produced and directly compared to the qualitative fluorescence data.

  19. Studying the Earth's Environment from Space: Computer Laboratory Exercised and Instructor Resources

    NASA Technical Reports Server (NTRS)

    Smith, Elizabeth A.; Alfultis, Michael

    1998-01-01

    Studying the Earth's Environment From Space is a two-year project to develop a suite of CD-ROMs containing Earth System Science curriculum modules for introductory undergraduate science classes. Lecture notes, slides, and computer laboratory exercises, including actual satellite data and software, are being developed in close collaboration with Carla Evans of NASA GSFC Earth Sciences Directorate Scientific and Educational Endeavors (SEE) project. Smith and Alfultis are responsible for the Oceanography and Sea Ice Processes Modules. The GSFC SEE project is responsible for Ozone and Land Vegetation Modules. This document constitutes a report on the first year of activities of Smith and Alfultis' project.

  20. Unified parallel C and the computing needs of Sandia National Laboratories.

    SciTech Connect

    Brown, Jonathan Leighton; Wen, Zhaofang

    2004-09-01

    As Sandia looks toward petaflops computing and other advanced architectures, it is necessary to provide a programming environment that can exploit this additional computing power while supporting reasonable development time for applications. Thus, they evaluate the Partitioned Global Address Space (PGAS) programming model as implemented in Unified Parallel C (UPC) for its applicability. They report on their experiences in implementing sorting and minimum spanning tree algorithms on a test system, a Cray T3e, with UPC support. They describe several macros that could serve as language extensions and several building-block operations that could serve as a foundation for a PGAS programming library. They analyze the limitations of the UPC implementation available on the test system, and suggest improvements necessary before UPC can be used in a production environment.

  1. Computer and laboratory simulation of interactions between spacecraft surfaces and charged-particle environments

    NASA Technical Reports Server (NTRS)

    Stevens, N. J.

    1979-01-01

    There are two categories of interactions considered in this report. The first, spacecraft passive, refers to cases where the charged-particle environment acts on the spacecraft (e.g., spacecraft charging phenomena). The second, spacecraft active, refers to cases where a system on the spacecraft causes the interaction (e.g., high voltage space power systems). Both categories are studied in ground simulation facilities to understand the processes involved and to measure the pertinent parameters. Computer simulations are based on the NASA Charging Analyzer Program (NASCAP) code. Analytical models are developed in this code and verified against the experimental data. Extrapolation from the small test samples to space conditions are made with this code. Typical results from laboratory and computer simulations are presented for both types of interactions. Extrapolations from these simulations to performance in space environments are discussed.

  2. Computer and laboratory simulation of interactions between spacecraft surfaces and charged-particle environments

    NASA Technical Reports Server (NTRS)

    Stevens, N. J.

    1979-01-01

    Cases where the charged-particle environment acts on the spacecraft (e.g., spacecraft charging phenomena) and cases where a system on the spacecraft causes the interaction (e.g., high voltage space power systems) are considered. Both categories were studied in ground simulation facilities to understand the processes involved and to measure the pertinent parameters. Computer simulations are based on the NASA Charging Analyzer Program (NASCAP) code. Analytical models are developed in this code and verified against the experimental data. Extrapolation from the small test samples to space conditions are made with this code. Typical results from laboratory and computer simulations are presented for both types of interactions. Extrapolations from these simulations to performance in space environments are discussed.

  3. The Development, Implementation, and Evaluation of an Electronics Curriculum Using Computer-Assisted Instruction/Computer-Controlled Laboratory at Motlow State Community College, Tullahoma, Tennessee.

    ERIC Educational Resources Information Center

    Hasty, Doyle E.

    Motlow State Community College (MSCC) in Tullahoma, Tennessee, received a federal grant to develop and implement an electronics computer-assisted instruction (CAI) classroom and an electronics computer-controlled laboratory (CCL). A portion of a complete CAI/CCL electronics curriculum developed by NIDA Corporation was developed, implemented, and…

  4. Completion summary for borehole USGS 136 near the Advanced Test Reactor Complex, Idaho National Laboratory, Idaho

    USGS Publications Warehouse

    Twining, Brian V.; Bartholomay, Roy C.; Hodges, Mary K.V.

    2012-01-01

    In 2011, the U.S. Geological Survey, in cooperation with the U.S. Department of Energy, cored and completed borehole USGS 136 for stratigraphic framework analyses and long-term groundwater monitoring of the eastern Snake River Plain aquifer at the Idaho National Laboratory. The borehole was initially cored to a depth of 1,048 feet (ft) below land surface (BLS) to collect core, open-borehole water samples, and geophysical data. After these data were collected, borehole USGS 136 was cemented and backfilled between 560 and 1,048 ft BLS. The final construction of borehole USGS 136 required that the borehole be reamed to allow for installation of 6-inch (in.) diameter carbon-steel casing and 5-in. diameter stainless-steel screen; the screened monitoring interval was completed between 500 and 551 ft BLS. A dedicated pump and water-level access line were placed to allow for aquifer testing, for collecting periodic water samples, and for measuring water levels. Geophysical and borehole video logs were collected after coring and after the completion of the monitor well. Geophysical logs were examined in conjunction with the borehole core to describe borehole lithology and to identify primary flow paths for groundwater, which occur in intervals of fractured and vesicular basalt. A single-well aquifer test was used to define hydraulic characteristics for borehole USGS 136 in the eastern Snake River Plain aquifer. Specific-capacity, transmissivity, and hydraulic conductivity from the aquifer test were at least 975 gallons per minute per foot, 1.4 × 105 feet squared per day (ft2/d), and 254 feet per day, respectively. The amount of measureable drawdown during the aquifer test was about 0.02 ft. The transmissivity for borehole USGS 136 was in the range of values determined from previous aquifer tests conducted in other wells near the Advanced Test Reactor Complex: 9.5 × 103 to 1.9 × 105 ft2/d. Water samples were analyzed for cations, anions, metals, nutrients, total organic

  5. The Advanced Interdisciplinary Research Laboratory: A Student Team Approach to the Fourth-Year Research Thesis Project Experience

    ERIC Educational Resources Information Center

    Piunno, Paul A. E.; Boyd, Cleo; Barzda, Virginijus; Gradinaru, Claudiu C.; Krull, Ulrich J.; Stefanovic, Sasa; Stewart, Bryan

    2014-01-01

    The advanced interdisciplinary research laboratory (AIRLab) represents a novel, effective, and motivational course designed from the interdisciplinary research interests of chemistry, physics, biology, and education development faculty members as an alternative to the independent thesis project experience. Student teams are assembled to work…

  6. Complementary Spectroscopic Assays for Investigating Protein-Ligand Binding Activity: A Project for the Advanced Chemistry Laboratory

    ERIC Educational Resources Information Center

    Mascotti, David P.; Waner, Mark J.

    2010-01-01

    A protein-ligand binding, guided-inquiry laboratory project with potential application across the advanced undergraduate curriculum is described. At the heart of the project are fluorescence and spectrophotometric assays utilizing biotin-4-fluorescein and streptavidin. The use of the same stock solutions for an assay that may be examined by two…

  7. Cold Crucible Induction Melter Testing at The Idaho National Laboratory for the Advanced Remediation Technologies Program

    SciTech Connect

    Jay Roach; Nick Soelberg; Mike Ancho; Eric Tchemitcheff; John Richardson

    2009-03-01

    AREVA Federal Services (AFS) is performing a multi-year, multi-phase Advanced Remediation Technologies (ART) project, sponsored by the U.S. Department of Energy (DOE), to evaluate the feasibility and benefits of replacing the existing joule-heated melter (JHM) used to treat high level waste (HLW) in the Defense Waste Processing Facility (DWPF) at the Savannah River Site with a cold crucible induction melter (CCIM). The AFS ART CCIM project includes several collaborators from AREVA subsidiaries, French companies, and DOE national laboratories. The Savannah River National Laboratory and the Commissariat a l’Energie Atomique (CEA) have performed laboratory-scale studies and testing to determine a suitable, high-waste-loading glass matrix. The Idaho National Laboratory (INL) and CEA are performing CCIM demonstrations at two different pilot scales to assess CCIM design and operation for treating SRS sludge wastes that are currently being treated in the DWPF. SGN is performing engineering studies to validate the feasibility of retrofitting CCIM technology into the DWPF Melter Cell. The long-term project plan includes more lab-testing, pilot- and large-scale demonstrations, and engineering activities to be performed during subsequent project phases. This paper provides preliminary results of tests using the engineering-scale CCIM test system located at the INL. The CCIM test system was operated continuously over a time period of about 58 hours. As the DWPF simulant feed was continuously fed to the melter, the glass level gradually increased until a portion of the molten glass was drained from the melter. The glass drain was operated semi-continuously because the glass drain rate was higher than the glass feedrate. A cold cap of unmelted feed was controlled by adjusting the feedrate and melter power levels to obtain the target molten glass temperatures with varying cold cap levels. Three test conditions were performed per the test plan, during which the melter was

  8. Student Estimates of Probability and Uncertainty in Advanced Laboratory and Statistical Physics Courses

    NASA Astrophysics Data System (ADS)

    Mountcastle, Donald B.; Bucy, Brandon R.; Thompson, John R.

    2007-11-01

    Equilibrium properties of macroscopic systems are highly predictable as n, the number of particles approaches and exceeds Avogadro's number; theories of statistical physics depend on these results. Typical pedagogical devices used in statistical physics textbooks to introduce entropy (S) and multiplicity (ω) (where S = k ln(ω)) include flipping coins and/or other equivalent binary events, repeated n times. Prior to instruction, our statistical mechanics students usually gave reasonable answers about the probabilities, but not the relative uncertainties, of the predicted outcomes of such events. However, they reliably predicted that the uncertainty in a measured continuous quantity (e.g., the amount of rainfall) does decrease as the number of measurements increases. Typical textbook presentations assume that students understand that the relative uncertainty of binary outcomes will similarly decrease as the number of events increases. This is at odds with our findings, even though most of our students had previously completed mathematics courses in statistics, as well as an advanced electronics laboratory course that included statistical analysis of distributions of dart scores as n increased.

  9. The entrance system laboratory prototype for an advanced mass and ionic charge composition experiment

    SciTech Connect

    Allegrini, F.; Desai, M. I.; Livi, R.; Livi, S.; McComas, D. J.; Randol, B.

    2009-10-15

    Electrostatic analyzers (ESA) have been used extensively for the characterization of plasmas in a variety of space environments. They vary in shape, geometry, and size and are adapted to the specific particle population to be measured and the configuration of the spacecraft. Their main function is to select the energy per charge of the particles within a passband. An energy-per-charge range larger than that of the passband can be sampled by varying the voltage difference between the ESA electrodes. The voltage sweep takes time and reduces the duty cycle for a particular energy-per-charge passband. Our design approach for an advanced mass and ionic charge composition experiment (AMICCE) has a novel electrostatic analyzer that essentially serves as a spectrograph and selects ions simultaneously over a broad range of energy-per-charge (E/q). Only three voltage settings are required to cover the entire range from {approx}10 to 270 keV/q, thus dramatically increasing the product of the geometric factor times the duty cycle when compared with other instruments. In this paper, we describe the AMICCE concept with particular emphasis on the prototype of the entrance system (ESA and collimator), which we designed, developed, and tested. We also present comparisons of the laboratory results with electrostatic simulations.

  10. Vibration considerations in the design of the Advanced Photon Source at Argonne National Laboratory

    SciTech Connect

    Jendrzejczyk, J.A.; Wambsganss, M.W.

    1991-01-01

    The Advanced Photon Source (APS), a new synchrotron radiation facility being built at Argonne National Laboratory, will provide the world's most brilliant X-ray beams for research in a wide range of technical fields. Successful operation of the APS requires an extremely stable positron closed orbit. Vibration of the storage ring quadrupole magnets, even in the submicron range, can lead to distortion of the positron closed orbit and to potentially unacceptable beam emittance growth, which results in degraded performance. This paper presents an overview of the technical approach used to minimize vibration response, beginning at the conceptual stage, through design and construction, and on to successful operation. Acceptance criteria relating to maximum allowable quadrupole magnet vibration are discussed. Soil properties are used to determine resonant frequencies of foundations and to predict attenuation characteristics. Two sources are considered to have the potential to excite the foundation: far-field sources, which are produced external to the facility, and near-field sources, which are produced within the facility. Measurements of ambient ground motion, monitored to determine far- field excitation, are presented. Ambient vibration was measured at several operating facilities within Argonne to gain insight on typical near-field excitation sources. Discussion covers the dynamic response characteristics of a prototypic magnet support structure to various excitations, including ambient floor motion, coolant flow, and magnet power. 19 refs., 10 figs., 5 tabs.

  11. Advances in computer-aided design and computer-aided manufacture technology.

    PubMed

    Calamia, J R

    1996-01-01

    Although the development of computer-aided design (CAD) and computer-aided manufacture (CAM) technology and the benefits of increased productivity became obvious in the automobile and aerospace industries in the 1970s, investigations of this technology's application in the field of dentistry did not begin until the 1980s. Only now are we beginning to see the fruits of this work with the commercial availability of some systems; the potential for this technology seems boundless. This article reviews the recent literature with emphasis on the period from June 1992 to May 1993. This review should familiarize the reader with some of the latest developments in this technology, including a brief description of some systems currently available and the clinical and economical rationale for their acceptance into the dental mainstream. This article concentrates on a particular system, the Cerec (Siemens/Pelton and Crane, Charlotte, NC) system, for three reasons: First, this system has been available since 1985 and, as a result, has a track record of almost 7 years of data. Most of the data have just recently been released and consequently, much of this year's literature on CAD-CAM is monopolized by studies using this system. Second, this system was developed as a mobile, affordable, direct chairside CAD-CAM restorative method. As such, it is of special interest to the patient, providing a one-visit restoration. Third, the author is currently engaged in research using this particular system and has a working knowledge of this system's capabilities.

  12. Low-cost computer-controlled current stimulator for the student laboratory.

    PubMed

    Güçlü, Burak

    2007-06-01

    Electrical stimulation of nerve and muscle tissues is frequently used for teaching core concepts in physiology. It is usually expensive to provide every student group in the laboratory with an individual stimulator. This article presents the design and application of a low-cost [about $100 (U.S.)] isolated stimulator that can be controlled by two analog-output channels (e.g., output channels of a data-acquisition card or onboard audio channels) of a computer. The device is based on a voltage-to-current converter circuit and can produce accurate monopolar and bipolar current pulses, pulse trains, arbitrary current waveforms, and a trigger output. The compliance of the current source is +/-15 V, and the maximum available current is +/-1.5 mA. The device was electrically tested by using the audio output of a personal computer. In this condition, the device had a dynamic range of 46 dB and the available pulse-width range was 0.1-10 ms. The device is easily programmable, and a freeware MATLAB script is posted on the World Wide Web. The practical use of the device was demonstrated by electrically stimulating the sciatic nerve of a frog and recording compound action potentials. The newly designed current stimulator is a flexible and effective tool for teaching in the physiology laboratory, and it can increase the efficiency of learning by maximizing performance-to-cost ratio.

  13. A validated methodology for determination of laboratory instrument computer interface efficacy

    NASA Astrophysics Data System (ADS)

    1984-12-01

    This report is intended to provide a methodology for determining when, and for which instruments, direct interfacing of laboratory instrument and laboratory computers is beneficial. This methodology has been developed to assist the Tri-Service Medical Information Systems Program Office in making future decisions regarding laboratory instrument interfaces. We have calculated the time savings required to reach a break-even point for a range of instrument interface prices and corresponding average annual costs. The break-even analyses used empirical data to estimate the number of data points run per day that are required to meet the break-even point. The results indicate, for example, that at a purchase price of $3,000, an instrument interface will be cost-effective if the instrument is utilized for at least 154 data points per day if operated in the continuous mode, or 216 points per day if operated in the discrete mode. Although this model can help to ensure that instrument interfaces are cost effective, additional information should be considered in making the interface decisions. A reduction in results transcription errors may be a major benefit of instrument interfacing.

  14. ENHANCING THE ATOMIC-LEVEL UNDERSTANDING OF CO2 MINERAL SEQUESTRATION MECHANISMS VIA ADVANCED COMPUTATIONAL MODELING

    SciTech Connect

    A.V.G. Chizmeshya

    2003-12-19

    /NETL managed National Mineral Sequestration Working Group we have already significantly improved our understanding of mineral carbonation. Group members at the Albany Research Center have recently shown that carbonation of olivine and serpentine, which naturally occurs over geological time (i.e., 100,000s of years), can be accelerated to near completion in hours. Further process refinement will require a synergetic science/engineering approach that emphasizes simultaneous investigation of both thermodynamic processes and the detailed microscopic, atomic-level mechanisms that govern carbonation kinetics. Our previously funded Phase I Innovative Concepts project demonstrated the value of advanced quantum-mechanical modeling as a complementary tool in bridging important gaps in our understanding of the atomic/molecular structure and reaction mechanisms that govern CO{sub 2} mineral sequestration reaction processes for the model Mg-rich lamellar hydroxide feedstock material Mg(OH){sub 2}. In the present simulation project, improved techniques and more efficient computational schemes have allowed us to expand and augment these capabilities and explore more complex Mg-rich, lamellar hydroxide-based feedstock materials, including the serpentine-based minerals. These feedstock materials are being actively investigated due to their wide availability, and low-cost CO{sub 2} mineral sequestration potential. Cutting-edge first principles quantum chemical, computational solid-state and materials simulation methodology studies proposed herein, have been strategically integrated with our new DOE supported (ASU-Argonne National Laboratory) project to investigate the mechanisms that govern mineral feedstock heat-treatment and aqueous/fluid-phase serpentine mineral carbonation in situ. This unified, synergetic theoretical and experimental approach will provide a deeper understanding of the key reaction mechanisms than either individual approach can alone. Ab initio techniques will also

  15. ENHANCING THE ATOMIC-LEVEL UNDERSTANDING OF CO2 MINERAL SEQUESTRATION MECHANISMS VIA ADVANCED COMPUTATIONAL MODELING

    SciTech Connect

    A.V.G. Chizmeshya; M.J. McKelvy; G.H. Wolf; R.W. Carpenter; D.A. Gormley; J.R. Diefenbacher; R. Marzke

    2006-03-01

    significantly improved our understanding of mineral carbonation. Group members at the Albany Research Center have recently shown that carbonation of olivine and serpentine, which naturally occurs over geological time (i.e., 100,000s of years), can be accelerated to near completion in hours. Further process refinement will require a synergetic science/engineering approach that emphasizes simultaneous investigation of both thermodynamic processes and the detailed microscopic, atomic-level mechanisms that govern carbonation kinetics. Our previously funded Phase I Innovative Concepts project demonstrated the value of advanced quantum-mechanical modeling as a complementary tool in bridging important gaps in our understanding of the atomic/molecular structure and reaction mechanisms that govern CO2 mineral sequestration reaction processes for the model Mg-rich lamellar hydroxide feedstock material Mg(OH)2. In the present simulation project, improved techniques and more efficient computational schemes have allowed us to expand and augment these capabilities and explore more complex Mg-rich, lamellar hydroxide-based feedstock materials, including the serpentine-based minerals. These feedstock materials are being actively investigated due to their wide availability, and low-cost CO2 mineral sequestration potential. Cutting-edge first principles quantum chemical, computational solid-state and materials simulation methodology studies proposed herein, have been strategically integrated with our new DOE supported (ASU-Argonne National Laboratory) project to investigate the mechanisms that govern mineral feedstock heat-treatment and aqueous/fluid-phase serpentine mineral carbonation in situ. This unified, synergetic theoretical and experimental approach has provided a deeper understanding of the key reaction mechanisms than either individual approach can alone. We used ab initio techniques to significantly advance our understanding of atomic-level processes at the solid/solution interface by

  16. ENHANCING THE ATOMIC-LEVEL UNDERSTANDING OF CO2 MINERAL SEQUESTRATION MECHANISMS VIA ADVANCED COMPUTATIONAL MODELING

    SciTech Connect

    A.V.G. Chizmeshya

    2002-12-19

    /NETL managed National Mineral Sequestration Working Group we have already significantly improved our understanding of mineral carbonation. Group members at the Albany Research Center have recently shown that carbonation of olivine and serpentine, which naturally occurs over geological time (i.e., 100,000s of years), can be accelerated to near completion in hours. Further process refinement will require a synergetic science/engineering approach that emphasizes simultaneous investigation of both thermodynamic processes and the detailed microscopic, atomic-level mechanisms that govern carbonation kinetics. Our previously funded Phase I Innovative Concepts project demonstrated the value of advanced quantum-mechanical modeling as a complementary tool in bridging important gaps in our understanding of the atomic/molecular structure and reaction mechanisms that govern CO{sub 2} mineral sequestration reaction processes for the model Mg-rich lamellar hydroxide feedstock material Mg(OH){sub 2}. In the present simulation project, improved techniques and more efficient computational schemes have allowed us to expand and augment these capabilities and explore more complex Mg-rich, lamellar hydroxide-based feedstock materials, including the serpentine-based minerals. These feedstock materials are being actively investigated due to their wide availability, and low-cost CO{sub 2} mineral sequestration potential. Cutting-edge first principles quantum chemical, computational solid-state and materials simulation methodology studies proposed herein, have been strategically integrated with our new DOE supported (ASU-Argonne National Laboratory) project to investigate the mechanisms that govern mineral feedstock heat-treatment and aqueous/fluid-phase serpentine mineral carbonation in situ. This unified, synergetic theoretical and experimental approach will provide a deeper understanding of the key reaction mechanisms than either individual approach can alone. Ab initio techniques will also

  17. Advances in computer-aided design and computer-aided manufacture technology.

    PubMed

    Calamia, J R

    1994-01-01

    Although the development of computer-aided design (CAD) and computer-aided manufacture (CAM) technology and the benefits of increased productivity became obvious in the automobile and aerospace industries in the 1970s, investigations of this technology's application in the field of dentistry did not begin until the 1980s. Only now are we beginning to see the fruits of this work with the commercial availability of some systems; the potential for this technology seems boundless. This article reviews the recent literature with emphasis on the period from June 1992 to May 1993. This review should familiarize the reader with some of the latest developments in this technology, including a brief description of some systems currently available and the clinical and economical rationale for their acceptance into the dental mainstream. This article concentrates on a particular system, the Cerec (Siemens/Pelton and Crane, Charlotte, NC) system, for three reasons: first, this system has been available since 1985 and, as a result, has a track record of almost 7 years of data. Most of the data have just recently been released and consequently, much of this year's literature on CAD-CAM is monopolized by studies using this system. Second, this system was developed as a mobile, affordable, direct chairside CAD-CAM restorative method. As such, it is of special interest to the dentist who will offer this new technology directly to the patient, providing a one-visit restoration. Third, the author is currently engaged in research using this particular system and has a working knowledge of this system's capabilities.

  18. Computation of Loads on the McDonnell Douglas Advanced Bearingless Rotor

    NASA Technical Reports Server (NTRS)

    Nguyen, Khanh; Lauzon, Dan; Anand, Vaidyanathan

    1994-01-01

    Computed results from UMARC and DART analyses are compared with the blade bending moments and vibratory hub loads data obtained from a full-scale wind tunnel test of the McDonnell Douglas five-bladed advanced bearingless rotor. The 5 per-rev vibratory hub loads data are corrected using results from a dynamic calibration of the rotor balance. The comparison between UMARC computed blade bending moments at different flight conditions are poor to fair, while DART results are fair to good. Using the free wake module, UMARC adequately computes the 5P vibratory hub loads for this rotor, capturing both magnitude and variations with forward speed. DART employs a uniform inflow wake model and does not adequately compute the 5P vibratory hub loads for this rotor.

  19. Characterization of Aerodynamic Interactions with the Mars Science Laboratory Reaction Control System Using Computation and Experiment

    NASA Technical Reports Server (NTRS)

    Schoenenberger, Mark; VanNorman, John; Rhode, Matthew; Paulson, John

    2013-01-01

    On August 5 , 2012, the Mars Science Laboratory (MSL) entry capsule successfully entered Mars' atmosphere and landed the Curiosity rover in Gale Crater. The capsule used a reaction control system (RCS) consisting of four pairs of hydrazine thrusters to fly a guided entry. The RCS provided bank control to fly along a flight path commanded by an onboard computer and also damped unwanted rates due to atmospheric disturbances and any dynamic instabilities of the capsule. A preliminary assessment of the MSL's flight data from entry showed that the capsule flew much as predicted. This paper will describe how the MSL aerodynamics team used engineering analyses, computational codes and wind tunnel testing in concert to develop the RCS system and certify it for flight. Over the course of MSL's development, the RCS configuration underwent a number of design iterations to accommodate mechanical constraints, aeroheating concerns and excessive aero/RCS interactions. A brief overview of the MSL RCS configuration design evolution is provided. Then, a brief description is presented of how the computational predictions of RCS jet interactions were validated. The primary work to certify that the RCS interactions were acceptable for flight was centered on validating computational predictions at hypersonic speeds. A comparison of computational fluid dynamics (CFD) predictions to wind tunnel force and moment data gathered in the NASA Langley 31-Inch Mach 10 Tunnel was the lynch pin to validating the CFD codes used to predict aero/RCS interactions. Using the CFD predictions and experimental data, an interaction model was developed for Monte Carlo analyses using 6-degree-of-freedom trajectory simulation. The interaction model used in the flight simulation is presented.

  20. Advanced Simulation & Computing FY15 Implementation Plan Volume 2, Rev. 0.5

    SciTech Connect

    McCoy, Michel; Archer, Bill; Matzen, M. Keith

    2014-09-16

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. As the program approaches the end of its second decade, ASC is intently focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), quantify critical margins and uncertainties, and resolve increasingly difficult analyses needed for the SSP. Where possible, the program also enables the use of high-performance simulation and computing tools to address broader national security needs, such as foreign nuclear weapon assessments and counternuclear terrorism.

  1. Advancing the Theory of Nuclear Reactions with Rare Isotopes. From the Laboratory to the Cosmos

    SciTech Connect

    Nunes, Filomena

    2015-06-01

    The mission of the Topical Collaboration on the Theory of Reactions for Unstable iSotopes (TORUS) was to develop new methods to advance nuclear reaction theory for unstable isotopes—particularly the (d,p) reaction in which a deuteron, composed of a proton and a neutron, transfers its neutron to an unstable nucleus. After benchmarking the state-of-the-art theories, the TORUS collaboration found that there were no exact methods to study (d,p) reactions involving heavy targets; the difficulty arising from the long-range nature of the well known, yet subtle, Coulomb force. To overcome this challenge, the TORUS collaboration developed a new theory where the complexity of treating the long-range Coulomb interaction is shifted to the calculation of so-called form-factors. An efficient implementation for the computation of these form factors was a major achievement of the TORUS collaboration. All the new machinery developed are essential ingredients to analyse (d,p) reactions involving heavy nuclei relevant for astrophysics, energy production, and stockpile stewardship.

  2. Recovery Act: Advanced Interaction, Computation, and Visualization Tools for Sustainable Building Design

    SciTech Connect

    Greenberg, Donald P.; Hencey, Brandon M.

    2013-08-20

    Current building energy simulation technology requires excessive labor, time and expertise to create building energy models, excessive computational time for accurate simulations and difficulties with the interpretation of the results. These deficiencies can be ameliorated using modern graphical user interfaces and algorithms which take advantage of modern computer architectures and display capabilities. To prove this hypothesis, we developed an experimental test bed for building energy simulation. This novel test bed environment offers an easy-to-use interactive graphical interface, provides access to innovative simulation modules that run at accelerated computational speeds, and presents new graphics visualization methods to interpret simulation results. Our system offers the promise of dramatic ease of use in comparison with currently available building energy simulation tools. Its modular structure makes it suitable for early stage building design, as a research platform for the investigation of new simulation methods, and as a tool for teaching concepts of sustainable design. Improvements in the accuracy and execution speed of many of the simulation modules are based on the modification of advanced computer graphics rendering algorithms. Significant performance improvements are demonstrated in several computationally expensive energy simulation modules. The incorporation of these modern graphical techniques should advance the state of the art in the domain of whole building energy analysis and building performance simulation, particularly at the conceptual design stage when decisions have the greatest impact. More importantly, these better simulation tools will enable the transition from prescriptive to performative energy codes, resulting in better, more efficient designs for our future built environment.

  3. DOE Advanced Scientific Computing Advisory Subcommittee (ASCAC) Report: Top Ten Exascale Research Challenges

    SciTech Connect

    Lucas, Robert; Ang, James; Bergman, Keren; Borkar, Shekhar; Carlson, William; Carrington, Laura; Chiu, George; Colwell, Robert; Dally, William; Dongarra, Jack; Geist, Al; Haring, Rud; Hittinger, Jeffrey; Hoisie, Adolfy; Klein, Dean Micron; Kogge, Peter; Lethin, Richard; Sarkar, Vivek; Schreiber, Robert; Shalf, John; Sterling, Thomas; Stevens, Rick; Bashor, Jon; Brightwell, Ron; Coteus, Paul; Debenedictus, Erik; Hiller, Jon; Kim, K. H.; Langston, Harper; Murphy, Richard Micron; Webster, Clayton; Wild, Stefan; Grider, Gary; Ross, Rob; Leyffer, Sven; Laros III, James

    2014-02-10

    Exascale computing systems are essential for the scientific fields that will transform the 21st century global economy, including energy, biotechnology, nanotechnology, and materials science. Progress in these fields is predicated on the ability to perform advanced scientific and engineering simulations, and analyze the deluge of data. On July 29, 2013, ASCAC was charged by Patricia Dehmer, the Acting Director of the Office of Science, to assemble a subcommittee to provide advice on exascale computing. This subcommittee was directed to return a list of no more than ten technical approaches (hardware and software) that will enable the development of a system that achieves the Department's goals for exascale computing. Numerous reports over the past few years have documented the technical challenges and the non¬-viability of simply scaling existing computer designs to reach exascale. The technical challenges revolve around energy consumption, memory performance, resilience, extreme concurrency, and big data. Drawing from these reports and more recent experience, this ASCAC subcommittee has identified the top ten computing technology advancements that are critical to making a capable, economically viable, exascale system.

  4. A comparison of computer architectures for the NASA demonstration advanced avionics system

    NASA Technical Reports Server (NTRS)

    Seacord, C. L.; Bailey, D. G.; Larson, J. C.

    1979-01-01

    The paper compares computer architectures for the NASA demonstration advanced avionics system. Two computer architectures are described with an unusual approach to fault tolerance: a single spare processor can correct for faults in any of the distributed processors by taking on the role of a failed module. It was shown the system must be used from a functional point of view to properly apply redundancy and achieve fault tolerance and ultra reliability. Data are presented on complexity and mission failure probability which show that the revised version offers equivalent mission reliability at lower cost as measured by hardware and software complexity.

  5. TEMPERATURE MONITORING OPTIONS AVAILABLE AT THE IDAHO NATIONAL LABORATORY ADVANCED TEST REACTOR

    SciTech Connect

    J.E. Daw; J.L. Rempe; D.L. Knudson; T. Unruh; B.M. Chase; K.L Davis

    2012-03-01

    As part of the Advanced Test Reactor National Scientific User Facility (ATR NSUF) program, the Idaho National Laboratory (INL) has developed in-house capabilities to fabricate, test, and qualify new and enhanced sensors for irradiation testing. To meet recent customer requests, an array of temperature monitoring options is now available to ATR users. The method selected is determined by test requirements and budget. Melt wires are the simplest and least expensive option for monitoring temperature. INL has recently verified the melting temperature of a collection of materials with melt temperatures ranging from 100 to 1000 C with a differential scanning calorimeter installed at INL’s High Temperature Test Laboratory (HTTL). INL encapsulates these melt wires in quartz or metal tubes. In the case of quartz tubes, multiple wires can be encapsulated in a single 1.6 mm diameter tube. The second option available to ATR users is a silicon carbide temperature monitor. The benefit of this option is that a single small monitor (typically 1 mm x 1 mm x 10 mm or 1 mm diameter x 10 mm length) can be used to detect peak irradiation temperatures ranging from 200 to 800 C. Equipment has been installed at INL’s HTTL to complete post-irradiation resistivity measurements on SiC monitors, a technique that has been found to yield the most accurate temperatures from these monitors. For instrumented tests, thermocouples may be used. In addition to Type-K and Type-N thermocouples, a High Temperature Irradiation Resistant ThermoCouple (HTIR-TC) was developed at the HTTL that contains commercially-available doped molybdenum paired with a niobium alloy thermoelements. Long duration high temperature tests, in furnaces and in the ATR and other MTRs, demonstrate that the HTIR-TC is accurate up to 1800 C and insensitive to thermal neutron interactions. Thus, degradation observed at temperatures above 1100 C with Type K and N thermocouples and decalibration due to transmutation with tungsten

  6. Saturday Academay of Computing and Mathematics (SACAM) at the Oak Ridge National Laboratory

    SciTech Connect

    Clark, D.N. )

    1991-01-01

    To be part of the impending Information Age, our students and teachers must be trained in the use of computers, logic, and mathematics. The Saturday Academy of Computing and Mathematics (SACAM) represents one facet of Oak Ridge National Laboratory's (ORNL) response to meet the challenge. SACAM attempts to provide the area's best high school students with a creative program that illustrates how researchers are using computing and mathematics tools to help solve nationally recognized problems in virtually all scientific fields. Each SACAM program is designed as eight 3-hour sessions. Each session outlines a current scientific question or research area. Sessions are presented on a Saturday morning by a speaker team of two to four ORNL scientists (mentors) working in that particular field. Approximately four students and one teacher from each of ten area high schools attend the eight sessions. Session topics cover diverse problems such as climate modeling cryptography and cryptology, high-energy physics, human genome sequencing, and even the use of probability in locating people lost in a national forest. Evaluations from students, teachers, and speakers indicate that the program has been well received, and a tracking program is being undertaken to determine long-range benefits. An analysis of the program's successes and lessons learned is presented as well as resources required for the program.

  7. Robotics, Stem Cells and Brain Computer Interfaces in Rehabilitation and Recovery from Stroke; Updates and Advances

    PubMed Central

    Boninger, Michael L; Wechsler, Lawrence R.; Stein, Joel

    2014-01-01

    Objective To describe the current state and latest advances in robotics, stem cells, and brain computer interfaces in rehabilitation and recovery for stroke. Design The authors of this summary recently reviewed this work as part of a national presentation. The paper represents the information included in each area. Results Each area has seen great advances and challenges as products move to market and experiments are ongoing. Conclusion Robotics, stem cells, and brain computer interfaces all have tremendous potential to reduce disability and lead to better outcomes for patients with stroke. Continued research and investment will be needed as the field moves forward. With this investment, the potential for recovery of function is likely substantial PMID:25313662

  8. High Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

    SciTech Connect

    Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

    2014-07-28

    The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space, and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing are essential for accurately modeling them. In the past decade, the US Department of Energy's SciDAC program has produced accelerator-modeling tools that have been employed to tackle some of the most difficult accelerator science problems. The authors discuss the Synergia framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable of handling the entire spectrum of beam dynamics simulations. Our authors present Synergia's design principles and its performance on HPC platforms.

  9. High Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

    DOE PAGES

    Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

    2014-07-28

    The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space, and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing are essential for accurately modeling them. In the past decade, the US Department of Energy's SciDAC program has produced accelerator-modeling tools that have been employed to tackle some of the most difficult accelerator science problems. The authors discuss the Synergia framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable ofmore » handling the entire spectrum of beam dynamics simulations. Our authors present Synergia's design principles and its performance on HPC platforms.« less

  10. Computational methods in the prediction of advanced subsonic and supersonic propeller induced noise: ASSPIN users' manual

    NASA Technical Reports Server (NTRS)

    Dunn, M. H.; Tarkenton, G. M.

    1992-01-01

    This document describes the computational aspects of propeller noise prediction in the time domain and the use of high speed propeller noise prediction program ASSPIN (Advanced Subsonic and Supersonic Propeller Induced Noise). These formulations are valid in both the near and far fields. Two formulations are utilized by ASSPIN: (1) one is used for subsonic portions of the propeller blade; and (2) the second is used for transonic and supersonic regions on the blade. Switching between the two formulations is done automatically. ASSPIN incorporates advanced blade geometry and surface pressure modelling, adaptive observer time grid strategies, and contains enhanced numerical algorithms that result in reduced computational time. In addition, the ability to treat the nonaxial inflow case has been included.

  11. Advances in Single-Photon Emission Computed Tomography Hardware and Software.

    PubMed

    Piccinelli, Marina; Garcia, Ernest V

    2016-02-01

    Nuclear imaging techniques remain today's most reliable modality for the assessment and quantification of myocardial perfusion. In recent years, the field has experienced tremendous progress both in terms of dedicated cameras for cardiac applications and software techniques for image reconstruction. The most recent advances in single-photon emission computed tomography hardware and software are reviewed, focusing on how these improvements have resulted in an even more powerful diagnostic tool with reduced injected radiation dose and acquisition time.

  12. Recent Advances in Cardiac Computed Tomography: Dual Energy, Spectral and Molecular CT Imaging

    PubMed Central

    Danad, Ibrahim; Fayad, Zahi A.; Willemink, Martin J.; Min, James K.

    2015-01-01

    Computed tomography (CT) evolved into a powerful diagnostic tool and it is impossible to imagine current clinical practice without CT imaging. Due to its widespread availability, ease of clinical application, superb sensitivity for detection of CAD, and non-invasive nature, CT has become a valuable tool within the armamentarium of the cardiologist. In the last few years, numerous technological advances in CT have occurred—including dual energy CT (DECT), spectral CT and CT-based molecular imaging. By harnessing the advances in technology, cardiac CT has advanced beyond the mere evaluation of coronary stenosis to an imaging modality tool that permits accurate plaque characterization, assessment of myocardial perfusion and even probing of molecular processes that are involved in coronary atherosclerosis. Novel innovations in CT contrast agents and pre-clinical spectral CT devices have paved the way for CT-based molecular imaging. PMID:26068288

  13. Phase 1 environmental report for the Advanced Neutron Source at Oak Ridge National Laboratory

    SciTech Connect

    Blasing, T.J.; Brown, R.A.; Cada, G.F.; Easterly, C.; Feldman, D.L.; Hagan, C.W.; Harrington, R.M.; Johnson, R.O.; Ketelle, R.H.; Kroodsma, R.L.; McCold, L.N.; Reich, W.J.; Scofield, P.A.; Socolof, M.L.; Taleyarkhan, R.P.; Van Dyke, J.W.

    1992-02-01

    The US Department of Energy (DOE) has proposed the construction and operation of the Advanced Neutron Source (ANS), a 330-MW(f) reactor, at Oak Ridge National Laboratory (ORNL) to support neutron scattering and nuclear physics experiments. ANS would provide a steady-state source of neutrons that are thermalized to produce sources of hot, cold, and very coal neutrons. The use of these neutrons in ANS experiment facilities would be an essential component of national research efforts in basic materials science. Additionally, ANS capabilities would include production of transplutonium isotopes, irradiation of potential fusion and fission reactor materials, activation analysis, and production of medical and industrial isotopes such as {sup 252}Cf. Although ANS would not require licensing by the US Nuclear Regulatory Commission (NRC), DOE regards the design, construction, and operation of ANS as activities that would produce a licensable facility; that is, DOE is following the regulatory guidelines that NRC would apply if NRC were licensing the facility. Those guidelines include instructions for the preparation of an environmental report (ER), a compilation of available data and preliminary analyses regarding the environmental impacts of nuclear facility construction and operation. The ER, described and outlined in NRC Regulatory Guide 4.2, serves as a background document to facilitate the preparation of environmental impact statements (EISs). Using Regulatory Guide 4.2 as a model, this ANS ER provides analyses and information specific to the ANS site and area that can be adopted (and modified, if necessary) for the ANS EIS. The ER is being prepared in two phases. Phase 1 ER includes many of the data and analyses needed to prepare the EIS but does not include data or analyses of alternate sites or alternate technologies. Phase 2 ER will include the additional data and analyses stipulated by Regulatory Guide 4.2.

  14. Decomposition and plant-available nitrogen in biosolids: laboratory studies, field studies, and computer simulation.

    PubMed

    Gilmour, John T; Cogger, Craig G; Jacobs, Lee W; Evanylo, Gregory K; Sullivan, Dan M

    2003-01-01

    This research combines laboratory and field studies with computer simulation to characterize the amount of plant-available nitrogen (PAN) released when municipal biosolids are land-applied to agronomic crops. In the laboratory studies, biosolids were incubated in or on soil from the land application sites. Mean biosolids total C, organic N, and C to N ratio were 292 g kg(-1), 41.7 g kg(-1), and 7.5, respectively. Based on CO2 evolution at 25 degrees C and optimum soil moisture, 27 of the 37 biosolids-soil combinations had two decomposition phases. The mean rapid and slow fraction rate constants were 0.021 and 0.0015 d(-1), respectively, and the rapid fraction contained 23% of the total C assuming sequential decomposition. Where only one decomposition phase existed, the mean first order rate constant was 0.0046 d(-1). The mean rate constant for biosolids stored in lagoons for an extended time was 0.00097 d(-1). The only treatment process that was related to biosolids treatment was stabilization by storage in a lagoon. Biosolids addition rates (dry basis) ranged from 1.3 to 33.8 Mg ha(-1) with a mean value of 10.6 Mg ha(-1). A relationship between fertilizer N rate and crop response was used to estimate observed PAN at each site. Mean observed PAN during the growing season was 18.9 kg N Mg(-1) or 37% of the biosolids total N. Observed PAN was linearly related to biosolids total N. Predicted PAN using the computer model Decomposition, actual growing-season weather, actual analytical data, and laboratory decomposition kinetics compared well with observed PAN. The mean computer model prediction of growing-season PAN was 19.2 kg N Mg(-1) and the slope of the regression between predicted and observed PAN was not significantly different from unity. Predicted PAN obtained using mean decomposition kinetics was related to predicted PAN using actual decomposition kinetics suggesting that mean rate constants, actual weather, and actual analytical data could be used in

  15. Fossil energy: From laboratory to marketplace. Part 2, The role of advanced research

    SciTech Connect

    Not Available

    1992-03-01

    The purpose of this work is to provide a summary description of the role of advanced research in the overall Fossil Energy R&D program successes. It presents the specific Fossil Energy advanced research products that have been adopted commercially or fed into other R&D programs as part of the crosscutting enabling technology base upon which advanced systems are based.

  16. Recent advances in 3D computed tomography techniques for simulation and navigation in hepatobiliary pancreatic surgery.

    PubMed

    Uchida, Masafumi

    2014-04-01

    A few years ago it could take several hours to complete a 3D image using a 3D workstation. Thanks to advances in computer science, obtaining results of interest now requires only a few minutes. Many recent 3D workstations or multimedia computers are equipped with onboard 3D virtual patient modeling software, which enables patient-specific preoperative assessment and virtual planning, navigation, and tool positioning. Although medical 3D imaging can now be conducted using various modalities, including computed tomography (CT), magnetic resonance imaging (MRI), positron emission tomography (PET), and ultrasonography (US) among others, the highest quality images are obtained using CT data, and CT images are now the most commonly used source of data for 3D simulation and navigation image. If the 2D source image is bad, no amount of 3D image manipulation in software will provide a quality 3D image. In this exhibition, the recent advances in CT imaging technique and 3D visualization of the hepatobiliary and pancreatic abnormalities are featured, including scan and image reconstruction technique, contrast-enhanced techniques, new application of advanced CT scan techniques, and new virtual reality simulation and navigation imaging.

  17. Research Institute for Advanced Computer Science: Annual Report October 1998 through September 1999

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.; Gross, Anthony R. (Technical Monitor)

    1999-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center (ARC). It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. ARC has been designated NASA's Center of Excellence in Information Technology. In this capacity, ARC is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA ARC and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to

  18. Advanced methods for the computation of particle beam transport and the computation of electromagnetic fields and beam-cavity interactions

    SciTech Connect

    Dragt, A.J.; Gluckstern, R.L.

    1990-11-01

    The University of Maryland Dynamical Systems and Accelerator Theory Group carries out research in two broad areas: the computation of charged particle beam transport using Lie algebraic methods and advanced methods for the computation of electromagnetic fields and beam-cavity interactions. Important improvements in the state of the art are believed to be possible in both of these areas. In addition, applications of these methods are made to problems of current interest in accelerator physics including the theoretical performance of present and proposed high energy machines. The Lie algebraic method of computing and analyzing beam transport handles both linear and nonlinear beam elements. Tests show this method to be superior to the earlier matrix or numerical integration methods. It has wide application to many areas including accelerator physics, intense particle beams, ion microprobes, high resolution electron microscopy, and light optics. With regard to the area of electromagnetic fields and beam cavity interactions, work is carried out on the theory of beam breakup in single pulses. Work is also done on the analysis of the high behavior of longitudinal and transverse coupling impendances, including the examination of methods which may be used to measure these impedances. Finally, work is performed on the electromagnetic analysis of coupled cavities and on the coupling of cavities to waveguides.

  19. The use of computer-aided learning in chemistry laboratory instruction

    NASA Astrophysics Data System (ADS)

    Allred, Brian Robert Tracy

    This research involves developing and implementing computer software for chemistry laboratory instruction. The specific goal is to design the software and investigate whether it can be used to introduce concepts and laboratory procedures without a lecture format. This would allow students to conduct an experiment even though they may not have been introduced to the chemical concept in their lecture course. This would also allow for another type of interaction for those students who respond more positively to a visual approach to instruction. The first module developed was devoted to using computer software to help introduce students to the concepts related to thin-layer chromatography and setting up and running an experiment. This was achieved through the use of digitized pictures and digitized video clips along with written information. A review quiz was used to help reinforce the learned information. The second module was devoted to the concept of the "dry lab". This module presented students with relevant information regarding the chemical concepts and then showed them the outcome of mixing solutions. By these observations, they were to determine the composition of unknown solutions based on provided descriptions and comparison with their written observations. The third piece of the software designed was a computer game. This program followed the first two modules in providing information the students were to learn. The difference here, though, was incorporating a game scenario for students to use to help reinforce the learning. Students were then assessed to see how much information they retained after playing the game. In each of the three cases, a control group exposed to the traditional lecture format was used. Their results were compared to the experimental group using the computer modules. Based upon the findings, it can be concluded that using technology to aid in the instructional process is definitely of benefit and students were more successful in

  20. Computer Network Availability at Sandia National Laboratories, Albuquerque NM: Measurement and Perception

    SciTech Connect

    NELSON,SPENCER D.; TOLENDINO,LAWRENCE F.

    1999-11-01

    The desire to provide a measure of computer network availability at Sandia National Laboratories has existed for along time. Several attempts were made to build this measure by accurately recording network failures, identifying the type of network element involved, the root cause of the problem, and the time to repair the fault. Recognizing the limitations of available methods, it became obvious that another approach of determining network availability had to be defined. The chosen concept involved the periodic sampling of network services and applications from various network locations. A measure of ''network'' availability was then calculated based on the ratio of polling success to failure. The effort required to gather the information and produce a useful metric is not prohibitive and the information gained has verified long held feelings regarding network performance with real data.

  1. ELAS - A geobased information system that is transferable to several computers. [Earth resources Laboratory Applications Software

    NASA Technical Reports Server (NTRS)

    Whitley, S. L.; Pearson, R. W.; Seyfarth, B. R.; Graham, M. H.

    1981-01-01

    In the early years of remote sensing, emphasis was placed on the processing and analysis of data from a single multispectral sensor, such as the Landsat Multispectral Scanner System (MSS). However, in connection with attempts to use the data for resource management, it was realized that many deficiencies existed in single data sets. A need was established to geographically reference the MSS data and to register with it data from disparate sources. Technological transfer activities have required systems concepts that can be easily transferred to computers of different types in other organizations. ELAS (Earth Resources Laboratory Applications Software), a geographically based information system, was developed to meet the considered needs. ELAS accepts data from a variety of sources. It contains programs to geographically reference the data to the Universal Transverse Mercator grid. One of the primary functions of ELAS is to produce a surface cover map.

  2. Pencil-and-Paper Neural Networks: An Undergraduate Laboratory Exercise in Computational Neuroscience

    PubMed Central

    Crisp, Kevin M.; Sutter, Ellen N.; Westerberg, Jacob A.

    2015-01-01

    Although it has been more than 70 years since McCulloch and Pitts published their seminal work on artificial neural networks, such models remain primarily in the domain of computer science departments in undergraduate education. This is unfortunate, as simple network models offer undergraduate students a much-needed bridge between cellular neurobiology and processes governing thought and behavior. Here, we present a very simple laboratory exercise in which students constructed, trained and tested artificial neural networks by hand on paper. They explored a variety of concepts, including pattern recognition, pattern completion, noise elimination and stimulus ambiguity. Learning gains were evident in changes in the use of language when writing about information processing in the brain. PMID:26557791

  3. A computational study of advanced exhaust system transition ducts with experimental validation

    NASA Technical Reports Server (NTRS)

    Wu, C.; Farokhi, S.; Taghavi, R.

    1992-01-01

    The current study is an application of CFD to a 'real' design and analysis environment. A subsonic, three-dimensional parabolized Navier-Stokes (PNS) code is used to construct stall margin design charts for optimum-length advanced exhaust systems' circular-to-rectangular transition ducts. Computer code validation has been conducted to examine the capability of wall static pressure predictions. The comparison of measured and computed wall static pressures indicates a reasonable accuracy of the PNS computer code results. Computations have also been conducted on 15 transition ducts, three area ratios, and five aspect ratios. The three area ratios investigated are constant area ratio of unity, moderate contracting area ratio of 0.8, and highly contracting area ratio of 0.5. The degree of mean flow acceleration is identified as a dominant parameter in establishing the minimum duct length requirement. The effect of increasing aspect ratio in the minimum length transition duct is to increase the length requirement, as well as to increase the mass-averaged total pressure losses. The design guidelines constructed from this investigation may aid in the design and manufacture of advanced exhaust systems for modern fighter aircraft.

  4. ADVANCING THE FUNDAMENTAL UNDERSTANDING AND SCALE-UP OF TRISO FUEL COATERS VIA ADVANCED MEASUREMENT AND COMPUTATIONAL TECHNIQUES

    SciTech Connect

    Biswas, Pratim; Al-Dahhan, Muthanna

    2012-11-01

    to advance the fundamental understanding of the hydrodynamics by systematically investigating the effect of design and operating variables, to evaluate the reported dimensionless groups as scaling factors, and to establish a reliable scale-up methodology for the TRISO fuel particle spouted bed coaters based on hydrodynamic similarity via advanced measurement and computational techniques. An additional objective is to develop an on-line non-invasive measurement technique based on gamma ray densitometry (i.e. Nuclear Gauge Densitometry) that can be installed and used for coater process monitoring to ensure proper performance and operation and to facilitate the developed scale-up methodology. To achieve the objectives set for the project, the work will use optical probes and gamma ray computed tomography (CT) (for the measurements of solids/voidage holdup cross-sectional distribution and radial profiles along the bed height, spouted diameter, and fountain height) and radioactive particle tracking (RPT) (for the measurements of the 3D solids flow field, velocity, turbulent parameters, circulation time, solids lagrangian trajectories, and many other of spouted bed related hydrodynamic parameters). In addition, gas dynamic measurement techniques and pressure transducers will be utilized to complement the obtained information. The measurements obtained by these techniques will be used as benchmark data to evaluate and validate the computational fluid dynamic (CFD) models (two fluid model or discrete particle model) and their closures. The validated CFD models and closures will be used to facilitate the developed methodology for scale-up, design and hydrodynamic similarity. Successful execution of this work and the proposed tasks will advance the fundamental understanding of the coater flow field and quantify it for proper and safe design, scale-up, and performance. Such achievements will overcome the barriers to AGR applications and will help assure that the US maintains

  5. CRADA ORNL 91-0046B final report: Assessment of IBM advanced computing architectures

    SciTech Connect

    Geist, G.A.

    1996-02-01

    This was a Cooperative Research and Development Agreement (CRADA) with IBM to assess their advanced computer architectures. Over the course of this project three different architectures were evaluated. The POWER/4 RIOS1 based shared memory multiprocessor, the POWER/2 RIOS2 based high performance workstation, and the J30 PowerPC based shared memory multiprocessor. In addition to this hardware several software packages where beta tested for IBM including: ESSO scientific computing library, nv video-conferencing package, Ultimedia multimedia display environment, FORTRAN 90 and C++ compilers, and the AIX 4.1 operating system. Both IBM and ORNL benefited from the research performed in this project and even though access to the POWER/4 computer was delayed several months, all milestones were met.

  6. Advanced Optical Diagnostics for Ice Crystal Cloud Measurements in the NASA Glenn Propulsion Systems Laboratory

    NASA Technical Reports Server (NTRS)

    Bencic, Timothy J.; Fagan, Amy; Van Zante, Judith F.; Kirkegaard, Jonathan P.; Rohler, David P.; Maniyedath, Arjun; Izen, Steven H.

    2013-01-01

    A light extinction tomography technique has been developed to monitor ice water clouds upstream of a direct connected engine in the Propulsion Systems Laboratory (PSL) at NASA Glenn Research Center (GRC). The system consists of 60 laser diodes with sheet generating optics and 120 detectors mounted around a 36-inch diameter ring. The sources are pulsed sequentially while the detectors acquire line-of-sight extinction data for each laser pulse. Using computed tomography algorithms, the extinction data are analyzed to produce a plot of the relative water content in the measurement plane. To target the low-spatial-frequency nature of ice water clouds, unique tomography algorithms were developed using filtered back-projection methods and direct inversion methods that use Gaussian basis functions. With the availability of a priori knowledge of the mean droplet size and the total water content at some point in the measurement plane, the tomography system can provide near real-time in-situ quantitative full-field total water content data at a measurement plane approximately 5 feet upstream of the engine inlet. Results from ice crystal clouds in the PSL are presented. In addition to the optical tomography technique, laser sheet imaging has also been applied in the PSL to provide planar ice cloud uniformity and relative water content data during facility calibration before the tomography system was available and also as validation data for the tomography system. A comparison between the laser sheet system and light extinction tomography resulting data are also presented. Very good agreement of imaged intensity and water content is demonstrated for both techniques. Also, comparative studies between the two techniques show excellent agreement in calculation of bulk total water content averaged over the center of the pipe.

  7. Communication and computing technology in biocontainment laboratories using the NEIDL as a model.

    PubMed

    McCall, John; Hardcastle, Kath

    2014-07-01

    The National Emerging Infectious Diseases Laboratories (NEIDL), Boston University, is a globally unique biocontainment research facility housing biosafety level 2 (BSL-2), BSL-3, and BSL-4 laboratories. Located in the BioSquare area at the University's Medical Campus, it is part of a national network of secure facilities constructed to study infectious diseases of major public health concern. The NEIDL allows for basic, translational, and clinical phases of research to be carried out in a single facility with the overall goal of accelerating understanding, treatment, and prevention of infectious diseases. The NEIDL will also act as a center of excellence providing training and education in all aspects of biocontainment research. Within every detail of NEIDL operations is a primary emphasis on safety and security. The ultramodern NEIDL has required a new approach to communications technology solutions in order to ensure safety and security and meet the needs of investigators working in this complex building. This article discusses the implementation of secure wireless networks and private cloud computing to promote operational efficiency, biosecurity, and biosafety with additional energy-saving advantages. The utilization of a dedicated data center, virtualized servers, virtualized desktop integration, multichannel secure wireless networks, and a NEIDL-dedicated Voice over Internet Protocol (VoIP) network are all discussed.

  8. Communication and computing technology in biocontainment laboratories using the NEIDL as a model.

    PubMed

    McCall, John; Hardcastle, Kath

    2014-07-01

    The National Emerging Infectious Diseases Laboratories (NEIDL), Boston University, is a globally unique biocontainment research facility housing biosafety level 2 (BSL-2), BSL-3, and BSL-4 laboratories. Located in the BioSquare area at the University's Medical Campus, it is part of a national network of secure facilities constructed to study infectious diseases of major public health concern. The NEIDL allows for basic, translational, and clinical phases of research to be carried out in a single facility with the overall goal of accelerating understanding, treatment, and prevention of infectious diseases. The NEIDL will also act as a center of excellence providing training and education in all aspects of biocontainment research. Within every detail of NEIDL operations is a primary emphasis on safety and security. The ultramodern NEIDL has required a new approach to communications technology solutions in order to ensure safety and security and meet the needs of investigators working in this complex building. This article discusses the implementation of secure wireless networks and private cloud computing to promote operational efficiency, biosecurity, and biosafety with additional energy-saving advantages. The utilization of a dedicated data center, virtualized servers, virtualized desktop integration, multichannel secure wireless networks, and a NEIDL-dedicated Voice over Internet Protocol (VoIP) network are all discussed. PMID:24535887

  9. Advanced Simulation and Computing FY07-08 Implementation Plan Volume 2

    SciTech Connect

    Kusnezov, D; Hale, A; McCoy, M; Hopson, J

    2006-06-22

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program will require the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from

  10. Advanced Simulation & Computing FY09-FY10 Implementation Plan Volume 2, Rev. 0

    SciTech Connect

    Meisner, R; Perry, J; McCoy, M; Hopson, J

    2008-04-30

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  11. Advanced Simulation and Computing FY10-11 Implementation Plan Volume 2, Rev. 0

    SciTech Connect

    Carnes, B

    2009-06-08

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one that

  12. Advanced Simulation and Computing FY08-09 Implementation Plan Volume 2 Revision 0

    SciTech Connect

    McCoy, M; Kusnezov, D; Bikkel, T; Hopson, J

    2007-04-25

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  13. Advanced Simulation and Computing FY09-FY10 Implementation Plan Volume 2, Rev. 1

    SciTech Connect

    Kissel, L

    2009-04-01

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one that

  14. Advanced Simulation and Computing FY09-FY10 Implementation Plan, Volume 2, Revision 0.5

    SciTech Connect

    Meisner, R; Hopson, J; Peery, J; McCoy, M

    2008-10-07

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  15. Advanced Simulation and Computing FY10-FY11 Implementation Plan Volume 2, Rev. 0.5

    SciTech Connect

    Meisner, R; Peery, J; McCoy, M; Hopson, J

    2009-09-08

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering (D&E) programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model

  16. Advanced Simulation and Computing Fiscal Year 2011-2012 Implementation Plan, Revision 0

    SciTech Connect

    McCoy, M; Phillips, J; Hpson, J; Meisner, R

    2010-04-22

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering (D&E) programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model

  17. Advanced Simulation and Computing FY08-09 Implementation Plan, Volume 2, Revision 0.5

    SciTech Connect

    Kusnezov, D; Bickel, T; McCoy, M; Hopson, J

    2007-09-13

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from

  18. Restructuring the CS 1 classroom: Examining the effect of open laboratory-based classes vs. closed laboratory-based classes on Computer Science 1 students' achievement and attitudes toward computers and computer courses

    NASA Astrophysics Data System (ADS)

    Henderson, Jean Foster

    The purpose of this study was to assess the effect of classroom restructuring involving computer laboratories on student achievement and student attitudes toward computers and computer courses. The effects of the targeted student attributes of gender, previous programming experience, math background, and learning style were also examined. The open lab-based class structure consisted of a traditional lecture class with a separate, unscheduled lab component in which lab assignments were completed outside of class; the closed lab-based class structure integrated a lab component within the lecture class so that half the class was reserved for lecture and half the class was reserved for students to complete lab assignments by working cooperatively with each other and under the supervision and guidance of the instructor. The sample consisted of 71 students enrolled in four intact classes of Computer Science I during the fall and spring semesters of the 2006--2007 school year at two southern universities: two classes were held in the fall (one at each university) and two classes were held in the spring (one at each university). A counterbalanced repeated measures design was used in which all students experienced both class structures for half of each semester. The order of control and treatment was rotated among the four classes. All students received the same amount of class and instructor time. A multivariate analysis of variance (MANOVA) via a multiple regression strategy was used to test the study's hypotheses. Although the overall MANOVA model was statistically significant, independent follow-up univariate analyses relative to each dependent measure found that the only significant research factor was math background: Students whose mathematics background was at the level of Calculus I or higher had significantly higher student achievement than students whose mathematics background was less than Calculus I. The results suggest that classroom structures that

  19. Naval Research Laboratory's programs in advanced indium phosphide solar cell development

    NASA Astrophysics Data System (ADS)

    Summers, Geoffrey P.

    1995-10-01

    The Naval Research Laboratory has been involved in developing InP solar cell technology since 1988. The purpose of these programs was to produce advanced cells for use in very high radiation environments, either as a result of operating satellites in the Van Allen belts or for very long duration missions in other orbits. Richard Statler was technical representative on the first program, with Spire Corporation as the contractor, which eventually produced several hundred, high efficiency 2 x 2 sq cm single crystal InP cells. The shallow homojunction technology which was developed in this program enabled cells to be made with AMO, one sun efficiencies greater than 19%. Many of these cells have been flown on space experiments, including PASP Plus, which have confirmed the high radiation resistance of InP cells. NRL has also published widely on the radiation response of these cells and also on radiation-induced defect levels detected by DLTS, especially the work of Rob Walters and Scott Messenger. In 1990 NRL began another Navy-sponsored program with Tim Coutts and Mark Wanlass at the National Renewable Energy Laboratory (NREL), to develop a one sun, two terminal space version of the InP-InGaAs tandem junction cell being investigated at NREL for terrestrial applications. These cells were grown on InP substrates. Several cells with AM0, one sun efficiencies greater than 22% were produced. Two 2 x 2 sq cm cells were incorporated on the STRV lA/B solar cell experiment. These were the only two junction, tandem cells on the STRV experiment. The high cost and relative brittleness of InP wafers meant that if InP cell technology were to become a viable space power source, the superior radiation resistance of InP would have to be combined with a cheaper and more robust substrate. The main technical challenge was to overcome the effect of the dislocations produced by the lattice mismatch at the interface of the two materials. Over the last few years, NRL and Steve Wojtczuk at

  20. Naval Research Laboratory's programs in advanced indium phosphide solar cell development

    NASA Technical Reports Server (NTRS)

    Summers, Geoffrey P.

    1995-01-01

    The Naval Research Laboratory has been involved in developing InP solar cell technology since 1988. The purpose of these programs was to produce advanced cells for use in very high radiation environments, either as a result of operating satellites in the Van Allen belts or for very long duration missions in other orbits. Richard Statler was technical representative on the first program, with Spire Corporation as the contractor, which eventually produced several hundred, high efficiency 2 x 2 sq cm single crystal InP cells. The shallow homojunction technology which was developed in this program enabled cells to be made with AMO, one sun efficiencies greater than 19%. Many of these cells have been flown on space experiments, including PASP Plus, which have confirmed the high radiation resistance of InP cells. NRL has also published widely on the radiation response of these cells and also on radiation-induced defect levels detected by DLTS, especially the work of Rob Walters and Scott Messenger. In 1990 NRL began another Navy-sponsored program with Tim Coutts and Mark Wanlass at the National Renewable Energy Laboratory (NREL), to develop a one sun, two terminal space version of the InP-InGaAs tandem junction cell being investigated at NREL for terrestrial applications. These cells were grown on InP substrates. Several cells with AM0, one sun efficiencies greater than 22% were produced. Two 2 x 2 sq cm cells were incorporated on the STRV lA/B solar cell experiment. These were the only two junction, tandem cells on the STRV experiment. The high cost and relative brittleness of InP wafers meant that if InP cell technology were to become a viable space power source, the superior radiation resistance of InP would have to be combined with a cheaper and more robust substrate. The main technical challenge was to overcome the effect of the dislocations produced by the lattice mismatch at the interface of the two materials. Over the last few years, NRL and Steve Wojtczuk at

  1. Advances with the Chinese anthelminthic drug tribendimidine in clinical trials and laboratory investigations.

    PubMed

    Xiao, Shu-Hua; Utzinger, Jürg; Tanner, Marcel; Keiser, Jennifer; Xue, Jian

    2013-05-01

    The anthelminthic drug tribendimidine has been approved by Chinese authorities for human use in 2004, and a first comprehensive review was published in Acta Tropica in 2005. Here, we summarise further advances made through additional clinical trials and laboratory investigations. Two phase IV trials have been conducted in the People's Republic of China, the first one enrolling 1292 adolescents and adults aged 15-70 years and the second one conducted with 899 children aged 4-14 years who were infected with one or multiple species of soil-transmitted helminths. Oral tribendimidine (single 400mg enteric-coated tablet given to adolescents/adults and 200mg to children) showed high cure rates against Ascaris lumbricoides (90.1-95.0%) and moderate-to-high cure rates against hookworm (82.0-88.4%). Another trial done in school-aged children using a rigorous diagnostic approach found a cure rate against hookworm of 76.5%. A single oral dose of tribendimidine showed only low cure rates against Trichuris trichiura (23.9-36.8%) confirming previous results. Tribendimidine administered to children infected with Enterobius vermicularis (two doses of 200mg each on consecutive days) resulted in a high cure rate (97.1%). Importantly, a series of randomised, exploratory trials revealed that tribendimidine shows interesting activity against the liver flukes Opisthorchis viverrini and Clonorchis sinensis, the tapeworm Taenia spp. and the threadworm Strongyloides stercoralis with respective cure rates of 70.0%, 40.0%, 53.3% and 36.4%. Pharmacokinetic studies in healthy Chinese volunteers indicated that after oral administration of tribendimidine, no parent drug was detected in plasma, but its primary metabolite, p-(1-dimethylamino ethylimino) aniline (aminoamidine, deacylated amidantel) (dADT), was found in plasma. dADT is then further metabolised to acetylated dADT (AdADT). dADT exhibits activity against several species of hookworm and C. sinensis in experimental studies, similar to

  2. Behavior of vortices generated by an advancing ejecta curtain in theory, in the laboratory, and on Mars

    NASA Technical Reports Server (NTRS)

    Barnouin, O. S.; Schultz, P. H.

    1993-01-01

    Several papers assess the interaction between an atmosphere and advancing ejecta to assess possible atmospheric processes affecting ejecta emplacement. Ejecta travel through an atmosphere in two modes: larger ejecta blocks follow ballistic trajectories unhindered by the atmosphere; finer ejecta are entrained in a turbulent basal cloud, which develops as the advancing ejecta curtain generates strong atmospheric winds. Laboratory experiments reveal that this cloud of fine ejecta produce ramparts, flow lobes, or radial scouring that superposes larger ballistic ejecta emplaced earlier. Martian, Venusian, and terrestrial ejecta facies can be interpreted in terms of processes observed in the laboratory with appropriate first-order corrections for scaling. A continuum model of the atmospheric flow around an advancing inclined plate simulated and reproduced some of the complex flow patterns observed in front and at the top of the curtain. Here we consider improvements to the model to compare quantitatively the approximate position of ejecta deposition (i.e., run-out distance) with laboratory experiments and Martian ejecta facies.

  3. Further advancements for large area-detector based computed tomography system

    SciTech Connect

    Davis, A. W.; Keating, S. C.; Claytor, T. N.

    2001-01-01

    We present advancements made to a large area-detector based system for industrial x-ray computed tomography. Past performance improvements in data acquisition speeds were made by use of high-resolution large area, flat-panel amorphous-silicon (a-Si) detectors. The detectors have proven, over several years, to be a robust alternative to CCD-optics and image intensifier CT systems. These detectors also provide the advantage of area detection as compared with the single slice geometry of linear array systems. New advancements in this system include parallel processing of sinogram reconstructions, improved visualization software and migration to frame-rate a-Si detectors. Parallel processing provides significant speed improvements for data reconstruction, and is implemented for parallel-beam, fan-beam and Feldkamp cone-beam reconstruction algorithms. Reconstruction times are reduced by an order of magnitude by use of a cluster of ten or more equal-speed computers. Advancements in data visualization are made through interactive software, which allows interrogation of the full three-dimensional dataset. Inspection examples presented in this paper include an electromechanical device, a nonliving biological specimen and a press-cast plastic specimen. We also present a commonplace item for the benefit of the layperson.

  4. Determining the hydraulic properties of saturated, low-permeability geological materials in the laboratory: Advances in theory and practice

    USGS Publications Warehouse

    Zhang, M.; Takahashi, M.; Morin, R.H.; Endo, H.; Esaki, T.; ,

    2002-01-01

    The accurate hydraulic characterization of low-permeability subsurface environments has important practical significance. In order to examine this issue from the perspective of laboratory-based approaches, we review some recent advancements in the theoretical analyses of three different laboratory techniques specifically applied to low-permeability geologic materials: constant-head, constant flow-rate and transient-pulse permeability tests. Some potential strategies for effectively decreasing the time required to confidently estimate the permeability of these materials are presented. In addition, a new and versatile laboratory system is introduced that can implement any of these three test methods while simultaneously subjecting a specimen to high confining pressures and pore pressures, thereby simulating in situ conditions at great depths. The capabilities and advantages of this innovative system are demonstrated using experimental data derived from Shirahama sandstone and Inada granite, two rock types widely encountered in Japan.

  5. Recent advances in computer-aided drug design as applied to anti-influenza drug discovery.

    PubMed

    Mallipeddi, Prema L; Kumar, Gyanendra; White, Stephen W; Webb, Thomas R

    2014-01-01

    Influenza is a seasonal and serious health threat, and the recent outbreak of H7N9 following the pandemic spread of H1N1 in 2009 has served to emphasize the importance of anti-influenza drug discovery. Zanamivir (Relenza™) and oseltamivir (Tamiflu(®)) are two antiviral drugs currently recommended by the CDC for treating influenza. Both are examples of the successful application of structure-based drug design strategies. These strategies have combined computer- based approaches, such as docking- and pharmacophore-based virtual screening with X-ray crystallographic structural analyses. Docking is a routinely used computational method to identify potential hits from large compound libraries. This method has evolved from simple rigid docking approaches to flexible docking methods to handle receptor flexibility and to enhance hit rates in virtual screening. Virtual screening approaches can employ both ligand-based and structurebased pharmacophore models depending on the available information. The exponential growth in computing power has increasingly facilitated the application of computer-aided methods in drug discovery, and they now play significant roles in the search for novel therapeutics. An overview of these computational tools is presented in this review, and recent advances and challenges will be discussed. The focus of the review will be anti-influenza drug discovery and how advances in our understanding of viral biology have led to the discovery of novel influenza protein targets. Also discussed will be strategies to circumvent the problem of resistance emerging from rapid mutations that has seriously compromised the efficacy of current anti-influenza therapies.

  6. Advanced computational simulation for design and manufacturing of lightweight material components for automotive applications

    SciTech Connect

    Simunovic, S.; Aramayo, G.A.; Zacharia, T.; Toridis, T.G.; Bandak, F.; Ragland, C.L.

    1997-04-01

    Computational vehicle models for the analysis of lightweight material performance in automobiles have been developed through collaboration between Oak Ridge National Laboratory, the National Highway Transportation Safety Administration, and George Washington University. The vehicle models have been verified against experimental data obtained from vehicle collisions. The crashed vehicles were analyzed, and the main impact energy dissipation mechanisms were identified and characterized. Important structural parts were extracted and digitized and directly compared with simulation results. High-performance computing played a key role in the model development because it allowed for rapid computational simulations and model modifications. The deformation of the computational model shows a very good agreement with the experiments. This report documents the modifications made to the computational model and relates them to the observations and findings on the test vehicle. Procedural guidelines are also provided that the authors believe need to be followed to create realistic models of passenger vehicles that could be used to evaluate the performance of lightweight materials in automotive structural components.

  7. High Resolution Traction Force Microscopy Based on Experimental and Computational Advances

    PubMed Central

    Sabass, Benedikt; Gardel, Margaret L.; Waterman, Clare M.; Schwarz, Ulrich S.

    2008-01-01

    Cell adhesion and migration crucially depend on the transmission of actomyosin-generated forces through sites of focal adhesion to the extracellular matrix. Here we report experimental and computational advances in improving the resolution and reliability of traction force microscopy. First, we introduce the use of two differently colored nanobeads as fiducial markers in polyacrylamide gels and explain how the displacement field can be computationally extracted from the fluorescence data. Second, we present different improvements regarding standard methods for force reconstruction from the displacement field, which are the boundary element method, Fourier-transform traction cytometry, and traction reconstruction with point forces. Using extensive data simulation, we show that the spatial resolution of the boundary element method can be improved considerably by splitting the elastic field into near, intermediate, and far field. Fourier-transform traction cytometry requires considerably less computer time, but can achieve a comparable resolution only when combined with Wiener filtering or appropriate regularization schemes. Both methods tend to underestimate forces, especially at small adhesion sites. Traction reconstruction with point forces does not suffer from this limitation, but is only applicable with stationary and well-developed adhesion sites. Third, we combine these advances and for the first time reconstruct fibroblast traction with a spatial resolution of ∼1 μm. PMID:17827246

  8. The Advancement in Using Remote Laboratories in Electrical Engineering Education: A Review

    ERIC Educational Resources Information Center

    Almarshoud, A. F.

    2011-01-01

    The rapid development in Internet technology and its big popularity has led some universities around the world to incorporate web-based learning in some of their programmes. The present paper introduces a comprehensive survey of the publications about using remote laboratories in electrical engineering education. Remote laboratories are web-based,…

  9. Effects of Combined Hands-on Laboratory and Computer Modeling on Student Learning of Gas Laws: A Quasi-Experimental Study

    ERIC Educational Resources Information Center

    Liu, Xiufeng

    2006-01-01

    Based on current theories of chemistry learning, this study intends to test a hypothesis that computer modeling enhanced hands-on chemistry laboratories are more effective than hands-on laboratories or computer modeling laboratories alone in facilitating high school students' understanding of chemistry concepts. Thirty-three high school chemistry…

  10. The advancement in using remote laboratories in electrical engineering education: a review

    NASA Astrophysics Data System (ADS)

    Almarshoud, A. F.

    2011-10-01

    The rapid development in Internet technology and its big popularity has led some universities around the world to incorporate web-based learning in some of their programmes. The present paper introduces a comprehensive survey of the publications about using remote laboratories in electrical engineering education. Remote laboratories are web-based, real-time laboratories that enable students to measure and control the measurements remotely in their own time. The survey highlights the features of many recent remote laboratories and demonstrates the software and networking technologies used. The paper provides a comprehensive overview on several aspects related to remote laboratories development. The paper concentrates on the publications appearing during the last decade. The review is arranged according to the area of specialisation, then chronologically.

  11. Computational mechanics - Advances and trends; Proceedings of the Session - Future directions of Computational Mechanics of the ASME Winter Annual Meeting, Anaheim, CA, Dec. 7-12, 1986

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Editor)

    1986-01-01

    The papers contained in this volume provide an overview of the advances made in a number of aspects of computational mechanics, identify some of the anticipated industry needs in this area, discuss the opportunities provided by new hardware and parallel algorithms, and outline some of the current government programs in computational mechanics. Papers are included on advances and trends in parallel algorithms, supercomputers for engineering analysis, material modeling in nonlinear finite-element analysis, the Navier-Stokes computer, and future finite-element software systems.

  12. Body surface area measurement in laboratory miniature pigs using a computed tomography scanner.

    PubMed

    Itoh, Tadashi; Kawabe, Mifumi; Nagase, Takahiko; Endo, Katsumi; Miyoshi, Masafumi; Miyahara, Kazuro

    2016-01-01

    The body surface area (BSA) of an organism is an important parameter for evaluating physiological functions. In drug development, normalization by BSA is an appropriate method for extrapolating doses between species. The BSA of animals has generally been estimated by multiplying a constant by the power of the body weight (BW). Recently, the use of miniature pigs in non-clinical studies for medical drugs or devices has gradually been increasing. However, verification of their BSA is not as yet sufficient. In this study, we measured the BSAs of 40 laboratory miniature pigs (11 males and 9 females of Göttingen minipig and 14 males and 6 females of Nippon Institute for Biological Science [NIBS] miniature pig) by analyzing computed tomography (CT) images, since measurements using a CT scanner were expected to more precisely determine BSA than classical measuring techniques. The measurement results showed the BSAs of the 20 Göttingen minipigs to range from 0.4358 to 0.8356 m(2) (the working BW range: 12.7-37.0 kg) and 20 NIBS miniature pigs to range from 0.2906 to 0.8675 m(2) (the working BW range: 7.9-41.5 kg). Since accuracy and reproducibility were confirmed by measuring the surface area of an acrylic cuboid, we concluded the measurement method employed in this study to be very reliable. We propose the following estimating formula for BSA of laboratory miniature pigs: 100 × BSA [m(2)] = 7.98 × BW [kg](2/3). PMID:27665773

  13. Advanced computer techniques for inverse modeling of electric current in cardiac tissue

    SciTech Connect

    Hutchinson, S.A.; Romero, L.A.; Diegert, C.F.

    1996-08-01

    For many years, ECG`s and vector cardiograms have been the tools of choice for non-invasive diagnosis of cardiac conduction problems, such as found in reentrant tachycardia or Wolff-Parkinson-White (WPW) syndrome. Through skillful analysis of these skin-surface measurements of cardiac generated electric currents, a physician can deduce the general location of heart conduction irregularities. Using a combination of high-fidelity geometry modeling, advanced mathematical algorithms and massively parallel computing, Sandia`s approach would provide much more accurate information and thus allow the physician to pinpoint the source of an arrhythmia or abnormal conduction pathway.

  14. Unenhanced CT in the evaluation of urinary calculi: application of advanced computer methods.

    PubMed

    Olcott, E W; Sommer, F G

    1999-04-01

    Recent advances in computer hardware and software technology enable radiologists to examine tissues and structures using three-dimensional figures constructed from the multiple planar images acquired during a spiral CT examination. Three-dimensional CT techniques permit the linear dimensions of renal calculi to be determined along all three coordinate axes with a high degree of accuracy and enable direct volumetric analysis of calculi, yielding information that is not available from any other diagnostic modality. Additionally, three-dimensional techniques can help to identify and localize calculi in patients with suspected urinary colic.

  15. DOE Advanced Scientific Computing Advisory Committee (ASCAC) Subcommittee Report on Scientific and Technical Information

    SciTech Connect

    Hey, Tony; Agarwal, Deborah; Borgman, Christine; Cartaro, Concetta; Crivelli, Silvia; Van Dam, Kerstin Kleese; Luce, Richard; Arjun, Shankar; Trefethen, Anne; Wade, Alex; Williams, Dean

    2015-09-04

    The Advanced Scientific Computing Advisory Committee (ASCAC) was charged to form a standing subcommittee to review the Department of Energy’s Office of Scientific and Technical Information (OSTI) and to begin by assessing the quality and effectiveness of OSTI’s recent and current products and services and to comment on its mission and future directions in the rapidly changing environment for scientific publication and data. The Committee met with OSTI staff and reviewed available products, services and other materials. This report summaries their initial findings and recommendations.

  16. Advanced Computing Technologies for Rocket Engine Propulsion Systems: Object-Oriented Design with C++

    NASA Technical Reports Server (NTRS)

    Bekele, Gete

    2002-01-01

    This document explores the use of advanced computer technologies with an emphasis on object-oriented design to be applied in the development of software for a rocket engine to improve vehicle safety and reliability. The primary focus is on phase one of this project, the smart start sequence module. The objectives are: 1) To use current sound software engineering practices, object-orientation; 2) To improve on software development time, maintenance, execution and management; 3) To provide an alternate design choice for control, implementation, and performance.

  17. Computational Models of Exercise on the Advanced Resistance Exercise Device (ARED)

    NASA Technical Reports Server (NTRS)

    Newby, Nate; Caldwell, Erin; Scott-Pandorf, Melissa; Peters,Brian; Fincke, Renita; DeWitt, John; Poutz-Snyder, Lori

    2011-01-01

    Muscle and bone loss remain a concern for crew returning from space flight. The advanced resistance exercise device (ARED) is used for on-orbit resistance exercise to help mitigate these losses. However, characterization of how the ARED loads the body in microgravity has yet to be determined. Computational models allow us to analyze ARED exercise in both 1G and 0G environments. To this end, biomechanical models of the squat, single-leg squat, and deadlift exercise on the ARED have been developed to further investigate bone and muscle forces resulting from the exercises.

  18. Advanced Computational Thermal Fluid Physics (CTFP) and Its Assessment for Light Water Reactors and Supercritical Reactors

    SciTech Connect

    D.M. McEligot; K. G. Condie; G. E. McCreery; H. M. McIlroy; R. J. Pink; L.E. Hochreiter; J.D. Jackson; R.H. Pletcher; B.L. Smith; P. Vukoslavcevic; J.M. Wallace; J.Y. Yoo; J.S. Lee; S.T. Ro; S.O. Park

    2005-10-01

    Background: The ultimate goal of the study is the improvement of predictive methods for safety analyses and design of Generation IV reactor systems such as supercritical water reactors (SCWR) for higher efficiency, improved performance and operation, design simplification, enhanced safety and reduced waste and cost. The objective of this Korean / US / laboratory / university collaboration of coupled fundamental computational and experimental studies is to develop the supporting knowledge needed for improved predictive techniques for use in the technology development of Generation IV reactor concepts and their passive safety systems. The present study emphasizes SCWR concepts in the Generation IV program.

  19. Advanced Simulation and Computing Fiscal Year 2016 Implementation Plan, Version 0

    SciTech Connect

    McCoy, M.; Archer, B.; Hendrickson, B.

    2015-08-27

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The purpose of this IP is to outline key work requirements to be performed and to control individual work activities within the scope of work. Contractors may not deviate from this plan without a revised WA or subsequent IP.

  20. Computational methods to extract meaning from text and advance theories of human cognition.

    PubMed

    McNamara, Danielle S

    2011-01-01

    Over the past two decades, researchers have made great advances in the area of computational methods for extracting meaning from text. This research has to a large extent been spurred by the development of latent semantic analysis (LSA), a method for extracting and representing the meaning of words using statistical computations applied to large corpora of text. Since the advent of LSA, researchers have developed and tested alternative statistical methods designed to detect and analyze meaning in text corpora. This research exemplifies how statistical models of semantics play an important role in our understanding of cognition and contribute to the field of cognitive science. Importantly, these models afford large-scale representations of human knowledge and allow researchers to explore various questions regarding knowledge, discourse processing, text comprehension, and language. This topic includes the latest progress by the leading researchers in the endeavor to go beyond LSA.

  1. Computational methods to extract meaning from text and advance theories of human cognition.

    PubMed

    McNamara, Danielle S

    2011-01-01

    Over the past two decades, researchers have made great advances in the area of computational methods for extracting meaning from text. This research has to a large extent been spurred by the development of latent semantic analysis (LSA), a method for extracting and representing the meaning of words using statistical computations applied to large corpora of text. Since the advent of LSA, researchers have developed and tested alternative statistical methods designed to detect and analyze meaning in text corpora. This research exemplifies how statistical models of semantics play an important role in our understanding of cognition and contribute to the field of cognitive science. Importantly, these models afford large-scale representations of human knowledge and allow researchers to explore various questions regarding knowledge, discourse processing, text comprehension, and language. This topic includes the latest progress by the leading researchers in the endeavor to go beyond LSA. PMID:25164173

  2. Integrated Computational Materials Engineering (ICME) for Third Generation Advanced High-Strength Steel Development

    SciTech Connect

    Savic, Vesna; Hector, Louis G.; Ezzat, Hesham; Sachdev, Anil K.; Quinn, James; Krupitzer, Ronald; Sun, Xin

    2015-06-01

    This paper presents an overview of a four-year project focused on development of an integrated computational materials engineering (ICME) toolset for third generation advanced high-strength steels (3GAHSS). Following a brief look at ICME as an emerging discipline within the Materials Genome Initiative, technical tasks in the ICME project will be discussed. Specific aims of the individual tasks are multi-scale, microstructure-based material model development using state-of-the-art computational and experimental techniques, forming, toolset assembly, design optimization, integration and technical cost modeling. The integrated approach is initially illustrated using a 980 grade transformation induced plasticity (TRIP) steel, subject to a two-step quenching and partitioning (Q&P) heat treatment, as an example.

  3. LABORATORY PROCESS CONTROLLER USING NATURAL LANGUAGE COMMANDS FROM A PERSONAL COMPUTER

    NASA Technical Reports Server (NTRS)

    Will, H.

    1994-01-01

    The complex environment of the typical research laboratory requires flexible process control. This program provides natural language process control from an IBM PC or compatible machine. Sometimes process control schedules require changes frequently, even several times per day. These changes may include adding, deleting, and rearranging steps in a process. This program sets up a process control system that can either run without an operator, or be run by workers with limited programming skills. The software system includes three programs. Two of the programs, written in FORTRAN77, record data and control research processes. The third program, written in Pascal, generates the FORTRAN subroutines used by the other two programs to identify the user commands with the user-written device drivers. The software system also includes an input data set which allows the user to define the user commands which are to be executed by the computer. To set the system up the operator writes device driver routines for all of the controlled devices. Once set up, this system requires only an input file containing natural language command lines which tell the system what to do and when to do it. The operator can make up custom commands for operating and taking data from external research equipment at any time of the day or night without the operator in attendance. This process control system requires a personal computer operating under MS-DOS with suitable hardware interfaces to all controlled devices. The program requires a FORTRAN77 compiler and user-written device drivers. This program was developed in 1989 and has a memory requirement of about 62 Kbytes.

  4. Computational Fluid Dynamics Simulation of the Hydrogen Reduction of Magnetite Concentrate in a Laboratory Flash Reactor

    NASA Astrophysics Data System (ADS)

    Fan, De-Qiu; Sohn, H. Y.; Mohassab, Yousef; Elzohiery, Mohamed

    2016-08-01

    A three-dimensional computational fluid dynamics (CFD) model was developed to study the hydrogen reduction of magnetite concentrate particles in a laboratory flash reactor representing a novel flash ironmaking process. The model was used to simulate the fluid flow, heat transfer, and chemical reactions involved. The governing equations for the gas phase were solved in the Eulerian frame of reference while the particles were tracked in the Lagrangian framework. The change in the particle mass was related to the chemical reaction and the particle temperature was calculated by taking into consideration the heat of reaction, convection, and radiation. The stochastic trajectory model was used to describe particle dispersion due to turbulence. Partial combustion of H2 by O2 injected through a non-premixed burner was also simulated in this study. The partial combustion mechanism used in this model consisted of seven chemical reactions involving six species. The temperature profiles and reduction degrees obtained from the simulations satisfactorily agreed with the experimental measurements.

  5. High Performance Computing: Advanced Research Projects Agency Should Do More To Foster Program Goals. Report to the Chairman, Committee on Armed Services, House of Representatives.

    ERIC Educational Resources Information Center

    General Accounting Office, Washington, DC. Information Management and Technology Div.

    High-performance computing refers to the use of advanced computing technologies to solve highly complex problems in the shortest possible time. The federal High Performance Computing and Communications Initiative of the Advanced Research Project Agency (ARPA) attempts to accelerate availability and use of high performance computers and networks.…

  6. Proceedings of the workshop on advanced computer technologies and biological sequencing

    SciTech Connect

    Not Available

    1988-11-01

    The participants in the workshop agree that advanced computer technologies will play a significant role in biological sequencing. They suggest a strategy based on the following four recommendations: define a set of model projects, and develop a complete set of data management and analysis tools for these model projects; seek to consolidate appropriate databases, while allowing for the flexible development and design of tools that will permit further consolidation, In the longer term, develop a coordinated effort that will allow networking of all relevant databases; encourage the development, collection, and distribution of analysis tools; and address user interface issues and encourage the development of graphics and visualization tools. Section 3 of this report elaborates on each of these recommendations. Section 2 contains the tutorials presented at the workshop and a summary of the comments made in the discussion period following the tutorials. These tutorials were an integral part of the workshop: they provided a forum for the discussion of the needs of biologists in managing and analyzing biological sequencing data, and the capabilities of advanced computer technologies in meeting those needs. Also included in Section 2 is an informal paper on fifth generation technologies, prepared by two of the participants. Appendix A contains the documents (edited for grammar) prepared by the participants and groups at the workshop. Appendix B contains the workshop program.

  7. Computing and information services at the Jet Propulsion Laboratory - A management approach to a diversity of needs

    NASA Technical Reports Server (NTRS)

    Felberg, F. H.

    1984-01-01

    The Jet Propulsion Laboratory, a research and development organization with about 5,000 employees, presents a complicated set of requirements for an institutional system of computing and informational services. The approach taken by JPL in meeting this challenge is one of controlled flexibility. A central communications network is provided, together with selected computing facilities for common use. At the same time, staff members are given considerable discretion in choosing the mini- and microcomputers that they believe will best serve their needs. Consultation services, computer education, and other support functions are also provided.

  8. To Compare the Effects of Computer Based Learning and the Laboratory Based Learning on Students' Achievement Regarding Electric Circuits

    ERIC Educational Resources Information Center

    Bayrak, Bekir; Kanli, Uygar; Ingec, Sebnem Kandil

    2007-01-01

    In this study, the research problem was: "Is the computer based physics instruction as effective as laboratory intensive physics instruction with regards to academic success on electric circuits 9th grade students?" For this research of experimental quality the design of pre-test and post-test are applied with an experiment and a control group.…

  9. Comparison of a Computer Simulation Program and a Traditional Laboratory Practical Class for Teaching the Principles of Intestinal Absorption.

    ERIC Educational Resources Information Center

    Dewhurst, D. G.; And Others

    1994-01-01

    Evaluates the effectiveness of an interactive computer-assisted learning program for undergraduate students that simulates experiments performed using isolated, everted sacs of rat small intestine. The program is designed to offer an alternative student-centered approach to traditional laboratory-based practical classes. Knowledge gain of students…

  10. Combining a Laboratory Practical Class with a Computer Simulation: Studies on the Synthesis of Urea in Isolated Hepatocytes.

    ERIC Educational Resources Information Center

    Bender, David A.

    1986-01-01

    Describes how a computer simulation is used with a laboratory experiment on the synthesis of urea in isolated hepatocytes. The simulation calculates the amount of urea formed and the amount of ammonium remaining as the concentrations of ornithine, citrulline, argininosuccinate, arginine, and aspartate are altered. (JN)

  11. The Use and Benefits of Computer Aided Learning in the Assessment of the Laboratory Exercise "Enzyme Induction in Escherichia coli".

    ERIC Educational Resources Information Center

    Pamula, F.; And Others

    1995-01-01

    Describes an interactive computer program written to provide accurate and immediate feedback to students while they are processing experimental data. Discusses the problems inherent in laboratory courses that led to the development of this program. Advantages of the software include allowing students to work at their own pace in a nonthreatening…

  12. A Low-Cost Computer-Controlled Arduino-Based Educational Laboratory System for Teaching the Fundamentals of Photovoltaic Cells

    ERIC Educational Resources Information Center

    Zachariadou, K.; Yiasemides, K.; Trougkakos, N.

    2012-01-01

    We present a low-cost, fully computer-controlled, Arduino-based, educational laboratory (SolarInsight) to be used in undergraduate university courses concerned with electrical engineering and physics. The major goal of the system is to provide students with the necessary instrumentation, software tools and methodology in order to learn fundamental…

  13. Expanding Distance Education in the Spatial Sciences through Virtual Learning Entities and a Virtual GIS Computer Laboratory

    ERIC Educational Resources Information Center

    Grunwald, S.; Ramasundaram, V.; Bruland, G. L.; Jesseman, D. K.

    2007-01-01

    In this article we describe the implementation of an emerging virtual learning environment to teach GIS and spatial sciences to distance education graduate students. We discuss the benefits and constraints of our mixed architecture with the main focus on the innovative hybrid architecture of the virtual GIS computer laboratory. Criteria that were…

  14. Setting up a Low-Cost Lab Management System for a Multi-Purpose Computing Laboratory Using Virtualisation Technology

    ERIC Educational Resources Information Center

    Mok, Heng Ngee; Lee, Yeow Leong; Tan, Wee Kiat

    2012-01-01

    This paper describes how a generic computer laboratory equipped with 52 workstations is set up for teaching IT-related courses and other general purpose usage. The authors have successfully constructed a lab management system based on decentralised, client-side software virtualisation technology using Linux and free software tools from VMware that…

  15. Reassigning the Structures of Natural Products Using NMR Chemical Shifts Computed with Quantum Mechanics: A Laboratory Exercise

    ERIC Educational Resources Information Center

    Palazzo, Teresa A.; Truong, Tiana T.; Wong, Shirley M. T.; Mack, Emma T.; Lodewyk, Michael W.; Harrison, Jason G.; Gamage, R. Alan; Siegel, Justin B.; Kurth, Mark J.; Tantillo, Dean J.

    2015-01-01

    An applied computational chemistry laboratory exercise is described in which students use modern quantum chemical calculations of chemical shifts to assign the structure of a recently isolated natural product. A pre/post assessment was used to measure student learning gains and verify that students demonstrated proficiency of key learning…

  16. To Compare the Effects of Computer Based Learning and the Laboratory Based Learning on Students' Achievement Regarding Electric Circuits

    ERIC Educational Resources Information Center

    Bayrak, Bekir; Kanli, Uygar; Kandil Ingeç, Sebnem

    2007-01-01

    In this study, the research problem was: "Is the computer based physics instruction as effective as laboratory intensive physics instruction with regards to academic success on electric circuits 9th grade students?" For this research of experimental quality the design of pre-test and post-test are applied with an experiment and a control…

  17. TRAC-PF1: an advanced best-estimate computer program for pressurized water reactor analysis

    SciTech Connect

    Liles, D.R.; Mahaffy, J.H.

    1984-02-01

    The Transient Reactor Analysis Code (TRAC) is being developed at the Los Alamos National Laboratory to provide advanced best-estimate predictions of postulated accidents in light water reactors. The TRAC-PF1 program provides this capability for pressurized water reactors and for many thermal-hydraulic experimental facilities. The code features either a one-dimensional or a three-dimensional treatment of the pressure vessel and its associated internals; a two-phase, two-fluid nonequilibrium hydrodynamics model with a noncondensable gas field; flow-regime-dependent constitutive equation treatment; optional reflood tracking capability for both bottom flood and falling-film quench fronts; and consistent treatment of entire accident sequences including the generation of consistent initial conditions. This report describes the thermal-hydraulic models and the numerical solution methods used in the code. Detailed programming and user information also are provided.

  18. Utilization of advanced metal-ceramic technology: clinical and laboratory procedures for a lower-fusing porcelain.

    PubMed

    McLaren, E A

    1998-09-01

    Metal-ceramic restorations remain the most widely accepted type of indirect restorative modality, and have been applied successfully for years. Recent advances in material science have resulted in the development of a new class of metal-ceramic materials that have been termed lower-fusing ceramics. Following proper procedures for preparation and metal framework design, these metal-ceramic porcelains achieve the aesthetics normally demonstrated by conventional all-ceramic restorations. This article provides an overview of the clinical and laboratory processes utilizing these materials and is illustrated by two case presentations.

  19. Recent advances in computational methodology for simulation of mechanical circulatory assist devices

    PubMed Central

    Marsden, Alison L.; Bazilevs, Yuri; Long, Christopher C.; Behr, Marek

    2014-01-01

    Ventricular assist devices (VADs) provide mechanical circulatory support to offload the work of one or both ventricles during heart failure. They are used in the clinical setting as destination therapy, as bridge to transplant, or more recently as bridge to recovery to allow for myocardial remodeling. Recent developments in computational simulation allow for detailed assessment of VAD hemodynamics for device design and optimization for both children and adults. Here, we provide a focused review of the recent literature on finite element methods and optimization for VAD simulations. As VAD designs typically fall into two categories, pulsatile and continuous flow devices, we separately address computational challenges of both types of designs, and the interaction with the circulatory system with three representative case studies. In particular, we focus on recent advancements in finite element methodology that has increased the fidelity of VAD simulations. We outline key challenges, which extend to the incorporation of biological response such as thrombosis and hemolysis, as well as shape optimization methods and challenges in computational methodology. PMID:24449607

  20. E-Learning in Engineering Education: Design of a Collaborative Advanced Remote Access Laboratory

    ERIC Educational Resources Information Center

    Chandra A. P., Jagadeesh; Samuel, R. D. Sudhaker

    2010-01-01

    Attaining excellence in technical education is a worthy challenge to any life goal. Distance learning opportunities make these goals easier to reach with added quality. Distance learning in engineering education is possible only through successful implementations of remote laboratories in a learning-by-doing environment. This paper presents one…