Sample records for laboratory computing center

  1. Laboratory Computing Resource Center

    Science.gov Websites

    Systems Computing and Data Resources Purchasing Resources Future Plans For Users Getting Started Using LCRC Software Best Practices and Policies Getting Help Support Laboratory Computing Resource Center Laboratory Computing Resource Center Latest Announcements See All April 27, 2018, Announcements, John Low

  2. Comparison of nonmesonic hypernuclear decay rates computed in laboratory and center-of-mass coordinates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Conti, C.; Barbero, C.; Galeão, A. P.

    In this work we compute the one-nucleon-induced nonmesonic hypernuclear decay rates of {sub Λ}{sup 5}He, {sub Λ}{sup 12}C and {sub Λ}{sup 13}C using a formalism based on the independent particle shell model in terms of laboratory coordinates. To ascertain the correctness and precision of the method, these results are compared with those obtained using a formalism in terms of center-of-mass coordinates, which has been previously reported in the literature. The formalism in terms of laboratory coordinates will be useful in the shell-model approach to two-nucleon-induced transitions.

  3. Center for Computing Research Summer Research Proceedings 2015.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bradley, Andrew Michael; Parks, Michael L.

    2015-12-18

    The Center for Computing Research (CCR) at Sandia National Laboratories organizes a summer student program each summer, in coordination with the Computer Science Research Institute (CSRI) and Cyber Engineering Research Institute (CERI).

  4. Joint the Center for Applied Scientific Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gamblin, Todd; Bremer, Timo; Van Essen, Brian

    The Center for Applied Scientific Computing serves as Livermore Lab’s window to the broader computer science, computational physics, applied mathematics, and data science research communities. In collaboration with academic, industrial, and other government laboratory partners, we conduct world-class scientific research and development on problems critical to national security. CASC applies the power of high-performance computing and the efficiency of modern computational methods to the realms of stockpile stewardship, cyber and energy security, and knowledge discovery for intelligence applications.

  5. The NASA Lewis Research Center High Temperature Fatigue and Structures Laboratory

    NASA Technical Reports Server (NTRS)

    Mcgaw, M. A.; Bartolotta, P. A.

    1987-01-01

    The physical organization of the NASA Lewis Research Center High Temperature Fatigue and Structures Laboratory is described. Particular attention is given to uniaxial test systems, high cycle/low cycle testing systems, axial torsional test systems, computer system capabilities, and a laboratory addition. The proposed addition will double the floor area of the present laboratory and will be equipped with its own control room.

  6. Report on Computing and Networking in the Space Science Laboratory by the SSL Computer Committee

    NASA Technical Reports Server (NTRS)

    Gallagher, D. L. (Editor)

    1993-01-01

    The Space Science Laboratory (SSL) at Marshall Space Flight Center is a multiprogram facility. Scientific research is conducted in four discipline areas: earth science and applications, solar-terrestrial physics, astrophysics, and microgravity science and applications. Representatives from each of these discipline areas participate in a Laboratory computer requirements committee, which developed this document. The purpose is to establish and discuss Laboratory objectives for computing and networking in support of science. The purpose is also to lay the foundation for a collective, multiprogram approach to providing these services. Special recognition is given to the importance of the national and international efforts of our research communities toward the development of interoperable, network-based computer applications.

  7. The role of dedicated data computing centers in the age of cloud computing

    NASA Astrophysics Data System (ADS)

    Caramarcu, Costin; Hollowell, Christopher; Strecker-Kellogg, William; Wong, Antonio; Zaytsev, Alexandr

    2017-10-01

    Brookhaven National Laboratory (BNL) anticipates significant growth in scientific programs with large computing and data storage needs in the near future and has recently reorganized support for scientific computing to meet these needs. A key component is the enhanced role of the RHIC-ATLAS Computing Facility (RACF) in support of high-throughput and high-performance computing (HTC and HPC) at BNL. This presentation discusses the evolving role of the RACF at BNL, in light of its growing portfolio of responsibilities and its increasing integration with cloud (academic and for-profit) computing activities. We also discuss BNL’s plan to build a new computing center to support the new responsibilities of the RACF and present a summary of the cost benefit analysis done, including the types of computing activities that benefit most from a local data center vs. cloud computing. This analysis is partly based on an updated cost comparison of Amazon EC2 computing services and the RACF, which was originally conducted in 2012.

  8. Center for Computational Structures Technology

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Perry, Ferman W.

    1995-01-01

    The Center for Computational Structures Technology (CST) is intended to serve as a focal point for the diverse CST research activities. The CST activities include the use of numerical simulation and artificial intelligence methods in modeling, analysis, sensitivity studies, and optimization of flight-vehicle structures. The Center is located at NASA Langley and is an integral part of the School of Engineering and Applied Science of the University of Virginia. The key elements of the Center are: (1) conducting innovative research on advanced topics of CST; (2) acting as pathfinder by demonstrating to the research community what can be done (high-potential, high-risk research); (3) strong collaboration with NASA scientists and researchers from universities and other government laboratories; and (4) rapid dissemination of CST to industry, through integration of industrial personnel into the ongoing research efforts.

  9. Reinventing patient-centered computing for the twenty-first century.

    PubMed

    Goldberg, H S; Morales, A; Gottlieb, L; Meador, L; Safran, C

    2001-01-01

    Despite evidence over the past decade that patients like and will use patient-centered computing systems in managing their health, patients have remained forgotten stakeholders in advances in clinical computing systems. We present a framework for patient empowerment and the technical realization of that framework in an architecture called CareLink. In an evaluation of the initial deployment of CareLink in the support of neonatal intensive care, we have demonstrated a reduction in the length of stay for very-low birthweight infants, and an improvement in family satisfaction with care delivery. With the ubiquitous adoption of the Internet into the general culture, patient-centered computing provides the opportunity to mend broken health care relationships and reconnect patients to the care delivery process. CareLink itself provides functionality to support both clinical care and research, and provides a living laboratory for the further study of patient-centered computing.

  10. Laboratory Sequence in Computational Methods for Introductory Chemistry

    NASA Astrophysics Data System (ADS)

    Cody, Jason A.; Wiser, Dawn C.

    2003-07-01

    A four-exercise laboratory sequence for introductory chemistry integrating hands-on, student-centered experience with computer modeling has been designed and implemented. The progression builds from exploration of molecular shapes to intermolecular forces and the impact of those forces on chemical separations made with gas chromatography and distillation. The sequence ends with an exploration of molecular orbitals. The students use the computers as a tool; they build the molecules, submit the calculations, and interpret the results. Because of the construction of the sequence and its placement spanning the semester break, good laboratory notebook practices are reinforced and the continuity of course content and methods between semesters is emphasized. The inclusion of these techniques in the first year of chemistry has had a positive impact on student perceptions and student learning.

  11. Accuracy of a laboratory-based computer implant guiding system.

    PubMed

    Barnea, Eitan; Alt, Ido; Kolerman, Roni; Nissan, Joseph

    2010-05-01

    Computer-guided implant placement is a growing treatment modality in partially and totally edentulous patients, though data about the accuracy of some systems for computer-guided surgery is limited. The purpose of this study was to evaluate the accuracy of a laboratory computer-guided system. A laboratory-based computer guiding system (M Guide; MIS technologies, Shlomi, Israel) was used to place implants in a fresh sheep mandible. A second computerized tomography (CT) scan was taken after placing the implants . The drill plan figures of the planned implants were positioned using assigned software (Med3D, Heidelberg, Germany) on the second CT scan to compare the implant position with the initial planning. Values representing the implant locations of the original drill plan were compared with that of the placed implants using SPSS software. Six measurements (3 vertical, 3 horizontal) were made on each implant to assess the deviation from the initial implant planning. A repeated-measurement analysis of variance was performed comparing the location of measurement (center, abutment, apex) and type of deviation (vertical vs. horizontal). The vertical deviation (mean -0.168) was significantly smaller than the horizontal deviation (mean 1.148). The laboratory computer-based guiding system may be a viable treatment concept for placing implants. Copyright (c) 2010 Mosby, Inc. All rights reserved.

  12. Redirecting Under-Utilised Computer Laboratories into Cluster Computing Facilities

    ERIC Educational Resources Information Center

    Atkinson, John S.; Spenneman, Dirk H. R.; Cornforth, David

    2005-01-01

    Purpose: To provide administrators at an Australian university with data on the feasibility of redirecting under-utilised computer laboratories facilities into a distributed high performance computing facility. Design/methodology/approach: The individual log-in records for each computer located in the computer laboratories at the university were…

  13. Systems integration for the Kennedy Space Center (KSC) Robotics Applications Development Laboratory (RADL)

    NASA Technical Reports Server (NTRS)

    Davis, V. Leon; Nordeen, Ross

    1988-01-01

    A laboratory for developing robotics technology for hazardous and repetitive Shuttle and payload processing activities is discussed. An overview of the computer hardware and software responsible for integrating the laboratory systems is given. The center's anthropomorphic robot is placed on a track allowing it to be moved to different stations. Various aspects of the laboratory equipment are described, including industrial robot arm control, smart systems integration, the supervisory computer, programmable process controller, real-time tracking controller, image processing hardware, and control display graphics. Topics of research include: automated loading and unloading of hypergolics for space vehicles and payloads; the use of mobile robotics for security, fire fighting, and hazardous spill operations; nondestructive testing for SRB joint and seal verification; Shuttle Orbiter radiator damage inspection; and Orbiter contour measurements. The possibility of expanding the laboratory in the future is examined.

  14. Automating the Analytical Laboratories Section, Lewis Research Center, National Aeronautics and Space Administration: A feasibility study

    NASA Technical Reports Server (NTRS)

    Boyle, W. G.; Barton, G. W.

    1979-01-01

    The feasibility of computerized automation of the Analytical Laboratories Section at NASA's Lewis Research Center was considered. Since that laboratory's duties are not routine, the automation goals were set with that in mind. Four instruments were selected as the most likely automation candidates: an atomic absorption spectrophotometer, an emission spectrometer, an X-ray fluorescence spectrometer, and an X-ray diffraction unit. Two options for computer automation were described: a time-shared central computer and a system with microcomputers for each instrument connected to a central computer. A third option, presented for future planning, expands the microcomputer version. Costs and benefits for each option were considered. It was concluded that the microcomputer version best fits the goals and duties of the laboratory and that such an automted system is needed to meet the laboratory's future requirements.

  15. Scientific Computing Strategic Plan for the Idaho National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whiting, Eric Todd

    Scientific computing is a critical foundation of modern science. Without innovations in the field of computational science, the essential missions of the Department of Energy (DOE) would go unrealized. Taking a leadership role in such innovations is Idaho National Laboratory’s (INL’s) challenge and charge, and is central to INL’s ongoing success. Computing is an essential part of INL’s future. DOE science and technology missions rely firmly on computing capabilities in various forms. Modeling and simulation, fueled by innovations in computational science and validated through experiment, are a critical foundation of science and engineering. Big data analytics from an increasing numbermore » of widely varied sources is opening new windows of insight and discovery. Computing is a critical tool in education, science, engineering, and experiments. Advanced computing capabilities in the form of people, tools, computers, and facilities, will position INL competitively to deliver results and solutions on important national science and engineering challenges. A computing strategy must include much more than simply computers. The foundational enabling component of computing at many DOE national laboratories is the combination of a showcase like data center facility coupled with a very capable supercomputer. In addition, network connectivity, disk storage systems, and visualization hardware are critical and generally tightly coupled to the computer system and co located in the same facility. The existence of these resources in a single data center facility opens the doors to many opportunities that would not otherwise be possible.« less

  16. Removing the center from computing: biology's new mode of digital knowledge production.

    PubMed

    November, Joseph

    2011-06-01

    This article shows how the USA's National Institutes of Health (NIH) helped to bring about a major shift in the way computers are used to produce knowledge and in the design of computers themselves as a consequence of its early 1960s efforts to introduce information technology to biologists. Starting in 1960 the NIH sought to reform the life sciences by encouraging researchers to make use of digital electronic computers, but despite generous federal support biologists generally did not embrace the new technology. Initially the blame fell on biologists' lack of appropriate (i.e. digital) data for computers to process. However, when the NIH consulted MIT computer architect Wesley Clark about this problem, he argued that the computer's quality as a device that was centralized posed an even greater challenge to potential biologist users than did the computer's need for digital data. Clark convinced the NIH that if the agency hoped to effectively computerize biology, it would need to satisfy biologists' experimental and institutional needs by providing them the means to use a computer without going to a computing center. With NIH support, Clark developed the 1963 Laboratory Instrument Computer (LINC), a small, real-time interactive computer intended to be used inside the laboratory and controlled entirely by its biologist users. Once built, the LINC provided a viable alternative to the 1960s norm of large computers housed in computing centers. As such, the LINC not only became popular among biologists, but also served in later decades as an important precursor of today's computing norm in the sciences and far beyond, the personal computer.

  17. Transportation Research and Analysis Computing Center (TRACC) Year 6 Quarter 4 Progress Report

    DOT National Transportation Integrated Search

    2013-03-01

    Argonne National Laboratory initiated a FY2006-FY2009 multi-year program with the US Department of Transportation (USDOT) on October 1, 2006, to establish the Transportation Research and Analysis Computing Center (TRACC). As part of the TRACC project...

  18. Using the Computer as a Laboratory Instrument.

    ERIC Educational Resources Information Center

    Collings, Peter J.; Greenslade, Thomas B., Jr.

    1989-01-01

    Reports experiences during a two-year period in introducing the computer to the laboratory and students to the computer as a laboratory instrument. Describes a working philosophy, data acquisition system, and experiments. Summarizes the laboratory procedures of nine experiments, covering mechanics, heat, electromagnetism, and optics. (YP)

  19. Computer laboratory in medical education for medical students.

    PubMed

    Hercigonja-Szekeres, Mira; Marinović, Darko; Kern, Josipa

    2009-01-01

    Five generations of second year students at the Zagreb University School of Medicine were interviewed through an anonymous questionnaire on their use of personal computers, Internet, computer laboratories and computer-assisted education in general. Results show an advance in students' usage of information and communication technology during the period from 1998/99 to 2002/03. However, their positive opinion about computer laboratory depends on installed capacities: the better the computer laboratory technology, the better the students' acceptance and use of it.

  20. Research Laboratories and Centers Fact Sheet

    EPA Pesticide Factsheets

    The Office of Research and Development is the research arm of the U.S. Environmental Protection Agency. It has three national laboratories and four national centers located in 14 facilities across the country.

  1. United States Air Force Summer Research Program -- 1993. Volume 6. Arnold Engineering Development Center, Frank J. Seiler Research Laboratory, Wilford Hall Medical Center

    DTIC Science & Technology

    1993-12-01

    where negative charge state. The local symmetry of the Ge(I) and Ge(II) centers are CI and C2 respectively. (See also Fig. 1.) q=- 1 Ge(I) Ge(II) s p...Raymond Field: Dept. of Computer Science Dept, CEM. M•e s , PhD Laboratory: / 3200 Willow Creek Road zmbry-Riddle Aeronautical Univ Vol-Page No: 0- 0...Field: Electrical Engineering Assistant Professor, PhD Laboratory: PL/WS 2390 S . York Street University of Denver Vol-Page No: 3-35 Denver, CO 80209-0177

  2. The Workstation Approach to Laboratory Computing

    PubMed Central

    Crosby, P.A.; Malachowski, G.C.; Hall, B.R.; Stevens, V.; Gunn, B.J.; Hudson, S.; Schlosser, D.

    1985-01-01

    There is a need for a Laboratory Workstation which specifically addresses the problems associated with computing in the scientific laboratory. A workstation based on the IBM PC architecture and including a front end data acquisition system which communicates with a host computer via a high speed communications link; a new graphics display controller with hardware window management and window scrolling; and an integrated software package is described.

  3. Student teaching and research laboratory focusing on brain-computer interface paradigms--A creative environment for computer science students.

    PubMed

    Rutkowski, Tomasz M

    2015-08-01

    This paper presents an applied concept of a brain-computer interface (BCI) student research laboratory (BCI-LAB) at the Life Science Center of TARA, University of Tsukuba, Japan. Several successful case studies of the student projects are reviewed together with the BCI Research Award 2014 winner case. The BCI-LAB design and project-based teaching philosophy is also explained. Future teaching and research directions summarize the review.

  4. Center for Advanced Computational Technology

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    2000-01-01

    The Center for Advanced Computational Technology (ACT) was established to serve as a focal point for diverse research activities pertaining to application of advanced computational technology to future aerospace systems. These activities include the use of numerical simulations, artificial intelligence methods, multimedia and synthetic environments, and computational intelligence, in the modeling, analysis, sensitivity studies, optimization, design and operation of future aerospace systems. The Center is located at NASA Langley and is an integral part of the School of Engineering and Applied Science of the University of Virginia. The Center has four specific objectives: 1) conduct innovative research on applications of advanced computational technology to aerospace systems; 2) act as pathfinder by demonstrating to the research community what can be done (high-potential, high-risk research); 3) help in identifying future directions of research in support of the aeronautical and space missions of the twenty-first century; and 4) help in the rapid transfer of research results to industry and in broadening awareness among researchers and engineers of the state-of-the-art in applications of advanced computational technology to the analysis, design prototyping and operations of aerospace and other high-performance engineering systems. In addition to research, Center activities include helping in the planning and coordination of the activities of a multi-center team of NASA and JPL researchers who are developing an intelligent synthesis environment for future aerospace systems; organizing workshops and national symposia; as well as writing state-of-the-art monographs and NASA special publications on timely topics.

  5. High-Performance Computing Data Center | Energy Systems Integration

    Science.gov Websites

    Facility | NREL High-Performance Computing Data Center High-Performance Computing Data Center The Energy Systems Integration Facility's High-Performance Computing Data Center is home to Peregrine -the largest high-performance computing system in the world exclusively dedicated to advancing

  6. Temporary Laboratory Office in Huntsville Industrial Center Building

    NASA Technical Reports Server (NTRS)

    1964-01-01

    Temporary quarters in the Huntsville Industrial Center (HIC) building located in downtown Huntsville, Alabama, as Marshall Space Flight Center (MSFC) grew. This image shows drafting specialists from the Propulsion and Vehicle Engineering Laboratory at work in the HIC building.

  7. Sandia National Laboratories: Cooperative Monitoring Center

    Science.gov Websites

    Programs Nuclear Weapons About Nuclear Weapons Safety & Security Weapons Science & Technology Robotics R&D 100 Awards Laboratory Directed Research & Development Technology Deployment Centers Audit Sandia's Economic Impact Licensing & Technology Transfer Browse Technology Portfolios

  8. User-Centered Computer Aided Language Learning

    ERIC Educational Resources Information Center

    Zaphiris, Panayiotis, Ed.; Zacharia, Giorgos, Ed.

    2006-01-01

    In the field of computer aided language learning (CALL), there is a need for emphasizing the importance of the user. "User-Centered Computer Aided Language Learning" presents methodologies, strategies, and design approaches for building interfaces for a user-centered CALL environment, creating a deeper understanding of the opportunities and…

  9. Marin Computer Center.

    ERIC Educational Resources Information Center

    Fox, Annie

    1978-01-01

    Relates some experiences at this nonprofit center, which was designed so that interested members of the general public can walk in and learn about computers in a safe, nonintimidating environment. STARWARS HODGE, a game written in PILOT, is also described. (CMV)

  10. Knowledge management: Role of the the Radiation Safety Information Computational Center (RSICC)

    NASA Astrophysics Data System (ADS)

    Valentine, Timothy

    2017-09-01

    The Radiation Safety Information Computational Center (RSICC) at Oak Ridge National Laboratory (ORNL) is an information analysis center that collects, archives, evaluates, synthesizes and distributes information, data and codes that are used in various nuclear technology applications. RSICC retains more than 2,000 software packages that have been provided by code developers from various federal and international agencies. RSICC's customers (scientists, engineers, and students from around the world) obtain access to such computing codes (source and/or executable versions) and processed nuclear data files to promote on-going research, to ensure nuclear and radiological safety, and to advance nuclear technology. The role of such information analysis centers is critical for supporting and sustaining nuclear education and training programs both domestically and internationally, as the majority of RSICC's customers are students attending U.S. universities. Additionally, RSICC operates a secure CLOUD computing system to provide access to sensitive export-controlled modeling and simulation (M&S) tools that support both domestic and international activities. This presentation will provide a general review of RSICC's activities, services, and systems that support knowledge management and education and training in the nuclear field.

  11. Biomedical Computing Technology Information Center: introduction and report of early progress

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maskewitz, B.F.; Henne, R.L.; McClain, W.J.

    1976-01-01

    In July 1975, the Biomedical Computing Technology Information Center (BCTIC) was established by the Division of Biomedical and Environmental Research of the U. S. Energy Research and Development Administration (ERDA) at the Oak Ridge National Laboratory. BCTIC collects, organizes, evaluates, and disseminates information on computing technology pertinent to biomedicine, providing needed routes of communication between installations and serving as a clearinghouse for the exchange of biomedical computing software, data, and interface designs. This paper presents BCTIC's functions and early progress to the MUMPS Users' Group in order to stimulate further discussion and cooperation between the two organizations. (BCTIC services aremore » available to its sponsors and their contractors and to any individual/group willing to participate in mutual exchange.) 1 figure.« less

  12. Oklahoma's Mobile Computer Graphics Laboratory.

    ERIC Educational Resources Information Center

    McClain, Gerald R.

    This Computer Graphics Laboratory houses an IBM 1130 computer, U.C.C. plotter, printer, card reader, two key punch machines, and seminar-type classroom furniture. A "General Drafting Graphics System" (GDGS) is used, based on repetitive use of basic coordinate and plot generating commands. The system is used by 12 institutions of higher education…

  13. UC Merced Center for Computational Biology Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Colvin, Michael; Watanabe, Masakatsu

    programs made possible by the CCB from its inception until August, 2010, at the end of the final extension. Although DOE support for the center ended in August 2010, the CCB will continue to exist and support its original objectives. The research and academic programs fostered by the CCB have led to additional extramural funding from other agencies, and we anticipate that CCB will continue to provide support for quantitative and computational biology program at UC Merced for many years to come. Since its inception in fall 2004, CCB research projects have continuously had a multi-institutional collaboration with Lawrence Livermore National Laboratory (LLNL), and the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign, as well as individual collaborators at other sites. CCB affiliated faculty cover a broad range of computational and mathematical research including molecular modeling, cell biology, applied math, evolutional biology, bioinformatics, etc. The CCB sponsored the first distinguished speaker series at UC Merced, which had an important role is spreading the word about the computational biology emphasis at this new campus. One of CCB's original goals is to help train a new generation of biologists who bridge the gap between the computational and life sciences. To archive this goal, by summer 2006, a new program - summer undergraduate internship program, have been established under CCB to train the highly mathematical and computationally intensive Biological Science researchers. By the end of summer 2010, 44 undergraduate students had gone through this program. Out of those participants, 11 students have been admitted to graduate schools and 10 more students are interested in pursuing graduate studies in the sciences. The center is also continuing to facilitate the development and dissemination of undergraduate and graduate course materials based on the latest research in computational biology.« less

  14. The Operation of a Specialized Scientific Information and Data Analysis Center With Computer Base and Associated Communications Network.

    ERIC Educational Resources Information Center

    Cottrell, William B.; And Others

    The Nuclear Safety Information Center (NSIC) is a highly sophisticated scientific information center operated at Oak Ridge National Laboratory (ORNL) for the U.S. Atomic Energy Commission. Its information file, which consists of both data and bibliographic information, is computer stored and numerous programs have been developed to facilitate the…

  15. Mathematics and Computer Science | Argonne National Laboratory

    Science.gov Websites

    Genomics and Systems Biology LCRCLaboratory Computing Resource Center MCSGMidwest Center for Structural Genomics NAISENorthwestern-Argonne Institute of Science & Engineering SBCStructural Biology Center

  16. Putting the Laboratory at the Center of Teaching Chemistry

    ERIC Educational Resources Information Center

    Bopegedera, A. M. R. P.

    2011-01-01

    This article describes an effective approach to teaching chemistry by bringing the laboratory to the center of teaching, to bring the excitement of discovery to the learning process. The lectures and laboratories are closely integrated to provide a holistic learning experience. The laboratories progress from verification to open-inquiry and…

  17. The Astromaterials X-Ray Computed Tomography Laboratory at Johnson Space Center

    NASA Technical Reports Server (NTRS)

    Zeigler, R. A.; Coleff, D. M.; McCubbin, F. M.

    2017-01-01

    The Astromaterials Acquisition and Curation Office at NASA's Johnson Space Center (hereafter JSC curation) is the past, present, and future home of all of NASA's astromaterials sample collections. JSC curation currently houses all or part of nine different sample collections: (1) Apollo samples (1969), (2) Lunar samples (1972), (3) Antarctic meteorites (1976), (4) Cosmic Dust particles (1981), (5) Microparticle Impact Collection (1985), (6) Genesis solar wind atoms (2004); (7) Stardust comet Wild-2 particles (2006), (8) Stardust interstellar particles (2006), and (9) Hayabusa asteroid Itokawa particles (2010). Each sample collection is housed in a dedicated clean room, or suite of clean rooms, that is tailored to the requirements of that sample collection. Our primary goals are to maintain the long-term integrity of the samples and ensure that the samples are distributed for scientific study in a fair, timely, and responsible manner, thus maximizing the return on each sample. Part of the curation process is planning for the future, and we also perform fundamental research in advanced curation initiatives. Advanced Curation is tasked with developing procedures, technology, and data sets necessary for curating new types of sample collections, or getting new results from existing sample collections [2]. We are (and have been) planning for future curation, including cold curation, extended curation of ices and volatiles, curation of samples with special chemical considerations such as perchlorate-rich samples, and curation of organically- and biologically-sensitive samples. As part of these advanced curation efforts we are augmenting our analytical facilities as well. A micro X-Ray computed tomography (micro-XCT) laboratory dedicated to the study of astromaterials will be coming online this spring within the JSC Curation office, and we plan to add additional facilities that will enable nondestructive (or minimally-destructive) analyses of astromaterials in the near

  18. Digital optical computers at the optoelectronic computing systems center

    NASA Technical Reports Server (NTRS)

    Jordan, Harry F.

    1991-01-01

    The Digital Optical Computing Program within the National Science Foundation Engineering Research Center for Opto-electronic Computing Systems has as its specific goal research on optical computing architectures suitable for use at the highest possible speeds. The program can be targeted toward exploiting the time domain because other programs in the Center are pursuing research on parallel optical systems, exploiting optical interconnection and optical devices and materials. Using a general purpose computing architecture as the focus, we are developing design techniques, tools and architecture for operation at the speed of light limit. Experimental work is being done with the somewhat low speed components currently available but with architectures which will scale up in speed as faster devices are developed. The design algorithms and tools developed for a general purpose, stored program computer are being applied to other systems such as optimally controlled optical communication networks.

  19. Computer-Assisted Laboratory Stations.

    ERIC Educational Resources Information Center

    Snyder, William J., Hanyak, Michael E.

    1985-01-01

    Describes the advantages and features of computer-assisted laboratory stations for use in a chemical engineering program. Also describes a typical experiment at such a station: determining the response times of a solid state humidity sensor at various humidity conditions and developing an empirical model for the sensor. (JN)

  20. Teaching Cardiovascular Integrations with Computer Laboratories.

    ERIC Educational Resources Information Center

    Peterson, Nils S.; Campbell, Kenneth B.

    1985-01-01

    Describes a computer-based instructional unit in cardiovascular physiology. The program (which employs simulated laboratory experimental techniques with a problem-solving format is designed to supplement an animal laboratory and to offer students an integrative approach to physiology through use of microcomputers. Also presents an overview of the…

  1. CAROLINA CENTER FOR COMPUTATIONAL TOXICOLOGY

    EPA Science Inventory

    The Center will advance the field of computational toxicology through the development of new methods and tools, as well as through collaborative efforts. In each Project, new computer-based models will be developed and published that represent the state-of-the-art. The tools p...

  2. Computer Maintenance Operations Center (CMOC), additional computer support equipment ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Computer Maintenance Operations Center (CMOC), additional computer support equipment - Beale Air Force Base, Perimeter Acquisition Vehicle Entry Phased-Array Warning System, Techinical Equipment Building, End of Spencer Paul Road, north of Warren Shingle Road (14th Street), Marysville, Yuba County, CA

  3. Sandia National Laboratories: Microsystems Science & Technology Center

    Science.gov Websites

    Programs Nuclear Weapons About Nuclear Weapons Safety & Security Weapons Science & Technology Robotics R&D 100 Awards Laboratory Directed Research & Development Technology Deployment Centers Audit Sandia's Economic Impact Licensing & Technology Transfer Browse Technology Portfolios

  4. High-Performance Computing Data Center Warm-Water Liquid Cooling |

    Science.gov Websites

    Computational Science | NREL Warm-Water Liquid Cooling High-Performance Computing Data Center Warm-Water Liquid Cooling NREL's High-Performance Computing Data Center (HPC Data Center) is liquid water Liquid cooling technologies offer a more energy-efficient solution that also allows for effective

  5. Assessment of Tutoring Laboratories in a Learning Assistance Center

    ERIC Educational Resources Information Center

    Fullmer, Patricia

    2012-01-01

    The Learning Resource Center at Lincoln University, Pennsylvania, provides tutoring laboratories that are required for developmental reading, writing, and math courses. This article reviews the processes used to plan and determine the effectiveness of the tutoring laboratories, including logic models, student learning outcomes, and the results of…

  6. Michigan/Air Force Research Laboratory (AFRL) Collaborative Center in Control Science (MACCCS)

    DTIC Science & Technology

    2016-09-01

    AFRL-RQ-WP-TR-2016-0139 MICHIGAN/AIR FORCE RESEARCH LABORATORY (AFRL) COLLABORATIVE CENTER IN CONTROL SCIENCE (MACCCS) Anouck Girard...Final 18 April 2007 – 30 September 2016 4. TITLE AND SUBTITLE MICHIGAN/AIR FORCE RESEARCH LABORATORY (AFRL) COLLABORATIVE CENTER IN CONTROL SCIENCE...and amplify an internationally recognized center of excellence in control science research and education, through interaction between the faculty and

  7. Implementing Computer Based Laboratories

    NASA Astrophysics Data System (ADS)

    Peterson, David

    2001-11-01

    Physics students at Francis Marion University will complete several required laboratory exercises utilizing computer-based Vernier probes. The simple pendulum, the acceleration due to gravity, simple harmonic motion, radioactive half lives, and radiation inverse square law experiments will be incorporated into calculus-based and algebra-based physics courses. Assessment of student learning and faculty satisfaction will be carried out by surveys and test results. Cost effectiveness and time effectiveness assessments will be presented. Majors in Computational Physics, Health Physics, Engineering, Chemistry, Mathematics and Biology take these courses, and assessments will be categorized by major. To enhance the computer skills of students enrolled in the courses, MAPLE will be used for further analysis of the data acquired during the experiments. Assessment of these enhancement exercises will also be presented.

  8. The Data Acquisition and Control Systems of the Jet Noise Laboratory at the NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Jansen, B. J., Jr.

    1998-01-01

    The features of the data acquisition and control systems of the NASA Langley Research Center's Jet Noise Laboratory are presented. The Jet Noise Laboratory is a facility that simulates realistic mixed flow turbofan jet engine nozzle exhaust systems in simulated flight. The system is capable of acquiring data for a complete take-off assessment of noise and nozzle performance. This paper describes the development of an integrated system to control and measure the behavior of model jet nozzles featuring dual independent high pressure combusting air streams with wind tunnel flow. The acquisition and control system is capable of simultaneous measurement of forces, moments, static and dynamic model pressures and temperatures, and jet noise. The design concepts for the coordination of the control computers and multiple data acquisition computers and instruments are discussed. The control system design and implementation are explained, describing the features, equipment, and the experiences of using a primarily Personal Computer based system. Areas for future development are examined.

  9. Conversion and improvement of the Rutherford Laboratory's magnetostatic computer code GFUN3D to the NMFECC CDC 7600

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tucker, T.C.

    1980-06-01

    The implementation of a version of the Rutherford Laboratory's magnetostatic computer code GFUN3D on the CDC 7600 at the National Magnetic Fusion Energy Computer Center is reported. A new iteration technique that greatly increases the probability of convergence and reduces computation time by about 30% for calculations with nonlinear, ferromagnetic materials is included. The use of GFUN3D on the NMFE network is discussed, and suggestions for future work are presented. Appendix A consists of revisions to the GFUN3D User Guide (published by Rutherford Laboratory( that are necessary to use this version. Appendix B contains input and output for some samplemore » calculations. Appendix C is a detailed discussion of the old and new iteration techniques.« less

  10. A Software Laboratory Environment for Computer-Based Problem Solving.

    ERIC Educational Resources Information Center

    Kurtz, Barry L.; O'Neal, Micheal B.

    This paper describes a National Science Foundation-sponsored project at Louisiana Technological University to develop computer-based laboratories for "hands-on" introductions to major topics of computer science. The underlying strategy is to develop structured laboratory environments that present abstract concepts through the use of…

  11. Computer Center Harris 1600 Operator’s Guide.

    DTIC Science & Technology

    1982-06-01

    RECIPIENT’S CATALOG NUMBER CMLD-82-15 Vb /9 7 ’ 4. TITLE (and Subtitle) S. TYPE OF REPORT & PERIOD COVERED Computer Center Harris 1600 Operator’s Guide...AD-AIAA 077 DAVID W TAYLOR NAVAL SHIP RESEARCH AND DEVELOPMENT CE--ETC F/G. 5/9 COMPUTER CENTER HARRIS 1600 OPEAATOR’S GUIDE.dU) M JUN 62 D A SOMMER...20084 COMPUTER CENTER HARRIS 1600 OPERATOR’s GUIDE by David V. Sommer & Sharon E. Good APPROVED FOR PUBLIC RELEASE: DISTRIBUTION UNLIMITED ’-.7 SJ0 o 0

  12. The Naval Health Research Center Respiratory Disease Laboratory.

    PubMed

    Ryan, M; Gray, G; Hawksworth, A; Malasig, M; Hudspeth, M; Poddar, S

    2000-07-01

    Concern about emerging and reemerging respiratory pathogens prompted the development of a respiratory disease reference laboratory at the Naval Health Research Center. Professionals working in this laboratory have instituted population-based surveillance for pathogens that affect military trainees and responded to threats of increased respiratory disease among high-risk military groups. Capabilities of this laboratory that are unique within the Department of Defense include adenovirus testing by viral shell culture and microneutralization serotyping, influenza culture and hemagglutination inhibition serotyping, and other special testing for Streptococcus pneumoniae, Streptococcus pyogenes, Mycoplasma pneumonia, and Chlamydia pneumoniae. Projected capabilities of this laboratory include more advanced testing for these pathogens and testing for other emerging pathogens, including Bordetella pertussis, Legionella pneumoniae, and Haemophilus influenzae type B. Such capabilities make the laboratory a valuable resource for military public health.

  13. Determination of Absolute Zero Using a Computer-Based Laboratory

    ERIC Educational Resources Information Center

    Amrani, D.

    2007-01-01

    We present a simple computer-based laboratory experiment for evaluating absolute zero in degrees Celsius, which can be performed in college and undergraduate physical sciences laboratory courses. With a computer, absolute zero apparatus can help demonstrators or students to observe the relationship between temperature and pressure and use…

  14. Air Flow Modeling in the Wind Tunnel of the FHWA Aerodynamics Laboratory at Turner-Fairbank Highway Research Center

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sitek, M. A.; Lottes, S. A.; Bojanowski, C.

    Computational fluid dynamics (CFD) modeling is widely used in industry for design and in the research community to support, compliment, and extend the scope of experimental studies. Analysis of transportation infrastructure using high performance cluster computing with CFD and structural mechanics software is done at the Transportation Research and Analysis Computing Center (TRACC) at Argonne National Laboratory. These resources, available at TRACC, were used to perform advanced three-dimensional computational simulations of the wind tunnel laboratory at the Turner-Fairbank Highway Research Center (TFHRC). The goals were to verify the CFD model of the laboratory wind tunnel and then to use versionsmore » of the model to provide the capability to (1) perform larger parametric series of tests than can be easily done in the laboratory with available budget and time, (2) to extend testing to wind speeds that cannot be achieved in the laboratory, and (3) to run types of tests that are very difficult or impossible to run in the laboratory. Modern CFD software has many physics models and domain meshing options. Models, including the choice of turbulence and other physics models and settings, the computational mesh, and the solver settings, need to be validated against measurements to verify that the results are sufficiently accurate for use in engineering applications. The wind tunnel model was built and tested, by comparing to experimental measurements, to provide a valuable tool to perform these types of studies in the future as a complement and extension to TFHRC’s experimental capabilities. Wind tunnel testing at TFHRC is conducted in a subsonic open-jet wind tunnel with a 1.83 m (6 foot) by 1.83 m (6 foot) cross section. A three component dual force-balance system is used to measure forces acting on tested models, and a three degree of freedom suspension system is used for dynamic response tests. Pictures of the room are shown in Figure 1-1 to Figure 1-4. A

  15. 77 FR 14805 - Clinical Laboratory Improvement Advisory Committee, Centers for Disease Control and Prevention...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-13

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Disease Control and Prevention Clinical Laboratory Improvement Advisory Committee, Centers for Disease Control and Prevention: Notice of Charter..., that the Clinical Laboratory Improvement Advisory Committee, Centers for Disease Control and Prevention...

  16. Computer Maintenance Operations Center (CMOC), showing duplexed cyber 170174 computers ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Computer Maintenance Operations Center (CMOC), showing duplexed cyber 170-174 computers - Beale Air Force Base, Perimeter Acquisition Vehicle Entry Phased-Array Warning System, Techinical Equipment Building, End of Spencer Paul Road, north of Warren Shingle Road (14th Street), Marysville, Yuba County, CA

  17. Computational Science News | Computational Science | NREL

    Science.gov Websites

    -Cooled High-Performance Computing Technology at the ESIF February 28, 2018 NREL Launches New Website for High-Performance Computing System Users The National Renewable Energy Laboratory (NREL) Computational Science Center has launched a revamped website for users of the lab's high-performance computing (HPC

  18. NACA Computer at the Lewis Flight Propulsion Laboratory

    NASA Image and Video Library

    1951-02-21

    A female computer at the National Advisory Committee for Aeronautics (NACA) Lewis Flight Propulsion Laboratory with a slide rule and Friden adding machine to make computations. The computer staff was introduced during World War II to relieve short-handed research engineers of some of the tedious computational work. The Computing Section was staffed by “computers,” young female employees, who often worked overnight when most of the tests were run. The computers obtained test data from the manometers and other instruments, made the initial computations, and plotted the data graphically. Researchers then analyzed the data and summarized the findings in a report or made modifications and ran the test again. There were over 400 female employees at the laboratory in 1944, including 100 computers. The use of computers was originally planned only for the duration of the war. The system was so successful that it was extended into the 1960s. The computers and analysts were located in the Altitude Wind Tunnel Shop and Office Building office wing during the 1940s and transferred to the new 8- by 6-Foot Supersonic Wind Tunnel in 1948.

  19. Real-Time, Sensor-Based Computing in the Laboratory.

    ERIC Educational Resources Information Center

    Badmus, O. O.; And Others

    1996-01-01

    Demonstrates the importance of Real-Time, Sensor-Based (RTSB) computing and how it can be easily and effectively integrated into university student laboratories. Describes the experimental processes, the process instrumentation and process-computer interface, the computer and communications systems, and typical software. Provides much technical…

  20. VIEW TO EAST OF CRYSTALLIZATION LABORATORY (CENTER LEFT FOREGROUND), PAINT ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    VIEW TO EAST OF CRYSTALLIZATION LABORATORY (CENTER LEFT FOREGROUND), PAINT APPLICATION BUILDING (CENTER BACKGROUND), AND c1944-1950 c1944-1950 POST-U.S. RADIUM ADDITION ADDITIONS TO EACH BUILDING (RIGHT FOREGROUND AND BACKGROUND) - United States Radium Corporation, 422-432 Alden Street, Orange, Essex County, NJ

  1. Improved dissection efficiency in the human gross anatomy laboratory by the integration of computers and modern technology.

    PubMed

    Reeves, Rustin E; Aschenbrenner, John E; Wordinger, Robert J; Roque, Rouel S; Sheedlo, Harold J

    2004-05-01

    The need to increase the efficiency of dissection in the gross anatomy laboratory has been the driving force behind the technologic changes we have recently implemented. With the introduction of an integrated systems-based medical curriculum and a reduction in laboratory teaching hours, anatomy faculty at the University of North Texas Health Science Center (UNTHSC) developed a computer-based dissection manual to adjust to these curricular changes and time constraints. At each cadaver workstation, Apple iMac computers were added and a new dissection manual, running in a browser-based format, was installed. Within the text of the manual, anatomical structures required for dissection were linked to digital images from prosected materials; in addition, for each body system, the dissection manual included images from cross sections, radiographs, CT scans, and histology. Although we have placed a high priority on computerization of the anatomy laboratory, we remain strong advocates of the importance of cadaver dissection. It is our belief that the utilization of computers for dissection is a natural evolution of technology and fosters creative teaching strategies adapted for anatomy laboratories in the 21st century. Our strategy has significantly enhanced the independence and proficiency of our students, the efficiency of their dissection time, and the quality of laboratory instruction by the faculty. Copyright 2004 Wiley-Liss, Inc.

  2. Advanced Biomedical Computing Center (ABCC) | DSITP

    Cancer.gov

    The Advanced Biomedical Computing Center (ABCC), located in Frederick Maryland (MD), provides HPC resources for both NIH/NCI intramural scientists and the extramural biomedical research community. Its mission is to provide HPC support, to provide collaborative research, and to conduct in-house research in various areas of computational biology and biomedical research.

  3. Community Information Centers and the Computer.

    ERIC Educational Resources Information Center

    Carroll, John M.; Tague, Jean M.

    Two computer data bases have been developed by the Computer Science Department at the University of Western Ontario for "Information London," the local community information center. One system, called LONDON, permits Boolean searches of a file of 5,000 records describing human service agencies in the London area. The second system,…

  4. A computer-based physics laboratory apparatus: Signal generator software

    NASA Astrophysics Data System (ADS)

    Thanakittiviroon, Tharest; Liangrocapart, Sompong

    2005-09-01

    This paper describes a computer-based physics laboratory apparatus to replace expensive instruments such as high-precision signal generators. This apparatus uses a sound card in a common personal computer to give sinusoidal signals with an accurate frequency that can be programmed to give different frequency signals repeatedly. An experiment on standing waves on an oscillating string uses this apparatus. In conjunction with interactive lab manuals, which have been developed using personal computers in our university, we achieve a complete set of low-cost, accurate, and easy-to-use equipment for teaching a physics laboratory.

  5. U.S. Ebola Treatment Center Clinical Laboratory Support.

    PubMed

    Jelden, Katelyn C; Iwen, Peter C; Herstein, Jocelyn J; Biddinger, Paul D; Kraft, Colleen S; Saiman, Lisa; Smith, Philip W; Hewlett, Angela L; Gibbs, Shawn G; Lowe, John J

    2016-04-01

    Fifty-five hospitals in the United States have been designated Ebola treatment centers (ETCs) by their state and local health authorities. Designated ETCs must have appropriate plans to manage a patient with confirmed Ebola virus disease (EVD) for the full duration of illness and must have these plans assessed through a CDC site visit conducted by an interdisciplinary team of subject matter experts. This study determined the clinical laboratory capabilities of these ETCs. ETCs were electronically surveyed on clinical laboratory characteristics. Survey responses were returned from 47 ETCs (85%). Forty-one (87%) of the ETCs planned to provide some laboratory support (e.g., point-of-care [POC] testing) within the room of the isolated patient. Forty-four (94%) ETCs indicated that their hospital would also provide clinical laboratory support for patient care. Twenty-two (50%) of these ETC clinical laboratories had biosafety level 3 (BSL-3) containment. Of all respondents, 34 (72%) were supported by their jurisdictional public health laboratory (PHL), all of which had available BSL-3 laboratories. Overall, 40 of 44 (91%) ETCs reported BSL-3 laboratory support via their clinical laboratory and/or PHL. This survey provided a snapshot of the laboratory support for designated U.S. ETCs. ETCs have approached high-level isolation critical care with laboratory support in close proximity to the patient room and by distributing laboratory support among laboratory resources. Experts might review safety considerations for these laboratory testing/diagnostic activities that are novel in the context of biocontainment care. Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  6. U.S. Ebola Treatment Center Clinical Laboratory Support

    PubMed Central

    Jelden, Katelyn C.; Iwen, Peter C.; Herstein, Jocelyn J.; Biddinger, Paul D.; Kraft, Colleen S.; Saiman, Lisa; Smith, Philip W.; Hewlett, Angela L.; Gibbs, Shawn G.

    2016-01-01

    Fifty-five hospitals in the United States have been designated Ebola treatment centers (ETCs) by their state and local health authorities. Designated ETCs must have appropriate plans to manage a patient with confirmed Ebola virus disease (EVD) for the full duration of illness and must have these plans assessed through a CDC site visit conducted by an interdisciplinary team of subject matter experts. This study determined the clinical laboratory capabilities of these ETCs. ETCs were electronically surveyed on clinical laboratory characteristics. Survey responses were returned from 47 ETCs (85%). Forty-one (87%) of the ETCs planned to provide some laboratory support (e.g., point-of-care [POC] testing) within the room of the isolated patient. Forty-four (94%) ETCs indicated that their hospital would also provide clinical laboratory support for patient care. Twenty-two (50%) of these ETC clinical laboratories had biosafety level 3 (BSL-3) containment. Of all respondents, 34 (72%) were supported by their jurisdictional public health laboratory (PHL), all of which had available BSL-3 laboratories. Overall, 40 of 44 (91%) ETCs reported BSL-3 laboratory support via their clinical laboratory and/or PHL. This survey provided a snapshot of the laboratory support for designated U.S. ETCs. ETCs have approached high-level isolation critical care with laboratory support in close proximity to the patient room and by distributing laboratory support among laboratory resources. Experts might review safety considerations for these laboratory testing/diagnostic activities that are novel in the context of biocontainment care. PMID:26842705

  7. Computational structures technology and UVA Center for CST

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    1992-01-01

    Rapid advances in computer hardware have had a profound effect on various engineering and mechanics disciplines, including the materials, structures, and dynamics disciplines. A new technology, computational structures technology (CST), has recently emerged as an insightful blend between material modeling, structural and dynamic analysis and synthesis on the one hand, and other disciplines such as computer science, numerical analysis, and approximation theory, on the other hand. CST is an outgrowth of finite element methods developed over the last three decades. The focus of this presentation is on some aspects of CST which can impact future airframes and propulsion systems, as well as on the newly established University of Virginia (UVA) Center for CST. The background and goals for CST are described along with the motivations for developing CST, and a brief discussion is made on computational material modeling. We look at the future in terms of technical needs, computing environment, and research directions. The newly established UVA Center for CST is described. One of the research projects of the Center is described, and a brief summary of the presentation is given.

  8. Computers in the General Physics Laboratory.

    ERIC Educational Resources Information Center

    Preston, Daryl W.; Good, R. H.

    1996-01-01

    Provides ideas and outcomes for nine computer laboratory experiments using a commercial eight-bit analog to digital (ADC) interface. Experiments cover statistics; rotation; harmonic motion; voltage, current, and resistance; ADC conversions; temperature measurement; single slit diffraction; and radioactive decay. Includes necessary schematics. (MVL)

  9. Sandia National Laboratories: Advanced Simulation and Computing

    Science.gov Websites

    Programs Nuclear Weapons About Nuclear Weapons Safety & Security Weapons Science & Technology Robotics R&D 100 Awards Laboratory Directed Research & Development Technology Deployment Centers Audit Sandia's Economic Impact Licensing & Technology Transfer Browse Technology Portfolios

  10. GeoBrain Computational Cyber-laboratory for Earth Science Studies

    NASA Astrophysics Data System (ADS)

    Deng, M.; di, L.

    2009-12-01

    Computational approaches (e.g., computer-based data visualization, analysis and modeling) are critical for conducting increasingly data-intensive Earth science (ES) studies to understand functions and changes of the Earth system. However, currently Earth scientists, educators, and students have met two major barriers that prevent them from being effectively using computational approaches in their learning, research and application activities. The two barriers are: 1) difficulties in finding, obtaining, and using multi-source ES data; and 2) lack of analytic functions and computing resources (e.g., analysis software, computing models, and high performance computing systems) to analyze the data. Taking advantages of recent advances in cyberinfrastructure, Web service, and geospatial interoperability technologies, GeoBrain, a project funded by NASA, has developed a prototype computational cyber-laboratory to effectively remove the two barriers. The cyber-laboratory makes ES data and computational resources at large organizations in distributed locations available to and easily usable by the Earth science community through 1) enabling seamless discovery, access and retrieval of distributed data, 2) federating and enhancing data discovery with a catalogue federation service and a semantically-augmented catalogue service, 3) customizing data access and retrieval at user request with interoperable, personalized, and on-demand data access and services, 4) automating or semi-automating multi-source geospatial data integration, 5) developing a large number of analytic functions as value-added, interoperable, and dynamically chainable geospatial Web services and deploying them in high-performance computing facilities, 6) enabling the online geospatial process modeling and execution, and 7) building a user-friendly extensible web portal for users to access the cyber-laboratory resources. Users can interactively discover the needed data and perform on-demand data analysis and

  11. Pilot Project in Computer Assisted Instruction for Adult Basic Education Students. Adult Learning Centers, the Adult Program, 1982-83.

    ERIC Educational Resources Information Center

    Buckley, Elizabeth; Johnston, Peter

    In February 1977, computer assisted instruction (CAI) was introducted to the Great Neck Adult Learning Centers (GNALC) to promote greater cognitive and affective growth of educationally disadvantaged adults. The project expanded to include not only adult basic education (ABE) students studying in the learning laboratory, but also ABE students…

  12. Computer Software Management and Information Center

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Computer programs for passive anti-roll tank, earth resources laboratory applications, the NIMBUS-7 coastal zone color scanner derived products, transportable applications executive, plastic and failure analysis of composites, velocity gradient method for calculating velocities in an axisymmetric annular duct, an integrated procurement management system, data I/O PRON for the Motorola exorcisor, aerodynamic shock-layer shape, kinematic modeling, hardware library for a graphics computer, and a file archival system are documented.

  13. Cornell University Center for Advanced Computing

    Science.gov Websites

    Resource Center Data Management (RDMSG) Computational Agriculture National Science Foundation Other Public agriculture technology acquired Lifka joins National Science Foundation CISE Advisory Committee © Cornell

  14. A computer-managed undergraduate physics laboratory

    NASA Astrophysics Data System (ADS)

    Kalman, C. S.

    1987-01-01

    Seventeen one-semester undergraduate laboratory courses are managed by a microcomputer system at Concordia University. Students may perform experiments at any time during operating hours. The computer administers pre- and post-tests. Considerable savings in manpower costs is achieved. The system also provides many pedagogical advantages.

  15. Computer Based Simulation of Laboratory Experiments.

    ERIC Educational Resources Information Center

    Edward, Norrie S.

    1997-01-01

    Examines computer based simulations of practical laboratory experiments in engineering. Discusses the aims and achievements of lab work (cognitive, process, psychomotor, and affective); types of simulations (model building and behavioral); and the strengths and weaknesses of simulations. Describes the development of a centrifugal pump simulation,…

  16. Clinical Laboratories – Production Factories or Specialized Diagnostic Centers

    PubMed Central

    Tóth, Judit

    2016-01-01

    Since a large proportion of medical decisions are based on laboratory results, clinical laboratories should meet the increasing demand of clinicians and their patients. Huge central laboratories may process over 10 million tests annually; they act as production factories, measuring emergency and routine tests with sufficient speed and accuracy. At the same time, they also serve as specialized diagnostic centers where well-trained experts analyze and interpret special test results. It is essential to improve and constantly monitor this complex laboratory service, by several methods. Sample transport by pneumatic tube system, use of an advanced laboratory information system and point-of-care testing may result in decreased total turnaround time. The optimization of test ordering may result in a faster and more cost-effective laboratory service. Autovalidation can save time for laboratory specialists, when the analysis of more complex results requires their attention. Small teams of experts responsible for special diagnostic work, and their interpretative reporting according to predetermined principles, may help to minimize subjectivity of these special reports. Although laboratory investigations have become so diversely developed in the past decades, it is essential that the laboratory can provide accurate results relatively quickly, and that laboratory specialists can support the diagnosis and monitoring of patients by adequate interpretation of esoteric laboratory methods. PMID:27683528

  17. Computers and Media Centers--A Winning Combination.

    ERIC Educational Resources Information Center

    Graf, Nancy

    1984-01-01

    Profile of the computer program offered by the library/media center at Chief Joseph Junior High School in Richland, Washington, highlights program background, operator's licensing procedure, the trainer license, assistance from high school students, need for more computers, handling of software, and helpful hints. (EJS)

  18. Pulmonary Testing Laboratory Computer Application

    PubMed Central

    Johnson, Martin E.

    1980-01-01

    An interactive computer application reporting patient pulmonary function data has been developed by Washington, D.C. VA Medical Center staff. A permanent on-line data base of patient demographics, lung capacity, flows, diffusion, arterial blood gases and physician interpretation is maintained by a minicomputer at the hospital. A user oriented application program resulted from development in concert with the clinical users. Rapid program development resulted from employing a newly developed time saving technique that has found wide application at other VA Medical Centers. Careful attention to user interaction has resulted in an application program requiring little training and which has been satisfactorily used by a number of clinicians.

  19. NASA Center for Computational Sciences: History and Resources

    NASA Technical Reports Server (NTRS)

    2000-01-01

    The Nasa Center for Computational Sciences (NCCS) has been a leading capacity computing facility, providing a production environment and support resources to address the challenges facing the Earth and space sciences research community.

  20. Leading Antibacterial Laboratory Research by Integrating Conventional and Innovative Approaches: The Laboratory Center of the Antibacterial Resistance Leadership Group.

    PubMed

    Manca, Claudia; Hill, Carol; Hujer, Andrea M; Patel, Robin; Evans, Scott R; Bonomo, Robert A; Kreiswirth, Barry N

    2017-03-15

    The Antibacterial Resistance Leadership Group (ARLG) Laboratory Center (LC) leads the evaluation, development, and implementation of laboratory-based research by providing scientific leadership and supporting standard/specialized laboratory services. The LC has developed a physical biorepository and a virtual biorepository. The physical biorepository contains bacterial isolates from ARLG-funded studies located in a centralized laboratory and they are available to ARLG investigators. The Web-based virtual biorepository strain catalogue includes well-characterized gram-positive and gram-negative bacterial strains published by ARLG investigators. The LC, in collaboration with the ARLG Leadership and Operations Center, developed procedures for review and approval of strain requests, guidance during the selection process, and for shipping strains from the distributing laboratories to the requesting investigators. ARLG strains and scientific and/or technical guidance have been provided to basic research laboratories and diagnostic companies for research and development, facilitating collaboration between diagnostic companies and the ARLG Master Protocol for Evaluating Multiple Infection Diagnostics (MASTERMIND) initiative for evaluation of multiple diagnostic devices from a single patient sampling event. In addition, the LC has completed several laboratory-based studies designed to help evaluate new rapid molecular diagnostics by developing, testing, and applying a MASTERMIND approach using purified bacterial strains. In collaboration with the ARLG's Statistical and Data Management Center (SDMC), the LC has developed novel analytical strategies that integrate microbiologic and genetic data for improved and accurate identification of antimicrobial resistance. These novel approaches will aid in the design of future ARLG studies and help correlate pathogenic markers with clinical outcomes. The LC's accomplishments are the result of a successful collaboration with the ARLG

  1. Voting with Their Seats: Computer Laboratory Design and the Casual User

    ERIC Educational Resources Information Center

    Spennemann, Dirk H. R.; Atkinson, John; Cornforth, David

    2007-01-01

    Student computer laboratories are provided by most teaching institutions around the world; however, what is the most effective layout for such facilities? The log-in data files from computer laboratories at a regional university in Australia were analysed to determine whether there was a pattern in student seating. In particular, it was…

  2. 2. CATCH BASIN, INFLOW PIPES AT CENTER, COLD FLOW LABORATORY ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. CATCH BASIN, INFLOW PIPES AT CENTER, COLD FLOW LABORATORY AT LEFT, VIEW TOWARDS NORTHWEST. - Glenn L. Martin Company, Titan Missile Test Facilities, Catch Basin, Waterton Canyon Road & Colorado Highway 121, Lakewood, Jefferson County, CO

  3. Utilization of Educationally Oriented Microcomputer Based Laboratories

    ERIC Educational Resources Information Center

    Fitzpatrick, Michael J.; Howard, James A.

    1977-01-01

    Describes one approach to supplying engineering and computer science educators with an economical portable digital systems laboratory centered around microprocessors. Expansion of the microcomputer based laboratory concept to include Learning Resource Aided Instruction (LRAI) systems is explored. (Author)

  4. An Overview of the Computational Physics and Methods Group at Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, Randal Scott

    CCS Division was formed to strengthen the visibility and impact of computer science and computational physics research on strategic directions for the Laboratory. Both computer science and computational science are now central to scientific discovery and innovation. They have become indispensable tools for all other scientific missions at the Laboratory. CCS Division forms a bridge between external partners and Laboratory programs, bringing new ideas and technologies to bear on today’s important problems and attracting high-quality technical staff members to the Laboratory. The Computational Physics and Methods Group CCS-2 conducts methods research and develops scientific software aimed at the latest andmore » emerging HPC systems.« less

  5. United States Air Force Summer Research Program -- 1993. Volume 16. Arnold Engineering Development Center. Frank J. Seiler Research Laboratory. Wilford Hall Medical Center

    DTIC Science & Technology

    1993-12-01

    A I 7f t UNITED STATE AIR FORCE SUMMER RESEARCH PROGRAM -- 1993 SUMMER RESEARCH PROGRAM FINAL REPORTS VOLUME 16 ARNOLD ENGINEERING DEVELOPMENT CENTER...FRANK J. SELLER RESEARCH LABORATORY WILFORD HALL MEDICAL CENTER RESEARCH & DEVELOPMENT LABORATORIES 5800 Uplander Way Culver City, CA 90230-6608...National Rd. Vol-Page No: 15-44 Dist Tecumseh High School 8.4 New Carlisle, OH 45344-0000 Barber, Jason Laboratory: AL/CF 1000 10th St. Vol-Page No

  6. Interfacing laboratory instruments to multiuser, virtual memory computers

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R.; Stang, David B.; Roth, Don J.

    1989-01-01

    Incentives, problems and solutions associated with interfacing laboratory equipment with multiuser, virtual memory computers are presented. The major difficulty concerns how to utilize these computers effectively in a medium sized research group. This entails optimization of hardware interconnections and software to facilitate multiple instrument control, data acquisition and processing. The architecture of the system that was devised, and associated programming and subroutines are described. An example program involving computer controlled hardware for ultrasonic scan imaging is provided to illustrate the operational features.

  7. Saving Water at Los Alamos National Laboratory

    ScienceCinema

    Erickson, Andy

    2018-01-16

    Los Alamos National Laboratory decreased its water usage by 26 percent in 2014, with about one-third of the reduction attributable to using reclaimed water to cool a supercomputing center. The Laboratory's goal during 2014 was to use only re-purposed water to support the mission at the Strategic Computing Complex. Using reclaimed water from the Sanitary Effluent Reclamation Facility, or SERF, substantially decreased water usage and supported the overall mission. SERF collects industrial wastewater and treats it for reuse. The reclamation facility contributed more than 27 million gallons of re-purposed water to the Laboratory's computing center, a secured supercomputing facility that supports the Laboratory’s national security mission and is one of the institution’s larger water users. In addition to the strategic water reuse program at SERF, the Laboratory reduced water use in 2014 by focusing conservation efforts on areas that use the most water, upgrading to water-conserving fixtures, and repairing leaks identified in a biennial survey.

  8. Senior Computational Scientist | Center for Cancer Research

    Cancer.gov

    The Basic Science Program (BSP) pursues independent, multidisciplinary research in basic and applied molecular biology, immunology, retrovirology, cancer biology, and human genetics. Research efforts and support are an integral part of the Center for Cancer Research (CCR) at the Frederick National Laboratory for Cancer Research (FNLCR). The Cancer & Inflammation Program (CIP),

  9. A Choice of Terminals: Spatial Patterning in Computer Laboratories

    ERIC Educational Resources Information Center

    Spennemann, Dirk; Cornforth, David; Atkinson, John

    2007-01-01

    Purpose: This paper seeks to examine the spatial patterns of student use of machines in each laboratory to whether there are underlying commonalities. Design/methodology/approach: The research was carried out by assessing the user behaviour in 16 computer laboratories at a regional university in Australia. Findings: The study found that computers…

  10. A Low Cost Microcomputer Laboratory for Investigating Computer Architecture.

    ERIC Educational Resources Information Center

    Mitchell, Eugene E., Ed.

    1980-01-01

    Described is a microcomputer laboratory at the United States Military Academy at West Point, New York, which provides easy access to non-volatile memory and a single input/output file system for 16 microcomputer laboratory positions. A microcomputer network that has a centralized data base is implemented using the concepts of computer network…

  11. Lawrence Livermore National Laboratory`s Computer Security Short Subjects Videos: Hidden Password, The Incident, Dangerous Games and The Mess; Computer Security Awareness Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    A video on computer security is described. Lonnie Moore, the Computer Security Manager, CSSM/CPPM at Lawrence Livermore National Laboratory (LLNL) and Gale Warshawsky, the Coordinator for Computer Security Education and Awareness at LLNL, wanted to share topics such as computer ethics, software piracy, privacy issues, and protecting information in a format that would capture and hold an audience`s attention. Four Computer Security Short Subject videos were produced which ranged from 1--3 minutes each. These videos are very effective education and awareness tools that can be used to generate discussions about computer security concerns and good computing practices.

  12. Computer Simulations Improve University Instructional Laboratories1

    PubMed Central

    2004-01-01

    Laboratory classes are commonplace and essential in biology departments but can sometimes be cumbersome, unreliable, and a drain on time and resources. As university intakes increase, pressure on budgets and staff time can often lead to reduction in practical class provision. Frequently, the ability to use laboratory equipment, mix solutions, and manipulate test animals are essential learning outcomes, and “wet” laboratory classes are thus appropriate. In others, however, interpretation and manipulation of the data are the primary learning outcomes, and here, computer-based simulations can provide a cheaper, easier, and less time- and labor-intensive alternative. We report the evaluation of two computer-based simulations of practical exercises: the first in chromosome analysis, the second in bioinformatics. Simulations can provide significant time savings to students (by a factor of four in our first case study) without affecting learning, as measured by performance in assessment. Moreover, under certain circumstances, performance can be improved by the use of simulations (by 7% in our second case study). We concluded that the introduction of these simulations can significantly enhance student learning where consideration of the learning outcomes indicates that it might be appropriate. In addition, they can offer significant benefits to teaching staff. PMID:15592599

  13. Meta-analysis of the effectiveness of computer-based laboratory versus traditional hands-on laboratory in college and pre-college science instructions

    NASA Astrophysics Data System (ADS)

    Onuoha, Cajetan O.

    The purpose of this research study was to determine the overall effectiveness of computer-based laboratory compared with the traditional hands-on laboratory for improving students' science academic achievement and attitudes towards science subjects at the college and pre-college levels of education in the United States. Meta-analysis was used to synthesis the findings from 38 primary research studies conducted and/or reported in the United States between 1996 and 2006 that compared the effectiveness of computer-based laboratory with the traditional hands-on laboratory on measures related to science academic achievements and attitudes towards science subjects. The 38 primary research studies, with total subjects of 3,824 generated a total of 67 weighted individual effect sizes that were used in this meta-analysis. The study found that computer-based laboratory had small positive effect sizes over the traditional hands-on laboratory (ES = +0.26) on measures related to students' science academic achievements and attitudes towards science subjects (ES = +0.22). It was also found that computer-based laboratory produced more significant effects on physical science subjects compared to biological sciences (ES = +0.34, +0.17).

  14. Communication and computing technology in biocontainment laboratories using the NEIDL as a model.

    PubMed

    McCall, John; Hardcastle, Kath

    2014-07-01

    The National Emerging Infectious Diseases Laboratories (NEIDL), Boston University, is a globally unique biocontainment research facility housing biosafety level 2 (BSL-2), BSL-3, and BSL-4 laboratories. Located in the BioSquare area at the University's Medical Campus, it is part of a national network of secure facilities constructed to study infectious diseases of major public health concern. The NEIDL allows for basic, translational, and clinical phases of research to be carried out in a single facility with the overall goal of accelerating understanding, treatment, and prevention of infectious diseases. The NEIDL will also act as a center of excellence providing training and education in all aspects of biocontainment research. Within every detail of NEIDL operations is a primary emphasis on safety and security. The ultramodern NEIDL has required a new approach to communications technology solutions in order to ensure safety and security and meet the needs of investigators working in this complex building. This article discusses the implementation of secure wireless networks and private cloud computing to promote operational efficiency, biosecurity, and biosafety with additional energy-saving advantages. The utilization of a dedicated data center, virtualized servers, virtualized desktop integration, multichannel secure wireless networks, and a NEIDL-dedicated Voice over Internet Protocol (VoIP) network are all discussed. © 2014 Federation of European Microbiological Societies. Published by John Wiley & Sons Ltd. All rights reserved.

  15. Examining Student Outcomes in University Computer Laboratory Environments: Issues for Educational Management

    ERIC Educational Resources Information Center

    Newby, Michael; Marcoulides, Laura D.

    2008-01-01

    Purpose: The purpose of this paper is to model the relationship between student performance, student attitudes, and computer laboratory environments. Design/methodology/approach: Data were collected from 234 college students enrolled in courses that involved the use of a computer to solve problems and provided the laboratory experience by means of…

  16. The Laboratory for Terrestrial Physics

    NASA Technical Reports Server (NTRS)

    2003-01-01

    The Laboratory for Terrestrial Physics is dedicated to the advancement of knowledge in Earth and planetary science, by conducting innovative research using space technology. The Laboratory's mission and activities support the work and new initiatives at NASA's Goddard Space Flight Center (GSFC). The Laboratory's success contributes to the Earth Science Directorate as a national resource for studies of Earth from Space. The Laboratory is part of the Earth Science Directorate based at the GSFC in Greenbelt, MD. The Directorate itself is comprised of the Global Change Data Center (GCDC), the Space Data and Computing Division (SDCD), and four science Laboratories, including Laboratory for Terrestrial Physics, Laboratory for Atmospheres, and Laboratory for Hydrospheric Processes all in Greenbelt, MD. The fourth research organization, Goddard Institute for Space Studies (GISS), is in New York, NY. Relevant to NASA's Strategic Plan, the Laboratory ensures that all work undertaken and completed is within the vision of GSFC. The philosophy of the Laboratory is to balance the completion of near term goals, while building on the Laboratory's achievements as a foundation for the scientific challenges in the future.

  17. Using 3D infrared imaging to calibrate and refine computational fluid dynamic modeling for large computer and data centers

    NASA Astrophysics Data System (ADS)

    Stockton, Gregory R.

    2011-05-01

    Over the last 10 years, very large government, military, and commercial computer and data center operators have spent millions of dollars trying to optimally cool data centers as each rack has begun to consume as much as 10 times more power than just a few years ago. In fact, the maximum amount of data computation in a computer center is becoming limited by the amount of available power, space and cooling capacity at some data centers. Tens of millions of dollars and megawatts of power are being annually spent to keep data centers cool. The cooling and air flows dynamically change away from any predicted 3-D computational fluid dynamic modeling during construction and as time goes by, and the efficiency and effectiveness of the actual cooling rapidly departs even farther from predicted models. By using 3-D infrared (IR) thermal mapping and other techniques to calibrate and refine the computational fluid dynamic modeling and make appropriate corrections and repairs, the required power for data centers can be dramatically reduced which reduces costs and also improves reliability.

  18. Saving Water at Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erickson, Andy

    Los Alamos National Laboratory decreased its water usage by 26 percent in 2014, with about one-third of the reduction attributable to using reclaimed water to cool a supercomputing center. The Laboratory's goal during 2014 was to use only re-purposed water to support the mission at the Strategic Computing Complex. Using reclaimed water from the Sanitary Effluent Reclamation Facility, or SERF, substantially decreased water usage and supported the overall mission. SERF collects industrial wastewater and treats it for reuse. The reclamation facility contributed more than 27 million gallons of re-purposed water to the Laboratory's computing center, a secured supercomputing facility thatmore » supports the Laboratory’s national security mission and is one of the institution’s larger water users. In addition to the strategic water reuse program at SERF, the Laboratory reduced water use in 2014 by focusing conservation efforts on areas that use the most water, upgrading to water-conserving fixtures, and repairing leaks identified in a biennial survey.« less

  19. Hibbing Community College's Community Computer Center.

    ERIC Educational Resources Information Center

    Regional Technology Strategies, Inc., Carrboro, NC.

    This paper reports on the development of the Community Computer Center (CCC) at Hibbing Community College (HCC) in Minnesota. HCC is located in the largest U.S. iron mining area in the United States. Closures of steel-producing plants are affecting the Hibbing area. Outmigration, particularly of younger workers and their families, has been…

  20. FY 72 Computer Utilization at the Transportation Systems Center

    DOT National Transportation Integrated Search

    1972-08-01

    The Transportation Systems Center currently employs a medley of on-site and off-site computer systems to obtain the computational support it requires. Examination of the monthly User Accountability Reports for FY72 indicated that during the fiscal ye...

  1. Center for space microelectronics technology

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The 1992 Technical Report of the Jet Propulsion Laboratory Center for Space Microelectronics Technology summarizes the technical accomplishments, publications, presentations, and patents of the center during the past year. The report lists 187 publications, 253 presentations, and 111 new technology reports and patents in the areas of solid-state devices, photonics, advanced computing, and custom microcircuits.

  2. Computers in aeronautics and space research at the Lewis Research Center

    NASA Technical Reports Server (NTRS)

    1991-01-01

    This brochure presents a general discussion of the role of computers in aerospace research at NASA's Lewis Research Center (LeRC). Four particular areas of computer applications are addressed: computer modeling and simulation, computer assisted engineering, data acquisition and analysis, and computer controlled testing.

  3. The Center for Computational Biology: resources, achievements, and challenges

    PubMed Central

    Dinov, Ivo D; Thompson, Paul M; Woods, Roger P; Van Horn, John D; Shattuck, David W; Parker, D Stott

    2011-01-01

    The Center for Computational Biology (CCB) is a multidisciplinary program where biomedical scientists, engineers, and clinicians work jointly to combine modern mathematical and computational techniques, to perform phenotypic and genotypic studies of biological structure, function, and physiology in health and disease. CCB has developed a computational framework built around the Manifold Atlas, an integrated biomedical computing environment that enables statistical inference on biological manifolds. These manifolds model biological structures, features, shapes, and flows, and support sophisticated morphometric and statistical analyses. The Manifold Atlas includes tools, workflows, and services for multimodal population-based modeling and analysis of biological manifolds. The broad spectrum of biomedical topics explored by CCB investigators include the study of normal and pathological brain development, maturation and aging, discovery of associations between neuroimaging and genetic biomarkers, and the modeling, analysis, and visualization of biological shape, form, and size. CCB supports a wide range of short-term and long-term collaborations with outside investigators, which drive the center's computational developments and focus the validation and dissemination of CCB resources to new areas and scientific domains. PMID:22081221

  4. The Center for Computational Biology: resources, achievements, and challenges.

    PubMed

    Toga, Arthur W; Dinov, Ivo D; Thompson, Paul M; Woods, Roger P; Van Horn, John D; Shattuck, David W; Parker, D Stott

    2012-01-01

    The Center for Computational Biology (CCB) is a multidisciplinary program where biomedical scientists, engineers, and clinicians work jointly to combine modern mathematical and computational techniques, to perform phenotypic and genotypic studies of biological structure, function, and physiology in health and disease. CCB has developed a computational framework built around the Manifold Atlas, an integrated biomedical computing environment that enables statistical inference on biological manifolds. These manifolds model biological structures, features, shapes, and flows, and support sophisticated morphometric and statistical analyses. The Manifold Atlas includes tools, workflows, and services for multimodal population-based modeling and analysis of biological manifolds. The broad spectrum of biomedical topics explored by CCB investigators include the study of normal and pathological brain development, maturation and aging, discovery of associations between neuroimaging and genetic biomarkers, and the modeling, analysis, and visualization of biological shape, form, and size. CCB supports a wide range of short-term and long-term collaborations with outside investigators, which drive the center's computational developments and focus the validation and dissemination of CCB resources to new areas and scientific domains.

  5. Managing Laboratory Data Using Cloud Computing as an Organizational Tool

    ERIC Educational Resources Information Center

    Bennett, Jacqueline; Pence, Harry E.

    2011-01-01

    One of the most significant difficulties encountered when directing undergraduate research and developing new laboratory experiments is how to efficiently manage the data generated by a number of students. Cloud computing, where both software and computer files reside online, offers a solution to this data-management problem and allows researchers…

  6. Integration of Computer Technology Into an Introductory-Level Neuroscience Laboratory

    ERIC Educational Resources Information Center

    Evert, Denise L.; Goodwin, Gregory; Stavnezer, Amy Jo

    2005-01-01

    We describe 3 computer-based neuroscience laboratories. In the first 2 labs, we used commercially available interactive software to enhance the study of functional and comparative neuroanatomy and neurophysiology. In the remaining lab, we used customized software and hardware in 2 psychophysiological experiments. With the use of the computer-based…

  7. Capability of the Gas Analysis and Testing Laboratory at the NASA Johnson Space Center

    NASA Technical Reports Server (NTRS)

    Broerman, Craig; Jimenez, Javier; Sweterlitsch, Jeff

    2012-01-01

    The Gas Analysis and Testing Laboratory is an integral part of the testing performed at the NASA Johnson Space Center. The Gas Analysis and Testing Laboratory is a high performance laboratory providing real time analytical instruments to support manned and unmanned testing. The lab utilizes precision gas chromatographs, gas analyzers and spectrophotometers to support the technology development programs within the NASA community. The Gas Analysis and Testing Laboratory works with a wide variety of customers and provides engineering support for user-specified applications in compressed gas, chemical analysis, general and research laboratory.

  8. Capability of the Gas Analysis and Testing Laboratory at the NASA Johnson Space Center

    NASA Technical Reports Server (NTRS)

    Broerman, Craig; Jimenez, Javier; Sweterlitsch, Jeff

    2011-01-01

    The Gas Analysis and Testing Laboratory is an integral part of the testing performed at the NASA Johnson Space Center. The Gas Analysis and Testing Laboratory is a high performance laboratory providing real time analytical instruments to support manned and unmanned testing. The lab utilizes precision gas chromatographs, gas analyzers and spectrophotometers to support the technology development programs within the NASA community. The Gas Analysis and Testing Laboratory works with a wide variety of customers and provides engineering support for user-specified applications in compressed gas, chemical analysis, general and research laboratory

  9. Computer Security Awareness Guide for Department of Energy Laboratories, Government Agencies, and others for use with Lawrence Livermore National Laboratory`s (LLNL): Computer security short subjects videos

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    Lonnie Moore, the Computer Security Manager, CSSM/CPPM at Lawrence Livermore National Laboratory (LLNL) and Gale Warshawsky, the Coordinator for Computer Security Education & Awareness at LLNL, wanted to share topics such as computer ethics, software piracy, privacy issues, and protecting information in a format that would capture and hold an audience`s attention. Four Computer Security Short Subject videos were produced which ranged from 1-3 minutes each. These videos are very effective education and awareness tools that can be used to generate discussions about computer security concerns and good computing practices. Leaders may incorporate the Short Subjects into presentations. After talkingmore » about a subject area, one of the Short Subjects may be shown to highlight that subject matter. Another method for sharing them could be to show a Short Subject first and then lead a discussion about its topic. The cast of characters and a bit of information about their personalities in the LLNL Computer Security Short Subjects is included in this report.« less

  10. Computer Simulation of Laboratory Experiments: An Unrealized Potential.

    ERIC Educational Resources Information Center

    Magin, D. J.; Reizes, J. A.

    1990-01-01

    Discussion of the use of computer simulation for laboratory experiments in undergraduate engineering education focuses on work at the University of New South Wales in the instructional design and software development of a package simulating a heat exchange device. The importance of integrating theory, design, and experimentation is also discussed.…

  11. National Wind Tecnology Center Provides Dual Axis Resonant Blade Testing

    ScienceCinema

    Felker, Fort

    2018-01-16

    NREL's Structural Testing Laboratory at the National Wind Technology Center (NWTC) provides experimental laboratories, computer facilities for analytical work, space for assembling components and turbines for atmospheric testing as well as office space for industry researchers. Fort Felker, center director at the NWTC, discusses NREL's state-of-the-art structural testing capabilities and shows a flapwise and edgewise blade test in progress.

  12. National Wind Tecnology Center Provides Dual Axis Resonant Blade Testing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Felker, Fort

    2013-11-13

    NREL's Structural Testing Laboratory at the National Wind Technology Center (NWTC) provides experimental laboratories, computer facilities for analytical work, space for assembling components and turbines for atmospheric testing as well as office space for industry researchers. Fort Felker, center director at the NWTC, discusses NREL's state-of-the-art structural testing capabilities and shows a flapwise and edgewise blade test in progress.

  13. Elementary and Advanced Computer Projects for the Physics Classroom and Laboratory

    DTIC Science & Technology

    1992-12-01

    are SPF/PC, MS Word, n3, Symphony, Mathematics, and FORTRAN. The authors’ programs assist data analysis in particular laboratory experiments and make...assist data analysis in particular laboratory experiments and make use of the Monte Carlo and other numerical techniques in computer simulation and...the language of science and engineering in industry and government laboratories (alth..4h C is becoming a powerful competitor ). RM/FORTRAN (cost $400

  14. End-to-end remote sensing at the Science and Technology Laboratory of John C. Stennis Space Center

    NASA Technical Reports Server (NTRS)

    Kelly, Patrick; Rickman, Douglas; Smith, Eric

    1991-01-01

    The Science and Technology Laboratory (STL) of Stennis Space Center (SSC) was developing an expertise in remote sensing for more than a decade. Capabilities at SSC/STL include all major areas of the field. STL includes the Sensor Development Laboratory (SDL), Image Processing Center, a Learjet 23 flight platform, and on-staff scientific investigators.

  15. Computer validation in toxicology: historical review for FDA and EPA good laboratory practice.

    PubMed

    Brodish, D L

    1998-01-01

    The application of computer validation principles to Good Laboratory Practice is a fairly recent phenomenon. As automated data collection systems have become more common in toxicology facilities, the U.S. Food and Drug Administration and the U.S. Environmental Protection Agency have begun to focus inspections in this area. This historical review documents the development of regulatory guidance on computer validation in toxicology over the past several decades. An overview of the components of a computer life cycle is presented, including the development of systems descriptions, validation plans, validation testing, system maintenance, SOPs, change control, security considerations, and system retirement. Examples are provided for implementation of computer validation principles on laboratory computer systems in a toxicology facility.

  16. The effect of introducing computers into an introductory physics problem-solving laboratory

    NASA Astrophysics Data System (ADS)

    McCullough, Laura Ellen

    2000-10-01

    Computers are appearing in every type of classroom across the country. Yet they often appear without benefit of studying their effects. The research that is available on computer use in classrooms has found mixed results, and often ignores the theoretical and instructional contexts of the computer in the classroom. The University of Minnesota's physics department employs a cooperative-group problem solving pedagogy, based on a cognitive apprenticeship instructional model, in its calculus-based introductory physics course. This study was designed to determine possible negative effects of introducing a computerized data-acquisition and analysis tool into this pedagogy as a problem-solving tool for students to use in laboratory. To determine the effects of the computer tool, two quasi-experimental treatment groups were selected. The computer-tool group (N = 170) used a tool, designed for this study (VideoTool), to collect and analyze motion data in the laboratory. The control group (N = 170) used traditional non-computer equipment (spark tapes and Polaroid(TM) film). The curriculum was kept as similar as possible for the two groups. During the ten week academic quarter, groups were examined for effects on performance on conceptual tests and grades, attitudes towards the laboratory and the laboratory tools, and behaviors within cooperative groups. Possible interactions with gender were also examined. Few differences were found between the control and computer-tool groups. The control group received slightly higher scores on one conceptual test, but this difference was not educationally significant. The computer-tool group had slightly more positive attitudes towards using the computer tool than their counterparts had towards the traditional tools. The computer-tool group also perceived that they spoke more frequently about physics misunderstandings, while the control group felt that they discussed equipment difficulties more often. This perceptual difference interacted

  17. Computational Nanotechnology at NASA Ames Research Center, 1996

    NASA Technical Reports Server (NTRS)

    Globus, Al; Bailey, David; Langhoff, Steve; Pohorille, Andrew; Levit, Creon; Chancellor, Marisa K. (Technical Monitor)

    1996-01-01

    Some forms of nanotechnology appear to have enormous potential to improve aerospace and computer systems; computational nanotechnology, the design and simulation of programmable molecular machines, is crucial to progress. NASA Ames Research Center has begun a computational nanotechnology program including in-house work, external research grants, and grants of supercomputer time. Four goals have been established: (1) Simulate a hypothetical programmable molecular machine replicating itself and building other products. (2) Develop molecular manufacturing CAD (computer aided design) software and use it to design molecular manufacturing systems and products of aerospace interest, including computer components. (3) Characterize nanotechnologically accessible materials of aerospace interest. Such materials may have excellent strength and thermal properties. (4) Collaborate with experimentalists. Current in-house activities include: (1) Development of NanoDesign, software to design and simulate a nanotechnology based on functionalized fullerenes. Early work focuses on gears. (2) A design for high density atomically precise memory. (3) Design of nanotechnology systems based on biology. (4) Characterization of diamonoid mechanosynthetic pathways. (5) Studies of the laplacian of the electronic charge density to understand molecular structure and reactivity. (6) Studies of entropic effects during self-assembly. Characterization of properties of matter for clusters up to sizes exhibiting bulk properties. In addition, the NAS (NASA Advanced Supercomputing) supercomputer division sponsored a workshop on computational molecular nanotechnology on March 4-5, 1996 held at NASA Ames Research Center. Finally, collaborations with Bill Goddard at CalTech, Ralph Merkle at Xerox Parc, Don Brenner at NCSU (North Carolina State University), Tom McKendree at Hughes, and Todd Wipke at UCSC are underway.

  18. The Development of University Computing in Sweden 1965-1985

    NASA Astrophysics Data System (ADS)

    Dahlstrand, Ingemar

    In 1965-70 the government agency, Statskontoret, set up five university computing centers, as service bureaux financed by grants earmarked for computer use. The centers were well equipped and staffed and caused a surge in computer use. When the yearly flow of grant money stagnated at 25 million Swedish crowns, the centers had to find external income to survive and acquire time-sharing. But the charging system led to the computers not being fully used. The computer scientists lacked equipment for laboratory use. The centers were decentralized and the earmarking abolished. Eventually they got new tasks like running computers owned by the departments, and serving the university administration.

  19. Emergency preparedness for genetics centers, laboratories, and patients: the Southeast Region Genetics Collaborative strategic plan.

    PubMed

    Andersson, Hans C; Perry, William; Bowdish, Bruce; Floyd-Browning, Phaidra

    2011-10-01

    Emergencies occur unpredictably and interrupt routine genetic care. The events after hurricanes Katrina and Rita have led to the recognition that a coherent plan is necessary to ensure continuity of operations for genetic centers and laboratories, including newborn screening. No geographic region is protected from the effects of a variety of potential emergencies. Regional and national efforts have begun to address the need for such preparedness, but a plan for ensuring continuity of operations by creating an emergency preparedness plan must be developed for each genetic center and laboratory, with attention to the interests of patients. This article describes the first steps in development of an emergency preparedness plan for individual centers.

  20. Computational mechanics and physics at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    South, Jerry C., Jr.

    1987-01-01

    An overview is given of computational mechanics and physics at NASA Langley Research Center. Computational analysis is a major component and tool in many of Langley's diverse research disciplines, as well as in the interdisciplinary research. Examples are given for algorithm development and advanced applications in aerodynamics, transition to turbulence and turbulence simulation, hypersonics, structures, and interdisciplinary optimization.

  1. About Region 3's Laboratory and Field Services at EPA's Environmental Science Center

    EPA Pesticide Factsheets

    Mission & contact information for EPA Region 3's Laboratory and Field Services located at EPA's Environmental Science Center: the Office of Analytical Services and Quality Assurance & Field Inspection Program

  2. Center for Nonlinear Studies

    Science.gov Websites

    Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron

  3. An Easily Assembled Laboratory Exercise in Computed Tomography

    ERIC Educational Resources Information Center

    Mylott, Elliot; Klepetka, Ryan; Dunlap, Justin C.; Widenhorn, Ralf

    2011-01-01

    In this paper, we present a laboratory activity in computed tomography (CT) primarily composed of a photogate and a rotary motion sensor that can be assembled quickly and partially automates data collection and analysis. We use an enclosure made with a light filter that is largely opaque in the visible spectrum but mostly transparent to the near…

  4. Laboratory Directed Research & Development (LDRD)

    Science.gov Websites

    Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron

  5. A User Assessment of Workspaces in Selected Music Education Computer Laboratories.

    ERIC Educational Resources Information Center

    Badolato, Michael Jeremy

    A study of 120 students selected from the user populations of four music education computer laboratories was conducted to determine the applicability of current ergonomic and environmental design guidelines in satisfying the needs of users of educational computing workspaces. Eleven categories of workspace factors were organized into a…

  6. Senior Laboratory Animal Technician | Center for Cancer Research

    Cancer.gov

    PROGRAM DESCRIPTION The Laboratory Animal Sciences Program (LASP) provides exceptional quality animal care and technical support services for animal research performed at the National Cancer Institute at the Frederick National Laboratory for Cancer Research. LASP executes this mission by providing a broad spectrum of state-of-the-art technologies and services that are focused on the design, generation, characterization and application of genetically engineered and biological animal models of human disease, which are aimed at the development of targeted diagnostics and therapies. LASP contributes to advancing human health, developing new treatments, and improving existing treatments for cancer and other diseases while ensuring safe and humane treatment of animals. KEY ROLES/RESPONSIBILITIES The Senior Laboratory Animal Technician will be responsible for: Daily tasks associated with the care, breeding and treatment of research animals for experimental purposes Management of rodent breeding colonies consisting of multiple, genetically complex strains and associated record keeping and database management Colony management procedures including: tail clipping, animal identification, weaning Data entry consistent with complex colony management Collection of routine diagnostic samples Coordinating shipment of live animals and specimens Performing rodent experimental procedures including basic necropsy and blood collection Observation and recording of physical signs of animal health Knowledge of safe working practices using chemical carcinogen and biological hazards Work schedule may include weekend and holiday hours This position is in support of the Center for Cancer Research (CCR).

  7. Human Centered Computing for Mars Exploration

    NASA Technical Reports Server (NTRS)

    Trimble, Jay

    2005-01-01

    The science objectives are to determine the aqueous, climatic, and geologic history of a site on Mars where conditions may have been favorable to the preservation of evidence of prebiotic or biotic processes. Human Centered Computing is a development process that starts with users and their needs, rather than with technology. The goal is a system design that serves the user, where the technology fits the task and the complexity is that of the task not of the tool.

  8. FY04 Engineering Technology Reports Laboratory Directed Research and Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharpe, R M

    2005-01-27

    This report summarizes the science and technology research and development efforts in Lawrence Livermore National Laboratory's Engineering Directorate for FY2004, and exemplifies Engineering's more than 50-year history of developing the technologies needed to support the Laboratory's missions. Engineering has been a partner in every major program and project at the Laboratory throughout its existence and has prepared for this role with a skilled workforce and the technical resources developed through venues like the Laboratory Directed Research and Development Program (LDRD). This accomplishment is well summarized by Engineering's mission: ''Enable program success today and ensure the Laboratory's vitality tomorrow''. Engineering's investmentmore » in technologies is carried out through two programs, the ''Tech Base'' program and the LDRD program. LDRD is the vehicle for creating those technologies and competencies that are cutting edge. These require a significant level of research or contain some unknown that needs to be fully understood. Tech Base is used to apply technologies to a Laboratory need. The term commonly used for Tech Base projects is ''reduction to practice''. Therefore, the LDRD report covered here has a strong research emphasis. Areas that are presented all fall into those needed to accomplish our mission. For FY2004, Engineering's LDRD projects were focused on mesoscale target fabrication and characterization, development of engineering computational capability, material studies and modeling, remote sensing and communications, and microtechnology and nanotechnology for national security applications. Engineering's five Centers, in partnership with the Division Leaders and Department Heads, are responsible for guiding the long-term science and technology investments for the Directorate. The Centers represent technologies that have been identified as critical for the present and future work of the Laboratory, and are chartered to develop their

  9. Senior Computational Scientist | Center for Cancer Research

    Cancer.gov

    The Basic Science Program (BSP) pursues independent, multidisciplinary research in basic and applied molecular biology, immunology, retrovirology, cancer biology, and human genetics. Research efforts and support are an integral part of the Center for Cancer Research (CCR) at the Frederick National Laboratory for Cancer Research (FNLCR). The Cancer & Inflammation Program (CIP), Basic Science Program, HLA Immunogenetics Section, under the leadership of Dr. Mary Carrington, studies the influence of human leukocyte antigens (HLA) and specific KIR/HLA genotypes on risk of and outcomes to infection, cancer, autoimmune disease, and maternal-fetal disease. Recent studies have focused on the impact of HLA gene expression in disease, the molecular mechanism regulating expression levels, and the functional basis for the effect of differential expression on disease outcome. The lab’s further focus is on the genetic basis for resistance/susceptibility to disease conferred by immunogenetic variation. KEY ROLES/RESPONSIBILITIES The Senior Computational Scientist will provide research support to the CIP-BSP-HLA Immunogenetics Section performing bio-statistical design, analysis and reporting of research projects conducted in the lab. This individual will be involved in the implementation of statistical models and data preparation. Successful candidate should have 5 or more years of competent, innovative biostatistics/bioinformatics research experience, beyond doctoral training Considerable experience with statistical software, such as SAS, R and S-Plus Sound knowledge, and demonstrated experience of theoretical and applied statistics Write program code to analyze data using statistical analysis software Contribute to the interpretation and publication of research results

  10. Creating and Using a Computer Networking and Systems Administration Laboratory Built under Relaxed Financial Constraints

    ERIC Educational Resources Information Center

    Conlon, Michael P.; Mullins, Paul

    2011-01-01

    The Computer Science Department at Slippery Rock University created a laboratory for its Computer Networks and System Administration and Security courses under relaxed financial constraints. This paper describes the department's experience designing and using this laboratory, including lessons learned and descriptions of some student projects…

  11. A computer-based maintenance reminder and record-keeping system for clinical laboratories.

    PubMed

    Roberts, B I; Mathews, C L; Walton, C J; Frazier, G

    1982-09-01

    "Maintenance" is all the activity an organization devotes to keeping instruments within performance specifications to assure accurate and precise operation. The increasing use of complex analytical instruments as "workhorses" in clinical laboratories requires more maintenance awareness by laboratory personnel. Record-keeping systems that document maintenance completion and that should prompt the continued performance of maintenance tasks have not kept up with instrumentation development. We report here a computer-based record-keeping and reminder system that lists weekly the maintenance items due for each work station in the laboratory, including the time required to complete each item. Written in BASIC, the system uses a DATABOSS data base management system running on a time-shared Digital Equipment Corporation PDP 11/60 computer with a RSTS V 7.0 operating system.

  12. Implementing the Data Center Energy Productivity Metric in a High Performance Computing Data Center

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sego, Landon H.; Marquez, Andres; Rawson, Andrew

    2013-06-30

    As data centers proliferate in size and number, the improvement of their energy efficiency and productivity has become an economic and environmental imperative. Making these improvements requires metrics that are robust, interpretable, and practical. We discuss the properties of a number of the proposed metrics of energy efficiency and productivity. In particular, we focus on the Data Center Energy Productivity (DCeP) metric, which is the ratio of useful work produced by the data center to the energy consumed performing that work. We describe our approach for using DCeP as the principal outcome of a designed experiment using a highly instrumented,more » high-performance computing data center. We found that DCeP was successful in clearly distinguishing different operational states in the data center, thereby validating its utility as a metric for identifying configurations of hardware and software that would improve energy productivity. We also discuss some of the challenges and benefits associated with implementing the DCeP metric, and we examine the efficacy of the metric in making comparisons within a data center and between data centers.« less

  13. System reliability of randomly vibrating structures: Computational modeling and laboratory testing

    NASA Astrophysics Data System (ADS)

    Sundar, V. S.; Ammanagi, S.; Manohar, C. S.

    2015-09-01

    The problem of determination of system reliability of randomly vibrating structures arises in many application areas of engineering. We discuss in this paper approaches based on Monte Carlo simulations and laboratory testing to tackle problems of time variant system reliability estimation. The strategy we adopt is based on the application of Girsanov's transformation to the governing stochastic differential equations which enables estimation of probability of failure with significantly reduced number of samples than what is needed in a direct simulation study. Notably, we show that the ideas from Girsanov's transformation based Monte Carlo simulations can be extended to conduct laboratory testing to assess system reliability of engineering structures with reduced number of samples and hence with reduced testing times. Illustrative examples include computational studies on a 10-degree of freedom nonlinear system model and laboratory/computational investigations on road load response of an automotive system tested on a four-post test rig.

  14. For operation of the Computer Software Management and Information Center (COSMIC)

    NASA Technical Reports Server (NTRS)

    Carmon, J. L.

    1983-01-01

    During the month of June, the Survey Research Center (SRC) at the University of Georgia designed new benefits questionnaires for computer software management and information center (COSMIC). As a test of their utility, these questionnaires are now used in the benefits identification process.

  15. Staff | Computational Science | NREL

    Science.gov Websites

    develops and leads laboratory-wide efforts in high-performance computing and energy-efficient data centers Professional IV-High Perf Computing Jim.Albin@nrel.gov 303-275-4069 Ananthan, Shreyas Senior Scientist - High -Performance Algorithms and Modeling Shreyas.Ananthan@nrel.gov 303-275-4807 Bendl, Kurt IT Professional IV-High

  16. Hyperspectral Remote Sensing and Ecological Modeling Research and Education at Mid America Remote Sensing Center (MARC): Field and Laboratory Enhancement

    NASA Technical Reports Server (NTRS)

    Cetin, Haluk

    1999-01-01

    The purpose of this project was to establish a new hyperspectral remote sensing laboratory at the Mid-America Remote sensing Center (MARC), dedicated to in situ and laboratory measurements of environmental samples and to the manipulation, analysis, and storage of remotely sensed data for environmental monitoring and research in ecological modeling using hyperspectral remote sensing at MARC, one of three research facilities of the Center of Reservoir Research at Murray State University (MSU), a Kentucky Commonwealth Center of Excellence. The equipment purchased, a FieldSpec FR portable spectroradiometer and peripherals, and ENVI hyperspectral data processing software, allowed MARC to provide hands-on experience, education, and training for the students of the Department of Geosciences in quantitative remote sensing using hyperspectral data, Geographic Information System (GIS), digital image processing (DIP), computer, geological and geophysical mapping; to provide field support to the researchers and students collecting in situ and laboratory measurements of environmental data; to create a spectral library of the cover types and to establish a World Wide Web server to provide the spectral library to other academic, state and Federal institutions. Much of the research will soon be published in scientific journals. A World Wide Web page has been created at the web site of MARC. Results of this project are grouped in two categories, education and research accomplishments. The Principal Investigator (PI) modified remote sensing and DIP courses to introduce students to ii situ field spectra and laboratory remote sensing studies for environmental monitoring in the region by using the new equipment in the courses. The PI collected in situ measurements using the spectroradiometer for the ER-2 mission to Puerto Rico project for the Moderate Resolution Imaging Spectrometer (MODIS) Airborne Simulator (MAS). Currently MARC is mapping water quality in Kentucky Lake and

  17. Optimizing physician access to surgical intensive care unit laboratory information through mobile computing.

    PubMed

    Strain, J J; Felciano, R M; Seiver, A; Acuff, R; Fagan, L

    1996-01-01

    Approximately 30 minutes of computer access time are required by surgical residents at Stanford University Medical Center (SUMC) to examine the lab values of all patients on a surgical intensive care unit (ICU) service, a task that must be performed several times a day. To reduce the time accessing this information and simultaneously increase the readability and currency of the data, we have created a mobile, pen-based user interface and software system that delivers lab results to surgeons in the ICU. The ScroungeMaster system, loaded on a portable tablet computer, retrieves lab results for a subset of patients from the central laboratory computer and stores them in a local database cache. The cache can be updated on command; this update takes approximately 2.7 minutes for all ICU patients being followed by the surgeon, and can be performed as a background task while the user continues to access selected lab results. The user interface presents lab results according to physiologic system. Which labs are displayed first is governed by a layout selection algorithm based on previous accesses to the patient's lab information, physician preferences, and the nature of the patient's medical condition. Initial evaluation of the system has shown that physicians prefer the ScroungeMaster interface to that of existing systems at SUMC and are satisfied with the system's performance. We discuss the evolution of ScroungeMaster and make observations on changes to physician work flow with the presence of mobile, pen-based computing in the ICU.

  18. Computer listing of the effects of drugs on laboratory data

    PubMed Central

    Young, D. S.; Thomas, D. W.; Friedman, R. B.

    1972-01-01

    A listing of approximately 10000 effects of drugs on tests performed in clinical laboratories has been developed in a time-shared computer. The list contains a directory for matching proprietary and generic names of drugs and an explanation for the mode of action of the drug on each test. Each entry is supported by a bibliographical reference that contains the author's names, and the title of the article and journal. It is possible to search for specific `character strings' (word or words, number, etc) to obtain all the effects of a particular drug, or all drugs that affect a particular test, or even to search for a specific explanation for an effect. The system is undergoing trial in the Department's own computer to permit of automatic correlation of the effects of drugs with laboratory data from patients in one hospital ward. PMID:4648544

  19. A Queue Simulation Tool for a High Performance Scientific Computing Center

    NASA Technical Reports Server (NTRS)

    Spear, Carrie; McGalliard, James

    2007-01-01

    The NASA Center for Computational Sciences (NCCS) at the Goddard Space Flight Center provides high performance highly parallel processors, mass storage, and supporting infrastructure to a community of computational Earth and space scientists. Long running (days) and highly parallel (hundreds of CPUs) jobs are common in the workload. NCCS management structures batch queues and allocates resources to optimize system use and prioritize workloads. NCCS technical staff use a locally developed discrete event simulation tool to model the impacts of evolving workloads, potential system upgrades, alternative queue structures and resource allocation policies.

  20. An approach to quality and performance control in a computer-assisted clinical chemistry laboratory.

    PubMed Central

    Undrill, P E; Frazer, S C

    1979-01-01

    A locally developed, computer-based clinical chemistry laboratory system has been in operation since 1970. This utilises a Digital Equipment Co Ltd PDP 12 and an interconnected PDP 8/F computer. Details are presented of the performance and quality control techniques incorporated into the system. Laboratory performance is assessed through analysis of results from fixed-level control sera as well as from cumulative sum methods. At a simple level the presentation may be considered purely indicative, while at a more sophisticated level statistical concepts have been introduced to aid the laboratory controller in decision-making processes. PMID:438340

  1. Final Report. Center for Scalable Application Development Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mellor-Crummey, John

    2014-10-26

    The Center for Scalable Application Development Software (CScADS) was established as a part- nership between Rice University, Argonne National Laboratory, University of California Berkeley, University of Tennessee – Knoxville, and University of Wisconsin – Madison. CScADS pursued an integrated set of activities with the aim of increasing the productivity of DOE computational scientists by catalyzing the development of systems software, libraries, compilers, and tools for leadership computing platforms. Principal Center activities were workshops to engage the research community in the challenges of leadership computing, research and development of open-source software, and work with computational scientists to help them develop codesmore » for leadership computing platforms. This final report summarizes CScADS activities at Rice University in these areas.« less

  2. The Mathematics and Computer Science Learning Center (MLC).

    ERIC Educational Resources Information Center

    Abraham, Solomon T.

    The Mathematics and Computer Science Learning Center (MLC) was established in the Department of Mathematics at North Carolina Central University during the fall semester of the 1982-83 academic year. The initial operations of the MLC were supported by grants to the University from the Burroughs-Wellcome Company and the Kenan Charitable Trust Fund.…

  3. Computer-simulated laboratory explorations for middle school life, earth, and physical Science

    NASA Astrophysics Data System (ADS)

    von Blum, Ruth

    1992-06-01

    Explorations in Middle School Science is a set of 72 computer-simulated laboratory lessons in life, earth, and physical Science for grades 6 9 developed by Jostens Learning Corporation with grants from the California State Department of Education and the National Science Foundation.3 At the heart of each lesson is a computer-simulated laboratory that actively involves students in doing science improving their: (1) understanding of science concepts by applying critical thinking to solve real problems; (2) skills in scientific processes and communications; and (3) attitudes about science. Students use on-line tools (notebook, calculator, word processor) to undertake in-depth investigations of phenomena (like motion in outer space, disease transmission, volcanic eruptions, or the structure of the atom) that would be too difficult, dangerous, or outright impossible to do in a “live” laboratory. Suggested extension activities lead students to hands-on investigations, away from the computer. This article presents the underlying rationale, instructional model, and process by which Explorations was designed and developed. It also describes the general courseware structure and three lesson's in detail, as well as presenting preliminary data from the evaluation. Finally, it suggests a model for incorporating technology into the science classroom.

  4. Computer based human-centered display system

    NASA Technical Reports Server (NTRS)

    Temme, Leonard A. (Inventor); Still, David L. (Inventor)

    2002-01-01

    A human centered informational display is disclosed that can be used with vehicles (e.g. aircraft) and in other operational environments where rapid human centered comprehension of an operational environment is required. The informational display integrates all cockpit information into a single display in such a way that the pilot can clearly understand with a glance, his or her spatial orientation, flight performance, engine status and power management issues, radio aids, and the location of other air traffic, runways, weather, and terrain features. With OZ the information is presented as an integrated whole, the pilot instantaneously recognizes flight path deviations, and is instinctively drawn to the corrective maneuvers. Our laboratory studies indicate that OZ transfers to the pilot all of the integrated display information in less than 200 milliseconds. The reacquisition of scan can be accomplished just as quickly. Thus, the time constants for forming a mental model are near instantaneous. The pilot's ability to keep up with rapidly changing and threatening environments is tremendously enhanced. OZ is most easily compatible with aircraft that has flight path information coded electronically. With the correct sensors (which are currently available) OZ can be installed in essentially all current aircraft.

  5. JobCenter: an open source, cross-platform, and distributed job queue management system optimized for scalability and versatility.

    PubMed

    Jaschob, Daniel; Riffle, Michael

    2012-07-30

    Laboratories engaged in computational biology or bioinformatics frequently need to run lengthy, multistep, and user-driven computational jobs. Each job can tie up a computer for a few minutes to several days, and many laboratories lack the expertise or resources to build and maintain a dedicated computer cluster. JobCenter is a client-server application and framework for job management and distributed job execution. The client and server components are both written in Java and are cross-platform and relatively easy to install. All communication with the server is client-driven, which allows worker nodes to run anywhere (even behind external firewalls or "in the cloud") and provides inherent load balancing. Adding a worker node to the worker pool is as simple as dropping the JobCenter client files onto any computer and performing basic configuration, which provides tremendous ease-of-use, flexibility, and limitless horizontal scalability. Each worker installation may be independently configured, including the types of jobs it is able to run. Executed jobs may be written in any language and may include multistep workflows. JobCenter is a versatile and scalable distributed job management system that allows laboratories to very efficiently distribute all computational work among available resources. JobCenter is freely available at http://code.google.com/p/jobcenter/.

  6. Williams uses computer in the U.S. Laboratory during Expedition 13

    NASA Image and Video Library

    2006-04-11

    ISS013-E-05853 (11 April 2006) --- Astronaut Jeffrey N. Williams, Expedition 13 NASA space station science officer and flight engineer, uses a computer in the Destiny laboratory of the International Space Station.

  7. Patient-centered computing: can it curb malpractice risk?

    PubMed

    Bartlett, E E

    1993-01-01

    The threat of a medical malpractice suit represents a major cause of career dissatisfaction for American physicians. Patient-centered computing may improve physician-patient communications, thereby reducing liability risk. This review describes programs that have sought to enhance patient education and involvement pertaining to 5 major categories of malpractice lawsuits: Diagnosis, medications, obstetrics, surgery, and treatment errors.

  8. Patient-centered computing: can it curb malpractice risk?

    PubMed Central

    Bartlett, E. E.

    1993-01-01

    The threat of a medical malpractice suit represents a major cause of career dissatisfaction for American physicians. Patient-centered computing may improve physician-patient communications, thereby reducing liability risk. This review describes programs that have sought to enhance patient education and involvement pertaining to 5 major categories of malpractice lawsuits: Diagnosis, medications, obstetrics, surgery, and treatment errors. PMID:8130563

  9. Books, Bytes, and Bridges: Libraries and Computer Centers in Academic Institutions.

    ERIC Educational Resources Information Center

    Hardesty, Larry, Ed.

    This book about the relationship between computer centers and libraries at academic institutions contains the following chapters: (1) "A History of the Rhetoric and Reality of Library and Computing Relationships" (Peggy Seiden and Michael D. Kathman); (2) "An Issue in Search of a Metaphor: Readings on the Marriageability of…

  10. An Evaluation of Student Perceptions of Screen Presentations in Computer-based Laboratory Simulations.

    ERIC Educational Resources Information Center

    Edward, Norrie S.

    1997-01-01

    Evaluates the importance of realism in the screen presentation of the plant in computer-based laboratory simulations for part-time engineering students. Concludes that simulations are less effective than actual laboratories but that realism minimizes the disadvantages. The schematic approach was preferred for ease of use. (AIM)

  11. The Center for Nanophase Materials Sciences

    NASA Astrophysics Data System (ADS)

    Lowndes, Douglas

    2005-03-01

    The Center for Nanophase Materials Sciences (CNMS) located at Oak Ridge National Laboratory (ORNL) will be the first DOE Nanoscale Science Research Center to begin operation, with construction to be completed in April 2005 and initial operations in October 2005. The CNMS' scientific program has been developed through workshops with the national community, with the goal of creating a highly collaborative research environment to accelerate discovery and drive technological advances. Research at the CNMS is organized under seven Scientific Themes selected to address challenges to understanding and to exploit particular ORNL strengths (see http://cnms.ornl.govhttp://cnms.ornl.gov). These include extensive synthesis and characterization capabilities for soft, hard, nanostructured, magnetic and catalytic materials and their composites; neutron scattering at the Spallation Neutron Source and High Flux Isotope Reactor; computational nanoscience in the CNMS' Nanomaterials Theory Institute and utilizing facilities and expertise of the Center for Computational Sciences and the new Leadership Scientific Computing Facility at ORNL; a new CNMS Nanofabrication Research Laboratory; and a suite of unique and state-of-the-art instruments to be made reliably available to the national community for imaging, manipulation, and properties measurements on nanoscale materials in controlled environments. The new research facilities will be described together with the planned operation of the user research program, the latter illustrated by the current ``jump start'' user program that utilizes existing ORNL/CNMS facilities.

  12. A multipurpose computing center with distributed resources

    NASA Astrophysics Data System (ADS)

    Chudoba, J.; Adam, M.; Adamová, D.; Kouba, T.; Mikula, A.; Říkal, V.; Švec, J.; Uhlířová, J.; Vokáč, P.; Svatoš, M.

    2017-10-01

    The Computing Center of the Institute of Physics (CC IoP) of the Czech Academy of Sciences serves a broad spectrum of users with various computing needs. It runs WLCG Tier-2 center for the ALICE and the ATLAS experiments; the same group of services is used by astroparticle physics projects the Pierre Auger Observatory (PAO) and the Cherenkov Telescope Array (CTA). OSG stack is installed for the NOvA experiment. Other groups of users use directly local batch system. Storage capacity is distributed to several locations. DPM servers used by the ATLAS and the PAO are all in the same server room, but several xrootd servers for the ALICE experiment are operated in the Nuclear Physics Institute in Řež, about 10 km away. The storage capacity for the ATLAS and the PAO is extended by resources of the CESNET - the Czech National Grid Initiative representative. Those resources are in Plzen and Jihlava, more than 100 km away from the CC IoP. Both distant sites use a hierarchical storage solution based on disks and tapes. They installed one common dCache instance, which is published in the CC IoP BDII. ATLAS users can use these resources using the standard ATLAS tools in the same way as the local storage without noticing this geographical distribution. Computing clusters LUNA and EXMAG dedicated to users mostly from the Solid State Physics departments offer resources for parallel computing. They are part of the Czech NGI infrastructure MetaCentrum with distributed batch system based on torque with a custom scheduler. Clusters are installed remotely by the MetaCentrum team and a local contact helps only when needed. Users from IoP have exclusive access only to a part of these two clusters and take advantage of higher priorities on the rest (1500 cores in total), which can also be used by any user of the MetaCentrum. IoP researchers can also use distant resources located in several towns of the Czech Republic with a capacity of more than 12000 cores in total.

  13. Institute for scientific computing research;fiscal year 1999 annual report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keyes, D

    2000-03-28

    Large-scale scientific computation, and all of the disciplines that support it and help to validate it, have been placed at the focus of Lawrence Livermore National Laboratory by the Accelerated Strategic Computing Initiative (ASCI). The Laboratory operates the computer with the highest peak performance in the world and has undertaken some of the largest and most compute-intensive simulations ever performed. Computers at the architectural extremes, however, are notoriously difficult to use efficiently. Even such successes as the Laboratory's two Bell Prizes awarded in November 1999 only emphasize the need for much better ways of interacting with the results of large-scalemore » simulations. Advances in scientific computing research have, therefore, never been more vital to the core missions of the Laboratory than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, the Laboratory must engage researchers at many academic centers of excellence. In FY 1999, the Institute for Scientific Computing Research (ISCR) has expanded the Laboratory's bridge to the academic community in the form of collaborative subcontracts, visiting faculty, student internships, a workshop, and a very active seminar series. ISCR research participants are integrated almost seamlessly with the Laboratory's Center for Applied Scientific Computing (CASC), which, in turn, addresses computational challenges arising throughout the Laboratory. Administratively, the ISCR flourishes under the Laboratory's University Relations Program (URP). Together with the other four Institutes of the URP, it must navigate a course that allows the Laboratory to benefit from academic exchanges while preserving national security. Although FY 1999 brought more than its share of challenges to the operation of an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met

  14. [AERA. Dream machines and computing practices at the Mathematical Center].

    PubMed

    Alberts, Gerard; De Beer, Huub T

    2008-01-01

    Dream machines may be just as effective as the ones materialised. Their symbolic thrust can be quite powerful. The Amsterdam 'Mathematisch Centrum' (Mathematical Center), founded February 11, 1946, created a Computing Department in an effort to realise its goal of serving society. When Aad van Wijngaarden was appointed as head of the Computing Department, however, he claimed space for scientific research and computer construction, next to computing as a service. Still, the computing service following the five stage style of Hartree's numerical analysis remained a dominant characteristic of the work of the Computing Department. The high level of ambition held by Aad van Wijngaarden lead to ever renewed projections of big automatic computers, symbolised by the never-built AERA. Even a machine that was actually constructed, the ARRA which followed A.D. Booth's design of the ARC, never made it into real operation. It did serve Van Wijngaarden to bluff his way into the computer age by midsummer 1952. Not until January 1954 did the computing department have a working stored program computer, which for reasons of policy went under the same name: ARRA. After just one other machine, the ARMAC, had been produced, a separate company, Electrologica, was set up for the manufacture of computers, which produced the rather successful X1 computer. The combination of ambition and absence of a working machine lead to a high level of work on programming, way beyond the usual ideas of libraries of subroutines. Edsger W. Dijkstra in particular led the way to an emphasis on the duties of the programmer within the pattern of numerical analysis. Programs generating programs, known elsewhere as autocoding systems, were at the 'Mathematisch Centrum' called 'superprograms'. Practical examples were usually called a 'complex', in Dutch, where in English one might say 'system'. Historically, this is where software begins. Dekker's matrix complex, Dijkstra's interrupt system, Dijkstra and

  15. A Study of the Relative Effectiveness of Content and Process Centered Biology Laboratories for College Freshmen.

    ERIC Educational Resources Information Center

    Murphy, Glenn Wayne

    The relative effectiveness of "content-centered" and "process-centered" biology laboratory courses in a freshman general biology course was investigated by administering the Nelson Biology Test, Science Attitude Scale, EPS II (a problem solving test), and an Interest Inventory at the beginning and end of the one quarter course. Course examination…

  16. Applied Computational Fluid Dynamics at NASA Ames Research Center

    NASA Technical Reports Server (NTRS)

    Holst, Terry L.; Kwak, Dochan (Technical Monitor)

    1994-01-01

    The field of Computational Fluid Dynamics (CFD) has advanced to the point where it can now be used for many applications in fluid mechanics research and aerospace vehicle design. A few applications being explored at NASA Ames Research Center will be presented and discussed. The examples presented will range in speed from hypersonic to low speed incompressible flow applications. Most of the results will be from numerical solutions of the Navier-Stokes or Euler equations in three space dimensions for general geometry applications. Computational results will be used to highlight the presentation as appropriate. Advances in computational facilities including those associated with NASA's CAS (Computational Aerosciences) Project of the Federal HPCC (High Performance Computing and Communications) Program will be discussed. Finally, opportunities for future research will be presented and discussed. All material will be taken from non-sensitive, previously-published and widely-disseminated work.

  17. Serological Diagnosis of Paracoccidioidomycosis: High Rate of Inter-laboratorial Variability among Medical Mycology Reference Centers

    PubMed Central

    Vidal, Monica Scarpelli Martinelli; Del Negro, Gilda Maria Barbaro; Vicentini, Adriana Pardini; Svidzinski, Teresinha Inez Estivalet; Mendes-Giannini, Maria Jose; Almeida, Ana Marisa Fusco; Martinez, Roberto; de Camargo, Zoilo Pires; Taborda, Carlos Pelleschi; Benard, Gil

    2014-01-01

    Background Serological tests have long been established as rapid, simple and inexpensive tools for the diagnosis and follow-up of PCM. However, different protocols and antigen preparations are used and the few attempts to standardize the routine serological methods have not succeeded. Methodology/Principal findings We compared the performance of six Brazilian reference centers for serological diagnosis of PCM. Each center provided 30 sera of PCM patients, with positive high, intermediate and low titers, which were defined as the “reference” titers. Each center then applied its own antigen preparation and serological routine test, either semiquantitative double immunodifusion or counterimmmunoelectrophoresis, in the 150 sera from the other five centers blindly as regard to the “reference” titers. Titers were transformed into scores: 0 (negative), 1 (healing titers), 2 (active disease, low titers) and 3 (active disease, high titers) according to each center's criteria. Major discordances were considered between scores indicating active disease and scores indicating negative or healing titers; such discordance when associated with proper clinical and other laboratorial data, may correspond to different approaches to the patient's treatment. Surprisingly, all centers exhibited a high rate of “major” discordances with a mean of 31 (20%) discordant scores. Alternatively, when the scores given by one center to their own sera were compared with the scores given to their sera by the remaining five other centers, a high rate of major discordances was also found, with a mean number of 14.8 sera in 30 presenting a discordance with at least one other center. The data also suggest that centers that used CIE and pool of isolates for antigen preparation performed better. Conclusion There are inconsistencies among the laboratories that are strong enough to result in conflicting information regarding the patients' treatment. Renewed efforts should be promoted to improve

  18. A Computer Learning Center for Environmental Sciences

    NASA Technical Reports Server (NTRS)

    Mustard, John F.

    2000-01-01

    In the fall of 1998, MacMillan Hall opened at Brown University to students. In MacMillan Hall was the new Computer Learning Center, since named the EarthLab which was outfitted with high-end workstations and peripherals primarily focused on the use of remotely sensed and other spatial data in the environmental sciences. The NASA grant we received as part of the "Centers of Excellence in Applications of Remote Sensing to Regional and Global Integrated Environmental Assessments" was the primary source of funds to outfit this learning and research center. Since opening, we have expanded the range of learning and research opportunities and integrated a cross-campus network of disciplines who have come together to learn and use spatial data of all kinds. The EarthLab also forms a core of undergraduate, graduate, and faculty research on environmental problems that draw upon the unique perspective of remotely sensed data. Over the last two years, the Earthlab has been a center for research on the environmental impact of water resource use in and regions, impact of the green revolution on forest cover in India, the design of forest preserves in Vietnam, and detailed assessments of the utility of thermal and hyperspectral data for water quality analysis. It has also been used extensively for local environmental activities, in particular studies on the impact of lead on the health of urban children in Rhode Island. Finally, the EarthLab has also served as a key educational and analysis center for activities related to the Brown University Affiliated Research Center that is devoted to transferring university research to the private sector.

  19. Computational Protein Engineering: Bridging the Gap between Rational Design and Laboratory Evolution

    PubMed Central

    Barrozo, Alexandre; Borstnar, Rok; Marloie, Gaël; Kamerlin, Shina Caroline Lynn

    2012-01-01

    Enzymes are tremendously proficient catalysts, which can be used as extracellular catalysts for a whole host of processes, from chemical synthesis to the generation of novel biofuels. For them to be more amenable to the needs of biotechnology, however, it is often necessary to be able to manipulate their physico-chemical properties in an efficient and streamlined manner, and, ideally, to be able to train them to catalyze completely new reactions. Recent years have seen an explosion of interest in different approaches to achieve this, both in the laboratory, and in silico. There remains, however, a gap between current approaches to computational enzyme design, which have primarily focused on the early stages of the design process, and laboratory evolution, which is an extremely powerful tool for enzyme redesign, but will always be limited by the vastness of sequence space combined with the low frequency for desirable mutations. This review discusses different approaches towards computational enzyme design and demonstrates how combining newly developed screening approaches that can rapidly predict potential mutation “hotspots” with approaches that can quantitatively and reliably dissect the catalytic step can bridge the gap that currently exists between computational enzyme design and laboratory evolution studies. PMID:23202907

  20. Postdoctoral Fellow | Center for Cancer Research

    Cancer.gov

    The Neuro-Oncology Branch (NOB), Center for Cancer Research (CCR), National Cancer Institute (NCI) of the National Institutes of Health (NIH) is seeking outstanding postdoctoral candidates interested in studying metabolic and cell signaling pathways in the context of brain cancers through construction of computational models amenable to formal computational analysis and simulation. The ability to closely collaborate with the modern metabolomics center developed at CCR provides a unique opportunity for a postdoctoral candidate with a strong theoretical background and interest in demonstrating the incredible potential of computational approaches to solve problems from scientific disciplines and improve lives. The candidate will be given the opportunity to both construct data-driven models, as well as biologically validate the models by demonstrating the ability to predict the effects of altering tumor metabolism in laboratory and clinical settings.

  1. Research in mobile robotics at ORNL/CESAR (Oak Ridge National Laboratory/Center for Engineering Systems Advanced Research)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mann, R.C.; Weisbin, C.R.; Pin, F.G.

    1989-01-01

    This paper reviews ongoing and planned research with mobile autonomous robots at the Oak Ridge National Laboratory (ORNL), Center for Engineering Systems Advanced Research (CESAR). Specifically we report on results obtained with the robot HERMIES-IIB in navigation, intelligent sensing, learning, and on-board parallel computing in support of these functions. We briefly summarize an experiment with HERMIES-IIB that demonstrates the capability of smooth transitions between robot autonomy and tele-operation. This experiment results from collaboration among teams at the Universities of Florida, Michigan, Tennessee, and Texas; and ORNL in a program targeted at robotics for advanced nuclear power stations. We conclude bymore » summarizing ongoing R D with our new mobile robot HERMIES-III which is equipped with a seven degree-of-freedom research manipulator arm. 12 refs., 4 figs.« less

  2. Internship at NASA Kennedy Space Center's Cryogenic Test laboratory

    NASA Technical Reports Server (NTRS)

    Holland, Katherine

    2013-01-01

    NASA's Kennedy Space Center (KSC) is known for hosting all of the United States manned rocket launches as well as many unmanned launches at low inclinations. Even though the Space Shuttle recently retired, they are continuing to support unmanned launches and modifying manned launch facilities. Before a rocket can be launched, it has to go through months of preparation, called processing. Pieces of a rocket and its payload may come in from anywhere in the nation or even the world. The facilities all around the center help integrate the rocket and prepare it for launch. As NASA prepares for the Space Launch System, a rocket designed to take astronauts beyond Low Earth Orbit throughout the solar system, technology development is crucial for enhancing launch capabilities at the KSC. The Cryogenics Test Laboratory at Kennedy Space Center greatly contributes to cryogenic research and technology development. The engineers and technicians that work there come up with new ways to efficiently store and transfer liquid cryogens. NASA has a great need for this research and technology development as it deals with cryogenic liquid hydrogen and liquid oxygen for rocket fuel, as well as long term space flight applications. Additionally, in this new era of space exploration, the Cryogenics Test Laboratory works with the commercial sector. One technology development project is the Liquid Hydrogen (LH2) Ground Operations Demonstration Unit (GODU). LH2 GODU intends to demonstrate increased efficiency in storing and transferring liquid hydrogen during processing, loading, launch and spaceflight of a spacecraft. During the Shuttle Program, only 55% of hydrogen purchased was used by the Space Shuttle Main Engines. GODU's goal is to demonstrate that this percentage can be increased to 75%. Figure 2 shows the GODU layout when I concluded my internship. The site will include a 33,000 gallon hydrogen tank (shown in cyan) with a heat exchanger inside the hydrogen tank attached to a

  3. Development and Evaluation of Computer-Based Laboratory Practical Learning Tool

    ERIC Educational Resources Information Center

    Gandole, Y. B.

    2006-01-01

    Effective evaluation of educational software is a key issue for successful introduction of advanced tools in the curriculum. This paper details to developing and evaluating a tool for computer assisted learning of science laboratory courses. The process was based on the generic instructional system design model. Various categories of educational…

  4. Restructuring the CS 1 classroom: Examining the effect of open laboratory-based classes vs. closed laboratory-based classes on Computer Science 1 students' achievement and attitudes toward computers and computer courses

    NASA Astrophysics Data System (ADS)

    Henderson, Jean Foster

    The purpose of this study was to assess the effect of classroom restructuring involving computer laboratories on student achievement and student attitudes toward computers and computer courses. The effects of the targeted student attributes of gender, previous programming experience, math background, and learning style were also examined. The open lab-based class structure consisted of a traditional lecture class with a separate, unscheduled lab component in which lab assignments were completed outside of class; the closed lab-based class structure integrated a lab component within the lecture class so that half the class was reserved for lecture and half the class was reserved for students to complete lab assignments by working cooperatively with each other and under the supervision and guidance of the instructor. The sample consisted of 71 students enrolled in four intact classes of Computer Science I during the fall and spring semesters of the 2006--2007 school year at two southern universities: two classes were held in the fall (one at each university) and two classes were held in the spring (one at each university). A counterbalanced repeated measures design was used in which all students experienced both class structures for half of each semester. The order of control and treatment was rotated among the four classes. All students received the same amount of class and instructor time. A multivariate analysis of variance (MANOVA) via a multiple regression strategy was used to test the study's hypotheses. Although the overall MANOVA model was statistically significant, independent follow-up univariate analyses relative to each dependent measure found that the only significant research factor was math background: Students whose mathematics background was at the level of Calculus I or higher had significantly higher student achievement than students whose mathematics background was less than Calculus I. The results suggest that classroom structures that

  5. Hospital laboratories as profit centers.

    PubMed

    Gray, S P; Steiner, J

    1988-11-01

    An aggressive business venture offers one solution to the growing competition and financial pressures hospital laboratories must overcome. For such a venture to be a success, a number of issues must be carefully considered. Properly met, today's challenges in the laboratory can become tomorrow's opportunities.

  6. High performance computing for advanced modeling and simulation of materials

    NASA Astrophysics Data System (ADS)

    Wang, Jue; Gao, Fei; Vazquez-Poletti, Jose Luis; Li, Jianjiang

    2017-02-01

    The First International Workshop on High Performance Computing for Advanced Modeling and Simulation of Materials (HPCMS2015) was held in Austin, Texas, USA, Nov. 18, 2015. HPCMS 2015 was organized by Computer Network Information Center (Chinese Academy of Sciences), University of Michigan, Universidad Complutense de Madrid, University of Science and Technology Beijing, Pittsburgh Supercomputing Center, China Institute of Atomic Energy, and Ames Laboratory.

  7. JobCenter: an open source, cross-platform, and distributed job queue management system optimized for scalability and versatility

    PubMed Central

    2012-01-01

    Background Laboratories engaged in computational biology or bioinformatics frequently need to run lengthy, multistep, and user-driven computational jobs. Each job can tie up a computer for a few minutes to several days, and many laboratories lack the expertise or resources to build and maintain a dedicated computer cluster. Results JobCenter is a client–server application and framework for job management and distributed job execution. The client and server components are both written in Java and are cross-platform and relatively easy to install. All communication with the server is client-driven, which allows worker nodes to run anywhere (even behind external firewalls or “in the cloud”) and provides inherent load balancing. Adding a worker node to the worker pool is as simple as dropping the JobCenter client files onto any computer and performing basic configuration, which provides tremendous ease-of-use, flexibility, and limitless horizontal scalability. Each worker installation may be independently configured, including the types of jobs it is able to run. Executed jobs may be written in any language and may include multistep workflows. Conclusions JobCenter is a versatile and scalable distributed job management system that allows laboratories to very efficiently distribute all computational work among available resources. JobCenter is freely available at http://code.google.com/p/jobcenter/. PMID:22846423

  8. High Performance Computing Meets Energy Efficiency - Continuum Magazine |

    Science.gov Websites

    NREL High Performance Computing Meets Energy Efficiency High Performance Computing Meets Energy turbines. Simulation by Patrick J. Moriarty and Matthew J. Churchfield, NREL The new High Performance Computing Data Center at the National Renewable Energy Laboratory (NREL) hosts high-speed, high-volume data

  9. Computational simulation of laboratory-scale volcanic jets

    NASA Astrophysics Data System (ADS)

    Solovitz, S.; Van Eaton, A. R.; Mastin, L. G.; Herzog, M.

    2017-12-01

    Volcanic eruptions produce ash clouds that may travel great distances, significantly impacting aviation and communities downwind. Atmospheric hazard forecasting relies partly on numerical models of the flow physics, which incorporate data from eruption observations and analogue laboratory tests. As numerical tools continue to increase in complexity, they must be validated to fine-tune their effectiveness. Since eruptions are relatively infrequent and challenging to observe in great detail, analogue experiments can provide important insights into expected behavior over a wide range of input conditions. Unfortunately, laboratory-scale jets cannot easily attain the high Reynolds numbers ( 109) of natural volcanic eruption columns. Comparisons between the computational models and analogue experiments can help bridge this gap. In this study, we investigate a 3-D volcanic plume model, the Active Tracer High-resolution Atmospheric Model (ATHAM), which has been used to simulate a variety of eruptions. However, it has not been previously validated using laboratory-scale data. We conducted numerical simulations of three flows that we have studied in the laboratory: a vertical jet in a quiescent environment, a vertical jet in horizontal cross flow, and a particle-laden jet. We considered Reynolds numbers from 10,000 to 50,000, jet-to-cross flow velocity ratios of 2 to 10, and particle mass loadings of up to 25% of the exit mass flow rate. Vertical jet simulations produce Gaussian velocity profiles in the near exit region by 3 diameters downstream, matching the mean experimental profiles. Simulations of air entrainment are of the correct order of magnitude, but they show decreasing entrainment with vertical distance from the vent. Cross flow simulations reproduce experimental trajectories for the jet centerline initially, although confinement appears to impact the response later. Particle-laden simulations display minimal variation in concentration profiles between cases with

  10. Origin of Marshall Space Flight Center (MSFC)

    NASA Image and Video Library

    2004-04-15

    Twelve scientific specialists of the Peenemuende team at the front of Building 4488, Redstone Arsenal, Huntsville, Alabama. They led the Army's space efforts at ABMA before transfer of the team to National Aeronautic and Space Administration (NASA), George C. Marshall Space Flight Center (MSFC). (Left to right) Dr. Ernst Stuhlinger, Director, Research Projects Office; Dr. Helmut Hoelzer, Director, Computation Laboratory: Karl L. Heimburg, Director, Test Laboratory; Dr. Ernst Geissler, Director, Aeroballistics Laboratory; Erich W. Neubert, Director, Systems Analysis Reliability Laboratory; Dr. Walter Haeussermarn, Director, Guidance and Control Laboratory; Dr. Wernher von Braun, Director Development Operations Division; William A. Mrazek, Director, Structures and Mechanics Laboratory; Hans Hueter, Director, System Support Equipment Laboratory;Eberhard Rees, Deputy Director, Development Operations Division; Dr. Kurt Debus, Director Missile Firing Laboratory; Hans H. Maus, Director, Fabrication and Assembly Engineering Laboratory

  11. Ice Crystal Icing Engine Testing in the NASA Glenn Research Center's Propulsion Systems Laboratory: Altitude Investigation

    NASA Technical Reports Server (NTRS)

    Oliver, Michael J.

    2014-01-01

    The National Aeronautics and Space Administration (NASA) conducted a full scale ice crystal icing turbofan engine test using an obsolete Allied Signal ALF502-R5 engine in the Propulsion Systems Laboratory (PSL) at NASA Glenn Research Center. The test article used was the exact engine that experienced a loss of power event after the ingestion of ice crystals while operating at high altitude during a 1997 Honeywell flight test campaign investigating the turbofan engine ice crystal icing phenomena. The test plan included test points conducted at the known flight test campaign field event pressure altitude and at various pressure altitudes ranging from low to high throughout the engine operating envelope. The test article experienced a loss of power event at each of the altitudes tested. For each pressure altitude test point conducted the ambient static temperature was predicted using a NASA engine icing risk computer model for the given ambient static pressure while maintaining the engine speed.

  12. MIT Laboratory for Computer Science Progress Report, July 1984-June 1985

    DTIC Science & Technology

    1985-06-01

    larger (up to several thousand machines) multiprocessor systems. This facility, funded by the newly formed Strategic Computing Program of the Defense...Szolovits, Group Leader R. Patil Collaborating Investigators M. Criscitiello, M.D., Tufts-New England Medical Center Hospital J. Dzierzanowski, Ph.D., Dept...COMPUTATION STRUCTURES Academic Staff J. B. Dennis, Group Leader Research Staff W. B. Ackerman G. A. Boughton W. Y-P. Lim Graduate Students T-A. Chu S

  13. Assessment of readiness for clinical decision support to aid laboratory monitoring of immunosuppressive care at U.S. liver transplant centers.

    PubMed

    Jacobs, J; Weir, C; Evans, R S; Staes, C

    2014-01-01

    Following liver transplantation, patients require lifelong immunosuppressive care and monitoring. Computerized clinical decision support (CDS) has been shown to improve post-transplant immunosuppressive care processes and outcomes. The readiness of transplant information systems to implement computerized CDS to support post-transplant care is unknown. a) Describe the current clinical information system functionality and manual and automated processes for laboratory monitoring of immunosuppressive care, b) describe the use of guidelines that may be used to produce computable logic and the use of computerized alerts to support guideline adherence, and c) explore barriers to implementation of CDS in U.S. liver transplant centers. We developed a web-based survey using cognitive interviewing techniques. We surveyed 119 U.S. transplant programs that performed at least five liver transplantations per year during 2010-2012. Responses were summarized using descriptive analyses; barriers were identified using qualitative methods. Respondents from 80 programs (67% response rate) completed the survey. While 98% of programs reported having an electronic health record (EHR), all programs used paper-based manual processes to receive or track immunosuppressive laboratory results. Most programs (85%) reported that 30% or more of their patients used external laboratories for routine testing. Few programs (19%) received most external laboratory results as discrete data via electronic interfaces while most (80%) manually entered laboratory results into the EHR; less than half (42%) could integrate internal and external laboratory results. Nearly all programs had guidelines regarding pre-specified target ranges (92%) or testing schedules (97%) for managing immunosuppressive care. Few programs used computerized alerting to notify transplant coordinators of out-of-range (27%) or overdue laboratory results (20%). Use of EHRs is common, yet all liver transplant programs were largely

  14. Effects of Combined Hands-on Laboratory and Computer Modeling on Student Learning of Gas Laws: A Quasi-Experimental Study

    ERIC Educational Resources Information Center

    Liu, Xiufeng

    2006-01-01

    Based on current theories of chemistry learning, this study intends to test a hypothesis that computer modeling enhanced hands-on chemistry laboratories are more effective than hands-on laboratories or computer modeling laboratories alone in facilitating high school students' understanding of chemistry concepts. Thirty-three high school chemistry…

  15. Intention and Usage of Computer Based Information Systems in Primary Health Centers

    ERIC Educational Resources Information Center

    Hosizah; Kuntoro; Basuki N., Hari

    2016-01-01

    The computer-based information system (CBIS) is adopted by almost all of in health care setting, including the primary health center in East Java Province Indonesia. Some of softwares available were SIMPUS, SIMPUSTRONIK, SIKDA Generik, e-puskesmas. Unfortunately they were most of the primary health center did not successfully implemented. This…

  16. A set of devices for Mechanics Laboratory assisted by a Computer

    NASA Astrophysics Data System (ADS)

    Rusu, Alexandru; Pirtac, Constantin

    2015-12-01

    The booklet give a description of a set of devices designed for unified work out of a number of Laboratory works in Mechanics for students at Technical Universities. It consists of a clock, adjusted to a computer, which allows to compute times with an error not greater than 0.0001 s. It allows also to make the calculations of the physical quantities measured in the experience and present the compilation of the final report. The least square method is used throughout the workshop.

  17. Williams works on computer in the U.S. Laboratory during Expedition 13

    NASA Image and Video Library

    2006-04-15

    ISS013-E-07975 (15 April 2006) --- Astronaut Jeffrey N. Williams, Expedition 13 NASA space station science officer and flight engineer, uses a computer in the Destiny laboratory of the International Space Station.

  18. Strengthening LLNL Missions through Laboratory Directed Research and Development in High Performance Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Willis, D. K.

    2016-12-01

    High performance computing (HPC) has been a defining strength of Lawrence Livermore National Laboratory (LLNL) since its founding. Livermore scientists have designed and used some of the world’s most powerful computers to drive breakthroughs in nearly every mission area. Today, the Laboratory is recognized as a world leader in the application of HPC to complex science, technology, and engineering challenges. Most importantly, HPC has been integral to the National Nuclear Security Administration’s (NNSA’s) Stockpile Stewardship Program—designed to ensure the safety, security, and reliability of our nuclear deterrent without nuclear testing. A critical factor behind Lawrence Livermore’s preeminence in HPC ismore » the ongoing investments made by the Laboratory Directed Research and Development (LDRD) Program in cutting-edge concepts to enable efficient utilization of these powerful machines. Congress established the LDRD Program in 1991 to maintain the technical vitality of the Department of Energy (DOE) national laboratories. Since then, LDRD has been, and continues to be, an essential tool for exploring anticipated needs that lie beyond the planning horizon of our programs and for attracting the next generation of talented visionaries. Through LDRD, Livermore researchers can examine future challenges, propose and explore innovative solutions, and deliver creative approaches to support our missions. The present scientific and technical strengths of the Laboratory are, in large part, a product of past LDRD investments in HPC. Here, we provide seven examples of LDRD projects from the past decade that have played a critical role in building LLNL’s HPC, computer science, mathematics, and data science research capabilities, and describe how they have impacted LLNL’s mission.« less

  19. Learning Center | Argonne National Laboratory

    Science.gov Websites

    Transformations IGSBInstitute for Genomics and Systems Biology IMEInstitute for Molecular Engineering JCESRJoint Science Center SBCStructural Biology Center Energy.gov U.S. Department of Energy Office of Science

  20. For operation of the Computer Software Management and Information Center (COSMIC)

    NASA Technical Reports Server (NTRS)

    Carmon, J. L.

    1983-01-01

    Progress report on current status of computer software management and information center (COSMIC) includes the following areas: inventory, evaluation and publication, marketing, customer service, maintenance and support, and budget summary.

  1. CNC Turning Center Advanced Operations. Computer Numerical Control Operator/Programmer. 444-332.

    ERIC Educational Resources Information Center

    Skowronski, Steven D.; Tatum, Kenneth

    This student guide provides materials for a course designed to introduce the student to the operations and functions of a two-axis computer numerical control (CNC) turning center. The course consists of seven units. Unit 1 presents course expectations and syllabus, covers safety precautions, and describes the CNC turning center components, CNC…

  2. Designing a Hands-On Brain Computer Interface Laboratory Course

    PubMed Central

    Khalighinejad, Bahar; Long, Laura Kathleen; Mesgarani, Nima

    2017-01-01

    Devices and systems that interact with the brain have become a growing field of research and development in recent years. Engineering students are well positioned to contribute to both hardware development and signal analysis techniques in this field. However, this area has been left out of most engineering curricula. We developed an electroencephalography (EEG) based brain computer interface (BCI) laboratory course to educate students through hands-on experiments. The course is offered jointly by the Biomedical Engineering, Electrical Engineering, and Computer Science Departments of Columbia University in the City of New York and is open to senior undergraduate and graduate students. The course provides an effective introduction to the experimental design, neuroscience concepts, data analysis techniques, and technical skills required in the field of BCI. PMID:28268946

  3. Designing a hands-on brain computer interface laboratory course.

    PubMed

    Khalighinejad, Bahar; Long, Laura Kathleen; Mesgarani, Nima

    2016-08-01

    Devices and systems that interact with the brain have become a growing field of research and development in recent years. Engineering students are well positioned to contribute to both hardware development and signal analysis techniques in this field. However, this area has been left out of most engineering curricula. We developed an electroencephalography (EEG) based brain computer interface (BCI) laboratory course to educate students through hands-on experiments. The course is offered jointly by the Biomedical Engineering, Electrical Engineering, and Computer Science Departments of Columbia University in the City of New York and is open to senior undergraduate and graduate students. The course provides an effective introduction to the experimental design, neuroscience concepts, data analysis techniques, and technical skills required in the field of BCI.

  4. Students' Cognitive Focus during a Chemistry Laboratory Exercise: Effects of a Computer-Simulated Prelab

    ERIC Educational Resources Information Center

    Winberg, T. Mikael; Berg, C. Anders R.

    2007-01-01

    To enhance the learning outcomes achieved by students, learners undertook a computer-simulated activity based on an acid-base titration prior to a university-level chemistry laboratory activity. Students were categorized with respect to their attitudes toward learning. During the laboratory exercise, questions that students asked their assistant…

  5. Funding Public Computing Centers: Balancing Broadband Availability and Expected Demand

    ERIC Educational Resources Information Center

    Jayakar, Krishna; Park, Eun-A

    2012-01-01

    The National Broadband Plan (NBP) recently announced by the Federal Communication Commission visualizes a significantly enhanced commitment to public computing centers (PCCs) as an element of the Commission's plans for promoting broadband availability. In parallel, the National Telecommunications and Information Administration (NTIA) has…

  6. Clinical trials of boron neutron capture therapy [in humans] [at Beth Israel Deaconess Medical Center][at Brookhaven National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wallace, Christine

    2001-05-29

    Assessment of research records of Boron Neutron Capture Therapy was conducted at Brookhaven National Laboratory and Beth Israel Deaconess Medical Center using the Code of Federal Regulations, FDA Regulations and Good Clinical Practice Guidelines. Clinical data were collected from subjects' research charts, and differences in conduct of studies at both centers were examined. Records maintained at Brookhaven National Laboratory were not in compliance with regulatory standards. Beth Israel's records followed federal regulations. Deficiencies discovered at both sites are discussed in the reports.

  7. Computation of Chemical Shifts for Paramagnetic Molecules: A Laboratory Experiment for the Undergraduate Curriculum

    ERIC Educational Resources Information Center

    Pritchard, Benjamin P.; Simpson, Scott; Zurek, Eva; Autschbach, Jochen

    2014-01-01

    A computational experiment investigating the [superscript 1]H and [superscript 13]C nuclear magnetic resonance (NMR) chemical shifts of molecules with unpaired electrons has been developed and implemented. This experiment is appropriate for an upper-level undergraduate laboratory course in computational, physical, or inorganic chemistry. The…

  8. Center for computation and visualization of geometric structures. Final report, 1992 - 1995

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-11-01

    This report describes the overall goals and the accomplishments of the Geometry Center of the University of Minnesota, whose mission is to develop, support, and promote computational tools for visualizing geometric structures, for facilitating communication among mathematical and computer scientists and between these scientists and the public at large, and for stimulating research in geometry.

  9. Public health microbiology in Germany: 20 years of national reference centers and consultant laboratories.

    PubMed

    Beermann, Sandra; Allerberger, Franz; Wirtz, Angela; Burger, Reinhard; Hamouda, Osamah

    2015-10-01

    In 1995, in agreement with the German Federal Ministry of Health, the Robert Koch Institute established a public health microbiology system consisting of national reference centers (NRCs) and consultant laboratories (CLs). The goal was to improve the efficiency of infection protection by advising the authorities on possible measures and to supplement infectious disease surveillance by monitoring selected pathogens that have high public health relevance. Currently, there are 19 NRCs and 40 CLs, each appointed for three years. In 2009, an additional system of national networks of NRCs and CLs was set up in order to enhance effectiveness and cooperation within the national reference laboratory system. The aim of these networks was to advance exchange in diagnostic methods and prevention concepts among reference laboratories and to develop geographic coverage of services. In the last two decades, the German public health laboratory reference system coped with all major infectious disease challenges. The European Union and the European Centre for Disease Prevention and Control (ECDC) are considering implementing a European public health microbiology reference laboratory system. The German reference laboratory system should be well prepared to participate actively in this upcoming endeavor. Copyright © 2015 Elsevier GmbH. All rights reserved.

  10. Bayesian Research at the NASA Ames Research Center,Computational Sciences Division

    NASA Technical Reports Server (NTRS)

    Morris, Robin D.

    2003-01-01

    NASA Ames Research Center is one of NASA s oldest centers, having started out as part of the National Advisory Committee on Aeronautics, (NACA). The site, about 40 miles south of San Francisco, still houses many wind tunnels and other aviation related departments. In recent years, with the growing realization that space exploration is heavily dependent on computing and data analysis, its focus has turned more towards Information Technology. The Computational Sciences Division has expanded rapidly as a result. In this article, I will give a brief overview of some of the past and present projects with a Bayesian content. Much more than is described here goes on with the Division. The web pages at http://ic.arc. nasa.gov give more information on these, and the other Division projects.

  11. Optimization of analytical laboratory work using computer networking and databasing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Upp, D.L.; Metcalf, R.A.

    1996-06-01

    The Health Physics Analysis Laboratory (HPAL) performs around 600,000 analyses for radioactive nuclides each year at Los Alamos National Laboratory (LANL). Analysis matrices vary from nasal swipes, air filters, work area swipes, liquids, to the bottoms of shoes and cat litter. HPAL uses 8 liquid scintillation counters, 8 gas proportional counters, and 9 high purity germanium detectors in 5 laboratories to perform these analyses. HPAL has developed a computer network between the labs and software to produce analysis results. The software and hardware package includes barcode sample tracking, log-in, chain of custody, analysis calculations, analysis result printing, and utility programs.more » All data are written to a database, mirrored on a central server, and eventually written to CD-ROM to provide for online historical results. This system has greatly reduced the work required to provide for analysis results as well as improving the quality of the work performed.« less

  12. A guide to Laboratory practicum on oscillations assisted by a computer

    NASA Astrophysics Data System (ADS)

    Russu, A. S.; Russu, S. S.; Pitac, C.

    2013-12-01

    The booklet contains descriptions of 3 Laboratory works on oscillations (n.9, 10,11) for students of Chisinau Technical University. They represent a modernized versions by a computer assistance of older ones which were first put in 1964. In each case it includes theoretical outlines, the work instruction, control questions.

  13. The flight robotics laboratory

    NASA Technical Reports Server (NTRS)

    Tobbe, Patrick A.; Williamson, Marlin J.; Glaese, John R.

    1988-01-01

    The Flight Robotics Laboratory of the Marshall Space Flight Center is described in detail. This facility, containing an eight degree of freedom manipulator, precision air bearing floor, teleoperated motion base, reconfigurable operator's console, and VAX 11/750 computer system, provides simulation capability to study human/system interactions of remote systems. The facility hardware, software and subsequent integration of these components into a real time man-in-the-loop simulation for the evaluation of spacecraft contact proximity and dynamics are described.

  14. Spectroscopy and astronomy: H3+ from the laboratory to the Galactic center.

    PubMed

    Oka, Takeshi

    2011-01-01

    Since the serendipitous discovery of the Fraunhofer spectrum in the Sun in 1814 which initiated spectroscopy and astrophysics, spectroscopy developed hand in hand with astronomy. I discuss my own work on the infrared spectrum of H3+ from its discovery in the laboratory in 1980, in interstellar space in 1996, to recent studies in the Galactic center as an example of astronomical spectroscopy. Its spin-off, the spectroscopy of simple molecular ions, is also briefly discussed.

  15. Development of Advanced Computational Aeroelasticity Tools at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Bartels, R. E.

    2008-01-01

    NASA Langley Research Center has continued to develop its long standing computational tools to address new challenges in aircraft and launch vehicle design. This paper discusses the application and development of those computational aeroelastic tools. Four topic areas will be discussed: 1) Modeling structural and flow field nonlinearities; 2) Integrated and modular approaches to nonlinear multidisciplinary analysis; 3) Simulating flight dynamics of flexible vehicles; and 4) Applications that support both aeronautics and space exploration.

  16. Adiabatic Quantum Computation with Neutral Atoms

    NASA Astrophysics Data System (ADS)

    Biedermann, Grant

    2013-03-01

    We are implementing a new platform for adiabatic quantum computation (AQC)[2] based on trapped neutral atoms whose coupling is mediated by the dipole-dipole interactions of Rydberg states. Ground state cesium atoms are dressed by laser fields in a manner conditional on the Rydberg blockade mechanism,[3,4] thereby providing the requisite entangling interactions. As a benchmark we study a Quadratic Unconstrained Binary Optimization (QUBO) problem whose solution is found in the ground state spin configuration of an Ising-like model. In collaboration with Lambert Parazzoli, Sandia National Laboratories; Aaron Hankin, Center for Quantum Information and Control (CQuIC), University of New Mexico; James Chin-Wen Chou, Yuan-Yu Jau, Peter Schwindt, Cort Johnson, and George Burns, Sandia National Laboratories; Tyler Keating, Krittika Goyal, and Ivan Deutsch, Center for Quantum Information and Control (CQuIC), University of New Mexico; and Andrew Landahl, Sandia National Laboratories. This work was supported by the Laboratory Directed Research and Development program at Sandia National Laboratories

  17. Characterization of Aerodynamic Interactions with the Mars Science Laboratory Reaction Control System Using Computation and Experiment

    NASA Technical Reports Server (NTRS)

    Schoenenberger, Mark; VanNorman, John; Rhode, Matthew; Paulson, John

    2013-01-01

    On August 5 , 2012, the Mars Science Laboratory (MSL) entry capsule successfully entered Mars' atmosphere and landed the Curiosity rover in Gale Crater. The capsule used a reaction control system (RCS) consisting of four pairs of hydrazine thrusters to fly a guided entry. The RCS provided bank control to fly along a flight path commanded by an onboard computer and also damped unwanted rates due to atmospheric disturbances and any dynamic instabilities of the capsule. A preliminary assessment of the MSL's flight data from entry showed that the capsule flew much as predicted. This paper will describe how the MSL aerodynamics team used engineering analyses, computational codes and wind tunnel testing in concert to develop the RCS system and certify it for flight. Over the course of MSL's development, the RCS configuration underwent a number of design iterations to accommodate mechanical constraints, aeroheating concerns and excessive aero/RCS interactions. A brief overview of the MSL RCS configuration design evolution is provided. Then, a brief description is presented of how the computational predictions of RCS jet interactions were validated. The primary work to certify that the RCS interactions were acceptable for flight was centered on validating computational predictions at hypersonic speeds. A comparison of computational fluid dynamics (CFD) predictions to wind tunnel force and moment data gathered in the NASA Langley 31-Inch Mach 10 Tunnel was the lynch pin to validating the CFD codes used to predict aero/RCS interactions. Using the CFD predictions and experimental data, an interaction model was developed for Monte Carlo analyses using 6-degree-of-freedom trajectory simulation. The interaction model used in the flight simulation is presented.

  18. Fred Hutchinson Cancer Research Center, Seattle, Washington: Laboratories for the 21st Century Case Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2001-12-01

    This case study was prepared by participants in the Laboratories for the 21st Century program, a joint endeavor of the U.S. Environmental Protection Agency and the U.S. Department of Energy's Federal Energy Management Program. The goal of this program is to foster greater energy efficiency in new laboratory buildings for both the public and the private sectors. Retrofits of existing laboratories are also encouraged. The energy-efficient features of the laboratories in the Fred Hutchinson Cancer Research Center complex in Seattle, Washington, include extensive use of efficient lighting, variable-air-volume controls, variable-speed drives, motion sensors, and high-efficiency chillers and motors. With aboutmore » 532,000 gross square feet, the complex is estimated to use 33% less electrical energy than most traditional research facilities consume because of its energy-efficient design and features.« less

  19. Williams uses laptop computer in the U.S. Laboratory taken during Expedition 13

    NASA Image and Video Library

    2006-06-22

    ISS013-E-40000 (22 June 2006) --- Astronaut Jeffrey N. Williams, Expedition 13 NASA space station science officer and flight engineer, uses a computer in the Destiny laboratory of the International Space Station.

  20. Computational Toxicology as Implemented by the US EPA ...

    EPA Pesticide Factsheets

    Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, the U.S. Environmental Protection Agency EPA is developing robust and flexible computational tools that can be applied to the thousands of chemicals in commerce, and contaminant mixtures found in air, water, and hazardous-waste sites. The Office of Research and Development (ORD) Computational Toxicology Research Program (CTRP) is composed of three main elements. The largest component is the National Center for Computational Toxicology (NCCT), which was established in 2005 to coordinate research on chemical screening and prioritization, informatics, and systems modeling. The second element consists of related activities in the National Health and Environmental Effects Research Laboratory (NHEERL) and the National Exposure Research Laboratory (NERL). The third and final component consists of academic centers working on various aspects of computational toxicology and funded by the U.S. EPA Science to Achieve Results (STAR) program. Together these elements form the key components in the implementation of both the initial strategy, A Framework for a Computational Toxicology Research Program (U.S. EPA, 2003), and the newly released The U.S. Environmental Protection Agency's Strategic Plan for Evaluating the T

  1. FJ44 Turbofan Engine Test at NASA Glenn Research Center's Aero-Acoustic Propulsion Laboratory

    NASA Technical Reports Server (NTRS)

    Lauer, Joel T.; McAllister, Joseph; Loew, Raymond A.; Sutliff, Daniel L.; Harley, Thomas C.

    2009-01-01

    A Williams International FJ44-3A 3000-lb thrust class turbofan engine was tested in the NASA Glenn Research Center s Aero-Acoustic Propulsion Laboratory. This report presents the test set-up and documents the test conditions. Farfield directivity, in-duct unsteady pressures, duct mode data, and phased-array data were taken and are reported separately.

  2. 3D chemical imaging in the laboratory by hyperspectral X-ray computed tomography

    PubMed Central

    Egan, C. K.; Jacques, S. D. M.; Wilson, M. D.; Veale, M. C.; Seller, P.; Beale, A. M.; Pattrick, R. A. D.; Withers, P. J.; Cernik, R. J.

    2015-01-01

    We report the development of laboratory based hyperspectral X-ray computed tomography which allows the internal elemental chemistry of an object to be reconstructed and visualised in three dimensions. The method employs a spectroscopic X-ray imaging detector with sufficient energy resolution to distinguish individual elemental absorption edges. Elemental distributions can then be made by K-edge subtraction, or alternatively by voxel-wise spectral fitting to give relative atomic concentrations. We demonstrate its application to two material systems: studying the distribution of catalyst material on porous substrates for industrial scale chemical processing; and mapping of minerals and inclusion phases inside a mineralised ore sample. The method makes use of a standard laboratory X-ray source with measurement times similar to that required for conventional computed tomography. PMID:26514938

  3. Computer systems and software engineering

    NASA Technical Reports Server (NTRS)

    Mckay, Charles W.

    1988-01-01

    The High Technologies Laboratory (HTL) was established in the fall of 1982 at the University of Houston Clear Lake. Research conducted at the High Tech Lab is focused upon computer systems and software engineering. There is a strong emphasis on the interrelationship of these areas of technology and the United States' space program. In Jan. of 1987, NASA Headquarters announced the formation of its first research center dedicated to software engineering. Operated by the High Tech Lab, the Software Engineering Research Center (SERC) was formed at the University of Houston Clear Lake. The High Tech Lab/Software Engineering Research Center promotes cooperative research among government, industry, and academia to advance the edge-of-knowledge and the state-of-the-practice in key topics of computer systems and software engineering which are critical to NASA. The center also recommends appropriate actions, guidelines, standards, and policies to NASA in matters pertinent to the center's research. Results of the research conducted at the High Tech Lab/Software Engineering Research Center have given direction to many decisions made by NASA concerning the Space Station Program.

  4. Use of computers and Internet among people with severe mental illnesses at peer support centers.

    PubMed

    Brunette, Mary F; Aschbrenner, Kelly A; Ferron, Joelle C; Ustinich, Lee; Kelly, Michael; Grinley, Thomas

    2017-12-01

    Peer support centers are an ideal setting where people with severe mental illnesses can access the Internet via computers for online health education, peer support, and behavioral treatments. The purpose of this study was to assess computer use and Internet access in peer support agencies. A peer-assisted survey assessed the frequency with which consumers in all 13 New Hampshire peer support centers (n = 702) used computers to access Internet resources. During the 30-day survey period, 200 of the 702 peer support consumers (28%) responded to the survey. More than 3 quarters (78.5%) of respondents had gone online to seek information in the past year. About half (49%) of respondents were interested in learning about online forums that would provide information and peer support for mental health issues. Peer support centers may be a useful venue for Web-based approaches to education, peer support, and intervention. Future research should assess facilitators and barriers to use of Web-based resources among people with severe mental illness in peer support centers. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  5. AHPCRC (Army High Performance Computing Research Center) Bulletin. Volume 1, Issue 2

    DTIC Science & Technology

    2011-01-01

    area and the researchers working on these projects. Also inside: news from the AHPCRC consortium partners at Morgan State University and the NASA ...Computing Research Center is provided by the supercomputing and research facilities at Stanford University and at the NASA Ames Research Center at...atomic and molecular level, he said. He noted that “every general would like to have” a Star Trek -like holodeck, where holographic avatars could

  6. Preparation for microgravity - The role of the Microgravity Material Science Laboratory

    NASA Technical Reports Server (NTRS)

    Johnston, J. Christopher; Rosenthal, Bruce N.; Meyer, Maryjo B.; Glasgow, Thomas K.

    1988-01-01

    Experiments at the NASA Lewis Research Center's Microgravity Material Science Laboratory using physical and mathematical models to delineate the effects of gravity on processes of scientific and commercial interest are discussed. Where possible, transparent model systems are used to visually track convection, settling, crystal growth, phase separation, agglomeration, vapor transport, diffusive flow, and polymer reactions. Materials studied include metals, alloys, salts, glasses, ceramics, and polymers. Specific technologies discussed include the General Purpose furnace used in the study of metals and crystal growth, the isothermal dendrite growth apparatus, the electromagnetic levitator/instrumented drop tube, the high temperature directional solidification furnace, the ceramics and polymer laboratories and the center's computing facilities.

  7. Design and analysis of a tendon-based computed tomography-compatible robot with remote center of motion for lung biopsy.

    PubMed

    Yang, Yunpeng; Jiang, Shan; Yang, Zhiyong; Yuan, Wei; Dou, Huaisu; Wang, Wei; Zhang, Daguang; Bian, Yuan

    2017-04-01

    Nowadays, biopsy is a decisive method of lung cancer diagnosis, whereas lung biopsy is time-consuming, complex and inaccurate. So a computed tomography-compatible robot for rapid and precise lung biopsy is developed in this article. According to the actual operation process, the robot is divided into two modules: 4-degree-of-freedom position module for location of puncture point is appropriate for patient's almost all positions and 3-degree-of-freedom tendon-based orientation module with remote center of motion is compact and computed tomography-compatible to orientate and insert needle automatically inside computed tomography bore. The workspace of the robot surrounds patient's thorax, and the needle tip forms a cone under patient's skin. A new error model of the robot based on screw theory is proposed in view of structure error and actuation error, which are regarded as screw motions. Simulation is carried out to verify the precision of the error model contrasted with compensation via inverse kinematics. The results of insertion experiment on specific phantom prove the feasibility of the robot with mean error of 1.373 mm in laboratory environment, which is accurate enough to replace manual operation.

  8. Knowledge Retention for Computer Simulations: A study comparing virtual and hands-on laboratories

    NASA Astrophysics Data System (ADS)

    Croom, John R., III

    The use of virtual laboratories has the potential to change physics education. These low-cost, interactive computer activities interest students, allow for easy setup, and give educators a way to teach laboratory based online classes. This study investigated whether virtual laboratories could replace traditional hands-on laboratories and whether students could retain the same long-term knowledge in virtual laboratories as compared to hands-on laboratories. This study is a quantitative quasi-experiment that used a multiple posttest design to determine if students using virtual laboratories would retain the same knowledge as students who performed hands-on laboratories after 9 weeks. The study was composed of 336 students from 14 school districts. Students had their performances on the laboratories and their retention of the laboratories compared to a series of factors that might have affected their retention using a pretest and two posttests, which were compared using a t test. The results showed no significant difference in short-term learning between the hands-on laboratory groups and virtual laboratory groups. There was, however, a significant difference (p = .005) between the groups in long-term retention; students in the hands-on laboratory groups retained more information than those in the virtual laboratory groups. These results suggest that long-term learning is enhanced when a laboratory contains a hands-on component. Finally, the results showed that both groups of students felt their particular laboratory style was superior to the alternative method. The findings of this study can be used to improve the integration of virtual laboratories into science curriculum.

  9. Computational Toxicology at the US EPA | Science Inventory ...

    EPA Pesticide Factsheets

    Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, EPA is developing robust and flexible computational tools that can be applied to the thousands of chemicals in commerce, and contaminant mixtures found in America’s air, water, and hazardous-waste sites. The ORD Computational Toxicology Research Program (CTRP) is composed of three main elements. The largest component is the National Center for Computational Toxicology (NCCT), which was established in 2005 to coordinate research on chemical screening and prioritization, informatics, and systems modeling. The second element consists of related activities in the National Health and Environmental Effects Research Laboratory (NHEERL) and the National Exposure Research Laboratory (NERL). The third and final component consists of academic centers working on various aspects of computational toxicology and funded by the EPA Science to Achieve Results (STAR) program. Key intramural projects of the CTRP include digitizing legacy toxicity testing information toxicity reference database (ToxRefDB), predicting toxicity (ToxCast™) and exposure (ExpoCast™), and creating virtual liver (v-Liver™) and virtual embryo (v-Embryo™) systems models. The models and underlying data are being made publicly available t

  10. Current evidence and future perspectives on the effective practice of patient-centered laboratory medicine.

    PubMed

    Hallworth, Mike J; Epner, Paul L; Ebert, Christoph; Fantz, Corinne R; Faye, Sherry A; Higgins, Trefor N; Kilpatrick, Eric S; Li, Wenzhe; Rana, S V; Vanstapel, Florent

    2015-04-01

    Systematic evidence of the contribution made by laboratory medicine to patient outcomes and the overall process of healthcare is difficult to find. An understanding of the value of laboratory medicine, how it can be determined, and the various factors that influence it is vital to ensuring that the service is provided and used optimally. This review summarizes existing evidence supporting the impact of laboratory medicine in healthcare and indicates the gaps in our understanding. It also identifies deficiencies in current utilization, suggests potential solutions, and offers a vision of a future in which laboratory medicine is used optimally to support patient care. To maximize the value of laboratory medicine, work is required in 5 areas: (a) improved utilization of existing and new tests; (b) definition of new roles for laboratory professionals that are focused on optimizing patient outcomes by adding value at all points of the diagnostic brain-to-brain cycle; (c) development of standardized protocols for prospective patient-centered studies of biomarker clinical effectiveness or extraanalytical process effectiveness; (d) benchmarking of existing and new tests in specified situations with commonly accepted measures of effectiveness; (e) agreed definition and validation of effectiveness measures and use of checklists for articles submitted for publication. Progress in these areas is essential if we are to demonstrate and enhance the value of laboratory medicine and prevent valuable information being lost in meaningless data. This requires effective collaboration with clinicians, and a determination to accept patient outcome and patient experience as the primary measure of laboratory effectiveness. © 2014 American Association for Clinical Chemistry.

  11. Vanderbilt University Institute of Imaging Science Center for Computational Imaging XNAT: A multimodal data archive and processing environment.

    PubMed

    Harrigan, Robert L; Yvernault, Benjamin C; Boyd, Brian D; Damon, Stephen M; Gibney, Kyla David; Conrad, Benjamin N; Phillips, Nicholas S; Rogers, Baxter P; Gao, Yurui; Landman, Bennett A

    2016-01-01

    The Vanderbilt University Institute for Imaging Science (VUIIS) Center for Computational Imaging (CCI) has developed a database built on XNAT housing over a quarter of a million scans. The database provides framework for (1) rapid prototyping, (2) large scale batch processing of images and (3) scalable project management. The system uses the web-based interfaces of XNAT and REDCap to allow for graphical interaction. A python middleware layer, the Distributed Automation for XNAT (DAX) package, distributes computation across the Vanderbilt Advanced Computing Center for Research and Education high performance computing center. All software are made available in open source for use in combining portable batch scripting (PBS) grids and XNAT servers. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. Venus - Computer Simulated Global View Centered at 0 Degrees East Longitude

    NASA Image and Video Library

    1996-03-14

    This global view of the surface of Venus is centered at 0 degrees east longitude. NASA Magellan synthetic aperture radar mosaics from the first cycle of Magellan mapping were mapped onto a computer-simulated globe to create this image. http://photojournal.jpl.nasa.gov/catalog/PIA00257

  13. Anderson uses laptop computer in the U.S. Laboratory during Joint Operations

    NASA Image and Video Library

    2007-06-13

    S117-E-07134 (12 June 2007) --- Astronaut Clayton Anderson, Expedition 15 flight engineer, uses a computer near the Microgravity Science Glovebox (MSG) in the Destiny laboratory of the International Space Station while Space Shuttle Atlantis (STS-117) was docked with the station. Astronaut Sunita Williams, flight engineer, is at right.

  14. Using a medical simulation center as an electronic health record usability laboratory

    PubMed Central

    Landman, Adam B; Redden, Lisa; Neri, Pamela; Poole, Stephen; Horsky, Jan; Raja, Ali S; Pozner, Charles N; Schiff, Gordon; Poon, Eric G

    2014-01-01

    Usability testing is increasingly being recognized as a way to increase the usability and safety of health information technology (HIT). Medical simulation centers can serve as testing environments for HIT usability studies. We integrated the quality assurance version of our emergency department (ED) electronic health record (EHR) into our medical simulation center and piloted a clinical care scenario in which emergency medicine resident physicians evaluated a simulated ED patient and documented electronically using the ED EHR. Meticulous planning and close collaboration with expert simulation staff was important for designing test scenarios, pilot testing, and running the sessions. Similarly, working with information systems teams was important for integration of the EHR. Electronic tools are needed to facilitate entry of fictitious clinical results while the simulation scenario is unfolding. EHRs can be successfully integrated into existing simulation centers, which may provide realistic environments for usability testing, training, and evaluation of human–computer interactions. PMID:24249778

  15. A Text-Computer Assisted Instruction Program as a Viable Alternative for Continuing Education in Laboratory Medicine.

    ERIC Educational Resources Information Center

    Bruce, A. Wayne

    1986-01-01

    Describes reasons for developing combined text and computer assisted instruction (CAI) teaching programs for delivery of continuing education to laboratory professionals, and mechanisms used for developing a CAI program on method evaluation in the clinical laboratory. Results of an evaluation of the software's cost effectiveness and instructional…

  16. The use of computer-aided learning in chemistry laboratory instruction

    NASA Astrophysics Data System (ADS)

    Allred, Brian Robert Tracy

    This research involves developing and implementing computer software for chemistry laboratory instruction. The specific goal is to design the software and investigate whether it can be used to introduce concepts and laboratory procedures without a lecture format. This would allow students to conduct an experiment even though they may not have been introduced to the chemical concept in their lecture course. This would also allow for another type of interaction for those students who respond more positively to a visual approach to instruction. The first module developed was devoted to using computer software to help introduce students to the concepts related to thin-layer chromatography and setting up and running an experiment. This was achieved through the use of digitized pictures and digitized video clips along with written information. A review quiz was used to help reinforce the learned information. The second module was devoted to the concept of the "dry lab". This module presented students with relevant information regarding the chemical concepts and then showed them the outcome of mixing solutions. By these observations, they were to determine the composition of unknown solutions based on provided descriptions and comparison with their written observations. The third piece of the software designed was a computer game. This program followed the first two modules in providing information the students were to learn. The difference here, though, was incorporating a game scenario for students to use to help reinforce the learning. Students were then assessed to see how much information they retained after playing the game. In each of the three cases, a control group exposed to the traditional lecture format was used. Their results were compared to the experimental group using the computer modules. Based upon the findings, it can be concluded that using technology to aid in the instructional process is definitely of benefit and students were more successful in

  17. Fred Hutchinson Cancer Research Center, Seattle, Washington: Laboratories for the 21st Century Case Studies (Revision)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2002-03-01

    This case study was prepared by participants in the Laboratories for the 21st Century program, a joint endeavor of the U.S. Environmental Protection Agency and the U.S. Department of Energy's Federal Energy Management Program. The goal of this program is to foster greater energy efficiency in new laboratory buildings for both the public and the private sectors. Retrofits of existing laboratories are also encouraged. The energy-efficient features of the laboratories in the Fred Hutchinson Cancer Research Center complex in Seattle, Washington, include extensive use of efficient lighting, variable-air-volume controls, variable-speed drives, motion sensors, and high-efficiency chillers and motors. With aboutmore » 532,000 gross square feet, the complex is estimated to use 33% less electrical energy than most traditional research facilities consume because of its energy-efficient design and features.« less

  18. Laboratory Information Management System (LIMS): A case study

    NASA Technical Reports Server (NTRS)

    Crandall, Karen S.; Auping, Judith V.; Megargle, Robert G.

    1987-01-01

    In the late 70's, a refurbishment of the analytical laboratories serving the Materials Division at NASA Lewis Research Center was undertaken. As part of the modernization efforts, a Laboratory Information Management System (LIMS) was to be included. Preliminary studies indicated a custom-designed system as the best choice in order to satisfy all of the requirements. A scaled down version of the original design has been in operation since 1984. The LIMS, a combination of computer hardware, provides the chemical characterization laboratory with an information data base, a report generator, a user interface, and networking capabilities. This paper is an account of the processes involved in designing and implementing that LIMS.

  19. Software For Monitoring A Computer Network

    NASA Technical Reports Server (NTRS)

    Lee, Young H.

    1992-01-01

    SNMAT is rule-based expert-system computer program designed to assist personnel in monitoring status of computer network and identifying defective computers, workstations, and other components of network. Also assists in training network operators. Network for SNMAT located at Space Flight Operations Center (SFOC) at NASA's Jet Propulsion Laboratory. Intended to serve as data-reduction system providing windows, menus, and graphs, enabling users to focus on relevant information. SNMAT expected to be adaptable to other computer networks; for example in management of repair, maintenance, and security, or in administration of planning systems, billing systems, or archives.

  20. The WHO/PEPFAR collaboration to prepare an operations manual for HIV prevention, care, and treatment at primary health centers in high-prevalence, resource-constrained settings: defining laboratory services.

    PubMed

    Spira, Thomas; Lindegren, Mary Lou; Ferris, Robert; Habiyambere, Vincent; Ellerbrock, Tedd

    2009-06-01

    The expansion of HIV/AIDS care and treatment in resource-constrained countries, especially in sub-Saharan Africa, has generally developed in a top-down manner. Further expansion will involve primary health centers where human and other resources are limited. This article describes the World Health Organization/President's Emergency Plan for AIDS Relief collaboration formed to help scale up HIV services in primary health centers in high-prevalence, resource-constrained settings. It reviews the contents of the Operations Manual developed, with emphasis on the Laboratory Services chapter, which discusses essential laboratory services, both at the center and the district hospital level, laboratory safety, laboratory testing, specimen transport, how to set up a laboratory, human resources, equipment maintenance, training materials, and references. The chapter provides specific information on essential tests and generic job aids for them. It also includes annexes containing a list of laboratory supplies for the health center and sample forms.

  1. Oak Ridge National Laboratory`s (ORNL) ecological and physical science study center: A hands-on science program for K-12 students

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bradshaw, S.P.

    1994-12-31

    In our tenth year of educational service and outreach, Oak Ridge National Laboratory`s Ecological and Physical Science Study Center (EPSSC) provides hands-on, inquiry-based science activities for area students and teachers. Established in 1984, the EPSSC now hosts over 20,000 student visits. Designed to foster a positive attitude towards science, each unit includes activities which reinforce the science concept being explored. Outdoor science units provide field experience at the Department of Energy`s Oak Ridge National Environmental Research Park and outreach programs are offered on-site in area schools. Other programs are offered as extensions of the EPSSC core programs, including on-site studentmore » science camps, all-girl programs, outreach science camps, student competitions, teacher in-service presentations and teacher workshops.« less

  2. Changing resident test ordering behavior: a multilevel intervention to decrease laboratory utilization at an academic medical center.

    PubMed

    Vidyarthi, Arpana R; Hamill, Timothy; Green, Adrienne L; Rosenbluth, Glenn; Baron, Robert B

    2015-01-01

    Hospital laboratory test volume is increasing, and overutilization contributes to errors and costs. Efforts to reduce laboratory utilization have targeted aspects of ordering behavior, but few have utilized a multilevel collaborative approach. The study team partnered with residents to reduce unnecessary laboratory tests and associated costs through multilevel interventions across the academic medical center. The study team selected laboratory tests for intervention based on cost, volume, and ordering frequency (complete blood count [CBC] and CBC with differential, common electrolytes, blood enzymes, and liver function tests). Interventions were designed collaboratively with residents and targeted components of ordering behavior, including system changes, teaching, social marketing, academic detailing, financial incentives, and audit/feedback. Laboratory ordering was reduced by 8% cumulatively over 3 years, saving $2 019 000. By involving residents at every stage of the intervention and targeting multiple levels simultaneously, laboratory utilization was reduced and cost savings were sustained over 3 years. © 2014 by the American College of Medical Quality.

  3. SANs and Large Scale Data Migration at the NASA Center for Computational Sciences

    NASA Technical Reports Server (NTRS)

    Salmon, Ellen M.

    2004-01-01

    Evolution and migration are a way of life for provisioners of high-performance mass storage systems that serve high-end computers used by climate and Earth and space science researchers: the compute engines come and go, but the data remains. At the NASA Center for Computational Sciences (NCCS), disk and tape SANs are deployed to provide high-speed I/O for the compute engines and the hierarchical storage management systems. Along with gigabit Ethernet, they also enable the NCCS's latest significant migration: the transparent transfer of 300 Til3 of legacy HSM data into the new Sun SAM-QFS cluster.

  4. CICART Center For Integrated Computation And Analysis Of Reconnection And Turbulence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhattacharjee, Amitava

    CICART is a partnership between the University of New Hampshire (UNH) and Dartmouth College. CICART addresses two important science needs of the DoE: the basic understanding of magnetic reconnection and turbulence that strongly impacts the performance of fusion plasmas, and the development of new mathematical and computational tools that enable the modeling and control of these phenomena. The principal participants of CICART constitute an interdisciplinary group, drawn from the communities of applied mathematics, astrophysics, computational physics, fluid dynamics, and fusion physics. It is a main premise of CICART that fundamental aspects of magnetic reconnection and turbulence in fusion devices, smaller-scalemore » laboratory experiments, and space and astrophysical plasmas can be viewed from a common perspective, and that progress in understanding in any of these interconnected fields is likely to lead to progress in others. The establishment of CICART has strongly impacted the education and research mission of a new Program in Integrated Applied Mathematics in the College of Engineering and Applied Sciences at UNH by enabling the recruitment of a tenure-track faculty member, supported equally by UNH and CICART, and the establishment of an IBM-UNH Computing Alliance. The proposed areas of research in magnetic reconnection and turbulence in astrophysical, space, and laboratory plasmas include the following topics: (A) Reconnection and secondary instabilities in large high-Lundquist-number plasmas, (B) Particle acceleration in the presence of multiple magnetic islands, (C) Gyrokinetic reconnection: comparison with fluid and particle-in-cell models, (D) Imbalanced turbulence, (E) Ion heating, and (F) Turbulence in laboratory (including fusion-relevant) experiments. These theoretical studies make active use of three high-performance computer simulation codes: (1) The Magnetic Reconnection Code, based on extended two-fluid (or Hall MHD) equations, in an Adaptive

  5. Scientific involvement in Skylab by the Space Sciences Laboratory of the Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    Winkler, C. E. (Editor)

    1973-01-01

    The involvement of the Marshall Space Flight Center's Space Sciences Laboratory in the Skylab program from the early feasibility studies through the analysis and publication of flight scientific and technical results is described. This includes mission operations support, the Apollo telescope mount, materials science/manufacturing in space, optical contamination, environmental and thermal criteria, and several corollary measurements and experiments.

  6. On-demand provisioning of HEP compute resources on cloud sites and shared HPC centers

    NASA Astrophysics Data System (ADS)

    Erli, G.; Fischer, F.; Fleig, G.; Giffels, M.; Hauth, T.; Quast, G.; Schnepf, M.; Heese, J.; Leppert, K.; Arnaez de Pedro, J.; Sträter, R.

    2017-10-01

    This contribution reports on solutions, experiences and recent developments with the dynamic, on-demand provisioning of remote computing resources for analysis and simulation workflows. Local resources of a physics institute are extended by private and commercial cloud sites, ranging from the inclusion of desktop clusters over institute clusters to HPC centers. Rather than relying on dedicated HEP computing centers, it is nowadays more reasonable and flexible to utilize remote computing capacity via virtualization techniques or container concepts. We report on recent experience from incorporating a remote HPC center (NEMO Cluster, Freiburg University) and resources dynamically requested from the commercial provider 1&1 Internet SE into our intitute’s computing infrastructure. The Freiburg HPC resources are requested via the standard batch system, allowing HPC and HEP applications to be executed simultaneously, such that regular batch jobs run side by side to virtual machines managed via OpenStack [1]. For the inclusion of the 1&1 commercial resources, a Python API and SDK as well as the possibility to upload images were available. Large scale tests prove the capability to serve the scientific use case in the European 1&1 datacenters. The described environment at the Institute of Experimental Nuclear Physics (IEKP) at KIT serves the needs of researchers participating in the CMS and Belle II experiments. In total, resources exceeding half a million CPU hours have been provided by remote sites.

  7. Yahoo! Compute Coop (YCC). A Next-Generation Passive Cooling Design for Data Centers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robison, AD; Page, Christina; Lytle, Bob

    The purpose of the Yahoo! Compute Coop (YCC) project is to research, design, build and implement a greenfield "efficient data factory" and to specifically demonstrate that the YCC concept is feasible for large facilities housing tens of thousands of heat-producing computing servers. The project scope for the Yahoo! Compute Coop technology includes: - Analyzing and implementing ways in which to drastically decrease energy consumption and waste output. - Analyzing the laws of thermodynamics and implementing naturally occurring environmental effects in order to maximize the "free-cooling" for large data center facilities. "Free cooling" is the direct usage of outside air tomore » cool the servers vs. traditional "mechanical cooling" which is supplied by chillers or other Dx units. - Redesigning and simplifying building materials and methods. - Shortening and simplifying build-to-operate schedules while at the same time reducing initial build and operating costs. Selected for its favorable climate, the greenfield project site is located in Lockport, NY. Construction on the 9.0 MW critical load data center facility began in May 2009, with the fully operational facility deployed in September 2010. The relatively low initial build cost, compatibility with current server and network models, and the efficient use of power and water are all key features that make it a highly compatible and globally implementable design innovation for the data center industry. Yahoo! Compute Coop technology is designed to achieve 99.98% uptime availability. This integrated building design allows for free cooling 99% of the year via the building's unique shape and orientation, as well as server physical configuration.« less

  8. NASA Glenn Research Center's Fuel Cell Stack, Ancillary and System Test and Development Laboratory

    NASA Technical Reports Server (NTRS)

    Loyselle, Patricia L.; Prokopius, Kevin P.; Becks, Larry A.; Burger, Thomas H.; Dick, Joseph F.; Rodriguez, George; Bremenour, Frank; Long, Zedock

    2011-01-01

    At the NASA Glenn Research Center, a fully operational fuel cell test and evaluation laboratory is available which is capable of evaluating fuel cell components and systems for future NASA missions. Components and subsystems of various types can be operated and monitored under a variety of conditions utilizing different reactants. This fuel cell facility can test the effectiveness of various component and system designs to meet NASA's needs.

  9. Design and implementation of a hospital-based usability laboratory: insights from a Department of Veterans Affairs laboratory for health information technology.

    PubMed

    Russ, Alissa L; Weiner, Michael; Russell, Scott A; Baker, Darrell A; Fahner, W Jeffrey; Saleem, Jason J

    2012-12-01

    Although the potential benefits of more usable health information technologies (HIT) are substantial-reduced HIT support costs, increased work efficiency, and improved patient safety--human factors methods to improve usability are rarely employed. The US Department of Veterans Affairs (VA) has emerged as an early leader in establishing usability laboratories to inform the design of HIT, including its electronic health record. Experience with a usability laboratory at a VA Medical Center provides insights on how to design, implement, and leverage usability laboratories in the health care setting. The VA Health Services Research and Development Service Human-Computer Interaction & Simulation Laboratory emerged as one of the first VA usability laboratories and was intended to provide research-based findings about HIT designs. This laboratory supports rapid prototyping, formal usability testing, and analysis tools to assess existing technologies, alternative designs, and potential future technologies. RESULTS OF IMPLEMENTATION: Although the laboratory has maintained a research focus, it has become increasingly integrated with VA operations, both within the medical center and on a national VA level. With this resource, data-driven recommendations have been provided for the design of HIT applications before and after implementation. The demand for usability testing of HIT is increasing, and information on how to develop usability laboratories for the health care setting is often needed. This article may assist other health care organizations that want to invest in usability resources to improve HIT. The establishment and utilization of usability laboratories in the health care setting may improve HIT designs and promote safe, high-quality care for patients.

  10. Deep Space Network (DSN), Network Operations Control Center (NOCC) computer-human interfaces

    NASA Technical Reports Server (NTRS)

    Ellman, Alvin; Carlton, Magdi

    1993-01-01

    The Network Operations Control Center (NOCC) of the DSN is responsible for scheduling the resources of DSN, and monitoring all multi-mission spacecraft tracking activities in real-time. Operations performs this job with computer systems at JPL connected to over 100 computers at Goldstone, Australia and Spain. The old computer system became obsolete, and the first version of the new system was installed in 1991. Significant improvements for the computer-human interfaces became the dominant theme for the replacement project. Major issues required innovating problem solving. Among these issues were: How to present several thousand data elements on displays without overloading the operator? What is the best graphical representation of DSN end-to-end data flow? How to operate the system without memorizing mnemonics of hundreds of operator directives? Which computing environment will meet the competing performance requirements? This paper presents the technical challenges, engineering solutions, and results of the NOCC computer-human interface design.

  11. Laboratory of the Neuropsychology and Cognitive Neurosciences Research Center of Universidad Católica del Maule, Chile.

    PubMed

    Lucero, Boris; Saracini, Chiara; Muñoz-Quezada, María Teresa; Mendez-Bustos, Pablo; Mora, Marco

    2018-06-14

    The Laboratory of the Neuropsychology and Cognitive Neurosciences Research Center (CINPSI Neurocog), located in the "Technological Park" building of the Catholic University of Maule (Universidad Católica del Maule, UCM) campus in Talca, Chile, has been established as "Psychology Lab" recently in July, 2016. Our lines of work include basic and applied research. Among the basic research, we study executive functions, decision-making, and spatial cognition. In the applied field, we have studied neuropsychological and neurobehavioral effects of pesticides exposure, among other interests. One of our aims is to develop collaboration both national and internationally. It is important to mention that to date there are only few psychology laboratories and research centers in Chile involved with the fields of neuropsychology and neurosciences. Thus, this scientific effort could be a groundbreaking initiative to develop specific knowledge in this area locally and interculturally through its international collaborations.

  12. 37. Photograph of plan for repairs to computer room, 1958, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    37. Photograph of plan for repairs to computer room, 1958, prepared by the Public Works Office, Underwater Sound Laboratory. Drawing on file at Caretaker Site Office, Naval Undersea Warfare Center, New London. Copyright-free. - Naval Undersea Warfare Center, Bowditch Hall, 600 feet east of Smith Street & 350 feet south of Columbia Cove, West bank of Thames River, New London, New London County, CT

  13. Brain-computer interaction research at the Computer Vision and Multimedia Laboratory, University of Geneva.

    PubMed

    Pun, Thierry; Alecu, Teodor Iulian; Chanel, Guillaume; Kronegg, Julien; Voloshynovskiy, Sviatoslav

    2006-06-01

    This paper describes the work being conducted in the domain of brain-computer interaction (BCI) at the Multimodal Interaction Group, Computer Vision and Multimedia Laboratory, University of Geneva, Geneva, Switzerland. The application focus of this work is on multimodal interaction rather than on rehabilitation, that is how to augment classical interaction by means of physiological measurements. Three main research topics are addressed. The first one concerns the more general problem of brain source activity recognition from EEGs. In contrast with classical deterministic approaches, we studied iterative robust stochastic based reconstruction procedures modeling source and noise statistics, to overcome known limitations of current techniques. We also developed procedures for optimal electroencephalogram (EEG) sensor system design in terms of placement and number of electrodes. The second topic is the study of BCI protocols and performance from an information-theoretic point of view. Various information rate measurements have been compared for assessing BCI abilities. The third research topic concerns the use of EEG and other physiological signals for assessing a user's emotional status.

  14. Organic Contamination Baseline Study in NASA Johnson Space Center Astromaterials Curation Laboratories

    NASA Technical Reports Server (NTRS)

    Calaway, Michael J.; Allen, Carlton C.; Allton, Judith H.

    2014-01-01

    Future robotic and human spaceflight missions to the Moon, Mars, asteroids, and comets will require curating astromaterial samples with minimal inorganic and organic contamination to preserve the scientific integrity of each sample. 21st century sample return missions will focus on strict protocols for reducing organic contamination that have not been seen since the Apollo manned lunar landing program. To properly curate these materials, the Astromaterials Acquisition and Curation Office under the Astromaterial Research and Exploration Science Directorate at NASA Johnson Space Center houses and protects all extraterrestrial materials brought back to Earth that are controlled by the United States government. During fiscal year 2012, we conducted a year-long project to compile historical documentation and laboratory tests involving organic investigations at these facilities. In addition, we developed a plan to determine the current state of organic cleanliness in curation laboratories housing astromaterials. This was accomplished by focusing on current procedures and protocols for cleaning, sample handling, and storage. While the intention of this report is to give a comprehensive overview of the current state of organic cleanliness in JSC curation laboratories, it also provides a baseline for determining whether our cleaning procedures and sample handling protocols need to be adapted and/or augmented to meet the new requirements for future human spaceflight and robotic sample return missions.

  15. Routine operation of an Elliott 903 computer in a clinical chemistry laboratory

    PubMed Central

    Whitby, L. G.; Simpson, D.

    1973-01-01

    Experience gained in the last four years concerning the capabilities and limitations of an 8K Elliott 903 (18-bit word) computer with magnetic tape backing store in the routine operation of a clinical chemistry laboratory is described. Designed as a total system, routine operation has latterly had to be confined to data acquisition and process control functions, due primarily to limitations imposed by the choice of hardware early in the project. In this final report of a partially successful experiment the opportunity is taken to review mistakes made, especially at the start of the project, to warn potential computer users of pitfalls to be avoided. PMID:4580240

  16. Vibroacoustic payload environment prediction system (VAPEPS): Data base management center remote access guide

    NASA Technical Reports Server (NTRS)

    Thomas, V. C.

    1986-01-01

    A Vibroacoustic Data Base Management Center has been established at the Jet Propulsion Laboratory (JPL). The center utilizes the Vibroacoustic Payload Environment Prediction System (VAPEPS) software package to manage a data base of shuttle and expendable launch vehicle flight and ground test data. Remote terminal access over telephone lines to a dedicated VAPEPS computer system has been established to provide the payload community a convenient means of querying the global VAPEPS data base. This guide describes the functions of the JPL Data Base Management Center and contains instructions for utilizing the resources of the center.

  17. The Role of Computers in Research and Development at Langley Research Center

    NASA Technical Reports Server (NTRS)

    Wieseman, Carol D. (Compiler)

    1994-01-01

    This document is a compilation of presentations given at a workshop on the role cf computers in research and development at the Langley Research Center. The objectives of the workshop were to inform the Langley Research Center community of the current software systems and software practices in use at Langley. The workshop was organized in 10 sessions: Software Engineering; Software Engineering Standards, methods, and CASE tools; Solutions of Equations; Automatic Differentiation; Mosaic and the World Wide Web; Graphics and Image Processing; System Design Integration; CAE Tools; Languages; and Advanced Topics.

  18. Introduction of Digital Computer Technology Into the Undergraduate Chemistry Laboratory. Final Technical Report.

    ERIC Educational Resources Information Center

    Perone, Sam P.

    The objective of this project has been the development of a successful approach for the incorporation of on-line computer technology into the undergraduate chemistry laboratory. This approach assumes no prior programing, electronics or instrumental analysis experience on the part of the student; it does not displace the chemistry content with…

  19. Computational Fluid Dynamics Program at NASA Ames Research Center

    NASA Technical Reports Server (NTRS)

    Holst, Terry L.

    1989-01-01

    The Computational Fluid Dynamics (CFD) Program at NASA Ames Research Center is reviewed and discussed. The technical elements of the CFD Program are listed and briefly discussed. These elements include algorithm research, research and pilot code development, scientific visualization, advanced surface representation, volume grid generation, and numerical optimization. Next, the discipline of CFD is briefly discussed and related to other areas of research at NASA Ames including experimental fluid dynamics, computer science research, computational chemistry, and numerical aerodynamic simulation. These areas combine with CFD to form a larger area of research, which might collectively be called computational technology. The ultimate goal of computational technology research at NASA Ames is to increase the physical understanding of the world in which we live, solve problems of national importance, and increase the technical capabilities of the aerospace community. Next, the major programs at NASA Ames that either use CFD technology or perform research in CFD are listed and discussed. Briefly, this list includes turbulent/transition physics and modeling, high-speed real gas flows, interdisciplinary research, turbomachinery demonstration computations, complete aircraft aerodynamics, rotorcraft applications, powered lift flows, high alpha flows, multiple body aerodynamics, and incompressible flow applications. Some of the individual problems actively being worked in each of these areas is listed to help define the breadth or extent of CFD involvement in each of these major programs. State-of-the-art examples of various CFD applications are presented to highlight most of these areas. The main emphasis of this portion of the presentation is on examples which will not otherwise be treated at this conference by the individual presentations. Finally, a list of principal current limitations and expected future directions is given.

  20. Computer-aided dispatch--traffic management center field operational test : Washington State final report

    DOT National Transportation Integrated Search

    2006-05-01

    This document provides the final report for the evaluation of the USDOT-sponsored Computer-Aided Dispatch - Traffic Management Center Integration Field Operations Test in the State of Washington. The document discusses evaluation findings in the foll...

  1. An academic medical center's response to widespread computer failure.

    PubMed

    Genes, Nicholas; Chary, Michael; Chason, Kevin W

    2013-01-01

    As hospitals incorporate information technology (IT), their operations become increasingly vulnerable to technological breakdowns and attacks. Proper emergency management and business continuity planning require an approach to identify, mitigate, and work through IT downtime. Hospitals can prepare for these disasters by reviewing case studies. This case study details the disruption of computer operations at Mount Sinai Medical Center (MSMC), an urban academic teaching hospital. The events, and MSMC's response, are narrated and the impact on hospital operations is analyzed. MSMC's disaster management strategy prevented computer failure from compromising patient care, although walkouts and time-to-disposition in the emergency department (ED) notably increased. This incident highlights the importance of disaster preparedness and mitigation. It also demonstrates the value of using operational data to evaluate hospital responses to disasters. Quantifying normal hospital functions, just as with a patient's vital signs, may help quantitatively evaluate and improve disaster management and business continuity planning.

  2. Extreme-Scale Computing Project Aims to Advance Precision Oncology | Frederick National Laboratory for Cancer Research

    Cancer.gov

    Two government agencies and five national laboratories are collaborating to develop extremely high-performance computing capabilities that will analyze mountains of research and clinical data to improve scientific understanding of cancer, predict dru

  3. Activities of the Japanese space weather forecast center at Communications Research Laboratory.

    PubMed

    Watari, Shinichi; Tomita, Fumihiko

    2002-12-01

    The International Space Environment Service (ISES) is an international organization for space weather forecasts and belongs to the International Union of Radio Science (URSI). There are eleven ISES forecast centers in the world, and Communications Research Laboratory (CRL) runs the Japanese one. We make forecasts on the space environment and deliver them over the phones and through the Internet. Our forecasts could be useful for human activities in space. Currently solar activity is near maximum phase of the solar cycle 23. We report the several large disturbances of space environment occurred in 2001, during which low-latitude auroras were observed several times in Japan.

  4. Secure data exchange between intelligent devices and computing centers

    NASA Astrophysics Data System (ADS)

    Naqvi, Syed; Riguidel, Michel

    2005-03-01

    The advent of reliable spontaneous networking technologies (commonly known as wireless ad-hoc networks) has ostensibly raised stakes for the conception of computing intensive environments using intelligent devices as their interface with the external world. These smart devices are used as data gateways for the computing units. These devices are employed in highly volatile environments where the secure exchange of data between these devices and their computing centers is of paramount importance. Moreover, their mission critical applications require dependable measures against the attacks like denial of service (DoS), eavesdropping, masquerading, etc. In this paper, we propose a mechanism to assure reliable data exchange between an intelligent environment composed of smart devices and distributed computing units collectively called 'computational grid'. The notion of infosphere is used to define a digital space made up of a persistent and a volatile asset in an often indefinite geographical space. We study different infospheres and present general evolutions and issues in the security of such technology-rich and intelligent environments. It is beyond any doubt that these environments will likely face a proliferation of users, applications, networked devices, and their interactions on a scale never experienced before. It would be better to build in the ability to uniformly deal with these systems. As a solution, we propose a concept of virtualization of security services. We try to solve the difficult problems of implementation and maintenance of trust on the one hand, and those of security management in heterogeneous infrastructure on the other hand.

  5. Laboratory for Computer Science Progress Report 21, July 1983-June 1984.

    DTIC Science & Technology

    1984-06-01

    Systems 269 4. Distributed Consensus 270 5. Election of a Leader in a Distributed Ring of Processors 273 6. Distributed Network Algorithms 274 7. Diagnosis...multiprocessor systems. This facility, funded by the new!y formed Strategic Computing Program of the Defense Advanced Research Projects Agency, will enable...Academic Staff P. Szo)ovits, Group Leader R. Patil Collaborating Investigators M. Criscitiello, M.D., Tufts-New England Medical Center Hospital R

  6. Exploring Effective Decision Making through Human-Centered and Computational Intelligence Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Han, Kyungsik; Cook, Kristin A.; Shih, Patrick C.

    Decision-making has long been studied to understand a psychological, cognitive, and social process of selecting an effective choice from alternative options. Its studies have been extended from a personal level to a group and collaborative level, and many computer-aided decision-making systems have been developed to help people make right decisions. There has been significant research growth in computational aspects of decision-making systems, yet comparatively little effort has existed in identifying and articulating user needs and requirements in assessing system outputs and the extent to which human judgments could be utilized for making accurate and reliable decisions. Our research focus ismore » decision-making through human-centered and computational intelligence methods in a collaborative environment, and the objectives of this position paper are to bring our research ideas to the workshop, and share and discuss ideas.« less

  7. Information and psychomotor skills knowledge acquisition: A student-customer-centered and computer-supported approach.

    PubMed

    Nicholson, Anita; Tobin, Mary

    2006-01-01

    This presentation will discuss coupling commercial and customized computer-supported teaching aids to provide BSN nursing students with a friendly customer-centered self-study approach to psychomotor skill acquisition.

  8. A validated methodology for determination of laboratory instrument computer interface efficacy

    NASA Astrophysics Data System (ADS)

    1984-12-01

    This report is intended to provide a methodology for determining when, and for which instruments, direct interfacing of laboratory instrument and laboratory computers is beneficial. This methodology has been developed to assist the Tri-Service Medical Information Systems Program Office in making future decisions regarding laboratory instrument interfaces. We have calculated the time savings required to reach a break-even point for a range of instrument interface prices and corresponding average annual costs. The break-even analyses used empirical data to estimate the number of data points run per day that are required to meet the break-even point. The results indicate, for example, that at a purchase price of $3,000, an instrument interface will be cost-effective if the instrument is utilized for at least 154 data points per day if operated in the continuous mode, or 216 points per day if operated in the discrete mode. Although this model can help to ensure that instrument interfaces are cost effective, additional information should be considered in making the interface decisions. A reduction in results transcription errors may be a major benefit of instrument interfacing.

  9. NASA Center for Intelligent Robotic Systems for Space Exploration

    NASA Technical Reports Server (NTRS)

    1990-01-01

    NASA's program for the civilian exploration of space is a challenge to scientists and engineers to help maintain and further develop the United States' position of leadership in a focused sphere of space activity. Such an ambitious plan requires the contribution and further development of many scientific and technological fields. One research area essential for the success of these space exploration programs is Intelligent Robotic Systems. These systems represent a class of autonomous and semi-autonomous machines that can perform human-like functions with or without human interaction. They are fundamental for activities too hazardous for humans or too distant or complex for remote telemanipulation. To meet this challenge, Rensselaer Polytechnic Institute (RPI) has established an Engineering Research Center for Intelligent Robotic Systems for Space Exploration (CIRSSE). The Center was created with a five year $5.5 million grant from NASA submitted by a team of the Robotics and Automation Laboratories. The Robotics and Automation Laboratories of RPI are the result of the merger of the Robotics and Automation Laboratory of the Department of Electrical, Computer, and Systems Engineering (ECSE) and the Research Laboratory for Kinematics and Robotic Mechanisms of the Department of Mechanical Engineering, Aeronautical Engineering, and Mechanics (ME,AE,&M), in 1987. This report is an examination of the activities that are centered at CIRSSE.

  10. Theory, Modeling, Software and Hardware Development for Analytical and Computational Materials Science

    NASA Technical Reports Server (NTRS)

    Young, Gerald W.; Clemons, Curtis B.

    2004-01-01

    The focus of this Cooperative Agreement between the Computational Materials Laboratory (CML) of the Processing Science and Technology Branch of the NASA Glenn Research Center (GRC) and the Department of Theoretical and Applied Mathematics at The University of Akron was in the areas of system development of the CML workstation environment, modeling of microgravity and earth-based material processing systems, and joint activities in laboratory projects. These efforts complement each other as the majority of the modeling work involves numerical computations to support laboratory investigations. Coordination and interaction between the modelers, system analysts, and laboratory personnel are essential toward providing the most effective simulations and communication of the simulation results. Toward these means, The University of Akron personnel involved in the agreement worked at the Applied Mathematics Research Laboratory (AMRL) in the Department of Theoretical and Applied Mathematics while maintaining a close relationship with the personnel of the Computational Materials Laboratory at GRC. Network communication between both sites has been established. A summary of the projects we undertook during the time period 9/1/03 - 6/30/04 is included.

  11. Laboratory Animal Technician | Center for Cancer Research

    Cancer.gov

    PROGRAM DESCRIPTION The Laboratory Animal Sciences Program (LASP) provides exceptional quality animal care and technical support services for animal research performed at the National Cancer Institute at the Frederick National Laboratory for Cancer Research. LASP executes this mission by providing a broad spectrum of state-of-the-art technologies and services that are focused

  12. Using Frameworks in a Government Contracting Environment: Case Study at the NASA Center for Computational Sciences

    NASA Technical Reports Server (NTRS)

    McGalliard, James

    2008-01-01

    A viewgraph describing the use of multiple frameworks by NASA, GSA, and U.S. Government agencies is presented. The contents include: 1) Federal Systems Integration and Management Center (FEDSIM) and NASA Center for Computational Sciences (NCCS) Environment; 2) Ruling Frameworks; 3) Implications; and 4) Reconciling Multiple Frameworks.

  13. A FRAMEWORK FOR A COMPUTATIONAL TOXICOLOGY RESEARCH PROGRAM IN ORD

    EPA Science Inventory

    "A Framework for a Computational Toxicology Research Program in ORD" was drafted by a Technical Writing Team having representatives from all of ORD's Laboratories and Centers. The document describes a framework for the development of an program within ORD to utilize approaches d...

  14. Current state and future direction of computer systems at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Rogers, James L. (Editor); Tucker, Jerry H. (Editor)

    1992-01-01

    Computer systems have advanced at a rate unmatched by any other area of technology. As performance has dramatically increased there has been an equally dramatic reduction in cost. This constant cost performance improvement has precipitated the pervasiveness of computer systems into virtually all areas of technology. This improvement is due primarily to advances in microelectronics. Most people are now convinced that the new generation of supercomputers will be built using a large number (possibly thousands) of high performance microprocessors. Although the spectacular improvements in computer systems have come about because of these hardware advances, there has also been a steady improvement in software techniques. In an effort to understand how these hardware and software advances will effect research at NASA LaRC, the Computer Systems Technical Committee drafted this white paper to examine the current state and possible future directions of computer systems at the Center. This paper discusses selected important areas of computer systems including real-time systems, embedded systems, high performance computing, distributed computing networks, data acquisition systems, artificial intelligence, and visualization.

  15. Zebrafish Health Conditions in the China Zebrafish Resource Center and 20 Major Chinese Zebrafish Laboratories.

    PubMed

    Liu, Liyue; Pan, Luyuan; Li, Kuoyu; Zhang, Yun; Zhu, Zuoyan; Sun, Yonghua

    2016-07-01

    In China, the use of zebrafish as an experimental animal in the past 15 years has widely expanded. The China Zebrafish Resource Center (CZRC), which was established in 2012, is becoming one of the major resource centers in the global zebrafish community. Large-scale use and regular exchange of zebrafish resources have put forward higher requirements on zebrafish health issues in China. This article reports the current aquatic infrastructure design, animal husbandry, and health-monitoring programs in the CZRC. Meanwhile, through a survey of 20 Chinese zebrafish laboratories, we also describe the current health status of major zebrafish facilities in China. We conclude that it is of great importance to establish a widely accepted health standard and health-monitoring strategy in the Chinese zebrafish research community.

  16. The Lister Hill National Center for Biomedical Communications.

    PubMed

    Smith, K A

    1994-09-01

    On August 3, 1968, the Joint Resolution of the Congress established the program and construction of the Lister Hill National Center for Biomedical Communications. The facility dedicated in 1980 contains the latest in computer and communications technologies. The history, program requirements, construction management, and general planning are discussed including technical issues regarding cabling, systems functions, heating, ventilation, and air conditioning system (HVAC), fire suppression, research and development laboratories, among others.

  17. System analysis for the Huntsville Operation Support Center distributed computer system

    NASA Technical Reports Server (NTRS)

    Ingels, F. M.

    1986-01-01

    A simulation model of the NASA Huntsville Operational Support Center (HOSC) was developed. This simulation model emulates the HYPERchannel Local Area Network (LAN) that ties together the various computers of HOSC. The HOSC system is a large installation of mainframe computers such as the Perkin Elmer 3200 series and the Dec VAX series. A series of six simulation exercises of the HOSC model is described using data sets provided by NASA. The analytical analysis of the ETHERNET LAN and the video terminals (VTs) distribution system are presented. An interface analysis of the smart terminal network model which allows the data flow requirements due to VTs on the ETHERNET LAN to be estimated, is presented.

  18. Computing with Beowulf

    NASA Technical Reports Server (NTRS)

    Cohen, Jarrett

    1999-01-01

    Parallel computers built out of mass-market parts are cost-effectively performing data processing and simulation tasks. The Supercomputing (now known as "SC") series of conferences celebrated its 10th anniversary last November. While vendors have come and gone, the dominant paradigm for tackling big problems still is a shared-resource, commercial supercomputer. Growing numbers of users needing a cheaper or dedicated-access alternative are building their own supercomputers out of mass-market parts. Such machines are generally called Beowulf-class systems after the 11th century epic. This modern-day Beowulf story began in 1994 at NASA's Goddard Space Flight Center. A laboratory for the Earth and space sciences, computing managers there threw down a gauntlet to develop a $50,000 gigaFLOPS workstation for processing satellite data sets. Soon, Thomas Sterling and Don Becker were working on the Beowulf concept at the University Space Research Association (USRA)-run Center of Excellence in Space Data and Information Sciences (CESDIS). Beowulf clusters mix three primary ingredients: commodity personal computers or workstations, low-cost Ethernet networks, and the open-source Linux operating system. One of the larger Beowulfs is Goddard's Highly-parallel Integrated Virtual Environment, or HIVE for short.

  19. Facilities | Argonne National Laboratory

    Science.gov Websites

    Skip to main content Argonne National Laboratory Toggle Navigation Toggle Search Research Facilities Advanced Powertrain Research Facility Center for Transportation Research Distributed Energy Research Center Engine Research Facility Heat Transfer Laboratory Materials Engineering Research Facility

  20. Computer-generated formulas for three-center nuclear-attraction integrals (electrostatic potential) for Slater-type orbitals

    NASA Technical Reports Server (NTRS)

    Jones, H. W.

    1984-01-01

    The computer-assisted C-matrix, Loewdin-alpha-function, single-center expansion method in spherical harmonics has been applied to the three-center nuclear-attraction integral (potential due to the product of separated Slater-type orbitals). Exact formulas are produced for 13 terms of an infinite series that permits evaluation to ten decimal digits of an example using 1s orbitals.

  1. Eye center localization and gaze gesture recognition for human-computer interaction.

    PubMed

    Zhang, Wenhao; Smith, Melvyn L; Smith, Lyndon N; Farooq, Abdul

    2016-03-01

    This paper introduces an unsupervised modular approach for accurate and real-time eye center localization in images and videos, thus allowing a coarse-to-fine, global-to-regional scheme. The trajectories of eye centers in consecutive frames, i.e., gaze gestures, are further analyzed, recognized, and employed to boost the human-computer interaction (HCI) experience. This modular approach makes use of isophote and gradient features to estimate the eye center locations. A selective oriented gradient filter has been specifically designed to remove strong gradients from eyebrows, eye corners, and shadows, which sabotage most eye center localization methods. A real-world implementation utilizing these algorithms has been designed in the form of an interactive advertising billboard to demonstrate the effectiveness of our method for HCI. The eye center localization algorithm has been compared with 10 other algorithms on the BioID database and six other algorithms on the GI4E database. It outperforms all the other algorithms in comparison in terms of localization accuracy. Further tests on the extended Yale Face Database b and self-collected data have proved this algorithm to be robust against moderate head poses and poor illumination conditions. The interactive advertising billboard has manifested outstanding usability and effectiveness in our tests and shows great potential for benefiting a wide range of real-world HCI applications.

  2. Usnic Acid and the Intramolecular Hydrogen Bond: A Computational Experiment for the Organic Laboratory

    ERIC Educational Resources Information Center

    Green, Thomas K.; Lane, Charles A.

    2006-01-01

    A computational experiment is described for the organic chemistry laboratory that allows students to estimate the relative strengths of the intramolecular hydrogen bonds of usnic and isousnic acids, two related lichen secondary metabolites. Students first extract and purify usnic acid from common lichens and obtain [superscript 1]H NMR and IR…

  3. Computer-aided dispatch--traffic management center field operational test : state of Utah final report

    DOT National Transportation Integrated Search

    2006-07-01

    This document provides the final report for the evaluation of the USDOT-sponsored Computer-Aided Dispatch Traffic Management Center Integration Field Operations Test in the State of Utah. The document discusses evaluation findings in the followin...

  4. Computational fluid dynamics research at the United Technologies Research Center requiring supercomputers

    NASA Technical Reports Server (NTRS)

    Landgrebe, Anton J.

    1987-01-01

    An overview of research activities at the United Technologies Research Center (UTRC) in the area of Computational Fluid Dynamics (CFD) is presented. The requirement and use of various levels of computers, including supercomputers, for the CFD activities is described. Examples of CFD directed toward applications to helicopters, turbomachinery, heat exchangers, and the National Aerospace Plane are included. Helicopter rotor codes for the prediction of rotor and fuselage flow fields and airloads were developed with emphasis on rotor wake modeling. Airflow and airload predictions and comparisons with experimental data are presented. Examples are presented of recent parabolized Navier-Stokes and full Navier-Stokes solutions for hypersonic shock-wave/boundary layer interaction, and hydrogen/air supersonic combustion. In addition, other examples of CFD efforts in turbomachinery Navier-Stokes methodology and separated flow modeling are presented. A brief discussion of the 3-tier scientific computing environment is also presented, in which the researcher has access to workstations, mid-size computers, and supercomputers.

  5. Computational fluid dynamics research at the United Technologies Research Center requiring supercomputers

    NASA Astrophysics Data System (ADS)

    Landgrebe, Anton J.

    1987-03-01

    An overview of research activities at the United Technologies Research Center (UTRC) in the area of Computational Fluid Dynamics (CFD) is presented. The requirement and use of various levels of computers, including supercomputers, for the CFD activities is described. Examples of CFD directed toward applications to helicopters, turbomachinery, heat exchangers, and the National Aerospace Plane are included. Helicopter rotor codes for the prediction of rotor and fuselage flow fields and airloads were developed with emphasis on rotor wake modeling. Airflow and airload predictions and comparisons with experimental data are presented. Examples are presented of recent parabolized Navier-Stokes and full Navier-Stokes solutions for hypersonic shock-wave/boundary layer interaction, and hydrogen/air supersonic combustion. In addition, other examples of CFD efforts in turbomachinery Navier-Stokes methodology and separated flow modeling are presented. A brief discussion of the 3-tier scientific computing environment is also presented, in which the researcher has access to workstations, mid-size computers, and supercomputers.

  6. Unique life sciences research facilities at NASA Ames Research Center

    NASA Technical Reports Server (NTRS)

    Mulenburg, G. M.; Vasques, M.; Caldwell, W. F.; Tucker, J.

    1994-01-01

    The Life Science Division at NASA's Ames Research Center has a suite of specialized facilities that enable scientists to study the effects of gravity on living systems. This paper describes some of these facilities and their use in research. Seven centrifuges, each with its own unique abilities, allow testing of a variety of parameters on test subjects ranging from single cells through hardware to humans. The Vestibular Research Facility allows the study of both centrifugation and linear acceleration on animals and humans. The Biocomputation Center uses computers for 3D reconstruction of physiological systems, and interactive research tools for virtual reality modeling. Psycophysiological, cardiovascular, exercise physiology, and biomechanical studies are conducted in the 12 bed Human Research Facility and samples are analyzed in the certified Central Clinical Laboratory and other laboratories at Ames. Human bedrest, water immersion and lower body negative pressure equipment are also available to study physiological changes associated with weightlessness. These and other weightlessness models are used in specialized laboratories for the study of basic physiological mechanisms, metabolism and cell biology. Visual-motor performance, perception, and adaptation are studied using ground-based models as well as short term weightlessness experiments (parabolic flights). The unique combination of Life Science research facilities, laboratories, and equipment at Ames Research Center are described in detail in relation to their research contributions.

  7. System Analysis for the Huntsville Operation Support Center, Distributed Computer System

    NASA Technical Reports Server (NTRS)

    Ingels, F. M.; Massey, D.

    1985-01-01

    HOSC as a distributed computing system, is responsible for data acquisition and analysis during Space Shuttle operations. HOSC also provides computing services for Marshall Space Flight Center's nonmission activities. As mission and nonmission activities change, so do the support functions of HOSC change, demonstrating the need for some method of simulating activity at HOSC in various configurations. The simulation developed in this work primarily models the HYPERchannel network. The model simulates the activity of a steady state network, reporting statistics such as, transmitted bits, collision statistics, frame sequences transmitted, and average message delay. These statistics are used to evaluate such performance indicators as throughout, utilization, and delay. Thus the overall performance of the network is evaluated, as well as predicting possible overload conditions.

  8. Clinical and clinical laboratory correlates in sea otters dying unexpectedly in rehabilitation centers following the Exxon Valdez oil spill

    USGS Publications Warehouse

    Rebar, A.H.; Lipscomb, T.P.; Harris, R.K.; Ballachey, Brenda E.

    1995-01-01

    Following the Exxon Valdez oil spill, 347 oiled sea otters (Enhydra lutris) were treated in rehabilitation centers. Of these, 116 died, 94 within 10 days of presentation. Clinical records of 21 otters dying during the first 10 days of rehabilitation were reviewed to define the laboratory abnormalities and clinical syndromes associated with these unexpected deaths. The most common terminal syndrome was shock characterized by hypothermia, lethargy, and often hemorrhagic diarrhea. In heavily and moderately oiled otters, shock developed within 48 hours of initial presentation, whereas in lightly oiled otters shock generally occurred during the second week of captivity. Accompanying laboratory abnormalities included leukopenia with increased numbers of immature neutrophils (degenerative left shift), lymphopenia, anemia, azotemia (primarily prerenal), hyperkalemia, hypoproteinemia/hypoalbuminemia, elevations of serum transaminases, and hypoglycemia. Shock associated with hemorrhagic diarrhea probably occurred either as a direct primary effect of oiling or as an indirect effect secondary to confinement and handling in the rehabilitation centers. Lightly oiled otters were less likely to die from shock than were heavily oiled otters (22% vs. 72%, respectively). Heavily oiled otters developed shock more rapidly and had greater numbers of laboratory abnormalities, suggesting that exposure to oil was an important contributing factor.

  9. Brookhaven National Laboratory technology transfer report, fiscal year 1987

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1987-01-01

    The Brookhaven Office of Research and Technology Applications (ORTA) inaugurated two major initiatives. The effort by our ORTA in collaboration with the National Synchrotron Light Source (NSLS) has succeeded in alerting American industry to the potential of using a synchrotron x-ray source for high resolution lithography. We are undertaking a preconstruction study for the construction of a prototype commercial synchrotron and development of an advanced commercial cryogenic synchrotron (XLS). ORTA sponsored a technology transfer workshop where industry expressed its views on how to transfer accelerator technology during the construction of the prototype commercial machine. The Northeast Regional utility Initiative broughtmore » 14 utilities to a workshop at the Laboratory in November. One recommendation of this workshop was to create a Center at the Laboratory for research support on issues of interest to utilities in the region where BNL has unique capability. The ORTA has initiated discussions with the New York State Science and Technology Commission, Cornell University's world renowned Nannofabrication Center and the computer aided design capabilities at SUNY at Stony Brook to create, centered around the NSLS and the XLS, the leading edge semiconductor process technology development center when the XLS becomes operational in two and a half years. 1 fig.« less

  10. Marshall Space Flight Center Materials and Processes Laboratory

    NASA Technical Reports Server (NTRS)

    Tramel, Terri L.

    2012-01-01

    Marshall?s Materials and Processes Laboratory has been a core capability for NASA for over fifty years. MSFC has a proven heritage and recognized expertise in materials and manufacturing that are essential to enable and sustain space exploration. Marshall provides a "systems-wise" capability for applied research, flight hardware development, and sustaining engineering. Our history of leadership and achievements in materials, manufacturing, and flight experiments includes Apollo, Skylab, Mir, Spacelab, Shuttle (Space Shuttle Main Engine, External Tank, Reusable Solid Rocket Motor, and Solid Rocket Booster), Hubble, Chandra, and the International Space Station. MSFC?s National Center for Advanced Manufacturing, NCAM, facilitates major M&P advanced manufacturing partnership activities with academia, industry and other local, state and federal government agencies. The Materials and Processes Laborato ry has principal competencies in metals, composites, ceramics, additive manufacturing, materials and process modeling and simulation, space environmental effects, non-destructive evaluation, and fracture and failure analysis provide products ranging from materials research in space to fully integrated solutions for large complex systems challenges. Marshall?s materials research, development and manufacturing capabilities assure that NASA and National missions have access to cutting-edge, cost-effective engineering design and production options that are frugal in using design margins and are verified as safe and reliable. These are all critical factors in both future mission success and affordability.

  11. An analysis of reference laboratory (send out) testing: an 8-year experience in a large academic medical center.

    PubMed

    MacMillan, Donna; Lewandrowski, Elizabeth; Lewandrowski, Kent

    2004-01-01

    Utilization of outside reference laboratories for selected laboratory testing is common in the United States. However, relatively little data exist in the literature describing the scope and impact of these services. In this study, we reviewed use of reference laboratory testing at the Massachusetts General Hospital, a large urban academic medical center in Boston, Massachusetts. A retrospective review of hospital and laboratory administrative records over an 8-year period from fiscal years (FY) 1995-2002. Over the 8 years studied, reference laboratory expenses increased 4.2-fold and totaled 12.4% of the total laboratory budget in FY 2002. Total reference laboratory test volume increased 4-fold to 68,328 tests in FY 2002 but represented only 1.06% of the total test volume in the hospital. The menu of reference laboratory tests comprised 946 tests (65.7% of the hospital test menu) compared to 494 (34.3%) of tests performed in house. The average unit cost of reference laboratory tests was essentially unchanged but was approximately 13 times greater than the average unit cost in the hospital laboratory. Much of the growth in reference laboratory cost can be attributed to the addition of new molecular, genetic, and microbiological assays. Four of the top 10 tests with the highest total cost in 2002 were molecular diagnostic tests that were recently added to the test menu. Reference laboratory testing comprises a major component of hospital clinical laboratory services. Although send out tests represent a small percentage of the total test volume, these services account for the majority of the hospital laboratory test menu and a disproportionate percentage of laboratory costs.

  12. Initial constructs for patient-centered outcome measures to evaluate brain-computer interfaces.

    PubMed

    Andresen, Elena M; Fried-Oken, Melanie; Peters, Betts; Patrick, Donald L

    2016-10-01

    The authors describe preliminary work toward the creation of patient-centered outcome (PCO) measures to evaluate brain-computer interface (BCI) as an assistive technology (AT) for individuals with severe speech and physical impairments (SSPI). In Phase 1, 591 items from 15 existing measures were mapped to the International Classification of Functioning, Disability and Health (ICF). In Phase 2, qualitative interviews were conducted with eight people with SSPI and seven caregivers. Resulting text data were coded in an iterative analysis. Most items (79%) were mapped to the ICF environmental domain; over half (53%) were mapped to more than one domain. The ICF framework was well suited for mapping items related to body functions and structures, but less so for items in other areas, including personal factors. Two constructs emerged from qualitative data: quality of life (QOL) and AT. Component domains and themes were identified for each. Preliminary constructs, domains and themes were generated for future PCO measures relevant to BCI. Existing instruments are sufficient for initial items but do not adequately match the values of people with SSPI and their caregivers. Field methods for interviewing people with SSPI were successful, and support the inclusion of these individuals in PCO research. Implications for Rehabilitation Adapted interview methods allow people with severe speech and physical impairments to participate in patient-centered outcomes research. Patient-centered outcome measures are needed to evaluate the clinical implementation of brain-computer interface as an assistive technology.

  13. 42 CFR 493.1355 - Condition: Laboratories performing PPM procedures; laboratory director.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Condition: Laboratories performing PPM procedures; laboratory director. 493.1355 Section 493.1355 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STANDARDS AND CERTIFICATION LABORATORY REQUIREMENTS...

  14. 42 CFR 493.1355 - Condition: Laboratories performing PPM procedures; laboratory director.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Condition: Laboratories performing PPM procedures; laboratory director. 493.1355 Section 493.1355 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STANDARDS AND CERTIFICATION LABORATORY REQUIREMENTS...

  15. Computer Assisted Multi-Center Creation of Medical Knowledge Bases

    PubMed Central

    Giuse, Nunzia Bettinsoli; Giuse, Dario A.; Miller, Randolph A.

    1988-01-01

    Computer programs which support different aspects of medical care have been developed in recent years. Their capabilities range from diagnosis to medical imaging, and include hospital management systems and therapy prescription. In spite of their diversity these systems have one commonality: their reliance on a large body of medical knowledge in computer-readable form. This knowledge enables such programs to draw inferences, validate hypotheses, and in general to perform their intended task. As has been clear to developers of such systems, however, the creation and maintenance of medical knowledge bases are very expensive. Practical and economical difficulties encountered during this long-term process have discouraged most attempts. This paper discusses knowledge base creation and maintenance, with special emphasis on medical applications. We first describe the methods currently used and their limitations. We then present our recent work on developing tools and methodologies which will assist in the process of creating a medical knowledge base. We focus, in particular, on the possibility of multi-center creation of the knowledge base.

  16. The effective use of virtualization for selection of data centers in a cloud computing environment

    NASA Astrophysics Data System (ADS)

    Kumar, B. Santhosh; Parthiban, Latha

    2018-04-01

    Data centers are the places which consist of network of remote servers to store, access and process the data. Cloud computing is a technology where users worldwide will submit the tasks and the service providers will direct the requests to the data centers which are responsible for execution of tasks. The servers in the data centers need to employ the virtualization concept so that multiple tasks can be executed simultaneously. In this paper we proposed an algorithm for data center selection based on energy of virtual machines created in server. The virtualization energy in each of the server is calculated and total energy of the data center is obtained by the summation of individual server energy. The tasks submitted are routed to the data center with least energy consumption which will result in minimizing the operational expenses of a service provider.

  17. 78 FR 69926 - Privacy Act of 1974, as Amended; Computer Matching Program (SSA/Centers for Medicare & Medicaid...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-21

    ... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2013-0059] Privacy Act of 1974, as Amended; Computer Matching Program (SSA/ Centers for Medicare & Medicaid Services (CMS))--Match Number 1076 AGENCY: Social Security Administration (SSA). ACTION: Notice of a renewal of an existing computer matching...

  18. 76 FR 21091 - Privacy Act of 1974, as Amended; Computer Matching Program (SSA/Centers for Medicare & Medicaid...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-14

    ... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2011-0022] Privacy Act of 1974, as Amended; Computer Matching Program (SSA/ Centers for Medicare & Medicaid Services (CMS))--Match Number 1076 AGENCY: Social Security Administration (SSA). ACTION: Notice of a renewal of an existing computer matching...

  19. [The quality management in clinical diagnostic laboratory in conditions of the Federal Center of traumatology, orthopedics and endoprosthesis replacement of Minzdrav of Russia (Cheboksary)].

    PubMed

    Nikolaev, N S; Nazarova, V V; Dobrovol'skaia, N Iu; Orlova, A V; Pchelova, N N

    2014-10-01

    The article presents experience of clinical diagnostic laboratory of the Federal Center of traumatology, orthopedics and endoprosthesis replacement of Minzdrav of Russia (Cheboksary) in the area of quality management of medical laboratory services on the basis of evaluation of efficacy and effectiveness of processes. The factors effecting quality of functioning of clinical diagnostic laboratory are indicated. The criteria and indicators of efficacy of work of employees of clinical diagnostic laboratory are presented.

  20. Laboratory Equipment Criteria.

    ERIC Educational Resources Information Center

    State Univ. Construction Fund, Albany, NY.

    Requirements for planning, designing, constructing and installing laboratory furniture are given in conjunction with establishing facility criteria for housing laboratory equipment. Furniture and equipment described include--(1) center tables, (2) reagent racks, (3) laboratory benches and their mechanical fixtures, (4) sink and work counters, (5)…

  1. Modeling Subsurface Reactive Flows Using Leadership-Class Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mills, Richard T; Hammond, Glenn; Lichtner, Peter

    2009-01-01

    We describe our experiences running PFLOTRAN - a code for simulation of coupled hydro-thermal-chemical processes in variably saturated, non-isothermal, porous media - on leadership-class supercomputers, including initial experiences running on the petaflop incarnation of Jaguar, the Cray XT5 at the National Center for Computational Sciences at Oak Ridge National Laboratory. PFLOTRAN utilizes fully implicit time-stepping and is built on top of the Portable, Extensible Toolkit for Scientific Computation (PETSc). We discuss some of the hurdles to 'at scale' performance with PFLOTRAN and the progress we have made in overcoming them on leadership-class computer architectures.

  2. Computer and laboratory simulation of interactions between spacecraft surfaces and charged-particle environments

    NASA Technical Reports Server (NTRS)

    Stevens, N. J.

    1979-01-01

    Cases where the charged-particle environment acts on the spacecraft (e.g., spacecraft charging phenomena) and cases where a system on the spacecraft causes the interaction (e.g., high voltage space power systems) are considered. Both categories were studied in ground simulation facilities to understand the processes involved and to measure the pertinent parameters. Computer simulations are based on the NASA Charging Analyzer Program (NASCAP) code. Analytical models are developed in this code and verified against the experimental data. Extrapolation from the small test samples to space conditions are made with this code. Typical results from laboratory and computer simulations are presented for both types of interactions. Extrapolations from these simulations to performance in space environments are discussed.

  3. The psychology of computer displays in the modern mission control center

    NASA Technical Reports Server (NTRS)

    Granaas, Michael M.; Rhea, Donald C.

    1988-01-01

    Work at NASA's Western Aeronautical Test Range (WATR) has demonstrated the need for increased consideration of psychological factors in the design of computer displays for the WATR mission control center. These factors include color perception, memory load, and cognitive processing abilities. A review of relevant work in the human factors psychology area is provided to demonstrate the need for this awareness. The information provided should be relevant in control room settings where computerized displays are being used.

  4. Report: EPA’s Radiation and Indoor Environments National Laboratory Should Improve Its Computer Room Security Controls

    EPA Pesticide Factsheets

    Report #12-P-0847, September 21, 2012.Our review of the security posture and in-place environmental controls of EPA’s Radiation and Indoor Environments National Laboratory computer room disclosed an array of security and environmental control deficiencies.

  5. Computational Methods for Crashworthiness

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler); Carden, Huey D. (Compiler)

    1993-01-01

    Presentations and discussions from the joint UVA/NASA Workshop on Computational Methods for Crashworthiness held at Langley Research Center on 2-3 Sep. 1992 are included. The presentations addressed activities in the area of impact dynamics. Workshop attendees represented NASA, the Army and Air Force, the Lawrence Livermore and Sandia National Laboratories, the aircraft and automotive industries, and academia. The workshop objectives were to assess the state-of-technology in the numerical simulation of crash and to provide guidelines for future research.

  6. NECC '86: Proceedings of the National Educational Computing Conference (7th, San Diego, California, June 4-6, 1986).

    ERIC Educational Resources Information Center

    Ryan, William C., Ed.

    The l50 papers and 28 panel discussion reports in this collection focus on innovations, trends, and research in the use of computers in a variety of educational settings. Topics discussed include: computer centers and laboratories; use of computer at various levels from K-12 through the university, including inservice teacher training; use of…

  7. Laboratory and software applications for clinical trials: the global laboratory environment.

    PubMed

    Briscoe, Chad

    2011-11-01

    The Applied Pharmaceutical Software Meeting is held annually. It is sponsored by The Boston Society, a not-for-profit organization that coordinates a series of meetings within the global pharmaceutical industry. The meeting generally focuses on laboratory applications, but in recent years has expanded to include some software applications for clinical trials. The 2011 meeting emphasized the global laboratory environment. Global clinical trials generate massive amounts of data in many locations that must be centralized and processed for efficient analysis. Thus, the meeting had a strong focus on establishing networks and systems for dealing with the computer infrastructure to support such environments. In addition to the globally installed laboratory information management system, electronic laboratory notebook and other traditional laboratory applications, cloud computing is quickly becoming the answer to provide efficient, inexpensive options for managing the large volumes of data and computing power, and thus it served as a central theme for the meeting.

  8. Petascale Computing for Ground-Based Solar Physics with the DKIST Data Center

    NASA Astrophysics Data System (ADS)

    Berukoff, Steven J.; Hays, Tony; Reardon, Kevin P.; Spiess, DJ; Watson, Fraser; Wiant, Scott

    2016-05-01

    When construction is complete in 2019, the Daniel K. Inouye Solar Telescope will be the most-capable large aperture, high-resolution, multi-instrument solar physics facility in the world. The telescope is designed as a four-meter off-axis Gregorian, with a rotating Coude laboratory designed to simultaneously house and support five first-light imaging and spectropolarimetric instruments. At current design, the facility and its instruments will generate data volumes of 3 PB per year, and produce 107-109 metadata elements.The DKIST Data Center is being designed to store, curate, and process this flood of information, while providing association of science data and metadata to its acquisition and processing provenance. The Data Center will produce quality-controlled calibrated data sets, and make them available freely and openly through modern search interfaces and APIs. Documented software and algorithms will also be made available through community repositories like Github for further collaboration and improvement.We discuss the current design and approach of the DKIST Data Center, describing the development cycle, early technology analysis and prototyping, and the roadmap ahead. We discuss our iterative development approach, the underappreciated challenges of calibrating ground-based solar data, the crucial integration of the Data Center within the larger Operations lifecycle, and how software and hardware support, intelligently deployed, will enable high-caliber solar physics research and community growth for the DKIST's 40-year lifespan.

  9. The NOAA Scientific Computing System Data Assembly Center

    NASA Astrophysics Data System (ADS)

    Suchdeve, K. L.; Smith, S. R.; Van Waes, M.

    2016-02-01

    The Scientific Computing System (SCS) Data Assembly Center (DAC) was established in 2014 by the Office of Marine and Aviation Operations (OMAO) to evaluate the quality of full-resolution (sampling on the order of once per second) data collected by SCS onboard NOAA-operated research vessels. The SCS data are nominally transferred from the vessel to the National Centers for Environmental Information (NCEI) soon after the completion of each cruise and are complimented with detailed cruise metadata from OMAO. The authors will describe tools developed by the SCS DAC to monitor the timeliness of SCS data delivery to NCEI and the completeness of the SCS packages received by NCEI (ensuring the package contains data for all enabled sensors on a given cruise). Feedback to OMAO and NCEI regarding the timeliness and data completeness will be outlined along with challenges encountered by the DAC as it works to develop automated quality assessment of the SCS data packages.Data collected by SCS on NOAA vessels represent a significant investment by the American taxpayer. The mission of the SCS DAC is to ensure that archived SCS data at NCEI are a complete record of the observations made on NOAA research cruises. Archival of complete SCS datasets at NCEI ensures these data are preserved for future generations of scientists, policy makers, and the public.

  10. Development of the HERMIES III mobile robot research testbed at Oak Ridge National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Manges, W.W.; Hamel, W.R.; Weisbin, C.R.

    1988-01-01

    The latest robot in the Hostile Environment Robotic Machine Intelligence Experiment Series (HERMIES) is now under development at the Center for Engineering Systems Advanced Research (CESAR) in the Oak Ridge National Laboratory. The HERMIES III robot incorporates a larger than human size 7-degree-of-freedom manipulator mounted on a 2-degree-of-freedom mobile platform including a variety of sensors and computers. The deployment of this robot represents a significant increase in research capabilities for the CESAR laboratory. The initial on-board computer capacity of the robot exceeds that of 20 Vax 11/780s. The navigation and vision algorithms under development make extensive use of the on-boardmore » NCUBE hypercube computer while the sensors are interfaced through five VME computers running the OS-9 real-time, multitasking operating system. This paper describes the motivation, key issues, and detailed design trade-offs of implementing the first phase (basic functionality) of the HERMIES III robot. 10 refs., 7 figs.« less

  11. Computer-aided dispatch--traffic management center field operational test final detailed test plan : WSDOT deployment

    DOT National Transportation Integrated Search

    2003-10-01

    The purpose of this document is to expand upon the evaluation components presented in "Computer-aided dispatch--traffic management center field operational test final evaluation plan : WSDOT deployment". This document defines the objective, approach,...

  12. Improving communication among the HLA typing laboratories, transplant centers, and coordinating center.

    PubMed

    Gordon, S; Holdsworth, R; Müller, C; Tiedemann, K

    2007-04-01

    Good communication between the bone marrow registries, the donor centres, tissue typing laboratories and clinical units is paramount to ensure timely identification, testing and selection of donors for unrelated bone marrow transplants. This panel session focussed on how to improve communication so that there was a clear understanding of prioritization of requests from clinicians, typing strategies in the laboratories and requests for donors to the registries. This paper outlined some of the strategies discussed in this session.

  13. Impact of configuration management system of computer center on support of scientific projects throughout their lifecycle

    NASA Astrophysics Data System (ADS)

    Bogdanov, A. V.; Iuzhanin, N. V.; Zolotarev, V. I.; Ezhakova, T. R.

    2017-12-01

    In this article the problem of scientific projects support throughout their lifecycle in the computer center is considered in every aspect of support. Configuration Management system plays a connecting role in processes related to the provision and support of services of a computer center. In view of strong integration of IT infrastructure components with the use of virtualization, control of infrastructure becomes even more critical to the support of research projects, which means higher requirements for the Configuration Management system. For every aspect of research projects support, the influence of the Configuration Management system is being reviewed and development of the corresponding elements of the system is being described in the present paper.

  14. 42 CFR 493.1441 - Condition: Laboratories performing high complexity testing; laboratory director.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Condition: Laboratories performing high complexity testing; laboratory director. 493.1441 Section 493.1441 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STANDARDS AND CERTIFICATION LABORATORY...

  15. 42 CFR 493.1441 - Condition: Laboratories performing high complexity testing; laboratory director.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Condition: Laboratories performing high complexity testing; laboratory director. 493.1441 Section 493.1441 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STANDARDS AND CERTIFICATION LABORATORY...

  16. Quality-assurance plan for the analysis of fluvial sediment by the U.S. Geological Survey Kentucky Water Science Center Sediment Laboratory

    USGS Publications Warehouse

    Shreve, Elizabeth A.; Downs, Aimee C.

    2005-01-01

    This report describes laboratory procedures used by the U.S. Geological Survey Kentucky Water Science Center Sediment Laboratory for the processing and analysis of fluvial-sediment samples for concentration of sand and finer material. The report details the processing of a sediment sample through the laboratory from receiving the sediment sample, through the analytical process, to compiling results of the requested analysis. Procedures for preserving sample integrity, calibrating and maintaining of laboratory and field instruments and equipment, analyzing samples, internal quality assurance and quality control, and validity of the sediment-analysis results also are described. The report includes a list of references cited and a glossary of sediment and quality-assurance terms.

  17. DCDM1: Lessons Learned from the World's Most Energy Efficient Data Center

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sickinger, David E; Van Geet, Otto D; Carter, Thomas

    This presentation discusses the holistic approach to design the world's most energy-efficient data center, which is located at the U.S. Department of Energy National Renewable Energy Laboratory (NREL). This high-performance computing (HPC) data center has achieved a trailing twelve-month average power usage effectiveness (PUE) of 1.04 and features a chiller-less design, component-level warm-water liquid cooling, and waste heat capture and reuse. We provide details of the demonstrated PUE and energy reuse effectiveness (ERE) and lessons learned during four years of production operation. Recent efforts to dramatically reduce the water footprint will also be discussed. Johnson Controls partnered with NREL andmore » Sandia National Laboratories to deploy a thermosyphon cooler (TSC) as a test bed at NREL's HPC data center that resulted in a 50% reduction in water usage during the first year of operation. The Thermosyphon Cooler Hybrid System (TCHS) integrates the control of a dry heat rejection device with an open cooling tower.« less

  18. Laboratory and exterior decay of wood plastic composite boards: voids analysis and computed tomography

    Treesearch

    Grace Sun; Rebecca E. Ibach; Meghan Faillace; Marek Gnatowski; Jessie A. Glaeser; John Haight

    2016-01-01

    After exposure in the field and laboratory soil block culture testing, the void content of wood–plastic composite (WPC) decking boards was compared to unexposed samples. A void volume analysis was conducted based on calculations of sample density and from micro-computed tomography (microCT) data. It was found that reference WPC contains voids of different sizes from...

  19. Building a Prototype of LHC Analysis Oriented Computing Centers

    NASA Astrophysics Data System (ADS)

    Bagliesi, G.; Boccali, T.; Della Ricca, G.; Donvito, G.; Paganoni, M.

    2012-12-01

    A Consortium between four LHC Computing Centers (Bari, Milano, Pisa and Trieste) has been formed in 2010 to prototype Analysis-oriented facilities for CMS data analysis, profiting from a grant from the Italian Ministry of Research. The Consortium aims to realize an ad-hoc infrastructure to ease the analysis activities on the huge data set collected at the LHC Collider. While “Tier2” Computing Centres, specialized in organized processing tasks like Monte Carlo simulation, are nowadays a well established concept, with years of running experience, site specialized towards end user chaotic analysis activities do not yet have a defacto standard implementation. In our effort, we focus on all the aspects that can make the analysis tasks easier for a physics user not expert in computing. On the storage side, we are experimenting on storage techniques allowing for remote data access and on storage optimization on the typical analysis access patterns. On the networking side, we are studying the differences between flat and tiered LAN architecture, also using virtual partitioning of the same physical networking for the different use patterns. Finally, on the user side, we are developing tools and instruments to allow for an exhaustive monitoring of their processes at the site, and for an efficient support system in case of problems. We will report about the results of the test executed on different subsystem and give a description of the layout of the infrastructure in place at the site participating to the consortium.

  20. Computer-aided dispatch--traffic management center field operational test final test plans : state of Utah

    DOT National Transportation Integrated Search

    2004-01-01

    The purpose of this document is to expand upon the evaluation components presented in "Computer-aided dispatch--traffic management center field operational test final evaluation plan : state of Utah". This document defines the objective, approach, an...

  1. New developments in delivering public access to data from the National Center for Computational Toxicology at the EPA

    EPA Science Inventory

    Researchers at EPA’s National Center for Computational Toxicology integrate advances in biology, chemistry, and computer science to examine the toxicity of chemicals and help prioritize chemicals for further research based on potential human health risks. The goal of this researc...

  2. Process Engineering Technology Center Initiative

    NASA Technical Reports Server (NTRS)

    Centeno, Martha A.

    2001-01-01

    NASA's Kennedy Space Center (KSC) is developing as a world-class Spaceport Technology Center (STC). From a process engineering (PE) perspective, the facilities used for flight hardware processing at KSC are NASA's premier factories. The products of these factories are safe, successful shuttle and expendable vehicle launches carrying state-of-the-art payloads. PE is devoted to process design, process management, and process improvement, rather than product design. PE also emphasizes the relationships of workers with systems and processes. Thus, it is difficult to speak of having a laboratory for PE at KSC because the entire facility is practically a laboratory when observed from a macro level perspective. However, it becomes necessary, at times, to show and display how KSC has benefited from PE and how KSC has contributed to the development of PE; hence, it has been proposed that a Process Engineering Technology Center (PETC) be developed to offer a place with a centralized focus on PE projects, and a place where KSC's PE capabilities can be showcased, and a venue where new Process Engineering technologies can be investigated and tested. Graphics for showcasing PE capabilities have been designed, and two initial test beds for PE technology research have been identified. Specifically, one test bed will look into the use of wearable computers with head mounted displays to deliver work instructions; the other test bed will look into developing simulation models that can be assembled into one to create a hierarchical model.

  3. CENTER CONDITIONS AND CYCLICITY FOR A FAMILY OF CUBIC SYSTEMS: COMPUTER ALGEBRA APPROACH.

    PubMed

    Ferčec, Brigita; Mahdi, Adam

    2013-01-01

    Using methods of computational algebra we obtain an upper bound for the cyclicity of a family of cubic systems. We overcame the problem of nonradicality of the associated Bautin ideal by moving from the ring of polynomials to a coordinate ring. Finally, we determine the number of limit cycles bifurcating from each component of the center variety.

  4. Computing and information services at the Jet Propulsion Laboratory - A management approach to a diversity of needs

    NASA Technical Reports Server (NTRS)

    Felberg, F. H.

    1984-01-01

    The Jet Propulsion Laboratory, a research and development organization with about 5,000 employees, presents a complicated set of requirements for an institutional system of computing and informational services. The approach taken by JPL in meeting this challenge is one of controlled flexibility. A central communications network is provided, together with selected computing facilities for common use. At the same time, staff members are given considerable discretion in choosing the mini- and microcomputers that they believe will best serve their needs. Consultation services, computer education, and other support functions are also provided.

  5. Computational Structures Technology for Airframes and Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler); Housner, Jerrold M. (Compiler); Starnes, James H., Jr. (Compiler); Hopkins, Dale A. (Compiler); Chamis, Christos C. (Compiler)

    1992-01-01

    This conference publication contains the presentations and discussions from the joint University of Virginia (UVA)/NASA Workshops. The presentations included NASA Headquarters perspectives on High Speed Civil Transport (HSCT), goals and objectives of the UVA Center for Computational Structures Technology (CST), NASA and Air Force CST activities, CST activities for airframes and propulsion systems in industry, and CST activities at Sandia National Laboratory.

  6. SAM: The "Search and Match" Computer Program of the Escherichia coli Genetic Stock Center

    ERIC Educational Resources Information Center

    Bachmann, B. J.; And Others

    1973-01-01

    Describes a computer program used at a genetic stock center to locate particular strains of bacteria. The program can match up to 30 strain descriptions requested by a researcher with the records on file. Uses of this particular program can be made in many fields. (PS)

  7. Senior Laboratory Animal Technician | Center for Cancer Research

    Cancer.gov

    PROGRAM DESCRIPTION The Laboratory Animal Sciences Program (LASP) provides exceptional quality animal care and technical support services for animal research performed at the National Cancer Institute at the Frederick National Laboratory for Cancer Research. LASP executes this mission by providing a broad spectrum of state-of-the-art technologies and services that are focused

  8. Low Cost, Scalable Proteomics Data Analysis Using Amazon's Cloud Computing Services and Open Source Search Algorithms

    PubMed Central

    Halligan, Brian D.; Geiger, Joey F.; Vallejos, Andrew K.; Greene, Andrew S.; Twigger, Simon N.

    2009-01-01

    One of the major difficulties for many laboratories setting up proteomics programs has been obtaining and maintaining the computational infrastructure required for the analysis of the large flow of proteomics data. We describe a system that combines distributed cloud computing and open source software to allow laboratories to set up scalable virtual proteomics analysis clusters without the investment in computational hardware or software licensing fees. Additionally, the pricing structure of distributed computing providers, such as Amazon Web Services, allows laboratories or even individuals to have large-scale computational resources at their disposal at a very low cost per run. We provide detailed step by step instructions on how to implement the virtual proteomics analysis clusters as well as a list of current available preconfigured Amazon machine images containing the OMSSA and X!Tandem search algorithms and sequence databases on the Medical College of Wisconsin Proteomics Center website (http://proteomics.mcw.edu/vipdac). PMID:19358578

  9. Low cost, scalable proteomics data analysis using Amazon's cloud computing services and open source search algorithms.

    PubMed

    Halligan, Brian D; Geiger, Joey F; Vallejos, Andrew K; Greene, Andrew S; Twigger, Simon N

    2009-06-01

    One of the major difficulties for many laboratories setting up proteomics programs has been obtaining and maintaining the computational infrastructure required for the analysis of the large flow of proteomics data. We describe a system that combines distributed cloud computing and open source software to allow laboratories to set up scalable virtual proteomics analysis clusters without the investment in computational hardware or software licensing fees. Additionally, the pricing structure of distributed computing providers, such as Amazon Web Services, allows laboratories or even individuals to have large-scale computational resources at their disposal at a very low cost per run. We provide detailed step-by-step instructions on how to implement the virtual proteomics analysis clusters as well as a list of current available preconfigured Amazon machine images containing the OMSSA and X!Tandem search algorithms and sequence databases on the Medical College of Wisconsin Proteomics Center Web site ( http://proteomics.mcw.edu/vipdac ).

  10. 77 FR 33547 - Privacy Act of 1974, as Amended; Computer Matching Program (SSA/Centers for Medicare and Medicaid...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-06

    ... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2012-0015] Privacy Act of 1974, as Amended; Computer Matching Program (SSA/ Centers for Medicare and Medicaid Services (CMS))--Match Number 1094 AGENCY: Social Security Administration (SSA). ACTION: Notice of a new computer matching program that will expire...

  11. 42 CFR 414.510 - Laboratory date of service for clinical laboratory and pathology specimens.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 3 2011-10-01 2011-10-01 false Laboratory date of service for clinical laboratory and pathology specimens. 414.510 Section 414.510 Public Health CENTERS FOR MEDICARE & MEDICAID... AND OTHER HEALTH SERVICES Payment for New Clinical Diagnostic Laboratory Tests § 414.510 Laboratory...

  12. Venus - Global View Centered at 180 degrees

    NASA Image and Video Library

    1996-11-26

    This global view of the surface of Venus is centered at 180 degrees east longitude. Magellan synthetic aperture radar mosaics from the first cycle of Magellan mapping, and a 5 degree latitude-longitude grid, are mapped onto a computer-simulated globe to create this image. Data gaps are filled with Pioneer-Venus Orbiter data, or a constant mid-range value. The image was produced by the Solar System Visualization project and the Magellan Science team at the JPL Multimission Image Processing Laboratory. http://photojournal.jpl.nasa.gov/catalog/PIA00478

  13. Crack-Detection Experiments on Simulated Turbine Engine Disks in NASA Glenn Research Center's Rotordynamics Laboratory

    NASA Technical Reports Server (NTRS)

    Woike, Mark R.; Abdul-Aziz, Ali

    2010-01-01

    The development of new health-monitoring techniques requires the use of theoretical and experimental tools to allow new concepts to be demonstrated and validated prior to use on more complicated and expensive engine hardware. In order to meet this need, significant upgrades were made to NASA Glenn Research Center s Rotordynamics Laboratory and a series of tests were conducted on simulated turbine engine disks as a means of demonstrating potential crack-detection techniques. The Rotordynamics Laboratory consists of a high-precision spin rig that can rotate subscale engine disks at speeds up to 12,000 rpm. The crack-detection experiment involved introducing a notch on a subscale engine disk and measuring its vibration response using externally mounted blade-tip-clearance sensors as the disk was operated at speeds up to 12 000 rpm. Testing was accomplished on both a clean baseline disk and a disk with an artificial crack: a 50.8-mm- (2-in.-) long introduced notch. The disk s vibration responses were compared and evaluated against theoretical models to investigate how successful the technique was in detecting cracks. This paper presents the capabilities of the Rotordynamics Laboratory, the baseline theory and experimental setup for the crack-detection experiments, and the associated results from the latest test campaign.

  14. Creating 21st-Century Laboratories and Classrooms for Improving Population Health: A Call to Action for Academic Medical Centers.

    PubMed

    DeVoe, Jennifer E; Likumahuwa-Ackman, Sonja; Shannon, Jackilen; Steiner Hayward, Elizabeth

    2017-04-01

    Academic medical centers (AMCs) in the United States built world-class infrastructure to successfully combat disease in the 20th century, which is inadequate for the complexity of sustaining and improving population health. AMCs must now build first-rate 21st-century infrastructure to connect combating disease and promoting health. This infrastructure must acknowledge the bio-psycho-social-environmental factors impacting health and will need to reach far beyond the AMC walls to foster community "laboratories" that support the "science of health," complementary to those supporting the "science of medicine"; cultivate community "classrooms" to stimulate learning and discovery in the places where people live, work, and play; and strengthen bridges between academic centers and these community laboratories and classrooms to facilitate bidirectional teaching, learning, innovation, and discovery.Private and public entities made deep financial investments that contributed to the AMC disease-centered approach to clinical care, education, and research in the 20th century. Many of these same funders now recognize the need to transform U.S. health care into a system that is accountable for population health and the need for a medical workforce equipped with the skills to measure and improve health. Innovative ideas about communities as centers of learning, the importance of social factors as major determinants of health, and the need for multidisciplinary perspectives to solve complex problems are not new; many are 20th-century ideas still waiting to be fully implemented. The window of opportunity is now. The authors articulate how AMCs must take bigger and bolder steps to become leaders in population health.

  15. Interior of the U.S. Laboratory / Destiny module

    NASA Image and Video Library

    2001-02-11

    STS98-E-5113 (11 February 2001) --- This wide shot, photographed with a digital still camera, shows the interior of the newly attached Destiny laboratory. The crews of Atlantis and the International Space Station opened the laboratory on Feb. 11 and spent the first full day of what are planned to be years of work ahead inside the orbiting science and command center. Station commander William M. (Bill) Shepherd opened the Destiny hatch, and he and shuttle commander Kenneth D. Cockrell ventured inside at 8:38 a.m. (CST), Feb. 11. As depicted in subsequent digital images in this series, members of both crews went to work quickly inside the new module, activating air systems, fire extinguishers, alarm systems, computers and internal communications. The crew also continued equipment transfers from the shuttle to the station.

  16. Mass Storage System Upgrades at the NASA Center for Computational Sciences

    NASA Technical Reports Server (NTRS)

    Tarshish, Adina; Salmon, Ellen; Macie, Medora; Saletta, Marty

    2000-01-01

    The NASA Center for Computational Sciences (NCCS) provides supercomputing and mass storage services to over 1200 Earth and space scientists. During the past two years, the mass storage system at the NCCS went through a great deal of changes both major and minor. Tape drives, silo control software, and the mass storage software itself were upgraded, and the mass storage platform was upgraded twice. Some of these upgrades were aimed at achieving year-2000 compliance, while others were simply upgrades to newer and better technologies. In this paper we will describe these upgrades.

  17. Bridging the digital divide by increasing computer and cancer literacy: community technology centers for head-start parents and families.

    PubMed

    Salovey, Peter; Williams-Piehota, Pamela; Mowad, Linda; Moret, Marta Elisa; Edlund, Denielle; Andersen, Judith

    2009-01-01

    This article describes the establishment of two community technology centers affiliated with Head Start early childhood education programs focused especially on Latino and African American parents of children enrolled in Head Start. A 6-hour course concerned with computer and cancer literacy was presented to 120 parents and other community residents who earned a free, refurbished, Internet-ready computer after completing the program. Focus groups provided the basis for designing the structure and content of the course and modifying it during the project period. An outcomes-based assessment comparing program participants with 70 nonparticipants at baseline, immediately after the course ended, and 3 months later suggested that the program increased knowledge about computers and their use, knowledge about cancer and its prevention, and computer use including health information-seeking via the Internet. The creation of community computer technology centers requires the availability of secure space, capacity of a community partner to oversee project implementation, and resources of this partner to ensure sustainability beyond core funding.

  18. Examining the Fundamental Obstructs of Adopting Cloud Computing for 9-1-1 Dispatch Centers in the USA

    ERIC Educational Resources Information Center

    Osman, Abdulaziz

    2016-01-01

    The purpose of this research study was to examine the unknown fears of embracing cloud computing which stretches across measurements like fear of change from leaders and the complexity of the technology in 9-1-1 dispatch centers in USA. The problem that was addressed in the study was that many 9-1-1 dispatch centers in USA are still using old…

  19. Virtual Computing Laboratories: A Case Study with Comparisons to Physical Computing Laboratories

    ERIC Educational Resources Information Center

    Burd, Stephen D.; Seazzu, Alessandro F.; Conway, Christopher

    2009-01-01

    Current technology enables schools to provide remote or virtual computing labs that can be implemented in multiple ways ranging from remote access to banks of dedicated workstations to sophisticated access to large-scale servers hosting virtualized workstations. This paper reports on the implementation of a specific lab using remote access to…

  20. Large space antenna communications systems: Integrated Langley Research Center/Jet Propulsion Laboratory technology development activities. 1: Introduction

    NASA Technical Reports Server (NTRS)

    Campbell, T. G.

    1983-01-01

    The Jet Propulsion Laboratory and the Langley Research Center have been developing technology related to large space antennas (LSA) during the past several years. The need for a communication system research program became apparent during the recent studies for the Land Mobile Satellite System. This study indicated the need for additional research in (1) electromagnetic analysis methods, (2) design and development of multiple beam feed systems, and (3) the measurement methods for LSA reflectors.

  1. ENVIRONMENTAL BIOINFORMATICS AND COMPUTATIONAL TOXICOLOGY CENTER

    EPA Science Inventory

    The Center activities focused on integrating developmental efforts from the various research projects of the Center, and collaborative applications involving scientists from other institutions and EPA, to enhance research in critical areas. A representative sample of specif...

  2. Protocol standards and implementation within the digital engineering laboratory computer network (DELNET) using the universal network interface device (UNID). Part 2

    NASA Astrophysics Data System (ADS)

    Phister, P. W., Jr.

    1983-12-01

    Development of the Air Force Institute of Technology's Digital Engineering Laboratory Network (DELNET) was continued with the development of an initial draft of a protocol standard for all seven layers as specified by the International Standards Organization's (ISO) Reference Model for Open Systems Interconnections. This effort centered on the restructuring of the Network Layer to perform Datagram routing and to conform to the developed protocol standards and actual software module development of the upper four protocol layers residing within the DELNET Monitor (Zilog MCZ 1/25 Computer System). Within the guidelines of the ISO Reference Model the Transport Layer was developed utilizing the Internet Header Format (IHF) combined with the Transport Control Protocol (TCP) to create a 128-byte Datagram. Also a limited Application Layer was created to pass the Gettysburg Address through the DELNET. This study formulated a first draft for the DELNET Protocol Standard and designed, implemented, and tested the Network, Transport, and Application Layers to conform to these protocol standards.

  3. User participation in the development of the human/computer interface for control centers

    NASA Technical Reports Server (NTRS)

    Broome, Richard; Quick-Campbell, Marlene; Creegan, James; Dutilly, Robert

    1996-01-01

    Technological advances coupled with the requirements to reduce operations staffing costs led to the demand for efficient, technologically-sophisticated mission operations control centers. The control center under development for the earth observing system (EOS) is considered. The users are involved in the development of a control center in order to ensure that it is cost-efficient and flexible. A number of measures were implemented in the EOS program in order to encourage user involvement in the area of human-computer interface development. The following user participation exercises carried out in relation to the system analysis and design are described: the shadow participation of the programmers during a day of operations; the flight operations personnel interviews; and the analysis of the flight operations team tasks. The user participation in the interface prototype development, the prototype evaluation, and the system implementation are reported on. The involvement of the users early in the development process enables the requirements to be better understood and the cost to be reduced.

  4. Energy Department Announces National Bioenergy Center

    Science.gov Websites

    Department of Energy's National Renewable Energy Laboratory (NREL) in Golden, Colo., and Oak Ridge National Laboratories (ORNL) in Oak Ridge, Tenn. will lead the Bioenergy Center. The center will link DOE-funded biomass

  5. OCIS: 15 years' experience with patient-centered computing.

    PubMed

    Enterline, J P; Lenhard, R E; Blum, B I; Majidi, F M; Stuart, G J

    1994-01-01

    In the mid-1970s, the medical and administrative staff of the Oncology Center at Johns Hopkins Hospital recognized a need for a computer-based clinical decision-support system that organized patients' information according to the care continuum, rather than as a series of event-specific data. This is especially important in cancer patients, because of the long periods in which they receive complex medical treatment and the enormous amounts of data generated by extremely ill patients with multiple interrelated diseases. During development of the Oncology Clinical Information System (OCIS), it became apparent that administrative services, research systems, ancillary functions (such as drug and blood product ordering), and financial processes should be integrated with the basic patient-oriented database. With the structured approach used in applications development, new modules were added as the need for additional functions arose. The system has since been moved to a modern network environment with the capacity for client-server processing.

  6. Interactive radiopharmaceutical facility between Yale Medical Center and Brookhaven National Laboratory. Progress report, October 1976-June 1979

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gottschalk, A.

    1979-01-01

    DOE Contract No. EY-76-S-02-4078 was started in October 1976 to set up an investigative radiochemical facility at the Yale Medical Center which would bridge the gap between current investigation with radionuclides at the Yale School of Medicine and the facilities in the Chemistry Department at the Brookhaven National Laboratory. To facilitate these goals, Dr. Mathew L. Thakur was recruited who joined the Yale University faculty in March of 1977. This report briefly summarizes our research accomplishments through the end of June 1979. These can be broadly classified into three categories: (1) research using indium-111 labelled cellular blood components; (2) developmentmore » of new radiopharmaceuticals; and (3) interaction with Dr. Alfred Wolf and colleagues in the Chemistry Department of Brookhaven National Laboratory.« less

  7. CNC Turning Center Operations and Prove Out. Computer Numerical Control Operator/Programmer. 444-334.

    ERIC Educational Resources Information Center

    Skowronski, Steven D.

    This student guide provides materials for a course designed to instruct the student in the recommended procedures used when setting up tooling and verifying part programs for a two-axis computer numerical control (CNC) turning center. The course consists of seven units. Unit 1 discusses course content and reviews and demonstrates set-up procedures…

  8. Performance assessment of KORAT-3D on the ANL IBM-SP computer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alexeyev, A.V.; Zvenigorodskaya, O.A.; Shagaliev, R.M.

    1999-09-01

    The TENAR code is currently being developed at the Russian Federal Nuclear Center (VNIIEF) as a coupled dynamics code for the simulation of transients in VVER and RBMK systems and other nuclear systems. The neutronic module in this code system is KORAT-3D. This module is also one of the most computationally intensive components of the code system. A parallel version of KORAT-3D has been implemented to achieve the goal of obtaining transient solutions in reasonable computational time, particularly for RBMK calculations that involve the application of >100,000 nodes. An evaluation of the KORAT-3D code performance was recently undertaken on themore » Argonne National Laboratory (ANL) IBM ScalablePower (SP) parallel computer located in the Mathematics and Computer Science Division of ANL. At the time of the study, the ANL IBM-SP computer had 80 processors. This study was conducted under the auspices of a technical staff exchange program sponsored by the International Nuclear Safety Center (INSC).« less

  9. Process Engineering Technology Center Initiative

    NASA Technical Reports Server (NTRS)

    Centeno, Martha A.

    2002-01-01

    NASA's Kennedy Space Center (KSC) is developing as a world-class Spaceport Technology Center (STC). From a process engineering (PE) perspective, the facilities used for flight hardware processing at KSC are NASA's premier factories. The products of these factories are safe, successful shuttle and expendable vehicle launches carrying state-of-the-art payloads. PE is devoted to process design, process management, and process improvement, rather than product design. PE also emphasizes the relationships of workers with systems and processes. Thus, it is difficult to speak of having a laboratory for PE at K.S.C. because the entire facility is practically a laboratory when observed from a macro level perspective. However, it becomes necessary, at times, to show and display how K.S.C. has benefited from PE and how K.S.C. has contributed to the development of PE; hence, it has been proposed that a Process Engineering Technology Center (PETC) be developed to offer a place with a centralized focus on PE projects, and a place where K.S.C.'s PE capabilities can be showcased, and a venue where new Process Engineering technologies can be investigated and tested. Graphics for showcasing PE capabilities have been designed, and two initial test beds for PE technology research have been identified. Specifically, one test bed will look into the use of wearable computers with head mounted displays to deliver work instructions; the other test bed will look into developing simulation models that can be assembled into one to create a hierarchical model.

  10. ACSYNT - A standards-based system for parametric, computer aided conceptual design of aircraft

    NASA Technical Reports Server (NTRS)

    Jayaram, S.; Myklebust, A.; Gelhausen, P.

    1992-01-01

    A group of eight US aerospace companies together with several NASA and NAVY centers, led by NASA Ames Systems Analysis Branch, and Virginia Tech's CAD Laboratory agreed, through the assistance of Americal Technology Initiative, in 1990 to form the ACSYNT (Aircraft Synthesis) Institute. The Institute is supported by a Joint Sponsored Research Agreement to continue the research and development in computer aided conceptual design of aircraft initiated by NASA Ames Research Center and Virginia Tech's CAD Laboratory. The result of this collaboration, a feature-based, parametric computer aided aircraft conceptual design code called ACSYNT, is described. The code is based on analysis routines begun at NASA Ames in the early 1970's. ACSYNT's CAD system is based entirely on the ISO standard Programmer's Hierarchical Interactive Graphics System and is graphics-device independent. The code includes a highly interactive graphical user interface, automatically generated Hermite and B-Spline surface models, and shaded image displays. Numerous features to enhance aircraft conceptual design are described.

  11. CSI computer system/remote interface unit acceptance test results

    NASA Technical Reports Server (NTRS)

    Sparks, Dean W., Jr.

    1992-01-01

    The validation tests conducted on the Control/Structures Interaction (CSI) Computer System (CCS)/Remote Interface Unit (RIU) is discussed. The CCS/RIU consists of a commercially available, Langley Research Center (LaRC) programmed, space flight qualified computer and a flight data acquisition and filtering computer, developed at LaRC. The tests were performed in the Space Structures Research Laboratory (SSRL) and included open loop excitation, closed loop control, safing, RIU digital filtering, and RIU stand alone testing with the CSI Evolutionary Model (CEM) Phase-0 testbed. The test results indicated that the CCS/RIU system is comparable to ground based systems in performing real-time control-structure experiments.

  12. Improving communication skill training in patient centered medical practice for enhancing rational use of laboratory tests: The core of bioinformation for leveraging stakeholder engagement in regulatory science.

    PubMed

    Moura, Josemar de Almeida; Costa, Bruna Carvalho; de Faria, Rosa Malena Delbone; Soares, Taciana Figueiredo; Moura, Eliane Perlatto; Chiappelli, Francesco

    2013-01-01

    Requests for laboratory tests are among the most relevant additional tools used by physicians as part of patient's health problemsolving. However, the overestimation of complementary investigation may be linked to less reflective medical practice as a consequence of a poor physician-patient communication, and may impair patient-centered care. This scenario is likely to result from reduced consultation time, and a clinical model focused on the disease. We propose a new medical intervention program that specifically targets improving the patient-centered communication of laboratory tests results, the core of bioinformation in health care. Expectations are that medical students training in communication skills significantly improve physicians-patient relationship, reduce inappropriate use of laboratorial tests, and raise stakeholder engagement.

  13. A comprehensive Laboratory Services Survey of State Public Health Laboratories.

    PubMed

    Inhorn, Stanley L; Wilcke, Burton W; Downes, Frances Pouch; Adjanor, Oluwatosin Omolade; Cada, Ronald; Ford, James R

    2006-01-01

    In November 2004, the Association of Public Health Laboratories (APHL) conducted a Comprehensive Laboratory Services Survey of State Public Health Laboratories (SPHLs) in order to establish the baseline data necessary for Healthy People 2010 Objective 23-13. This objective aims to measure the increase in the proportion of health agencies that provide or assure access to comprehensive laboratory services to support essential public health services. This assessment addressed only SPHLs and served as a baseline to periodically evaluate the level of improvement in the provision of laboratory services over the decade ending 2010. The 2004 survey used selected questions that were identified as key indicators of provision of comprehensive laboratory services. The survey was developed in consultation with the Centers for Disease Control and Prevention National Center for Health Statistics, based on newly developed data sources. Forty-seven states and one territory responded to the survey. The survey was based on the 11 core functions of SPHLs as previously defined by APHL. The range of performance among individual laboratories for the 11 core functions (subobjectives) reflects the challenging issues that have confronted SPHLs in the first half of this decade. APHL is now working on a coordinated effort with other stakeholders to create seamless state and national systems for the provision of laboratory services in support of public health programs. These services are necessary to help face the threats raised by the specter of terrorism, emerging infections, and natural disasters.

  14. Blast Computations over a Hemicylindrical Aircraft Shelter

    DTIC Science & Technology

    1981-07-01

    Westmoreland, C.D., "The HULL Hydro- dinamics Computer Code", AFWL-TR-76-183, U.S. Air Force Wocpon Laboratory, Kirtland Air Force Baze, IN (Septenber...DISTRIBUTION LIST No. of No. of Copies Organization Copies Organization 2 Commander 1 Director Defense Technical Info Center Weapons Systems Evaluation Gp ATTN...DRDTA-’’. Fort Monroe, VA 23651 Warrel, MI 48090 2 Director Commander US Army TRADOC Systems US Army Foreign Scienco and Analysis Activity Technology

  15. Industry and Academic Consortium for Computer Based Subsurface Geology Laboratory

    NASA Astrophysics Data System (ADS)

    Brown, A. L.; Nunn, J. A.; Sears, S. O.

    2008-12-01

    Twenty two licenses for Petrel Software acquired through a grant from Schlumberger are being used to redesign the laboratory portion of Subsurface Geology at Louisiana State University. The course redesign is a cooperative effort between LSU's Geology and Geophysics and Petroleum Engineering Departments and Schlumberger's Technical Training Division. In spring 2008, two laboratory sections were taught with 22 students in each section. The class contained geology majors, petroleum engineering majors, and geology graduate students. Limited enrollments and 3 hour labs make it possible to incorporate hands-on visualization, animation, manipulation of data and images, and access to geological data available online. 24/7 access to the laboratory and step by step instructions for Petrel exercises strongly promoted peer instruction and individual learning. Goals of the course redesign include: enhancing visualization of earth materials; strengthening student's ability to acquire, manage, and interpret multifaceted geological information; fostering critical thinking, the scientific method; improving student communication skills; providing cross training between geologists and engineers and increasing the quantity, quality, and diversity of students pursuing Earth Science and Petroleum Engineering careers. IT resources available in the laboratory provide students with sophisticated visualization tools, allowing them to switch between 2-D and 3-D reconstructions more seamlessly, and enabling them to manipulate larger integrated data-sets, thus permitting more time for critical thinking and hypothesis testing. IT resources also enable faculty and students to simultaneously work with the software to visually interrogate a 3D data set and immediately test hypothesis formulated in class. Preliminary evaluation of class results indicate that students found MS-Windows based Petrel easy to learn. By the end of the semester, students were able to not only map horizons and faults

  16. Center for Defect Physics - Energy Frontier Research Center (A "Life at the Frontiers of Energy Research" contest entry from the 2011 Energy Frontier Research Centers (EFRCs) Summit and Forum)

    ScienceCinema

    Stocks, G. Malcolm (Director, Center for Defect Physics in Structural Materials); CDP Staff

    2017-12-09

    'Center for Defect Physics - Energy Frontier Research Center' was submitted by the Center for Defect Physics (CDP) to the 'Life at the Frontiers of Energy Research' video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. CDP is directed by G. Malcolm Stocks at Oak Ridge National Laboratory, and is a partnership of scientists from nine institutions: Oak Ridge National Laboratory (lead); Ames Laboratory; Brown University; University of California, Berkeley; Carnegie Mellon University; University of Illinois, Urbana-Champaign; Lawrence Livermore National Laboratory; Ohio State University; and University of Tennessee. The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges.

  17. 42 CFR 493.1403 - Condition: Laboratories performing moderate complexity testing; laboratory director.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Condition: Laboratories performing moderate complexity testing; laboratory director. 493.1403 Section 493.1403 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STANDARDS AND CERTIFICATION...

  18. 42 CFR 493.1403 - Condition: Laboratories performing moderate complexity testing; laboratory director.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Condition: Laboratories performing moderate complexity testing; laboratory director. 493.1403 Section 493.1403 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STANDARDS AND CERTIFICATION...

  19. Mice examined in Animal Laboratory of Lunar Receiving Laboratory

    NASA Technical Reports Server (NTRS)

    1969-01-01

    Landrum Young (seated), Brown and Root-Northrup, and Russell Stullken, Manned Spacecraft Center, examine mice in the Animal laboratory of the Lunar Receiving Laboratory which have been inoculated with lunar sample material. wish for peace for all mankind. astronauts will be released from quarantine on August 11, 1969. Donald K. Slayton (right), MSC Director of Flight Crew Operations; and Lloyd Reeder, training coordinator.

  20. 42 CFR 414.510 - Laboratory date of service for clinical laboratory and pathology specimens.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... and pathology specimens. 414.510 Section 414.510 Public Health CENTERS FOR MEDICARE & MEDICAID... Laboratory date of service for clinical laboratory and pathology specimens. The date of service for either a clinical laboratory test or the technical component of physician pathology service is as follows: (a...

  1. 42 CFR 414.510 - Laboratory date of service for clinical laboratory and pathology specimens.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... and pathology specimens. 414.510 Section 414.510 Public Health CENTERS FOR MEDICARE & MEDICAID... Laboratory date of service for clinical laboratory and pathology specimens. The date of service for either a clinical laboratory test or the technical component of physician pathology service is as follows: (a...

  2. 42 CFR 414.510 - Laboratory date of service for clinical laboratory and pathology specimens.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... and pathology specimens. 414.510 Section 414.510 Public Health CENTERS FOR MEDICARE & MEDICAID... Laboratory date of service for clinical laboratory and pathology specimens. The date of service for either a clinical laboratory test or the technical component of physician pathology service is as follows: (a...

  3. Investigations in Computer-Aided Instruction and Computer-Aided Controls. Final Report.

    ERIC Educational Resources Information Center

    Rosenberg, R.C.; And Others

    These research projects, designed to delve into certain relationships between humans and computers, are focused on computer-assisted instruction and on man-computer interaction. One study demonstrates that within the limits of formal engineering theory, a computer simulated laboratory (Dynamic Systems Laboratory) can be built in which freshmen…

  4. Polymer waveguides for electro-optical integration in data centers and high-performance computers.

    PubMed

    Dangel, Roger; Hofrichter, Jens; Horst, Folkert; Jubin, Daniel; La Porta, Antonio; Meier, Norbert; Soganci, Ibrahim Murat; Weiss, Jonas; Offrein, Bert Jan

    2015-02-23

    To satisfy the intra- and inter-system bandwidth requirements of future data centers and high-performance computers, low-cost low-power high-throughput optical interconnects will become a key enabling technology. To tightly integrate optics with the computing hardware, particularly in the context of CMOS-compatible silicon photonics, optical printed circuit boards using polymer waveguides are considered as a formidable platform. IBM Research has already demonstrated the essential silicon photonics and interconnection building blocks. A remaining challenge is electro-optical packaging, i.e., the connection of the silicon photonics chips with the system. In this paper, we present a new single-mode polymer waveguide technology and a scalable method for building the optical interface between silicon photonics chips and single-mode polymer waveguides.

  5. A Report on the Design and Construction of the University of Massachusetts Computer Science Center.

    ERIC Educational Resources Information Center

    Massachusetts State Office of the Inspector General, Boston.

    This report describes a review conducted by the Massachusetts Office of the Inspector General on the construction of the Computer Science and Development Center at the University of Massachusetts, Amherst. The office initiated the review after hearing concerns about the management of the project, including its delayed completion and substantial…

  6. Ice Crystal Icing Engine Testing in the NASA Glenn Research Center's Propulsion Systems Laboratory (PSL): Altitude Investigation

    NASA Technical Reports Server (NTRS)

    Oliver, Michael J.

    2015-01-01

    The National Aeronautics and Space Administration conducted a full scale ice crystal icing turbofan engine test in the NASA Glenn Research Centers Propulsion Systems Laboratory (PSL) Facility in February 2013. Honeywell Engines supplied the test article, an obsolete, unmodified Lycoming ALF502-R5 turbofan engine serial number LF01 that experienced an un-commanded loss of thrust event while operating at certain high altitude ice crystal icing conditions. These known conditions were duplicated in the PSL for this testing.

  7. Laboratory diagnosis of Ebola virus disease and corresponding biosafety considerations in the China Ebola Treatment Center.

    PubMed

    Huang, Qing; Fu, Wei-Ling; You, Jian-Ping; Mao, Qing

    2016-10-01

    Ebola virus disease (EVD), caused by Ebola virus (EBOV), is a potent acute infectious disease with a high case-fatality rate. Etiological and serological EBOV detection methods, including techniques that involve the detection of the viral genome, virus-specific antigens and anti-virus antibodies, are standard laboratory diagnostic tests that facilitate confirmation or exclusion of EBOV infection. In addition, routine blood tests, liver and kidney function tests, electrolytes and coagulation tests and other diagnostic examinations are important for the clinical diagnosis and treatment of EVD. Because of the viral load in body fluids and secretions from EVD patients, all body fluids are highly contagious. As a result, biosafety control measures during the collection, transport and testing of clinical specimens obtained from individuals scheduled to undergo EBOV infection testing (including suspected, probable and confirmed cases) are crucial. This report has been generated following extensive work experience in the China Ebola Treatment Center (ETC) in Liberia and incorporates important information pertaining to relevant diagnostic standards, clinical significance, operational procedures, safety controls and other issues related to laboratory testing of EVD. Relevant opinions and suggestions are presented in this report to provide contextual awareness associated with the development of standards and/or guidelines related to EVD laboratory testing.

  8. Procedures of Exercise Physiology Laboratories

    NASA Technical Reports Server (NTRS)

    Bishop, Phillip A.; Fortney, Suzanne; Greenisen, Michael; Siconolfi, Steven F.; Bamman, Marcas M.; Moore, Alan D., Jr.; Squires, William

    1998-01-01

    This manual describes the laboratory methods used to collect flight crew physiological performance data at the Johnson Space Center. The Exercise Countermeasures Project Laboratory is a standard physiology laboratory; only the application to the study of human physiological adaptations to spaceflight is unique. In the absence of any other recently published laboratory manual, this manual should be a useful document staffs and students of other laboratories.

  9. Development of computational methods for unsteady aerodynamics at the NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Yates, E. Carson, Jr.; Whitlow, Woodrow, Jr.

    1987-01-01

    The current scope, recent progress, and plans for research and development of computational methods for unsteady aerodynamics at the NASA Langley Research Center are reviewed. Both integral equations and finite difference methods for inviscid and viscous flows are discussed. Although the great bulk of the effort has focused on finite difference solution of the transonic small perturbation equation, the integral equation program is given primary emphasis here because it is less well known.

  10. Development of computational methods for unsteady aerodynamics at the NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Yates, E. Carson, Jr.; Whitlow, Woodrow, Jr.

    1987-01-01

    The current scope, recent progress, and plans for research and development of computational methods for unsteady aerodynamics at the NASA Langley Research Center are reviewed. Both integral-equations and finite-difference method for inviscid and viscous flows are discussed. Although the great bulk of the effort has focused on finite-difference solution of the transonic small-perturbation equation, the integral-equation program is given primary emphasis here because it is less well known.

  11. Studying the Earth's Environment from Space: Computer Laboratory Exercised and Instructor Resources

    NASA Technical Reports Server (NTRS)

    Smith, Elizabeth A.; Alfultis, Michael

    1998-01-01

    Studying the Earth's Environment From Space is a two-year project to develop a suite of CD-ROMs containing Earth System Science curriculum modules for introductory undergraduate science classes. Lecture notes, slides, and computer laboratory exercises, including actual satellite data and software, are being developed in close collaboration with Carla Evans of NASA GSFC Earth Sciences Directorate Scientific and Educational Endeavors (SEE) project. Smith and Alfultis are responsible for the Oceanography and Sea Ice Processes Modules. The GSFC SEE project is responsible for Ozone and Land Vegetation Modules. This document constitutes a report on the first year of activities of Smith and Alfultis' project.

  12. Laboratory for Atmospheres: Philosophy, Organization, Major Activities, and 2001 Highlights

    NASA Technical Reports Server (NTRS)

    Hoegy, Walter R.; Cote, Charles, E.

    2002-01-01

    How can we improve our ability to predict the weather? How is the Earth's climate changing? What can the atmospheres of other planets teach us about our own? The Laboratory for Atmospheres is helping to answer these and other scientific questions. The Laboratory conducts a broad theoretical and experimental research program studying all aspects of the atmospheres of the Earth and other planets, including their structural, dynamical, radiative, and chemical properties. Vigorous research is central to NASA's exploration of the frontiers of knowledge. NASA scientists play a key role in conceiving new space missions, providing mission requirements., and carrying out research to explore the behavior of planetary systems, including, notably, the Earth's. Our Laboratory's scientists also supply outside scientists with technical assistance and scientific data to further investigations not immediately addressed by NASA itself. The Laboratory for Atmospheres is a vital participant in NASA's research program. The Laboratory is part of the Earth Sciences Directorate based at NASA's Goddard Space Flight Center in Greenbelt, Maryland. The Directorate itself comprises the Global Change Data Center; the Earth and Space Data Computing Division; three laboratories: the Laboratory for Atmospheres, the Laboratory for Terrestrial Physics, and the Laboratory for Hydrospheric Processes; and the Goddard Institute for Space Studies (GISS) in New York, New York. In this report, you will find a statement of our philosophy and a description of our role in NASA's mission. You'll also find a broad description of our research and a summary of our scientists' major accomplishments in 2001. The report also presents useful information on human resources, scientific interactions, and outreach activities with the outside community. For your convenience, we have published a version of this report on the Internet. Our Web site includes links to additional information about the Laboratory's Offices and

  13. Sandia National Laboratories: Research: Laboratory Directed Research &

    Science.gov Websites

    ; Technology Defense Systems & Assessments About Defense Systems & Assessments Program Areas Robotics R&D 100 Awards Laboratory Directed Research & Development Technology Deployment Centers Audit Sandia's Economic Impact Licensing & Technology Transfer Browse Technology Portfolios

  14. Sandia National Laboratories: Sandia National Laboratories: Missions:

    Science.gov Websites

    ; Technology Defense Systems & Assessments About Defense Systems & Assessments Program Areas Robotics R&D 100 Awards Laboratory Directed Research & Development Technology Deployment Centers Audit Sandia's Economic Impact Licensing & Technology Transfer Browse Technology Portfolios

  15. The Laboratory-Based Economics Curriculum.

    ERIC Educational Resources Information Center

    King, Paul G.; LaRoe, Ross M.

    1991-01-01

    Describes the liberal arts, computer laboratory-based economics program at Denison University (Ohio). Includes as goals helping students to (1) understand deductive arguments, (2) learn to apply theory in real-world situations, and (3) test and modify theory when necessary. Notes that the program combines computer laboratory experiments for…

  16. An inexpensive modification of the laboratory computer display changes emergency physicians' work habits and perceptions.

    PubMed

    Marinakis, Harry A; Zwemer, Frank L

    2003-02-01

    Little is known about how the availability of laboratory data affects emergency physicians' practice habits and satisfaction. We modified our clinical information system to display laboratory test status with continuous updates, similar to an airport arrival display. The objective of this study was to determine whether the laboratory test status display altered emergency physicians' work habits and increased satisfaction compared with the time period before implementation of laboratory test status. A retrospective analysis was performed of emergency physicians' actual use of the clinical information system before and after implementation of the laboratory test status display. Emergency physicians were retrospectively surveyed regarding the effect of laboratory test status display on their practice habits and clinical information system use. Survey responses were matched with actual use of the clinical information system. Data were analyzed by using dependent t tests and Pearson correlation coefficients. The study was conducted at a university hospital. Clinical information system use by 46 emergency physicians was analyzed. Twenty-five surveys were returned (71.4% of available emergency physicians). All emergency physicians perceived fewer clinical information system log ons per day after laboratory test status display. The actual average decrease was 19%. Emergency physicians who reported the greatest decrease in log ons per day tended to have the greatest actual decrease (r =-0.36). There was no significant correlation between actual and perceived total time logged on (r =0.08). In regard to effect on emergency physicians' practice habits, 95% reported increased efficiency, 80% reported improved satisfaction with data access, and 65% reported improved communication with patients. An inexpensive computer modification, laboratory test status display, significantly increased subjective efficiency, changed work habits, and improved satisfaction regarding data access

  17. Computer soundcard as an AC signal generator and oscilloscope for the physics laboratory

    NASA Astrophysics Data System (ADS)

    Sinlapanuntakul, Jinda; Kijamnajsuk, Puchong; Jetjamnong, Chanthawut; Chotikaprakhan, Sutharat

    2018-01-01

    The purpose of this paper is to develop both an AC signal generator and a dual-channel oscilloscope based on standard personal computer equipped with sound card as parts of the laboratory of the fundamental physics and the introduction to electronics classes. The setup turns the computer into the two channel measured device which can provides sample rate, simultaneous sampling, frequency range, filters and others essential capabilities required to perform amplitude, phase and frequency measurements of AC signal. The AC signal also generate from the same computer sound card output simultaneously in any waveform such as sine, square, triangle, saw-toothed pulsed, swept sine and white noise etc. These can convert an inexpensive PC sound card into powerful device, which allows the students to measure physical phenomena with their own PCs either at home or at university attendance. A graphic user interface software was developed for control and analysis, including facilities for data recording, signal processing and real time measurement display. The result is expanded utility of self-learning for the students in the field of electronics both AC and DC circuits, including the sound and vibration experiments.

  18. Reassigning the Structures of Natural Products Using NMR Chemical Shifts Computed with Quantum Mechanics: A Laboratory Exercise

    ERIC Educational Resources Information Center

    Palazzo, Teresa A.; Truong, Tiana T.; Wong, Shirley M. T.; Mack, Emma T.; Lodewyk, Michael W.; Harrison, Jason G.; Gamage, R. Alan; Siegel, Justin B.; Kurth, Mark J.; Tantillo, Dean J.

    2015-01-01

    An applied computational chemistry laboratory exercise is described in which students use modern quantum chemical calculations of chemical shifts to assign the structure of a recently isolated natural product. A pre/post assessment was used to measure student learning gains and verify that students demonstrated proficiency of key learning…

  19. Computer modeling with randomized-controlled trial data informs the development of person-centered aged care homes.

    PubMed

    Chenoweth, Lynn; Vickland, Victor; Stein-Parbury, Jane; Jeon, Yun-Hee; Kenny, Patricia; Brodaty, Henry

    2015-10-01

    To answer questions on the essential components (services, operations and resources) of a person-centered aged care home (iHome) using computer simulation. iHome was developed with AnyLogic software using extant study data obtained from 60 Australian aged care homes, 900+ clients and 700+ aged care staff. Bayesian analysis of simulated trial data will determine the influence of different iHome characteristics on care service quality and client outcomes. Interim results: A person-centered aged care home (socio-cultural context) and care/lifestyle services (interactional environment) can produce positive outcomes for aged care clients (subjective experiences) in the simulated environment. Further testing will define essential characteristics of a person-centered care home.

  20. National Biocontainment Training Center

    DTIC Science & Technology

    2014-08-01

    and Dr. Christopher Kasanga, Virologist, SACIDS, SUA. Pictured bottom right: Martha Betson, an instructor at Sokoine from the Royal Veterinary ...laboratories in the Pendik Veterinary Control Institute, which is a national research laboratory under the Turkish Ministry of Food, Agriculture and Livestock...Gargili (first row, center) for laboratory staff of the Pendik Veterinary Control Institute, a national research laboratory under the Turkish

  1. Computer Center: Software Review.

    ERIC Educational Resources Information Center

    Duhrkopf, Richard, Ed.; Belshe, John F., Ed.

    1988-01-01

    Reviews a software package, "Mitosis-Meiosis," available for Apple II or IBM computers with colorgraphics capabilities. Describes the documentation, presentation and flexibility of the program. Rates the program based on graphics and usability in a biology classroom. (CW)

  2. NASA Scientists Push the Limits of Computer Technology

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Dr. Donald Frazier,NASA researcher, uses a blue laser shining through a quarts window into a special mix of chemicals to generate a polymer film on the inside quartz surface. As the chemicals respond to the laser light, they adhere to the glass surface, forming optical films. Dr. Frazier and Dr. Mark S. Paley developed the process in the Space Sciences Laboratory at NASA's Marshall Space Flight Center in Huntsville, AL. Working aboard the Space Shuttle, a science team led by Dr. Frazier formed thin films potentially useful in optical computers with fewer impurities than those formed on Earth. Patterns of these films can be traced onto the quartz surface. In the optical computers of the future, these films could replace electronic circuits and wires, making the systems more efficient and cost-effective, as well as lighter and more compact. Photo credit: NASA/Marshall Space Flight Center.

  3. NASA Scientists Push the Limits of Computer Technology

    NASA Technical Reports Server (NTRS)

    1998-01-01

    NASA research Dr. Donald Frazier uses a blue laser shining through a quartz window into a special mix of chemicals to generate a polymer film on the inside quartz surface. As the chemicals respond to the laser light, they adhere to the glass surface, forming opticl films. Dr. Frazier and Dr. Mark S. Paley developed the process in the Space Sciences Laboratory at NASA's Marshall Space Flight Center in Huntsville, AL. Working aboard the Space Shuttle, a science team led by Dr. Frazier formed thin-films potentially useful in optical computers with fewer impurities than those formed on Earth. Patterns of these films can be traced onto the quartz surface. In the optical computers on the future, these films could replace electronic circuits and wires, making the systems more efficient and cost-effective, as well as lighter and more compact. Photo credit: NASA/Marshall Space Flight Center

  4. NASA Scientists Push the Limits of Computer Technology

    NASA Technical Reports Server (NTRS)

    1999-01-01

    NASA researcher Dr. Donald Frazier uses a blue laser shining through a quartz window into a special mix of chemicals to generate a polymer film on the inside quartz surface. As the chemicals respond to the laser light, they adhere to the glass surface, forming optical films. Dr. Frazier and Dr. Mark S. Paley developed the process in the Space Sciences Laboratory at NASA's Marshall Space Flight Center in Huntsville, AL. Working aboard the Space Shuttle, a science team led by Dr. Frazier formed thin-films potentially useful in optical computers with fewer impurities than those formed on Earth. Patterns of these films can be traced onto the quartz surface. In the optical computers of the future, thee films could replace electronic circuits and wires, making the systems more efficient and cost-effective, as well as lighter and more compact. Photo credit: NASA/Marshall Space Flight Center

  5. NASA Langley Research Center's Simulation-To-Flight Concept Accomplished through the Integration Laboratories of the Transport Research Facility

    NASA Technical Reports Server (NTRS)

    Martinez, Debbie; Davidson, Paul C.; Kenney, P. Sean; Hutchinson, Brian K.

    2004-01-01

    The Flight Simulation and Software Branch (FSSB) at NASA Langley Research Center (LaRC) maintains the unique national asset identified as the Transport Research Facility (TRF). The TRF is a group of facilities and integration laboratories utilized to support the LaRC's simulation-to-flight concept. This concept incorporates common software, hardware, and processes for both groundbased flight simulators and LaRC s B-757-200 flying laboratory identified as the Airborne Research Integrated Experiments System (ARIES). These assets provide Government, industry, and academia with an efficient way to develop and test new technology concepts to enhance the capacity, safety, and operational needs of the ever-changing national airspace system. The integration of the TRF enables a smooth continuous flow of the research from simulation to actual flight test.

  6. Simulating Laboratory Procedures.

    ERIC Educational Resources Information Center

    Baker, J. E.; And Others

    1986-01-01

    Describes the use of computer assisted instruction in a medical microbiology course. Presents examples of how computer assisted instruction can present case histories in which the laboratory procedures are simulated. Discusses an authoring system used to prepare computer simulations and provides one example of a case history dealing with fractured…

  7. Setting up a Low-Cost Lab Management System for a Multi-Purpose Computing Laboratory Using Virtualisation Technology

    ERIC Educational Resources Information Center

    Mok, Heng Ngee; Lee, Yeow Leong; Tan, Wee Kiat

    2012-01-01

    This paper describes how a generic computer laboratory equipped with 52 workstations is set up for teaching IT-related courses and other general purpose usage. The authors have successfully constructed a lab management system based on decentralised, client-side software virtualisation technology using Linux and free software tools from VMware that…

  8. A Low-Cost Computer-Controlled Arduino-Based Educational Laboratory System for Teaching the Fundamentals of Photovoltaic Cells

    ERIC Educational Resources Information Center

    Zachariadou, K.; Yiasemides, K.; Trougkakos, N.

    2012-01-01

    We present a low-cost, fully computer-controlled, Arduino-based, educational laboratory (SolarInsight) to be used in undergraduate university courses concerned with electrical engineering and physics. The major goal of the system is to provide students with the necessary instrumentation, software tools and methodology in order to learn fundamental…

  9. System Finds Horizontal Location of Center of Gravity

    NASA Technical Reports Server (NTRS)

    Johnston, Albert S.; Howard, Richard T.; Brewster, Linda L.

    2006-01-01

    An instrumentation system rapidly and repeatedly determines the horizontal location of the center of gravity of a laboratory vehicle that slides horizontally on three air bearings (see Figure 1). Typically, knowledge of the horizontal center-of-mass location of such a vehicle is needed in order to balance the vehicle properly for an experiment and/or to assess the dynamic behavior of the vehicle. The system includes a load cell above each air bearing, electronic circuits that generate digital readings of the weight on each load cell, and a computer equipped with software that processes the readings. The total weight and, hence, the mass of the vehicle are computed from the sum of the load-cell weight readings. Then the horizontal position of the center of gravity is calculated straightforwardly as the weighted sum of the known position vectors of the air bearings, the contribution of each bearing being proportional to the weight on that bearing. In the initial application for which this system was devised, the center- of-mass calculation is particularly simple because the air bearings are located at corners of an equilateral triangle. However, the system is not restricted to this simple geometry. The system acquires and processes weight readings at a rate of 800 Hz for each load cell. The total weight and the horizontal location of the center of gravity are updated at a rate of 800/3 approx. equals 267 Hz. In a typical application, a technician would use the center-of-mass output of this instrumentation system as a guide to the manual placement of small weights on the vehicle to shift the center of gravity to a desired horizontal position. Usually, the desired horizontal position is that of the geometric center. Alternatively, this instrumentation system could be used to provide position feedback for a control system that would cause weights to be shifted automatically (see Figure 2) in an effort to keep the center of gravity at the geometric center.

  10. Computer and laboratory simulation in the teaching of neonatal nursing: innovation and impact on learning 1

    PubMed Central

    Fonseca, Luciana Mara Monti; Aredes, Natália Del' Angelo; Fernandes, Ananda Maria; Batalha, Luís Manuel da Cunha; Apóstolo, Jorge Manuel Amado; Martins, José Carlos Amado; Rodrigues, Manuel Alves

    2016-01-01

    ABSTRACT Objectives: to evaluate the cognitive learning of nursing students in neonatal clinical evaluation from a blended course with the use of computer and laboratory simulation; to compare the cognitive learning of students in a control and experimental group testing the laboratory simulation; and to assess the extracurricular blended course offered on the clinical assessment of preterm infants, according to the students. Method: a quasi-experimental study with 14 Portuguese students, containing pretest, midterm test and post-test. The technologies offered in the course were serious game e-Baby, instructional software of semiology and semiotechnique, and laboratory simulation. Data collection tools developed for this study were used for the course evaluation and characterization of the students. Nonparametric statistics were used: Mann-Whitney and Wilcoxon. Results: the use of validated digital technologies and laboratory simulation demonstrated a statistically significant difference (p = 0.001) in the learning of the participants. The course was evaluated as very satisfactory for them. The laboratory simulation alone did not represent a significant difference in the learning. Conclusions: the cognitive learning of participants increased significantly. The use of technology can be partly responsible for the course success, showing it to be an important teaching tool for innovation and motivation of learning in healthcare. PMID:27737376

  11. About the Frederick National Laboratory for Cancer Research | Frederick National Laboratory for Cancer Research

    Cancer.gov

    The Frederick National Laboratory is a Federally Funded Research and Development Center (FFRDC) sponsored by the National Cancer Institute (NCI) and currently operated by Leidos Biomedical Research, Inc. The laboratory addresses some of the most urge

  12. Computer simulation of thermal and fluid systems for MIUS integration and subsystems test /MIST/ laboratory. [Modular Integrated Utility System

    NASA Technical Reports Server (NTRS)

    Rochelle, W. C.; Liu, D. K.; Nunnery, W. J., Jr.; Brandli, A. E.

    1975-01-01

    This paper describes the application of the SINDA (systems improved numerical differencing analyzer) computer program to simulate the operation of the NASA/JSC MIUS integration and subsystems test (MIST) laboratory. The MIST laboratory is designed to test the integration capability of the following subsystems of a modular integrated utility system (MIUS): (1) electric power generation, (2) space heating and cooling, (3) solid waste disposal, (4) potable water supply, and (5) waste water treatment. The SINDA/MIST computer model is designed to simulate the response of these subsystems to externally impressed loads. The computer model determines the amount of recovered waste heat from the prime mover exhaust, water jacket and oil/aftercooler and from the incinerator. This recovered waste heat is used in the model to heat potable water, for space heating, absorption air conditioning, waste water sterilization, and to provide for thermal storage. The details of the thermal and fluid simulation of MIST including the system configuration, modes of operation modeled, SINDA model characteristics and the results of several analyses are described.

  13. Louis Stokes Midwest Center for Excellence | Argonne National Laboratory

    Science.gov Websites

    Transformations IGSBInstitute for Genomics and Systems Biology IMEInstitute for Molecular Engineering JCESRJoint Science Center SBCStructural Biology Center Energy.gov U.S. Department of Energy Office of Science

  14. Richard P. Feynman Center for Innovation

    Science.gov Websites

    Search Site submit About Us Los Alamos National LaboratoryRichard P. Feynman Center for Innovation Innovation protecting tomorrow Los Alamos National Laboratory The Richard P. Feynman Center for Innovation self-healing, self-forming mesh network of long range radios. READ MORE supercomputer Los Alamos

  15. BioEnergy Science Center

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    The BioEnergy Science Center, led by Oak Ridge National Laboratory, has been making advances in biofuels for over a decade. These achievements in plant genomics, microbial engineering, biochemistry, and plant physiology will carry over into the Center for Bioenergy Innovation, a new Department of Energy bioenergy research center.

  16. Using an Advanced Computational Laboratory Experiment to Extend and Deepen Physical Chemistry Students' Understanding of Atomic Structure

    ERIC Educational Resources Information Center

    Hoffman, Gary G.

    2015-01-01

    A computational laboratory experiment is described, which involves the advanced study of an atomic system. The students use concepts and techniques typically covered in a physical chemistry course but extend those concepts and techniques to more complex situations. The students get a chance to explore the study of atomic states and perform…

  17. Center for Adaptive Optics | Center

    Science.gov Websites

    Astronomy, UCSC's CfAO and ISEE, and Maui Community College, runs education and internship programs in postdocs. E-mail: cfao@ucolick.org Institutions: University of California, Berkeley Astronomy Department Retinal Imaging Laboratory Eye Center University of California, Irvine Department of Physics and Astronomy

  18. Hatch leading into U.S. Laboratory / Destiny module

    NASA Image and Video Library

    2001-02-11

    STS98-E-5114 (11 February 2001) --- This medium close-up shot, photographed with a digital still camera, shows Unity's closed hatch to the newly delivered Destiny laboratory. The crews of Atlantis and the International Space Station opened the laboratory, shortly after this photo was made on Feb. 11, and the astronauts and cosmonauts spent the first full day of what are planned to be years of work ahead inside the orbiting science and command center. Station commander William M. (Bill) Shepherd opened the Destiny hatch, and he and shuttle commander Kenneth D. Cockrell ventured inside at 8:38 a.m. (CST), Feb. 11. As depicted in subsequent digital images in this series, members of both crews went to work quickly inside the new module, activating air systems, fire extinguishers, alarm systems, computers and internal communications. The crew also continued equipment transfers from the shuttle to the station.

  19. Does Cloud Computing in the Atmospheric Sciences Make Sense? A case study of hybrid cloud computing at NASA Langley Research Center

    NASA Astrophysics Data System (ADS)

    Nguyen, L.; Chee, T.; Minnis, P.; Spangenberg, D.; Ayers, J. K.; Palikonda, R.; Vakhnin, A.; Dubois, R.; Murphy, P. R.

    2014-12-01

    The processing, storage and dissemination of satellite cloud and radiation products produced at NASA Langley Research Center are key activities for the Climate Science Branch. A constellation of systems operates in sync to accomplish these goals. Because of the complexity involved with operating such intricate systems, there are both high failure rates and high costs for hardware and system maintenance. Cloud computing has the potential to ameliorate cost and complexity issues. Over time, the cloud computing model has evolved and hybrid systems comprising off-site as well as on-site resources are now common. Towards our mission of providing the highest quality research products to the widest audience, we have explored the use of the Amazon Web Services (AWS) Cloud and Storage and present a case study of our results and efforts. This project builds upon NASA Langley Cloud and Radiation Group's experience with operating large and complex computing infrastructures in a reliable and cost effective manner to explore novel ways to leverage cloud computing resources in the atmospheric science environment. Our case study presents the project requirements and then examines the fit of AWS with the LaRC computing model. We also discuss the evaluation metrics, feasibility, and outcomes and close the case study with the lessons we learned that would apply to others interested in exploring the implementation of the AWS system in their own atmospheric science computing environments.

  20. Jackson State University's Center for Spatial Data Research and Applications: New facilities and new paradigms

    NASA Technical Reports Server (NTRS)

    Davis, Bruce E.; Elliot, Gregory

    1989-01-01

    Jackson State University recently established the Center for Spatial Data Research and Applications, a Geographical Information System (GIS) and remote sensing laboratory. Taking advantage of new technologies and new directions in the spatial (geographic) sciences, JSU is building a Center of Excellence in Spatial Data Management. New opportunities for research, applications, and employment are emerging. GIS requires fundamental shifts and new demands in traditional computer science and geographic training. The Center is not merely another computer lab but is one setting the pace in a new applied frontier. GIS and its associated technologies are discussed. The Center's facilities are described. An ARC/INFO GIS runs on a Vax mainframe, with numerous workstations. Image processing packages include ELAS, LIPS, VICAR, and ERDAS. A host of hardware and software peripheral are used in support. Numerous projects are underway, such as the construction of a Gulf of Mexico environmental data base, development of AI in image processing, a land use dynamics study of metropolitan Jackson, and others. A new academic interdisciplinary program in Spatial Data Management is under development, combining courses in Geography and Computer Science. The broad range of JSU's GIS and remote sensing activities is addressed. The impacts on changing paradigms in the university and in the professional world conclude the discussion.

  1. A Review of the Centers for Disease Control and Prevention's Guidelines for the Clinical Laboratory Diagnosis of Lyme Disease.

    PubMed

    Miraglia, Caterina M

    2016-12-01

    The purpose of this paper is to review information regarding the current guidelines for the clinical laboratory diagnosis of Lyme disease as set forth by the Centers for Disease Control and Prevention (CDC) to chiropractic physicians and to discuss the clinical utility of this testing. The CDC's website was reviewed to determine what their current recommendations are for the clinical laboratory testing of Lyme disease. The CDC's established guidelines recommend the use of a 2-tiered serologic testing algorithm for the evaluation of patients with suspected Lyme disease. This review provides doctors of chiropractic with information to remain current with the CDC's recommended guidelines for Lyme disease testing because patients may present to their office with the associated signs and symptoms of Lyme disease.

  2. Pencil-and-Paper Neural Networks: An Undergraduate Laboratory Exercise in Computational Neuroscience

    PubMed Central

    Crisp, Kevin M.; Sutter, Ellen N.; Westerberg, Jacob A.

    2015-01-01

    Although it has been more than 70 years since McCulloch and Pitts published their seminal work on artificial neural networks, such models remain primarily in the domain of computer science departments in undergraduate education. This is unfortunate, as simple network models offer undergraduate students a much-needed bridge between cellular neurobiology and processes governing thought and behavior. Here, we present a very simple laboratory exercise in which students constructed, trained and tested artificial neural networks by hand on paper. They explored a variety of concepts, including pattern recognition, pattern completion, noise elimination and stimulus ambiguity. Learning gains were evident in changes in the use of language when writing about information processing in the brain. PMID:26557791

  3. Laboratories | Energy Systems Integration Facility | NREL

    Science.gov Websites

    laboratories to be safely divided into multiple test stand locations (or "capability hubs") to enable Fabrication Laboratory Energy Systems High-Pressure Test Laboratory Energy Systems Integration Laboratory Energy Systems Sensor Laboratory Fuel Cell Development and Test Laboratory High-Performance Computing

  4. Assessment of physical activity with the Computer Science and Applications, Inc., accelerometer: laboratory versus field validation.

    PubMed

    Nichols, J F; Morgan, C G; Chabot, L E; Sallis, J F; Calfas, K J

    2000-03-01

    Our purpose was to compare the validity of the Computer Science and Applications, (CSA) Inc., accelerometer in laboratory and field settings and establish CSA count ranges for light, moderate, and vigorous physical activity. Validity was determined in 60 adults during treadmill exercise, using oxygen consumption (VO2) as the criterion measure, while 30 adults walked and jogged outdoors on a 400-m track. The relationship between CSA counts and VO2 was linear (R2 = .89 SEE = 3.72 ml.kg-1.min-1), as was the relationship between velocity and counts in the field (R2 = .89, SEE = 0.89 mi.hr-1). However, significant differences were found (p < .05) between laboratory and field measures of CSA counts for light and vigorous intensity. We conclude that the CSA can be used to quantify walking and jogging outdoors on level ground; however, laboratory equations may not be appropriate for use in field settings, particularly for light and vigorous activity.

  5. A Future State for NASA Laboratories - Working in the 21st Century

    NASA Technical Reports Server (NTRS)

    Kegelman, Jerome T.; Harris, Charles E.; Antcliff, Richard R.; Bushnell, Dennis M.; Dwoyer, Douglas L.

    2009-01-01

    The name "21 st Century Laboratory" is an emerging concept of how NASA (and the world) will conduct research in the very near future. Our approach is to carefully plan for significant technological changes in products, organization, and society. The NASA mission can be the beneficiary of these changes, provided the Agency prepares for the role of 21st Century laboratories in research and technology development and its deployment in this new age. It has been clear for some time now that the technology revolutions, technology "mega-trends" that we are in the midst of now, all have a common element centered around advanced computational modeling of small scale physics. Whether it is nano technology, bio technology or advanced computational technology, all of these megatrends are converging on science at the very small scale where it is profoundly important to consider the quantum effects at play with physics at that scale. Whether it is the bio-technology creation of "nanites" designed to mimic our immune system or the creation of nanoscale infotechnology devices, allowing an order of magnitude increase in computational capability, all involve quantum physics that serves as the heart of these revolutionary changes.

  6. A comparison of traditional physical laboratory and computer-simulated laboratory experiences in relation to engineering undergraduate students' conceptual understandings of a communication systems topic

    NASA Astrophysics Data System (ADS)

    Javidi, Giti

    2005-07-01

    This study was designed to investigate an alternative to the use of traditional physical laboratory activities in a communication systems course. Specifically, this study examined whether as an alternative, computer simulation is as effective as physical laboratory activities in teaching college-level electronics engineering education students about the concepts of signal transmission, modulation and demodulation. Eighty undergraduate engineering students participated in the study, which was conducted at a southeastern four-year university. The students were randomly assigned to two groups. The groups were compared on understanding the concepts, remembering the concepts, completion time of the lab experiments and perception toward the laboratory experiments. The physical group's (n = 40) treatment was to conduct laboratory experiments in a physical laboratory. The students in this group used equipment in a controlled electronics laboratory. The Simulation group's (n = 40) treatment was to conduct similar experiments in a PC laboratory. The students in this group used a simulation program in a controlled PC lab. At the completion of the treatment, scores on a validated conceptual test were collected once after the treatment and again three weeks after the treatment. Attitude surveys and qualitative study were administered at the completion of the treatment. The findings revealed significant differences, in favor of the simulation group, between the two groups on both the conceptual post-test and the follow-up test. The findings also revealed significant correlation between simulation groups' attitude toward the simulation program and their post-test scores. Moreover, there was a significant difference between the two groups on their attitude toward their laboratory experience in favor of the simulation group. In addition, there was significant difference between the two groups on their lab completion time in favor of the simulation group. At the same time, the

  7. [Standardization of terminology in laboratory medicine I].

    PubMed

    Yoon, Soo Young; Yoon, Jong Hyun; Min, Won Ki; Lim, Hwan Sub; Song, Junghan; Chae, Seok Lae; Lee, Chang Kyu; Kwon, Jung Ah; Lee, Kap No

    2007-04-01

    Standardization of medical terminology is essential for data transmission between health-care institutions or clinical laboratories and for maximizing the benefits of information technology. Purpose of our study was to standardize the medical terms used in the clinical laboratory, such as test names, units, terms used in result descriptions, etc. During the first year of the study, we developed a standard database of concept names for laboratory terms, which covered the terms used in government health care centers, their branch offices, and primary health care units. Laboratory terms were collected from the electronic data interchange (EDI) codes from National Health Insurance Corporation (NHIC), Logical Observation Identifier Names and Codes (LOINC) database, community health centers and their branch offices, and clinical laboratories of representative university medical centers. For standard expression, we referred to the English-Korean/ Korean-English medical dictionary of Korean Medical Association and the rules for foreign language translation. Programs for mapping between LOINC DB and EDI code and for translating English to Korean were developed. A Korean standard laboratory terminology database containing six axial concept names such as components, property, time aspect, system (specimen), scale type, and method type was established for 7,508 test observations. Short names and a mapping table for EDI codes and Unified Medical Language System (UMLS) were added. Synonym tables for concept names, words used in the database, and six axial terms were prepared to make it easier to find the standard terminology with common terms used in the field of laboratory medicine. Here we report for the first time a Korean standard laboratory terminology database for test names, result description terms, result units covering most laboratory tests in primary healthcare centers.

  8. EPA Environmental Chemistry Laboratory

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The Environmental Protection Agency's (EPA) Chemistry Laboratory (ECL) is a national program laboratory specializing in residue chemistry analysis under the jurisdiction of the EPA's Office of Pesticide Programs in Washington, D.C. At Stennis Space Center, the laboratory's work supports many federal anti-pollution laws. The laboratory analyzes environmental and human samples to determine the presence and amount of agricultural chemicals and related substances. Pictured, ECL chemists analyze environmental and human samples for the presence of pesticides and other pollutants.

  9. Combining a Laboratory Practical Class with a Computer Simulation: Studies on the Synthesis of Urea in Isolated Hepatocytes.

    ERIC Educational Resources Information Center

    Bender, David A.

    1986-01-01

    Describes how a computer simulation is used with a laboratory experiment on the synthesis of urea in isolated hepatocytes. The simulation calculates the amount of urea formed and the amount of ammonium remaining as the concentrations of ornithine, citrulline, argininosuccinate, arginine, and aspartate are altered. (JN)

  10. Sandia National Laboratories: Sandia National Laboratories: News: Events

    Science.gov Websites

    Programs Nuclear Weapons About Nuclear Weapons Safety & Security Weapons Science & Technology Robotics R&D 100 Awards Laboratory Directed Research & Development Technology Deployment Centers Audit Sandia's Economic Impact Licensing & Technology Transfer Browse Technology Portfolios

  11. Storage and network bandwidth requirements through the year 2000 for the NASA Center for Computational Sciences

    NASA Technical Reports Server (NTRS)

    Salmon, Ellen

    1996-01-01

    The data storage and retrieval demands of space and Earth sciences researchers have made the NASA Center for Computational Sciences (NCCS) Mass Data Storage and Delivery System (MDSDS) one of the world's most active Convex UniTree systems. Science researchers formed the NCCS's Computer Environments and Research Requirements Committee (CERRC) to relate their projected supercomputing and mass storage requirements through the year 2000. Using the CERRC guidelines and observations of current usage, some detailed projections of requirements for MDSDS network bandwidth and mass storage capacity and performance are presented.

  12. A Urinalysis Result Reporting System for a Clinical Laboratory

    PubMed Central

    Sullivan, James E.; Plexico, Perry S.; Blank, David W.

    1987-01-01

    A menu driven Urinalysis Result Reporting System based on multiple IBM-PC Workstations connected together by a local area network was developed for the Clinical Chemistry Section of the Clinical Pathology Department at the National Institutes of Health's Clinical Center. Two Network File Servers redundantly save the test results of each urine specimen. When all test results for a specimen are entered into the system, the results are transmitted to the Department's Laboratory Computer System where they are made available to the ordering physician. The Urinalysis Data Management System has proven easy to learn and use.

  13. 78 FR 44954 - Clinical Laboratory Improvement Advisory Committee (CLIAC)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-25

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Disease Control and Prevention Clinical... and Human Services; the Assistant Secretary for Health; the Director, Centers for Disease Control and... laboratory quality and laboratory [[Page 44955

  14. From the Telescope to the Laboratory and Back Again: The Center for Astrophysical Plasma Properties

    NASA Astrophysics Data System (ADS)

    Houston Montgomery, Michael; Winget, Don; Schaeuble, Marc; Hawkins, Keith; Wheeler, Craig

    2018-01-01

    The Center for Astrophysical Plasma Properties (CAPP) is a new center focusing on the spectroscopic properties of stars and accretion disks using “at-parameter” experiments. Currently, these experiments use the X-ray output of the Z machine at Sandia National Laboratories—the largest X-ray source in the world—to heat plasmas to the same conditions (temperature, density, and radiation environment) as those observed in astronomical objects. Current experiments include measuring (1) density-dependent opacities of iron-peak elements at solar interior conditions, (2) spectral lines of low-Z elements at white dwarf photospheric conditions, (3) atomic population kinetics of neon in a radiation-dominated environment, and (4) resonant Auger destruction (RAD) of silicon at accretion disk conditions around supermassive black holes. We will be moving to new astrophysical environments and additional experimental facilities, such as the National Ignition Facility (NIF) and the OMEGA facility at the Laboratory for Laser Energetics (LLE). We seek students and collaborators to work on these experiments as well as the calculations that complement them. CAPP has funding for 5 years and can support up to six graduate students and three post-docs.

  15. A hypothesis on the formation of the primary ossification centers in the membranous neurocranium: a mathematical and computational model.

    PubMed

    Garzón-Alvarado, Diego A

    2013-01-21

    This article develops a model of the appearance and location of the primary centers of ossification in the calvaria. The model uses a system of reaction-diffusion equations of two molecules (BMP and Noggin) whose behavior is of type activator-substrate and its solution produces Turing patterns, which represents the primary ossification centers. Additionally, the model includes the level of cell maturation as a function of the location of mesenchymal cells. Thus the mature cells can become osteoblasts due to the action of BMP2. Therefore, with this model, we can have two frontal primary centers, two parietal, and one, two or more occipital centers. The location of these centers in the simplified computational model is highly consistent with those centers found at an embryonic level. Copyright © 2012 Elsevier Ltd. All rights reserved.

  16. The role of total laboratory automation in a consolidated laboratory network.

    PubMed

    Seaberg, R S; Stallone, R O; Statland, B E

    2000-05-01

    In an effort to reduce overall laboratory costs and improve overall laboratory efficiencies at all of its network hospitals, the North Shore-Long Island Health System recently established a Consolidated Laboratory Network with a Core Laboratory at its center. We established and implemented a centralized Core Laboratory designed around the Roche/Hitachi CLAS Total Laboratory Automation system to perform the general and esoteric laboratory testing throughout the system in a timely and cost-effective fashion. All remaining STAT testing will be performed within the Rapid Response Laboratories (RRLs) at each of the system's hospitals. Results for this laboratory consolidation and implementation effort demonstrated a decrease in labor costs and improved turnaround time (TAT) at the core laboratory. Anticipated system savings are approximately $2.7 million. TATs averaged 1.3 h within the Core Laboratory and less than 30 min in the RRLs. When properly implemented, automation systems can reduce overall laboratory expenses, enhance patient services, and address the overall concerns facing the laboratory today: job satisfaction, decreased length of stay, and safety. The financial savings realized are primarily a result of labor reductions.

  17. Simulated Laboratory in Digital Logic.

    ERIC Educational Resources Information Center

    Cleaver, Thomas G.

    Design of computer circuits used to be a pencil and paper task followed by laboratory tests, but logic circuit design can now be done in half the time as the engineer accesses a program which simulates the behavior of real digital circuits, and does all the wiring and testing on his computer screen. A simulated laboratory in digital logic has been…

  18. Alternative treatment technology information center computer database system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sullivan, D.

    1995-10-01

    The Alternative Treatment Technology Information Center (ATTIC) computer database system was developed pursuant to the 1986 Superfund law amendments. It provides up-to-date information on innovative treatment technologies to clean up hazardous waste sites. ATTIC v2.0 provides access to several independent databases as well as a mechanism for retrieving full-text documents of key literature. It can be accessed with a personal computer and modem 24 hours a day, and there are no user fees. ATTIC provides {open_quotes}one-stop shopping{close_quotes} for information on alternative treatment options by accessing several databases: (1) treatment technology database; this contains abstracts from the literature on all typesmore » of treatment technologies, including biological, chemical, physical, and thermal methods. The best literature as viewed by experts is highlighted. (2) treatability study database; this provides performance information on technologies to remove contaminants from wastewaters and soils. It is derived from treatability studies. This database is available through ATTIC or separately as a disk that can be mailed to you. (3) underground storage tank database; this presents information on underground storage tank corrective actions, surface spills, emergency response, and remedial actions. (4) oil/chemical spill database; this provides abstracts on treatment and disposal of spilled oil and chemicals. In addition to these separate databases, ATTIC allows immediate access to other disk-based systems such as the Vendor Information System for Innovative Treatment Technologies (VISITT) and the Bioremediation in the Field Search System (BFSS). The user may download these programs to their own PC via a high-speed modem. Also via modem, users are able to download entire documents through the ATTIC system. Currently, about fifty publications are available, including Superfund Innovative Technology Evaluation (SITE) program documents.« less

  19. Thermal-Structures and Materials Testing Laboratory

    NASA Technical Reports Server (NTRS)

    Teate, Anthony A.

    1997-01-01

    Since its inception and successful implementation in 1997 at James Madison University, the Thermal Structures and Materials Testing Laboratory (T-SaMTL) funded by the NASA Langley Research Center is evolving into one of the University's premier and exemplary efforts to increase minority representation in the sciences and mathematics. Serving ten (10) students and faculty directly and almost fifty (50) students indirectly, T-SAMTL, through its recruitment efforts, workshops, mentoring program, tutorial services and its research and computational laboratories has marked the completion of the first year with support from NASA totaling $ 100,000. Beginning as an innovative academic research and mentoring program for underrepresented minority science and mathematics students, the program now boasts a constituency which consists of 50% graduating seniors in the spring of 1998 with 50% planning to go to graduate school. The program's intent is to increase the number of underrepresented minorities who receive doctoral degrees in the sciences by initiating an academically enriched research program aimed at strengthening the academic and self actualization skills of undergraduate students with the potential to pursue doctoral study in the sciences. The program provides financial assistance, academic enrichment, and professional and personal development support for minority students who demonstrate the potential and strong desire to pursue careers in the sciences and mathematics. James Madison University was awarded the first $100,000, in April 1997, by The NASA Langley Research Center for establishment and support of its Thermal Structures and Materials Testing

  20. DOE - BES Nanoscale Science Research Centers (NSRCs)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beecher, Cathy Jo

    2016-11-14

    These are slides from a powerpoint shown to guests during tours of Center for Integrated Nanotechnologies (CINT) at Los Alamos National Laboratory. It shows the five DOE-BES nanoscale science research centers (NSRCs), which are located at different national laboratories throughout the country. Then it goes into detail specifically about the Center for Integrated Nanotechnologies at LANL, including statistics on its user community and CINT's New Mexico industrial users.

  1. Smart Computer-Assisted Markets

    NASA Astrophysics Data System (ADS)

    McCabe, Kevin A.; Rassenti, Stephen J.; Smith, Vernon L.

    1991-10-01

    The deregulation movement has motivated the experimental study of auction markets designed for interdependent network industries such as natural gas pipelines or electric power systems. Decentralized agents submit bids to buy commodity and offers to sell transportation and commodity to a computerized dispatch center. Computer algorithms determine prices and allocations that maximize the gains from exchange in the system relative to the submitted bids and offers. The problem is important, because traditionally the scale and coordination economies in such industries were thought to require regulation. Laboratory experiments are used to study feasibility, limitations, incentives, and performance of proposed market designs for deregulation, providing motivation for new theory.

  2. The National Program of Educational Laboratories. Final Report.

    ERIC Educational Resources Information Center

    Chase, Francis S.

    This report presents results of a critical analysis of 20 regional educational laboratories and nine university research and development centers established under ESEA Title IV. Observations, supported by specific examples, are made concerning the laboratories and centers and deal with their roles, programs definitions, impact on educational…

  3. Laboratory automation: trajectory, technology, and tactics.

    PubMed

    Markin, R S; Whalen, S A

    2000-05-01

    Laboratory automation is in its infancy, following a path parallel to the development of laboratory information systems in the late 1970s and early 1980s. Changes on the horizon in healthcare and clinical laboratory service that affect the delivery of laboratory results include the increasing age of the population in North America, the implementation of the Balanced Budget Act (1997), and the creation of disease management companies. Major technology drivers include outcomes optimization and phenotypically targeted drugs. Constant cost pressures in the clinical laboratory have forced diagnostic manufacturers into less than optimal profitability states. Laboratory automation can be a tool for the improvement of laboratory services and may decrease costs. The key to improvement of laboratory services is implementation of the correct automation technology. The design of this technology should be driven by required functionality. Automation design issues should be centered on the understanding of the laboratory and its relationship to healthcare delivery and the business and operational processes in the clinical laboratory. Automation design philosophy has evolved from a hardware-based approach to a software-based approach. Process control software to support repeat testing, reflex testing, and transportation management, and overall computer-integrated manufacturing approaches to laboratory automation implementation are rapidly expanding areas. It is clear that hardware and software are functionally interdependent and that the interface between the laboratory automation system and the laboratory information system is a key component. The cost-effectiveness of automation solutions suggested by vendors, however, has been difficult to evaluate because the number of automation installations are few and the precision with which operational data have been collected to determine payback is suboptimal. The trend in automation has moved from total laboratory automation to a

  4. RSF Data Center Tour

    ScienceCinema

    Powers, Chuck

    2017-12-11

    The Data Center in the Research Support Facility on the campus of the U.S. Department of Energy's National Renewable Energy Laboratory (NREL) marks a significant accomplishment in its ultra-efficiency. Data centers by nature are very energy intensive. The RSF Data Center was designed to use 80% less energy than NREL's old data center, which had been in use for the last 30 years. This tour takes you through the data center highlighting its energy saving techniques.

  5. Center for Defect Physics - Energy Frontier Research Center (A "Life at the Frontiers of Energy Research" contest entry from the 2011 Energy Frontier Research Centers (EFRCs) Summit and Forum)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stocks, G. Malcolm; Ice, Gene

    "Center for Defect Physics - Energy Frontier Research Center" was submitted by the Center for Defect Physics (CDP) to the "Life at the Frontiers of Energy Research" video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. CDP is directed by G. Malcolm Stocks at Oak Ridge National Laboratory, and is a partnership of scientists from eight institutions: Oak Ridge National Laboratory (lead); Ames Laboratory; University of California, Berkeley; Carnegie Mellon University; University of Illinois, Urbana-Champaign; Ohio State University;more » University of Georgia and University of Tennessee. The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges.« less

  6. Laboratory x-ray micro-computed tomography: a user guideline for biological samples

    PubMed Central

    2017-01-01

    Abstract Laboratory x-ray micro–computed tomography (micro-CT) is a fast-growing method in scientific research applications that allows for non-destructive imaging of morphological structures. This paper provides an easily operated “how to” guide for new potential users and describes the various steps required for successful planning of research projects that involve micro-CT. Background information on micro-CT is provided, followed by relevant setup, scanning, reconstructing, and visualization methods and considerations. Throughout the guide, a Jackson's chameleon specimen, which was scanned at different settings, is used as an interactive example. The ultimate aim of this paper is make new users familiar with the concepts and applications of micro-CT in an attempt to promote its use in future scientific studies. PMID:28419369

  7. High End Computer Network Testbedding at NASA Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Gary, James Patrick

    1998-01-01

    The Earth & Space Data Computing (ESDC) Division, at the Goddard Space Flight Center, is involved in development and demonstrating various high end computer networking capabilities. The ESDC has several high end super computers. These are used to run: (1) computer simulation of the climate systems; (2) to support the Earth and Space Sciences (ESS) project; (3) to support the Grand Challenge (GC) Science, which is aimed at understanding the turbulent convection and dynamos in stars. GC research occurs in many sites throughout the country, and this research is enabled by, in part, the multiple high performance network interconnections. The application drivers for High End Computer Networking use distributed supercomputing to support virtual reality applications, such as TerraVision, (i.e., three dimensional browser of remotely accessed data), and Cave Automatic Virtual Environments (CAVE). Workstations can access and display data from multiple CAVE's with video servers, which allows for group/project collaborations using a combination of video, data, voice and shared white boarding. The ESDC is also developing and demonstrating the high degree of interoperability between satellite and terrestrial-based networks. To this end, the ESDC is conducting research and evaluations of new computer networking protocols and related technologies which improve the interoperability of satellite and terrestrial networks. The ESDC is also involved in the Security Proof of Concept Keystone (SPOCK) program sponsored by National Security Agency (NSA). The SPOCK activity provides a forum for government users and security technology providers to share information on security requirements, emerging technologies and new product developments. Also, the ESDC is involved in the Trans-Pacific Digital Library Experiment, which aims to demonstrate and evaluate the use of high performance satellite communications and advanced data communications protocols to enable interactive digital library data

  8. Advances in Engine Test Capabilities at the NASA Glenn Research Center's Propulsion Systems Laboratory

    NASA Technical Reports Server (NTRS)

    Pachlhofer, Peter M.; Panek, Joseph W.; Dicki, Dennis J.; Piendl, Barry R.; Lizanich, Paul J.; Klann, Gary A.

    2006-01-01

    The Propulsion Systems Laboratory at the National Aeronautics and Space Administration (NASA) Glenn Research Center is one of the premier U.S. facilities for research on advanced aeropropulsion systems. The facility can simulate a wide range of altitude and Mach number conditions while supplying the aeropropulsion system with all the support services necessary to operate at those conditions. Test data are recorded on a combination of steady-state and highspeed data-acquisition systems. Recently a number of upgrades were made to the facility to meet demanding new requirements for the latest aeropropulsion concepts and to improve operational efficiency. Improvements were made to data-acquisition systems, facility and engine-control systems, test-condition simulation systems, video capture and display capabilities, and personnel training procedures. This paper discusses the facility s capabilities, recent upgrades, and planned future improvements.

  9. Simulations of Laboratory Astrophysics Experiments using the CRASH code

    NASA Astrophysics Data System (ADS)

    Trantham, Matthew; Kuranz, Carolyn; Manuel, Mario; Keiter, Paul; Drake, R. P.

    2014-10-01

    Computer simulations can assist in the design and analysis of laboratory astrophysics experiments. The Center for Radiative Shock Hydrodynamics (CRASH) at the University of Michigan developed a code that has been used to design and analyze high-energy-density experiments on OMEGA, NIF, and other large laser facilities. This Eulerian code uses block-adaptive mesh refinement (AMR) with implicit multigroup radiation transport, electron heat conduction and laser ray tracing. This poster/talk will demonstrate some of the experiments the CRASH code has helped design or analyze including: Kelvin-Helmholtz, Rayleigh-Taylor, imploding bubbles, and interacting jet experiments. This work is funded by the Predictive Sciences Academic Alliances Program in NNSA-ASC via Grant DEFC52-08NA28616, by the NNSA-DS and SC-OFES Joint Program in High-Energy-Density Laboratory Plasmas, Grant Number DE-NA0001840, and by the National Laser User Facility Program, Grant Number DE-NA0000850.

  10. Controlling Laboratory Processes From A Personal Computer

    NASA Technical Reports Server (NTRS)

    Will, H.; Mackin, M. A.

    1991-01-01

    Computer program provides natural-language process control from IBM PC or compatible computer. Sets up process-control system that either runs without operator or run by workers who have limited programming skills. Includes three smaller programs. Two of them, written in FORTRAN 77, record data and control research processes. Third program, written in Pascal, generates FORTRAN subroutines used by other two programs to identify user commands with device-driving routines written by user. Also includes set of input data allowing user to define user commands to be executed by computer. Requires personal computer operating under MS-DOS with suitable hardware interfaces to all controlled devices. Also requires FORTRAN 77 compiler and device drivers written by user.

  11. Computer-assisted enzyme immunoassays and simplified immunofluorescence assays: applications for the diagnostic laboratory and the veterinarian's office.

    PubMed

    Jacobson, R H; Downing, D R; Lynch, T J

    1982-11-15

    A computer-assisted enzyme-linked immunosorbent assay (ELISA) system, based on kinetics of the reaction between substrate and enzyme molecules, was developed for testing large numbers of sera in laboratory applications. Systematic and random errors associated with conventional ELISA technique were identified leading to results formulated on a statistically validated, objective, and standardized basis. In a parallel development, an inexpensive system for field and veterinary office applications contained many of the qualities of the computer-assisted ELISA. This system uses a fluorogenic indicator (rather than the enzyme-substrate interaction) in a rapid test (15 to 20 minutes' duration) which promises broad application in serodiagnosis.

  12. The Benefits of Making Data from the EPA National Center for Computational Toxicology available for reuse (ACS Fall meeting 3 of 12)

    EPA Science Inventory

    Researchers at EPA’s National Center for Computational Toxicology (NCCT) integrate advances in biology, chemistry, exposure and computer science to help prioritize chemicals for further research based on potential human health risks. The goal of this research is to quickly evalua...

  13. Ethics, Identity, and Political Vision: Toward a Justice-Centered Approach to Equity in Computer Science Education

    ERIC Educational Resources Information Center

    Vakil, Sepehr

    2018-01-01

    In this essay, Sepehr Vakil argues that a more serious engagement with critical traditions in education research is necessary to achieve a justice-centered approach to equity in computer science (CS) education. With CS rapidly emerging as a distinct feature of K-12 public education in the United States, calls to expand CS education are often…

  14. High Tech Programmers in Low-Income Communities: Creating a Computer Culture in a Community Technology Center

    NASA Astrophysics Data System (ADS)

    Kafai, Yasmin B.; Peppler, Kylie A.; Chiu, Grace M.

    For the last twenty years, issues of the digital divide have driven efforts around the world to address the lack of access to computers and the Internet, pertinent and language appropriate content, and technical skills in low-income communities (Schuler & Day, 2004a and b). The title of our paper makes reference to a milestone publication (Schon, Sanyal, & Mitchell, 1998) that showcased some of the early work and thinking in this area. Schon, Sanyal and Mitchell's book edition included an article outlining the Computer Clubhouse, a type of community technology center model, which was developed to create opportunities for youth in low-income communities to become creators and designers of technologies by (1998). The model has been very successful scaling up, with over 110 Computer Clubhouses now in existence worldwide.

  15. Universal computer test stand (recommended computer test requirements). [for space shuttle computer evaluation

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Techniques are considered which would be used to characterize areospace computers with the space shuttle application as end usage. The system level digital problems which have been encountered and documented are surveyed. From the large cross section of tests, an optimum set is recommended that has a high probability of discovering documented system level digital problems within laboratory environments. Defined is a baseline hardware, software system which is required as a laboratory tool to test aerospace computers. Hardware and software baselines and additions necessary to interface the UTE to aerospace computers for test purposes are outlined.

  16. Computer Center: Setting Up a Microcomputer Center--1 Person's Perspective.

    ERIC Educational Resources Information Center

    Duhrkopf, Richard, Ed.; Collins, Michael, A. J., Ed.

    1988-01-01

    Considers eight components to be considered in setting up a microcomputer center for use with college classes. Discussions include hardware, software, physical facility, furniture, technical support, personnel, continuing financial expenditures, and security. (CW)

  17. MIT Lincoln Laboratory Annual Report 2010

    DTIC Science & Technology

    2010-01-01

    Research and Development Center (FFRDC) and a DoD Research and Development Laboratory. The Laboratory conducts research and development pertinent to...year, the Laboratory restruc- tured three divisions to focus research and development in areas that are increasingly important to the nation...the Director 3 Collaborations with MIT campus continue to grow, leveraging the strengths of researchers at both the Laboratory and campus. The

  18. A practical VEP-based brain-computer interface.

    PubMed

    Wang, Yijun; Wang, Ruiping; Gao, Xiaorong; Hong, Bo; Gao, Shangkai

    2006-06-01

    This paper introduces the development of a practical brain-computer interface at Tsinghua University. The system uses frequency-coded steady-state visual evoked potentials to determine the gaze direction of the user. To ensure more universal applicability of the system, approaches for reducing user variation on system performance have been proposed. The information transfer rate (ITR) has been evaluated both in the laboratory and at the Rehabilitation Center of China, respectively. The system has been proved to be applicable to > 90% of people with a high ITR in living environments.

  19. Energy 101: Energy Efficient Data Centers

    ScienceCinema

    None

    2018-04-16

    Data centers provide mission-critical computing functions vital to the daily operation of top U.S. economic, scientific, and technological organizations. These data centers consume large amounts of energy to run and maintain their computer systems, servers, and associated high-performance components—up to 3% of all U.S. electricity powers data centers. And as more information comes online, data centers will consume even more energy. Data centers can become more energy efficient by incorporating features like power-saving "stand-by" modes, energy monitoring software, and efficient cooling systems instead of energy-intensive air conditioners. These and other efficiency improvements to data centers can produce significant energy savings, reduce the load on the electric grid, and help protect the nation by increasing the reliability of critical computer operations.

  20. The Computer Bulletin Board.

    ERIC Educational Resources Information Center

    Batt, Russell H., Ed.

    1990-01-01

    Four applications of microcomputers in the chemical laboratory are presented. Included are "Mass Spectrometer Interface with an Apple II Computer,""Interfacing the Spectronic 20 to a Computer,""A pH-Monitoring and Control System for Teaching Laboratories," and "A Computer-Aided Optical Melting Point Device." Software, instrumentation, and uses are…

  1. Argonne Research Library | Argonne National Laboratory

    Science.gov Websites

    Publications Researchers Postdocs Exascale Computing Institute for Molecular Engineering at Argonne Work with Scientific Publications Researchers Postdocs Exascale Computing Institute for Molecular Engineering at IMEInstitute for Molecular Engineering JCESRJoint Center for Energy Storage Research MCSGMidwest Center for

  2. Software process improvement in the NASA software engineering laboratory

    NASA Technical Reports Server (NTRS)

    Mcgarry, Frank; Pajerski, Rose; Page, Gerald; Waligora, Sharon; Basili, Victor; Zelkowitz, Marvin

    1994-01-01

    The Software Engineering Laboratory (SEL) was established in 1976 for the purpose of studying and measuring software processes with the intent of identifying improvements that could be applied to the production of ground support software within the Flight Dynamics Division (FDD) at the National Aeronautics and Space Administration (NASA)/Goddard Space Flight Center (GSFC). The SEL has three member organizations: NASA/GSFC, the University of Maryland, and Computer Sciences Corporation (CSC). The concept of process improvement within the SEL focuses on the continual understanding of both process and product as well as goal-driven experimentation and analysis of process change within a production environment.

  3. Introducing Computational Approaches in Intermediate Mechanics

    NASA Astrophysics Data System (ADS)

    Cook, David M.

    2006-12-01

    In the winter of 2003, we at Lawrence University moved Lagrangian mechanics and rigid body dynamics from a required sophomore course to an elective junior/senior course, freeing 40% of the time for computational approaches to ordinary differential equations (trajectory problems, the large amplitude pendulum, non-linear dynamics); evaluation of integrals (finding centers of mass and moment of inertia tensors, calculating gravitational potentials for various sources); and finding eigenvalues and eigenvectors of matrices (diagonalizing the moment of inertia tensor, finding principal axes), and to generating graphical displays of computed results. Further, students begin to use LaTeX to prepare some of their submitted problem solutions. Placed in the middle of the sophomore year, this course provides the background that permits faculty members as appropriate to assign computer-based exercises in subsequent courses. Further, students are encouraged to use our Computational Physics Laboratory on their own initiative whenever that use seems appropriate. (Curricular development supported in part by the W. M. Keck Foundation, the National Science Foundation, and Lawrence University.)

  4. Using Computational Modeling to Assess the Impact of Clinical Decision Support on Cancer Screening within Community Health Centers

    PubMed Central

    Carney, Timothy Jay; Morgan, Geoffrey P.; Jones, Josette; McDaniel, Anna M.; Weaver, Michael; Weiner, Bryan; Haggstrom, David A.

    2014-01-01

    Our conceptual model demonstrates our goal to investigate the impact of clinical decision support (CDS) utilization on cancer screening improvement strategies in the community health care (CHC) setting. We employed a dual modeling technique using both statistical and computational modeling to evaluate impact. Our statistical model used the Spearman’s Rho test to evaluate the strength of relationship between our proximal outcome measures (CDS utilization) against our distal outcome measure (provider self-reported cancer screening improvement). Our computational model relied on network evolution theory and made use of a tool called Construct-TM to model the use of CDS measured by the rate of organizational learning. We employed the use of previously collected survey data from community health centers Cancer Health Disparities Collaborative (HDCC). Our intent is to demonstrate the added valued gained by using a computational modeling tool in conjunction with a statistical analysis when evaluating the impact a health information technology, in the form of CDS, on health care quality process outcomes such as facility-level screening improvement. Significant simulated disparities in organizational learning over time were observed between community health centers beginning the simulation with high and low clinical decision support capability. PMID:24953241

  5. Final Report for "Implimentation and Evaluation of Multigrid Linear Solvers into Extended Magnetohydrodynamic Codes for Petascale Computing"

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Srinath Vadlamani; Scott Kruger; Travis Austin

    Extended magnetohydrodynamic (MHD) codes are used to model the large, slow-growing instabilities that are projected to limit the performance of International Thermonuclear Experimental Reactor (ITER). The multiscale nature of the extended MHD equations requires an implicit approach. The current linear solvers needed for the implicit algorithm scale poorly because the resultant matrices are so ill-conditioned. A new solver is needed, especially one that scales to the petascale. The most successful scalable parallel processor solvers to date are multigrid solvers. Applying multigrid techniques to a set of equations whose fundamental modes are dispersive waves is a promising solution to CEMM problems.more » For the Phase 1, we implemented multigrid preconditioners from the HYPRE project of the Center for Applied Scientific Computing at LLNL via PETSc of the DOE SciDAC TOPS for the real matrix systems of the extended MHD code NIMROD which is a one of the primary modeling codes of the OFES-funded Center for Extended Magnetohydrodynamic Modeling (CEMM) SciDAC. We implemented the multigrid solvers on the fusion test problem that allows for real matrix systems with success, and in the process learned about the details of NIMROD data structures and the difficulties of inverting NIMROD operators. The further success of this project will allow for efficient usage of future petascale computers at the National Leadership Facilities: Oak Ridge National Laboratory, Argonne National Laboratory, and National Energy Research Scientific Computing Center. The project will be a collaborative effort between computational plasma physicists and applied mathematicians at Tech-X Corporation, applied mathematicians Front Range Scientific Computations, Inc. (who are collaborators on the HYPRE project), and other computational plasma physicists involved with the CEMM project.« less

  6. Special Education Teacher Computer Literacy Training. Project STEEL. A Special Project To Develop and Implement a Computer-Based Special Teacher Education and Evaluation Laboratory. Volume II. Final Report.

    ERIC Educational Resources Information Center

    Frick, Theodore W.; And Others

    The document is part of the final report on Project STEEL (Special Teacher Education and Evaluation Laboratory) intended to extend the utilization of technology in the training of preservice special education teachers. This volume focuses on the second of four project objectives, the development of a special education teacher computer literacy…

  7. Computational Analyses of Offset Stream Nozzles for Noise Reduction

    NASA Technical Reports Server (NTRS)

    Dippold, Vance, III; Foster, Lancert; Wiese,Michael

    2007-01-01

    The Wind computational fluid dynamics code was used to perform a series of simulations on two offset stream nozzle concepts for jet noise reduction. The first concept used an S-duct to direct the secondary stream to the lower side of the nozzle. The second concept used vanes to turn the secondary flow downward. The analyses were completed in preparation of tests conducted in the NASA Glenn Research Center Aeroacoustic Propulsion Laboratory. The offset stream nozzles demonstrated good performance and reduced the amount of turbulence on the lower side of the jet plume. The computer analyses proved instrumental in guiding the development of the final test configurations and giving insight into the flow mechanics of offset stream nozzles. The computational predictions were compared with flowfield results from the jet rig testing and showed excellent agreement.

  8. Manual on characteristics of Landsat computer-compatible tapes produced by the EROS Data Center digital image processing system

    USGS Publications Warehouse

    Holkenbrink, Patrick F.

    1978-01-01

    Landsat data are received by National Aeronautics and Space Administration (NASA) tracking stations and converted into digital form on high-density tapes (HDTs) by the Image Processing Facility (IPF) at the Goddard Space Flight Center (GSFC), Greenbelt, Maryland. The HDTs are shipped to the EROS Data Center (EDC) where they are converted into customer products by the EROS Data Center digital image processing system (EDIPS). This document describes in detail one of these products: the computer-compatible tape (CCT) produced from Landsat-1, -2, and -3 multispectral scanner (MSS) data and Landsat-3 only return-beam vidicon (RBV) data. Landsat-1 and -2 RBV data will not be processed by IPF/EDIPS to CCT format.

  9. FORTRAN plotting subroutines for the space plasma laboratory

    NASA Technical Reports Server (NTRS)

    Williams, R.

    1983-01-01

    The computer program known as PLOTRW was custom made to satisfy some of the graphics requirements for the data collected in the Space Plasma Laboratory at the Johnson Space Center (JSC). The general requirements for the program were as follows: (1) all subroutines shall be callable through a FORTRAN source program; (2) all graphs shall fill one page and be properly labeled; (3) there shall be options for linear axes and logarithmic axes; (4) each axis shall have tick marks equally spaced with numeric values printed at the beginning tick mark and at the last tick mark; and (5) there shall be three options for plotting. These are: (1) point plot, (2) line plot and (3) point-line plot. The subroutines were written in FORTRAN IV for the LSI-11 Digital equipment Corporation (DEC) Computer. The program is now operational and can be run on any TEKTRONICX graphics terminal that uses a DEC Real-Time-11 (RT-11) operating system.

  10. Center for Space Microelectronics Technology. 1993 Technical Report

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The 1993 Technical Report of the Jet Propulsion Laboratory Center for Space Microelectronics Technology summarizes the technical accomplishments, publications, presentations, and patents of the Center during the past year. The report lists 170 publications, 193 presentations, and 84 New Technology Reports and patents. The 1993 Technical Report of the Jet Propulsion Laboratory Center for Space Microelectronics Technology summarizes the technical accomplishments, publications, presentations, and patents of the Center during the past year. The report lists 170 publications, 193 presentations, and 84 New Technology Reports and patents.

  11. Center of excellence for small robots

    NASA Astrophysics Data System (ADS)

    Nguyen, Hoa G.; Carroll, Daniel M.; Laird, Robin T.; Everett, H. R.

    2005-05-01

    The mission of the Unmanned Systems Branch of SPAWAR Systems Center, San Diego (SSC San Diego) is to provide network-integrated robotic solutions for Command, Control, Communications, Computers, Intelligence, Surveillance, and Reconnaissance (C4ISR) applications, serving and partnering with industry, academia, and other government agencies. We believe the most important criterion for a successful acquisition program is producing a value-added end product that the warfighter needs, uses and appreciates. Through our accomplishments in the laboratory and field, SSC San Diego has been designated the Center of Excellence for Small Robots by the Office of the Secretary of Defense Joint Robotics Program. This paper covers the background, experience, and collaboration efforts by SSC San Diego to serve as the "Impedance-Matching Transformer" between the robotic user and technical communities. Special attention is given to our Unmanned Systems Technology Imperatives for Research, Development, Testing and Evaluation (RDT&E) of Small Robots. Active projects, past efforts, and architectures are provided as success stories for the Unmanned Systems Development Approach.

  12. Image Understanding Research and Its Application to Cartography and Computer-Based Analysis of Aerial Imagery

    DTIC Science & Technology

    1983-09-01

    Report Al-TR-346. Artifcial Intelligence Laboratory, Mamachusetts Institute of Tech- niugy. Cambridge, Mmeh mett. June 19 [G.usmn@ A. Gaman-Arenas...Testbed Coordinator, 415/859-4395 Artificial Intelligence Center Computer Science and Technology Division Prepared for: Defense Advanced Research...to support processing of aerial photographs for such military applications as cartography, Intelligence , weapon guidance, and targeting. A key

  13. Human Centered Autonomous and Assistant Systems Testbed for Exploration Operations

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Mount, Frances; Carreon, Patricia; Torney, Susan E.

    2001-01-01

    The Engineering and Mission Operations Directorates at NASA Johnson Space Center are combining laboratories and expertise to establish the Human Centered Autonomous and Assistant Systems Testbed for Exploration Operations. This is a testbed for human centered design, development and evaluation of intelligent autonomous and assistant systems that will be needed for human exploration and development of space. This project will improve human-centered analysis, design and evaluation methods for developing intelligent software. This software will support human-machine cognitive and collaborative activities in future interplanetary work environments where distributed computer and human agents cooperate. We are developing and evaluating prototype intelligent systems for distributed multi-agent mixed-initiative operations. The primary target domain is control of life support systems in a planetary base. Technical approaches will be evaluated for use during extended manned tests in the target domain, the Bioregenerative Advanced Life Support Systems Test Complex (BIO-Plex). A spinoff target domain is the International Space Station (ISS) Mission Control Center (MCC). Prodl}cts of this project include human-centered intelligent software technology, innovative human interface designs, and human-centered software development processes, methods and products. The testbed uses adjustable autonomy software and life support systems simulation models from the Adjustable Autonomy Testbed, to represent operations on the remote planet. Ground operations prototypes and concepts will be evaluated in the Exploration Planning and Operations Center (ExPOC) and Jupiter Facility.

  14. Center for Electrochemical Energy Science | Argonne National Laboratory

    Science.gov Websites

    Electrochemical Energy Science Research Program Publications & Presentations News An Energy Frontier Research Center Exploring the electrochemical reactivity of oxide materials and their interfaces under the extreme

  15. Accomplishment Summary 1968-1969. Biological Computer Laboratory.

    ERIC Educational Resources Information Center

    Von Foerster, Heinz; And Others

    This report summarizes theoretical, applied, and experimental studies in the areas of computational principles in complex intelligent systems, cybernetics, multivalued logic, and the mechanization of cognitive processes. This work is summarized under the following topic headings: properties of complex dynamic systems; computers and the language…

  16. Computer laboratory notification system via short message service to reduce health care delays in management of tuberculosis in Taiwan.

    PubMed

    Chen, Tun-Chieh; Lin, Wei-Ru; Lu, Po-Liang; Lin, Chun-Yu; Lin, Shu-Hui; Lin, Chuen-Ju; Feng, Ming-Chu; Chiang, Horn-Che; Chen, Yen-Hsu; Huang, Ming-Shyan

    2011-06-01

    We investigated the impacts of introducing an expedited acid-fast bacilli (AFB) smear laboratory procedure and an automatic, real-time laboratory notification system by short message with mobile phones on delays in prompt isolation of patients with pulmonary tuberculosis (TB). We analyzed the data for all patients with active pulmonary tuberculosis at a hospital in Kaohsiung, Taiwan, a 1,600-bed medical center, during baseline (January 2004 to February 2005) and intervention (July 2005 to August 2006) phases. A total of 96 and 127 patients with AFB-positive TB was reported during the baseline and intervention phases, respectively. There were significant decreases in health care system delays (ie, laboratory delays: reception of sputum to reporting, P < .001; response delays: reporting to patient isolation, P = .045; and interval from admission to patient isolation, P < .001) during the intervention phase. Significantly fewer nurses were exposed to each patient with active pulmonary TB during the intervention phase (P = .039). Implementation of expedited AFB smear laboratory procedures and an automatic, real-time laboratory mobile notification system significantly decreased delays in the diagnosis and isolation of patients with active TB. Copyright © 2011 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Mosby, Inc. All rights reserved.

  17. Deep Space Network (DSN), Network Operations Control Center (NOCC) computer-human interfaces

    NASA Technical Reports Server (NTRS)

    Ellman, Alvin; Carlton, Magdi

    1993-01-01

    The technical challenges, engineering solutions, and results of the NOCC computer-human interface design are presented. The use-centered design process was as follows: determine the design criteria for user concerns; assess the impact of design decisions on the users; and determine the technical aspects of the implementation (tools, platforms, etc.). The NOCC hardware architecture is illustrated. A graphical model of the DSN that represented the hierarchical structure of the data was constructed. The DSN spacecraft summary display is shown. Navigation from top to bottom is accomplished by clicking the appropriate button for the element about which the user desires more detail. The telemetry summary display and the antenna color decision table are also shown.

  18. Laboratory security and emergency response guidance for laboratories working with select agents. Centers for Disease Control and Prevention.

    PubMed

    Richmond, Jonathan Y; Nesby-O'Dell, Shanna L

    2002-12-06

    In recent years, concern has increased regarding use of biologic materials as agents of terrorism, but these same agents are often necessary tools in clinical and research microbiology laboratories. Traditional biosafety guidelines for laboratories have emphasized use of optimal work practices, appropriate containment equipment, well-designed facilities, and administrative controls to minimize risk of worker injury and to ensure safeguards against laboratory contamination. The guidelines discussed in this report were first published in 1999 (U.S. Department of Health and Human Services/CDC and National Institutes of Health. Biosafety in microbiological and biomedical laboratories [BMBL]. Richmond JY, McKinney RW, eds. 4th ed. Washington, DC: US Department of Health and Human Services, 1999 [Appendix F]). In that report, physical security concerns were addressed, and efforts were focused on preventing unauthorized entry to laboratory areas and preventing unauthorized removal of dangerous biologic agents from the laboratory. Appendix F of BMBL is now being revised to include additional information regarding personnel risk assessments, and inventory controls. The guidelines contained in this report are intended for laboratories working with select agents under biosafety-level 2, 3, or 4 conditions as described in Sections II and III of BMBL. These recommendations include conducting facility risk assessments and developing comprehensive security plans to minimize the probability of misuse of select agents. Risk assessments should include systematic, site-specific reviews of 1) physical security; 2) security of data and electronic technology systems; 3) employee security; 4) access controls to laboratory and animal areas; 5) procedures for agent inventory and accountability; 6) shipping/transfer and receiving of select agents; 7) unintentional incident and injury policies; 8) emergency response plans; and 9) policies that address breaches in security. The security plan

  19. A laboratory animal science pioneer.

    PubMed

    Kostomitsopoulos, Nikolaos

    2014-11-01

    Nikolaos Kostomitsopoulos, DVM, PhD, is Head of Laboratory Animal Facilities and Designated Veterinarian, Center of Clinical, Experimental Surgery and Translational Research, Biomedical Research Foundation of the Academy of Athens, Athens, Greece. Dr. Kostomitsopoulos discusses his successes in implementing laboratory animal science legislation and fostering collaboration among scientists in Greece.

  20. 76 FR 82299 - Clinical Laboratory Improvement Advisory Committee (CLIAC)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-30

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Disease Control and Prevention Clinical... under which clinical laboratories are regulated; the impact on medical and laboratory practice of... the Clinical Laboratory Workforce; laboratory communication and electronic health records, integration...

  1. Center for Space Microelectronics Technology

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The 1990 technical report of the Jet Propulsion Laboratory Center for Space Microelectronics Technology summarizes the technical accomplishments, publications, presentations, and patents of the center during 1990. The report lists 130 publications, 226 presentations, and 87 new technology reports and patents.

  2. Computational mechanics research and support for aerodynamics and hydraulics at TFHRC, year 2 quarter 1 progress report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lottes, S.A.; Bojanowski, C.; Shen, J.

    2012-04-09

    The computational fluid dynamics (CFD) and computational structural mechanics (CSM) focus areas at Argonne's Transportation Research and Analysis Computing Center (TRACC) initiated a project to support and compliment the experimental programs at the Turner-Fairbank Highway Research Center (TFHRC) with high performance computing based analysis capabilities in August 2010. The project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at TFHRC for a period of five years, beginning in October 2010. Themore » analysis methods employ well-benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focus of the project are wind and water effects on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to improve design allowing for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, CFD analysis of the operation of the wind tunnel in the TFHRC wind engineering laboratory. This quarterly report documents technical progress on the project tasks for the period of October through

  3. Computational mechanics research and support for aerodynamics and hydraulics at TFHRC year 1 quarter 4 progress report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lottes, S.A.; Kulak, R.F.; Bojanowski, C.

    2011-12-09

    The computational fluid dynamics (CFD) and computational structural mechanics (CSM) focus areas at Argonne's Transportation Research and Analysis Computing Center (TRACC) initiated a project to support and compliment the experimental programs at the Turner-Fairbank Highway Research Center (TFHRC) with high performance computing based analysis capabilities in August 2010. The project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at TFHRC for a period of five years, beginning in October 2010. Themore » analysis methods employ well-benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focus of the project are wind and water effects on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to assess them for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, CFD analysis of the operation of the wind tunnel in the TFCHR wind engineering laboratory, vehicle stability under high wind loading, and the use of electromagnetic shock absorbers to improve vehicle stability

  4. Computational mechanics research and support for aerodynamics and hydraulics at TFHRC, year 2 quarter 2 progress report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lottes, S.A.; Bojanowski, C.; Shen, J.

    2012-06-28

    The computational fluid dynamics (CFD) and computational structural mechanics (CSM) focus areas at Argonne's Transportation Research and Analysis Computing Center (TRACC) initiated a project to support and compliment the experimental programs at the Turner-Fairbank Highway Research Center (TFHRC) with high performance computing based analysis capabilities in August 2010. The project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at TFHRC for a period of five years, beginning in October 2010. Themore » analysis methods employ well benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focus of the project are wind and water effects on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to improve design allowing for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, CFD analysis of the operation of the wind tunnel in the TFHRC wind engineering laboratory. This quarterly report documents technical progress on the project tasks for the period of January through

  5. 5. AERIAL PHOTO OF THE COMPONENTS TEST LABORATORY DURING THE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    5. AERIAL PHOTO OF THE COMPONENTS TEST LABORATORY DURING THE CONSTRUCTION OF THE EAST TEST AREA. 1955, FRED ORDWAY COLLECTION, U.S. SPACE AND ROCKET CENTER, HUNTSVILLE, AL. - Marshall Space Flight Center, East Test Area, Components Test Laboratory, Huntsville, Madison County, AL

  6. International External Quality Assurance for Laboratory Diagnosis of Diphtheria ▿

    PubMed Central

    Neal, S. E.; Efstratiou, A.

    2009-01-01

    The diphtheria surveillance network (DIPNET) encompassing National Diphtheria Reference Centers from 25 European countries is a Dedicated Surveillance Network recognized by the European Commission. A key DIPNET objective is the quality assessment of microbiological procedures for diphtheria across the European Union and beyond. A detailed questionnaire on the level of reference laboratory services and an external quality assessment (EQA) panel comprising six simulated throat specimens were sent to 34 centers. Twenty-three centers are designated National Diphtheria Reference Centers, with the laboratory in the United Kingdom being the only WHO Collaborating Centre. A variety of screening and identification tests were used, including the cysteinase test (20/34 centers), pyrazinamidase test (17/34 centers), and commercial kits (25/34 centers). The classic Elek test for toxigenicity testing is mostly used (28/34 centers), with variations in serum sources and antitoxin concentrations. Many laboratories reported problems obtaining Elek reagents or media. Only six centers produced acceptable results for all six specimens. Overall, 21% of identification and 13% of toxigenicity reports were unacceptable. Many centers could not isolate the target organism, and most found difficulties with the specimens that contained Corynebacterium striatum as a commensal contaminant. Nineteen centers generated either false-positive or negative toxigenic results, which may have caused inappropriate medical management. The discrepancies in this diphtheria diagnostics EQA alarmingly reflect the urgent need to improve laboratory performance in diphtheria diagnostics in Europe, standardize feasible and robust microbiological methods, and build awareness among public health authorities. Therefore, DIPNET recommends that regular workshops and EQA distributions for diphtheria diagnostics should be supported and maintained. PMID:19828749

  7. Astronomical Data Center Bulletin, volume 1, number 2

    NASA Technical Reports Server (NTRS)

    Nagy, T. A.; Warren, W. H., Jr.; Mead, J. M.

    1981-01-01

    Work in progress on astronomical catalogs is presented in 16 papers. Topics cover astronomical data center operations; automatic astronomical data retrieval at GSFC; interactive computer reference search of astronomical literature 1950-1976; formatting, checking, and documenting machine-readable catalogs; interactive catalog of UV, optical, and HI data for 201 Virgo cluster galaxies; machine-readable version of the general catalog of variable stars, third edition; galactic latitude and magnitude distribution of two astronomical catalogs; the catalog of open star clusters; infrared astronomical data base and catalog of infrared observations; the Air Force geophysics laboratory; revised magnetic tape of the N30 catalog of 5,268 standard stars; positional correlation of the two-micron sky survey and Smithsonian Astrophysical Observatory catalog sources; search capabilities for the catalog of stellar identifications (CSI) 1979 version; CSI statistics: blue magnitude versus spectral type; catalogs available from the Astronomical Data Center; and status report on machine-readable astronomical catalogs.

  8. Radio Wavelength Studies of the Galactic Center Source N3, Spectroscopic Instrumentation For Robotic Telescope Systems, and Developing Active Learning Activities for Astronomy Laboratory Courses

    NASA Astrophysics Data System (ADS)

    Ludovici, Dominic Alesio

    2017-08-01

    The mysterious radio source N3 appears to be located within the vicinity of the Radio Arc region of the Galactic Center. To investigate the nature of this source, we have conducted radio observations with the VLA and the VLBA. Continuum observations between 2 and 50 GHz reveal that N3 is an extremely compact and bright source with a non-thermal spectrum. Molecular line observations with the VLA reveal a compact molecular cloud adjacent to N3 in projection. The properties of this cloud are consistent with other galactic center clouds. We are able to rule out several hypotheses for the nature of N3, though a micro-blazar origin cannot be ruled out. Robotic Telescope systems are now seeing widespread deployment as both teaching and research instruments. While these systems have traditionally been able to produce high quality images, these systems have lacked the capability to conduct spectroscopic observations. To enable spectroscopic observations on the Iowa Robotic Observatory, we have developed a low cost (˜ 500), low resolution (R ˜ 300) spectrometer which mounts inside a modified filter wheel and a moderate cost (˜ 5000), medium resolution (R ˜ 8000) fiber-fed spectrometer. Software has been developed to operate both instruments robotically and calibration pipelines are being developed to automate calibration of the data. The University of Iowa offers several introductory astronomy laboratory courses taken by many hundreds of students each semester. To improve student learning in these laboratory courses, we have worked to integrate active learning into laboratory activities. We present the pedagogical approaches used to develop and update the laboratory activities and present an inventory of the current laboratory exercises. Using the inventory, we make observations of the strengths and weaknesses of the current exercises and provide suggestions for future refinement of the astronomy laboratory curriculum.

  9. High-Performance Computing Data Center | Computational Science | NREL

    Science.gov Websites

    liquid cooling to achieve its very low PUE, then captures and reuses waste heat as the primary heating dry cooler that uses refrigerant in a passive cycle to dissipate heat-is reducing onsite water Measuring efficiency through PUE Warm-water liquid cooling Re-using waste heat from computing components

  10. Deedee Montzka of the National Center for Atmospheric Research checks out the NOxyO3 instrument on NASA's DC-8 flying laboratory before the ARCTAS mission

    NASA Image and Video Library

    2008-03-07

    Climate researchers from the National Center for Atmospheric Research (NCAR) and several universities install and perform functional checkouts of a variety of sensitive atmospheric instruments on NASA's DC-8 airborne laboratory prior to beginning the ARCTAS mission.

  11. Closed-Loop HIRF Experiments Performed on a Fault Tolerant Flight Control Computer

    NASA Technical Reports Server (NTRS)

    Belcastro, Celeste M.

    1997-01-01

    ABSTRACT Closed-loop HIRF experiments were performed on a fault tolerant flight control computer (FCC) at the NASA Langley Research Center. The FCC used in the experiments was a quad-redundant flight control computer executing B737 Autoland control laws. The FCC was placed in one of the mode-stirred reverberation chambers in the HIRF Laboratory and interfaced to a computer simulation of the B737 flight dynamics, engines, sensors, actuators, and atmosphere in the Closed-Loop Systems Laboratory. Disturbances to the aircraft associated with wind gusts and turbulence were simulated during tests. Electrical isolation between the FCC under test and the simulation computer was achieved via a fiber optic interface for the analog and discrete signals. Closed-loop operation of the FCC enabled flight dynamics and atmospheric disturbances affecting the aircraft to be represented during tests. Upset was induced in the FCC as a result of exposure to HIRF, and the effect of upset on the simulated flight of the aircraft was observed and recorded. This paper presents a description of these closed- loop HIRF experiments, upset data obtained from the FCC during these experiments, and closed-loop effects on the simulated flight of the aircraft.

  12. Spectrum of tablet computer use by medical students and residents at an academic medical center.

    PubMed

    Robinson, Robert

    2015-01-01

    Introduction. The value of tablet computer use in medical education is an area of considerable interest, with preliminary investigations showing that the majority of medical trainees feel that tablet computers added value to the curriculum. This study investigated potential differences in tablet computer use between medical students and resident physicians. Materials & Methods. Data collection for this survey was accomplished with an anonymous online questionnaire shared with the medical students and residents at Southern Illinois University School of Medicine (SIU-SOM) in July and August of 2012. Results. There were 76 medical student responses (26% response rate) and 66 resident/fellow responses to this survey (21% response rate). Residents/fellows were more likely to use tablet computers several times daily than medical students (32% vs. 20%, p = 0.035). The most common reported uses were for accessing medical reference applications (46%), e-Books (45%), and board study (32%). Residents were more likely than students to use a tablet computer to access an electronic medical record (41% vs. 21%, p = 0.010), review radiology images (27% vs. 12%, p = 0.019), and enter patient care orders (26% vs. 3%, p < 0.001). Discussion. This study shows a high prevalence and frequency of tablet computer use among physicians in training at this academic medical center. Most residents and students use tablet computers to access medical references, e-Books, and to study for board exams. Residents were more likely to use tablet computers to complete clinical tasks. Conclusions. Tablet computer use among medical students and resident physicians was common in this survey. All learners used tablet computers for point of care references and board study. Resident physicians were more likely to use tablet computers to access the EMR, enter patient care orders, and review radiology studies. This difference is likely due to the differing educational and professional demands placed on

  13. Spectrum of tablet computer use by medical students and residents at an academic medical center

    PubMed Central

    2015-01-01

    Introduction. The value of tablet computer use in medical education is an area of considerable interest, with preliminary investigations showing that the majority of medical trainees feel that tablet computers added value to the curriculum. This study investigated potential differences in tablet computer use between medical students and resident physicians. Materials & Methods. Data collection for this survey was accomplished with an anonymous online questionnaire shared with the medical students and residents at Southern Illinois University School of Medicine (SIU-SOM) in July and August of 2012. Results. There were 76 medical student responses (26% response rate) and 66 resident/fellow responses to this survey (21% response rate). Residents/fellows were more likely to use tablet computers several times daily than medical students (32% vs. 20%, p = 0.035). The most common reported uses were for accessing medical reference applications (46%), e-Books (45%), and board study (32%). Residents were more likely than students to use a tablet computer to access an electronic medical record (41% vs. 21%, p = 0.010), review radiology images (27% vs. 12%, p = 0.019), and enter patient care orders (26% vs. 3%, p < 0.001). Discussion. This study shows a high prevalence and frequency of tablet computer use among physicians in training at this academic medical center. Most residents and students use tablet computers to access medical references, e-Books, and to study for board exams. Residents were more likely to use tablet computers to complete clinical tasks. Conclusions. Tablet computer use among medical students and resident physicians was common in this survey. All learners used tablet computers for point of care references and board study. Resident physicians were more likely to use tablet computers to access the EMR, enter patient care orders, and review radiology studies. This difference is likely due to the differing educational and professional demands placed on

  14. Exploring Electronics Laboratory Experiments Using Computer Software

    ERIC Educational Resources Information Center

    Gandole, Yogendra Babarao

    2011-01-01

    The roles of teachers and students are changing, and there are undoubtedly ways of learning not yet discovered. However, the computer and software technology may provide a significant role to identify the problems, to present solutions and life-long learning. It is clear that the computer based educational technology has reached the point where…

  15. To Compare the Effects of Computer Based Learning and the Laboratory Based Learning on Students' Achievement Regarding Electric Circuits

    ERIC Educational Resources Information Center

    Bayrak, Bekir; Kanli, Uygar; Kandil Ingeç, Sebnem

    2007-01-01

    In this study, the research problem was: "Is the computer based physics instruction as effective as laboratory intensive physics instruction with regards to academic success on electric circuits 9th grade students?" For this research of experimental quality the design of pre-test and post-test are applied with an experiment and a control…

  16. The growth of the UniTree mass storage system at the NASA Center for Computational Sciences

    NASA Technical Reports Server (NTRS)

    Tarshish, Adina; Salmon, Ellen

    1993-01-01

    In October 1992, the NASA Center for Computational Sciences made its Convex-based UniTree system generally available to users. The ensuing months saw the growth of near-online data from nil to nearly three terabytes, a doubling of the number of CPU's on the facility's Cray YMP (the primary data source for UniTree), and the necessity for an aggressive regimen for repacking sparse tapes and hierarchical 'vaulting' of old files to freestanding tape. Connectivity was enhanced as well with the addition of UltraNet HiPPI. This paper describes the increasing demands placed on the storage system's performance and throughput that resulted from the significant augmentation of compute-server processor power and network speed.

  17. Center for Space Microelectronics Technology

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The 1991 Technical Report of the Jet Propulsion Laboratory Center for Space Microelectronics Technology summarizes the technical accomplishments, publications, presentations, and patents of the Center during the past year. The report lists 193 publications, 211 presentations, and 125 new technology reports and patents.

  18. Pain, Work-related Characteristics, and Psychosocial Factors among Computer Workers at a University Center.

    PubMed

    Mainenti, Míriam Raquel Meira; Felicio, Lilian Ramiro; Rodrigues, Erika de Carvalho; Ribeiro da Silva, Dalila Terrinha; Vigário Dos Santos, Patrícia

    2014-04-01

    [Purpose] Complaint of pain is common in computer workers, encouraging the investigation of pain-related workplace factors. This study investigated the relationship among work-related characteristics, psychosocial factors, and pain among computer workers from a university center. [Subjects and Methods] Fifteen subjects (median age, 32.0 years; interquartile range, 26.8-34.5 years) were subjected to measurement of bioelectrical impedance; photogrammetry; workplace measurements; and pain complaint, quality of life, and motivation questionnaires. [Results] The low back was the most prevalent region of complaint (76.9%). The number of body regions for which subjects complained of pain was greater in the no rest breaks group, which also presented higher prevalences of neck (62.5%) and low back (100%) pain. There were also observed associations between neck complaint and quality of life; neck complaint and head protrusion; wrist complaint and shoulder angle; and use of a chair back and thoracic pain. [Conclusion] Complaint of pain was associated with no short rest breaks, no use of a chair back, poor quality of life, high head protrusion, and shoulder angle while using the mouse of a computer.

  19. Radiological Control Center (RADCC) Renaming Ceremony

    NASA Image and Video Library

    2017-03-31

    A Mars Science Laboratory cap is displayed in the Randall E. Scott Radiological Control Center at NASA's Kennedy Space Center. The facility was recently named in honor of Randy Scott, a professional health physicist of more than 40 years. He served as the Florida spaceport's Radiation Protection Officer for 14 years until his death June 17, 2016. Launched Nov. 26, 2011, the Mars Science Laboratory with the Curiosity lander was powered by a radioisotope thermalelectric generator. Located in the Neil Armstrong Operations and Checkout building, the Randall E. Scott Radiological Control Center is staffed by technical and radiological experts from NASA, the U.S. Department of Energy, the U.S. Air Force 45th Space Wing and the state of Florida. The group performs data collection and assessment functions supporting launch site and field data collection activities during launces involving plutonium-powered spacecraft such as the Mars Science Laboratory.

  20. Calibration Laboratory Capabilities Listing as of April 2009

    NASA Technical Reports Server (NTRS)

    Kennedy, Gary W.

    2009-01-01

    This document reviews the Calibration Laboratory capabilities for various NASA centers (i.e., Glenn Research Center and Plum Brook Test Facility Kennedy Space Center Marshall Space Flight Center Stennis Space Center and White Sands Test Facility.) Some of the parameters reported are: Alternating current, direct current, dimensional, mass, force, torque, pressure and vacuum, safety, and thermodynamics parameters. Some centers reported other parameters.

  1. Promoting CLT within a Computer Assisted Learning Environment: A Survey of the Communicative English Course of FLTC

    ERIC Educational Resources Information Center

    Haider, Md. Zulfeqar; Chowdhury, Takad Ahmed

    2012-01-01

    This study is based on a survey of the Communicative English Language Certificate (CELC) course run by the Foreign Language Training Center (FLTC), a Project under the Ministry of Education, Bangladesh. FLTC is working to promote the teaching and learning of English through its eleven computer-based and state of the art language laboratories. As…

  2. Cost-effective and business-beneficial computer validation for bioanalytical laboratories.

    PubMed

    McDowall, Rd

    2011-07-01

    Computerized system validation is often viewed as a burden and a waste of time to meet regulatory requirements. This article presents a different approach by looking at validation in a bioanalytical laboratory from the business benefits that computer validation can bring. Ask yourself the question, have you ever bought a computerized system that did not meet your initial expectations? This article will look at understanding the process to be automated, the paper to be eliminated and the records to be signed to meet the requirements of the GLP or GCP and Part 11 regulations. This paper will only consider commercial nonconfigurable and configurable software such as plate readers and LC-MS/MS data systems rather than LIMS or custom applications. Two streamlined life cycle models are presented. The first one consists of a single document for validation of nonconfigurable software. The second is for configurable software and is a five-stage model that avoids the need to write functional and design specifications. Both models are aimed at managing the risk each type of software poses whist reducing the amount of documented evidence required for validation.

  3. Sandia National Laboratories:

    Science.gov Websites

    Programs Nuclear Weapons About Nuclear Weapons Safety & Security Weapons Science & Technology Robotics R&D 100 Awards Laboratory Directed Research & Development Technology Deployment Centers Audit Sandia's Economic Impact Licensing & Technology Transfer Browse Technology Portfolios

  4. Discovery & Interaction in Astro 101 Laboratory Experiments

    NASA Astrophysics Data System (ADS)

    Maloney, Frank Patrick; Maurone, Philip; DeWarf, Laurence E.

    2016-01-01

    The availability of low-cost, high-performance computing hardware and software has transformed the manner by which astronomical concepts can be re-discovered and explored in a laboratory that accompanies an astronomy course for arts students. We report on a strategy, begun in 1992, for allowing each student to understand fundamental scientific principles by interactively confronting astronomical and physical phenomena, through direct observation and by computer simulation. These experiments have evolved as :a) the quality and speed of the hardware has greatly increasedb) the corresponding hardware costs have decreasedc) the students have become computer and Internet literated) the importance of computationally and scientifically literate arts graduates in the workplace has increased.We present the current suite of laboratory experiments, and describe the nature, procedures, and goals in this two-semester laboratory for liberal arts majors at the Astro 101 university level.

  5. Structural Test Laboratory | Water Power | NREL

    Science.gov Websites

    Structural Test Laboratory Structural Test Laboratory NREL engineers design and configure structural components can validate models, demonstrate system reliability, inform design margins, and assess , including mass and center of gravity, to ensure compliance with design goals Dynamic Characterization Use

  6. Personal computer versus personal computer/mobile device combination users' preclinical laboratory e-learning activity.

    PubMed

    Kon, Haruka; Kobayashi, Hiroshi; Sakurai, Naoki; Watanabe, Kiyoshi; Yamaga, Yoshiro; Ono, Takahiro

    2017-11-01

    The aim of the present study was to clarify differences between personal computer (PC)/mobile device combination and PC-only user patterns. We analyzed access frequency and time spent on a complete denture preclinical website in order to maximize website effectiveness. Fourth-year undergraduate students (N=41) in the preclinical complete denture laboratory course were invited to participate in this survey during the final week of the course to track login data. Students accessed video demonstrations and quizzes via our e-learning site/course program, and were instructed to view online demonstrations before classes. When the course concluded, participating students filled out a questionnaire about the program, their opinions, and devices they had used to access the site. Combination user access was significantly more frequent than PC-only during supplementary learning time, indicating that students with mobile devices studied during lunch breaks and before morning classes. Most students had favorable opinions of the e-learning site, but a few combination users commented that some videos were too long and that descriptive answers were difficult on smartphones. These results imply that mobile devices' increased accessibility encouraged learning by enabling more efficient time use between classes. They also suggest that e-learning system improvements should cater to mobile device users by reducing video length and including more short-answer questions. © 2016 John Wiley & Sons Australia, Ltd.

  7. 42 CFR 493.1443 - Standard; Laboratory director qualifications.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Standard; Laboratory director qualifications. 493.1443 Section 493.1443 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND... Testing Laboratories Performing High Complexity Testing § 493.1443 Standard; Laboratory director...

  8. 42 CFR 493.1443 - Standard; Laboratory director qualifications.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Standard; Laboratory director qualifications. 493.1443 Section 493.1443 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND... Testing Laboratories Performing High Complexity Testing § 493.1443 Standard; Laboratory director...

  9. 42 CFR 493.1445 - Standard; Laboratory director responsibilities.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Standard; Laboratory director responsibilities. 493.1445 Section 493.1445 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND... Testing Laboratories Performing High Complexity Testing § 493.1445 Standard; Laboratory director...

  10. 42 CFR 493.1407 - Standard; Laboratory director responsibilities.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Standard; Laboratory director responsibilities. 493.1407 Section 493.1407 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND... Testing Laboratories Performing Moderate Complexity Testing § 493.1407 Standard; Laboratory director...

  11. Modeling Laboratory Astrophysics Experiments using the CRASH code

    NASA Astrophysics Data System (ADS)

    Trantham, Matthew; Drake, R. P.; Grosskopf, Michael; Bauerle, Matthew; Kruanz, Carolyn; Keiter, Paul; Malamud, Guy; Crash Team

    2013-10-01

    The understanding of high energy density systems can be advanced by laboratory astrophysics experiments. Computer simulations can assist in the design and analysis of these experiments. The Center for Radiative Shock Hydrodynamics (CRASH) at the University of Michigan developed a code that has been used to design and analyze high-energy-density experiments on OMEGA, NIF, and other large laser facilities. This Eulerian code uses block-adaptive mesh refinement (AMR) with implicit multigroup radiation transport and electron heat conduction. This poster/talk will demonstrate some of the experiments the CRASH code has helped design or analyze including: Radiative shocks experiments, Kelvin-Helmholtz experiments, Rayleigh-Taylor experiments, plasma sheet, and interacting jets experiments. This work is funded by the Predictive Sciences Academic Alliances Program in NNSA-ASC via grant DEFC52- 08NA28616, by the NNSA-DS and SC-OFES Joint Program in High-Energy-Density Laboratory Plasmas, grant number DE-FG52-09NA29548, and by the National Laser User Facility Program, grant number DE-NA0000850.

  12. A Cloud Computing Based Patient Centric Medical Information System

    NASA Astrophysics Data System (ADS)

    Agarwal, Ankur; Henehan, Nathan; Somashekarappa, Vivek; Pandya, A. S.; Kalva, Hari; Furht, Borko

    This chapter discusses an emerging concept of a cloud computing based Patient Centric Medical Information System framework that will allow various authorized users to securely access patient records from various Care Delivery Organizations (CDOs) such as hospitals, urgent care centers, doctors, laboratories, imaging centers among others, from any location. Such a system must seamlessly integrate all patient records including images such as CT-SCANS and MRI'S which can easily be accessed from any location and reviewed by any authorized user. In such a scenario the storage and transmission of medical records will have be conducted in a totally secure and safe environment with a very high standard of data integrity, protecting patient privacy and complying with all Health Insurance Portability and Accountability Act (HIPAA) regulations.

  13. Simulations of Laboratory Astrophysics Experiments using the CRASH code

    NASA Astrophysics Data System (ADS)

    Trantham, Matthew; Kuranz, Carolyn; Fein, Jeff; Wan, Willow; Young, Rachel; Keiter, Paul; Drake, R. Paul

    2015-11-01

    Computer simulations can assist in the design and analysis of laboratory astrophysics experiments. The Center for Radiative Shock Hydrodynamics (CRASH) at the University of Michigan developed a code that has been used to design and analyze high-energy-density experiments on OMEGA, NIF, and other large laser facilities. This Eulerian code uses block-adaptive mesh refinement (AMR) with implicit multigroup radiation transport, electron heat conduction and laser ray tracing. This poster will demonstrate some of the experiments the CRASH code has helped design or analyze including: Kelvin-Helmholtz, Rayleigh-Taylor, magnetized flows, jets, and laser-produced plasmas. This work is funded by the following grants: DEFC52-08NA28616, DE-NA0001840, and DE-NA0002032.

  14. Visual interaction: models, systems, prototypes. The Pictorial Computing Laboratory at the University of Rome La Sapienza.

    PubMed

    Bottoni, Paolo; Cinque, Luigi; De Marsico, Maria; Levialdi, Stefano; Panizzi, Emanuele

    2006-06-01

    This paper reports on the research activities performed by the Pictorial Computing Laboratory at the University of Rome, La Sapienza, during the last 5 years. Such work, essentially is based on the study of humancomputer interaction, spans from metamodels of interaction down to prototypes of interactive systems for both synchronous multimedia communication and groupwork, annotation systems for web pages, also encompassing theoretical and practical issues of visual languages and environments also including pattern recognition algorithms. Some applications are also considered like e-learning and collaborative work.

  15. Risk factors for computer visual syndrome (CVS) among operators of two call centers in São Paulo, Brazil.

    PubMed

    Sa, Eduardo Costa; Ferreira Junior, Mario; Rocha, Lys Esther

    2012-01-01

    The aims of this study were to investigate work conditions, to estimate the prevalence and to describe risk factors associated with Computer Vision Syndrome among two call centers' operators in São Paulo (n = 476). The methods include a quantitative cross-sectional observational study and an ergonomic work analysis, using work observation, interviews and questionnaires. The case definition was the presence of one or more specific ocular symptoms answered as always, often or sometimes. The multiple logistic regression model, were created using the stepwise forward likelihood method and remained the variables with levels below 5% (p < 0.05). The operators were mainly female and young (from 15 to 24 years old). The call center was opened 24 hours and the operators weekly hours were 36 hours with break time from 21 to 35 minutes per day. The symptoms reported were eye fatigue (73.9%), "weight" in the eyes (68.2%), "burning" eyes (54.6%), tearing (43.9%) and weakening of vision (43.5%). The prevalence of Computer Vision Syndrome was 54.6%. Associations verified were: being female (OR 2.6, 95% CI 1.6 to 4.1), lack of recognition at work (OR 1.4, 95% CI 1.1 to 1.8), organization of work in call center (OR 1.4, 95% CI 1.1 to 1.7) and high demand at work (OR 1.1, 95% CI 1.0 to 1.3). The organization and psychosocial factors at work should be included in prevention programs of visual syndrome among call centers' operators.

  16. Laboratories for Teaching of Mathematical Subjects

    ERIC Educational Resources Information Center

    Berežný, Štefan

    2017-01-01

    We have adapted our two laboratories at our department based on our research results, which were presented at the conference CADGME 2014 in Halle and published in the journal. In this article we describe the hardware and software structure of the Laboratory 1: LabIT4KT-1: Laboratory of Computer Modelling and the Laboratory 2: LabIT4KT-2:…

  17. Comparison of Mars Science Laboratory Reaction Control System Jet Computations With Flow Visualization and Velocimetry

    NASA Technical Reports Server (NTRS)

    Bathel, Brett F.; Danehy, Paul M.; Johansen, Craig T.; Ashcraft, Scott W.; Novak, Luke A.

    2013-01-01

    Numerical predictions of the Mars Science Laboratory reaction control system jets interacting with a Mach 10 hypersonic flow are compared to experimental nitric oxide planar laser-induced fluorescence data. The steady Reynolds Averaged Navier Stokes equations using the Baldwin-Barth one-equation turbulence model were solved using the OVERFLOW code. The experimental fluorescence data used for comparison consists of qualitative two-dimensional visualization images, qualitative reconstructed three-dimensional flow structures, and quantitative two-dimensional distributions of streamwise velocity. Through modeling of the fluorescence signal equation, computational flow images were produced and directly compared to the qualitative fluorescence data.

  18. Air Force Weapons Laboratory Computational Requirements for 1976 Through 1980

    DTIC Science & Technology

    1976-01-01

    Air Force Weapons Laboratory , Attn: DYS, Kirtland AFB, NM 87117...final report was prepared by the Air Force Weapons Laboratory , Kirtland Air Force Base, New Mexico under Job Order 06CB. Dr. Clifford E. Rhoades, Jr... Force Base, New Mexico 87117 62601F, 06CB II. CONTROLLING OFFICE NAME AND ADDRESS Ai"- Force Weapons Laboratory / Jan 1076 Kirtland Air Force Base,

  19. Tu-144LL SST Flying Laboratory on Taxiway at Zhukovsky Air Development Center near Moscow, Russia

    NASA Technical Reports Server (NTRS)

    1998-01-01

    The sleek lines of the Tupolev Tu-144LL are evident as it sits on the taxiway at the Zhukovsky Air Development Center near Moscow, Russia. NASA teamed with American and Russian aerospace industries for an extended period in a joint international research program featuring the Russian-built Tu-144LL supersonic aircraft. The object of the program was to develop technologies for a proposed future second-generation supersonic airliner to be developed in the 21st Century. The aircraft's initial flight phase began in June 1996 and concluded in February 1998 after 19 research flights. A shorter follow-on program involving seven flights began in September 1998 and concluded in April 1999. All flights were conducted in Russia from Tupolev's facility at the Zhukovsky Air Development Center near Moscow. The centerpiece of the research program was the Tu 144LL, a first-generation Russian supersonic jetliner that was modified by its developer/builder, Tupolev ANTK (aviatsionnyy nauchno-tekhnicheskiy kompleks-roughly, aviation technical complex), into a flying laboratory for supersonic research. Using the Tu-144LL to conduct flight research experiments, researchers compared full-scale supersonic aircraft flight data with results from models in wind tunnels, computer-aided techniques, and other flight tests. The experiments provided unique aerodynamic, structures, acoustics, and operating environment data on supersonic passenger aircraft. Data collected from the research program was being used to develop the technology base for a proposed future American-built supersonic jetliner. Although actual development of such an advanced supersonic transport (SST) is currently on hold, commercial aviation experts estimate that a market for up to 500 such aircraft could develop by the third decade of the 21st Century. The Tu-144LL used in the NASA-sponsored research program was a 'D' model with different engines than were used in production-model aircraft. Fifty experiments were proposed

  20. Purgeable organic compounds at or near the Idaho Nuclear Technology and Engineering Center, Idaho National Laboratory, Idaho, 2015

    USGS Publications Warehouse

    Maimer, Neil V.; Bartholomay, Roy C.

    2016-05-25

    During 2015, the U.S. Geological Survey, in cooperation with the U.S. Department of Energy, collected groundwater samples from 31 wells at or near the Idaho Nuclear Technology and Engineering Center (INTEC) at the Idaho National Laboratory for purgeable organic compounds (POCs). The samples were collected and analyzed for the purpose of evaluating whether purge water from wells located inside an areal polygon established downgradient of the INTEC must be treated as a Resource Conservation and Recovery Act listed waste.POC concentrations in water samples from 29 of 31 wells completed in the eastern Snake River Plain aquifer were greater than their detection limit, determined from detection and quantitation calculation software, for at least one to four POCs. Of the 29 wells with concentrations greater than their detection limits, only 20 had concentrations greater than the laboratory reporting limit as calculated with detection and quantitation calculation software. None of the concentrations exceeded any maximum contaminant levels established for public drinking water supplies. Most commonly detected compounds were 1,1,1-trichoroethane, 1,1-dichloroethene, and trichloroethene.

  1. Analog Computer Laboratory with Biological Examples.

    ERIC Educational Resources Information Center

    Strebel, Donald E.

    1979-01-01

    The use of biological examples in teaching applications of the analog computer is discussed and several examples from mathematical ecology, enzyme kinetics, and tracer dynamics are described. (Author/GA)

  2. The Use and Benefits of Computer Aided Learning in the Assessment of the Laboratory Exercise "Enzyme Induction in Escherichia coli".

    ERIC Educational Resources Information Center

    Pamula, F.; And Others

    1995-01-01

    Describes an interactive computer program written to provide accurate and immediate feedback to students while they are processing experimental data. Discusses the problems inherent in laboratory courses that led to the development of this program. Advantages of the software include allowing students to work at their own pace in a nonthreatening…

  3. Computer Center: It's Time to Take Inventory.

    ERIC Educational Resources Information Center

    Spain, James D.

    1984-01-01

    Describes typical instructional applications of computers. Areas considered include: (1) instructional simulations and animations; (2) data analysis; (3) drill and practice; (4) student evaluation; (5) development of computer models and simulations; (6) biometrics or biostatistics; and (7) direct data acquisition and analysis. (JN)

  4. Natural and laboratory compaction bands in porous carbonates: a three-dimensional characterization using synchrotron X-ray computed microtomography

    NASA Astrophysics Data System (ADS)

    Cilona, A.; Arzilli, F.; Mancini, L.; Emanuele, T.

    2014-12-01

    Porous carbonates form important reservoirs for water and hydrocarbons. The fluid flow properties of carbonate reservoirs may be affected by post-depositional processes (e.g., mechanical and chemical), which need to be quantified. Field-based studies described bed-parallel compaction bands (CBs) within carbonates with a wide range of porosities. These burial-related structures accommodate volumetric strain by grain rotation, translation, pore collapse and pressure solution. Recently, the same structures have been reproduced for the first time in the laboratory by performing triaxial compaction experiments on porous grainstones. These laboratory studies characterized and compared the microstructures of natural and laboratory CBs, but no analysis of pore connectivity has been performed. In this paper, we use an innovative approach to characterize the pore networks (e.g. porosity, connectivity) of natural and laboratory CBs and compare them with the host rock one. We collected the data using the synchrotron X-ray computed microtomography technique at the SYRMEP beamline of the Elettra-Sincrotrone Trieste Laboratory (Italy). Quantitative analyses of the samples were performed with the Pore3D software library. The porosity was calculated from segmented 3D images of pristine and deformed carbonates. A process of skeletonization was then applied to quantify the number of connected pores within the rock volume. The analysis of the skeleton allowed us to highlight the differences between natural and laboratory CBs, and to investigate how pore connectivity evolves as a function of different deformation pathways. Both pore volume and connectivity are reduced within the CBs respect to the pristine rock and the natural CB has a lower porosity with respect to the laboratory one. The grain contacts in the natural CB are welded, whereas in the laboratory one they have more irregular shapes and grain crushing is the predominant process.

  5. An Analysis of Cloud Computing with Amazon Web Services for the Atmospheric Science Data Center

    NASA Astrophysics Data System (ADS)

    Gleason, J. L.; Little, M. M.

    2013-12-01

    NASA science and engineering efforts rely heavily on compute and data handling systems. The nature of NASA science data is such that it is not restricted to NASA users, instead it is widely shared across a globally distributed user community including scientists, educators, policy decision makers, and the public. Therefore NASA science computing is a candidate use case for cloud computing where compute resources are outsourced to an external vendor. Amazon Web Services (AWS) is a commercial cloud computing service developed to use excess computing capacity at Amazon, and potentially provides an alternative to costly and potentially underutilized dedicated acquisitions whenever NASA scientists or engineers require additional data processing. AWS desires to provide a simplified avenue for NASA scientists and researchers to share large, complex data sets with external partners and the public. AWS has been extensively used by JPL for a wide range of computing needs and was previously tested on a NASA Agency basis during the Nebula testing program. Its ability to support the Langley Science Directorate needs to be evaluated by integrating it with real world operational needs across NASA and the associated maturity that would come with that. The strengths and weaknesses of this architecture and its ability to support general science and engineering applications has been demonstrated during the previous testing. The Langley Office of the Chief Information Officer in partnership with the Atmospheric Sciences Data Center (ASDC) has established a pilot business interface to utilize AWS cloud computing resources on a organization and project level pay per use model. This poster discusses an effort to evaluate the feasibility of the pilot business interface from a project level perspective by specifically using a processing scenario involving the Clouds and Earth's Radiant Energy System (CERES) project.

  6. PNNL streamlines energy-guzzling computers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beckman, Mary T.; Marquez, Andres

    In a room the size of a garage, two rows of six-foot-tall racks holding supercomputer hard drives sit back-to-back. Thin tubes and wires snake off the hard drives, slithering into the corners. Stepping between the rows, a rush of heat whips around you -- the air from fans blowing off processing heat. But walk farther in, between the next racks of hard drives, and the temperature drops noticeably. These drives are being cooled by a non-conducting liquid that runs right over the hardworking processors. The liquid carries the heat away in tubes, saving the air a few degrees. This ismore » the Energy Smart Data Center at Pacific Northwest National Laboratory. The bigger, faster, and meatier supercomputers get, the more energy they consume. PNNL's Andres Marquez has developed this test bed to learn how to train the behemoths in energy efficiency. The work will help supercomputers perform better as well. Processors have to keep cool or suffer from "thermal throttling," says Marquez. "That's the performance threshold where the computer is too hot to run well. That threshold is an industry secret." The center at EMSL, DOE's national scientific user facility at PNNL, harbors several ways of experimenting with energy usage. For example, the room's air conditioning is isolated from the rest of EMSL -- pipes running beneath the floor carry temperature-controlled water through heat exchangers to cooling towers outside. "We can test whether it's more energy efficient to cool directly on the processing chips or out in the water tower," says Marquez. The hard drives feed energy and temperature data to a network server running specially designed software that controls and monitors the data center. To test the center’s limits, the team runs the processors flat out – not only on carefully controlled test programs in the Energy Smart computers, but also on real world software from other EMSL research, such as regional weather forecasting models. Marquez's group is also

  7. Application Modernization at LLNL and the Sierra Center of Excellence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neely, J. Robert; de Supinski, Bronis R.

    We repport that in 2014, Lawrence Livermore National Laboratory began acquisition of Sierra, a pre-exascale system from IBM and Nvidia. It marks a significant shift in direction for LLNL by introducing the concept of heterogeneous computing via GPUs. LLNL’s mission requires application teams to prepare for this paradigm shift. Thus, the Sierra procurement required a proposed Center of Excellence that would align the expertise of the chosen vendors with laboratory personnel that represent the application developers, system software, and tool providers in a concentrated effort to prepare the laboratory’s codes in advance of the system transitioning to production in 2018.more » Finally, this article presents LLNL’s overall application strategy, with a focus on how LLNL is collaborating with IBM and Nvidia to ensure a successful transition of its mission-oriented applications into the exascale era.« less

  8. Application Modernization at LLNL and the Sierra Center of Excellence

    DOE PAGES

    Neely, J. Robert; de Supinski, Bronis R.

    2017-09-01

    We repport that in 2014, Lawrence Livermore National Laboratory began acquisition of Sierra, a pre-exascale system from IBM and Nvidia. It marks a significant shift in direction for LLNL by introducing the concept of heterogeneous computing via GPUs. LLNL’s mission requires application teams to prepare for this paradigm shift. Thus, the Sierra procurement required a proposed Center of Excellence that would align the expertise of the chosen vendors with laboratory personnel that represent the application developers, system software, and tool providers in a concentrated effort to prepare the laboratory’s codes in advance of the system transitioning to production in 2018.more » Finally, this article presents LLNL’s overall application strategy, with a focus on how LLNL is collaborating with IBM and Nvidia to ensure a successful transition of its mission-oriented applications into the exascale era.« less

  9. Changing the batch system in a Tier 1 computing center: why and how

    NASA Astrophysics Data System (ADS)

    Chierici, Andrea; Dal Pra, Stefano

    2014-06-01

    At the Italian Tierl Center at CNAF we are evaluating the possibility to change the current production batch system. This activity is motivated mainly because we are looking for a more flexible licensing model as well as to avoid vendor lock-in. We performed a technology tracking exercise and among many possible solutions we chose to evaluate Grid Engine as an alternative because its adoption is increasing in the HEPiX community and because it's supported by the EMI middleware that we currently use on our computing farm. Another INFN site evaluated Slurm and we will compare our results in order to understand pros and cons of the two solutions. We will present the results of our evaluation of Grid Engine, in order to understand if it can fit the requirements of a Tier 1 center, compared to the solution we adopted long ago. We performed a survey and a critical re-evaluation of our farming infrastructure: many production softwares (accounting and monitoring on top of all) rely on our current solution and changing it required us to write new wrappers and adapt the infrastructure to the new system. We believe the results of this investigation can be very useful to other Tier-ls and Tier-2s centers in a similar situation, where the effort of switching may appear too hard to stand. We will provide guidelines in order to understand how difficult this operation can be and how long the change may take.

  10. Characteristics of the Navy Laboratory Warfare Center Technical Workforce

    DTIC Science & Technology

    2013-09-29

    Mathematics and Information Science (M&IS) Actuarial Science 1510 Computer Science 1550 Gen. Math & Statistics 1501 Mathematics 1520 Operations...Admin. Network Systems & Data Communication Analysts Actuaries Mathematicians Operations Research Analyst Statisticians Social Science (SS...workforce was sub-divided into six broad occupational groups: Life Science , Physical Science , Engineering, Mathematics, Computer Science and Information

  11. Human factors in computing systems: focus on patient-centered health communication at the ACM SIGCHI conference.

    PubMed

    Wilcox, Lauren; Patel, Rupa; Chen, Yunan; Shachak, Aviv

    2013-12-01

    Health Information Technologies, such as electronic health records (EHR) and secure messaging, have already transformed interactions among patients and clinicians. In addition, technologies supporting asynchronous communication outside of clinical encounters, such as email, SMS, and patient portals, are being increasingly used for follow-up, education, and data reporting. Meanwhile, patients are increasingly adopting personal tools to track various aspects of health status and therapeutic progress, wishing to review these data with clinicians during consultations. These issues have drawn increasing interest from the human-computer interaction (HCI) community, with special focus on critical challenges in patient-centered interactions and design opportunities that can address these challenges. We saw this community presenting and interacting at the ACM SIGCHI 2013, Conference on Human Factors in Computing Systems, (also known as CHI), held April 27-May 2nd, 2013 at the Palais de Congrès de Paris in France. CHI 2013 featured many formal avenues to pursue patient-centered health communication: a well-attended workshop, tracks of original research, and a lively panel discussion. In this report, we highlight these events and the main themes we identified. We hope that it will help bring the health care communication and the HCI communities closer together. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  12. Powder X-ray diffraction laboratory, Reston, Virginia

    USGS Publications Warehouse

    Piatak, Nadine M.; Dulong, Frank T.; Jackson, John C.; Folger, Helen W.

    2014-01-01

    The powder x-ray diffraction (XRD) laboratory is managed jointly by the Eastern Mineral and Environmental Resources and Eastern Energy Resources Science Centers. Laboratory scientists collaborate on a wide variety of research problems involving other U.S. Geological Survey (USGS) science centers and government agencies, universities, and industry. Capabilities include identification and quantification of crystalline and amorphous phases, and crystallographic and atomic structure analysis for a wide variety of sample media. Customized laboratory procedures and analyses commonly are used to characterize non-routine samples including, but not limited to, organic and inorganic components in petroleum source rocks, ore and mine waste, clay minerals, and glassy phases. Procedures can be adapted to meet a variety of research objectives.

  13. Optimized resolved rate control of seven-degree-of-freedom Laboratory Telerobotic Manipulator (LTM) with application to three-dimensional graphics simulation

    NASA Technical Reports Server (NTRS)

    Barker, L. Keith; Mckinney, William S., Jr.

    1989-01-01

    The Laboratory Telerobotic Manipulator (LTM) is a seven-degree-of-freedom robot arm. Two of the arms were delivered to Langley Research Center for ground-based research to assess the use of redundant degree-of-freedom robot arms in space operations. Resolved-rate control equations for the LTM are derived. The equations are based on a scheme developed at the Oak Ridge National Laboratory for computing optimized joint angle rates in real time. The optimized joint angle rates actually represent a trade-off, as the hand moves, between small rates (least-squares solution) and those rates which work toward satisfying a specified performance criterion of joint angles. In singularities where the optimization scheme cannot be applied, alternate control equations are devised. The equations developed were evaluated using a real-time computer simulation to control a 3-D graphics model of the LTM.

  14. University of Rochester, Laboratory for Laser Energetics

    NASA Astrophysics Data System (ADS)

    1987-01-01

    In FY86 the Laboratory has produced a list of accomplishments in which it takes pride. LLE has met every laser-fusion program milestone to date in a program of research for direct-drive ultraviolet laser fusion originally formulated in 1981. LLE scientists authored or co-authored 135 scientific papers during 1985 to 1986. The collaborative experiments with NRL, LANL, and LLNL have led to a number of important ICF results. The cryogenic target system developed by KMS Fusion for LLE will be used in future high-density experiments on OMEGA to demonstrate the compression of thermonuclear fuel to 100 to 200 times that of solid (20 to 40 g/cm) in a test of the direct-drive concept, as noted in the National Academy of Sciences' report. The excellence of the advanced technology efforts at LLE is illustrated by the establishment of the Ultrafast Science Center by the Department of Defense through the Air Force Office of Scientific Research. Research in the Center will concentrate on bridging the gap between high-speed electronics and ultrafast optics by providing education, research, and development in areas critical to future communications and high-speed computer systems. The Laboratory for Laser Energetics continues its pioneering work on the interaction of intense radiation with matter. This includes inertial-fusion and advanced optical and optical electronics research; training people in the technology and applications of high-power, short-pulse lasers; and interacting with the scientific community, business, industry, and government to promote the growth of laser technology.

  15. Laboratory and Field Investigations of Small Crater Repair Technologies

    DTIC Science & Technology

    2007-09-01

    caps over debris backfill or specially placed or compacted backfill, structural systems to bridge craters, foamed crater backfills, and structural ...Jeb S. Tingle, and Timothy J. McCaffrey Geotechnical and Structures Laboratory U.S. Army Engineer Research and Development Center 3909 Halls Ferry...Engineer Research and Development Center (ERDC), Geotechnical and Structures Laboratory (GSL), Vicksburg, MS. The findings and recommendations presented

  16. ANL statement of site strategy for computing workstations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fenske, K.R.; Boxberger, L.M.; Amiot, L.W.

    1991-11-01

    This Statement of Site Strategy describes the procedure at Argonne National Laboratory for defining, acquiring, using, and evaluating scientific and office workstations and related equipment and software in accord with DOE Order 1360.1A (5-30-85), and Laboratory policy. It is Laboratory policy to promote the installation and use of computing workstations to improve productivity and communications for both programmatic and support personnel, to ensure that computing workstations acquisitions meet the expressed need in a cost-effective manner, and to ensure that acquisitions of computing workstations are in accord with Laboratory and DOE policies. The overall computing site strategy at ANL is tomore » develop a hierarchy of integrated computing system resources to address the current and future computing needs of the laboratory. The major system components of this hierarchical strategy are: Supercomputers, Parallel computers, Centralized general purpose computers, Distributed multipurpose minicomputers, and Computing workstations and office automation support systems. Computing workstations include personal computers, scientific and engineering workstations, computer terminals, microcomputers, word processing and office automation electronic workstations, and associated software and peripheral devices costing less than $25,000 per item.« less

  17. Guidelines for biosafety laboratory competency: CDC and the Association of Public Health Laboratories.

    PubMed

    Delany, Judy R; Pentella, Michael A; Rodriguez, Joyce A; Shah, Kajari V; Baxley, Karen P; Holmes, David E

    2011-04-15

    These guidelines for biosafety laboratory competency outline the essential skills, knowledge, and abilities required for working with biologic agents at the three highest biosafety levels (BSLs) (levels 2, 3, and 4). The competencies are tiered to a worker's experience at three levels: entry level, midlevel (experienced), and senior level (supervisory or managerial positions). These guidelines were developed on behalf of CDC and the Association of Public Health Laboratories (APHL) by an expert panel comprising 27 experts representing state and federal public health laboratories, private sector clinical and research laboratories, and academic centers. They were then reviewed by approximately 300 practitioners representing the relevant fields. The guidelines are intended for laboratorians working with hazardous biologic agents, obtained from either samples or specimens that are maintained and manipulated in clinical, environmental, public health, academic, and research laboratories.

  18. Evaluating the distance between the femoral tunnel centers in anatomic double-bundle anterior cruciate ligament reconstruction using a computer simulation

    PubMed Central

    Tashiro, Yasutaka; Okazaki, Ken; Iwamoto, Yukihide

    2015-01-01

    Purpose We aimed to clarify the distance between the anteromedial (AM) bundle and posterolateral (PL) bundle tunnel-aperture centers by simulating the anatomical femoral tunnel placement during double-bundle anterior cruciate ligament reconstruction using 3-D computer-aided design models of the knee, in order to discuss the risk of tunnel overlap. Relationships between the AM to PL center distance, body height, and sex difference were also analyzed. Patients and methods The positions of the AM and PL tunnel centers were defined based on previous studies using the quadrant method, and were superimposed anatomically onto the 3-D computer-aided design knee models from 68 intact femurs. The distance between the tunnel centers was measured using the 3-D DICOM software package. The correlation between the AM–PL distance and the subject’s body height was assessed, and a cutoff height value for a higher risk of overlap of the AM and PL tunnel apertures was identified. Results The distance between the AM and PL centers was 10.2±0.6 mm in males and 9.4±0.5 mm in females (P<0.01). The AM–PL center distance demonstrated good correlation with body height in both males (r=0.66, P<0.01) and females (r=0.63, P<0.01). When 9 mm was defined as the critical distance between the tunnel centers to preserve a 2 mm bony bridge between the two tunnels, the cutoff value was calculated to be a height of 160 cm in males and 155 cm in females. Conclusion When AM and PL tunnels were placed anatomically in simulated double-bundle anterior cruciate ligament reconstruction, the distance between the two tunnel centers showed a strong positive correlation with body height. In cases with relatively short stature, the AM and PL tunnel apertures are considered to be at a higher risk of overlap when surgeons choose the double-bundle technique. PMID:26170727

  19. The Virtual Geophysics Laboratory (VGL): Scientific Workflows Operating Across Organizations and Across Infrastructures

    NASA Astrophysics Data System (ADS)

    Cox, S. J.; Wyborn, L. A.; Fraser, R.; Rankine, T.; Woodcock, R.; Vote, J.; Evans, B.

    2012-12-01

    The Virtual Geophysics Laboratory (VGL) is web portal that provides geoscientists with an integrated online environment that: seamlessly accesses geophysical and geoscience data services from the AuScope national geoscience information infrastructure; loosely couples these data to a variety of gesocience software tools; and provides large scale processing facilities via cloud computing. VGL is a collaboration between CSIRO, Geoscience Australia, National Computational Infrastructure, Monash University, Australian National University and the University of Queensland. The VGL provides a distributed system whereby a user can enter an online virtual laboratory to seamlessly connect to OGC web services for geoscience data. The data is supplied in open standards formats using international standards like GeoSciML. A VGL user uses a web mapping interface to discover and filter the data sources using spatial and attribute filters to define a subset. Once the data is selected the user is not required to download the data. VGL collates the service query information for later in the processing workflow where it will be staged directly to the computing facilities. The combination of deferring data download and access to Cloud computing enables VGL users to access their data at higher resolutions and to undertake larger scale inversions, more complex models and simulations than their own local computing facilities might allow. Inside the Virtual Geophysics Laboratory, the user has access to a library of existing models, complete with exemplar workflows for specific scientific problems based on those models. For example, the user can load a geological model published by Geoscience Australia, apply a basic deformation workflow provided by a CSIRO scientist, and have it run in a scientific code from Monash. Finally the user can publish these results to share with a colleague or cite in a paper. This opens new opportunities for access and collaboration as all the resources (models

  20. A Framework for CS1 Closed Laboratories

    ERIC Educational Resources Information Center

    Soh, Leen-Kiat; Samal, Ashok; Nugent, Gwen

    2005-01-01

    Closed laboratories are becoming an increasingly popular approach to teaching introductory computer science courses, as they facilitate structured problem-solving and cooperation. However, most closed laboratories have been designed and implemented without embedded instructional research components for constant evaluation of the laboratories'…