Laboratory Computing Resource Center
Systems Computing and Data Resources Purchasing Resources Future Plans For Users Getting Started Using LCRC Software Best Practices and Policies Getting Help Support Laboratory Computing Resource Center Laboratory Computing Resource Center Latest Announcements See All April 27, 2018, Announcements, John Low
Mathematics and Computer Science | Argonne National Laboratory
Genomics and Systems Biology LCRCLaboratory Computing Resource Center MCSGMidwest Center for Structural Genomics NAISENorthwestern-Argonne Institute of Science & Engineering SBCStructural Biology Center
Center for Computing Research Summer Research Proceedings 2015.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bradley, Andrew Michael; Parks, Michael L.
2015-12-18
The Center for Computing Research (CCR) at Sandia National Laboratories organizes a summer student program each summer, in coordination with the Computer Science Research Institute (CSRI) and Cyber Engineering Research Institute (CERI).
Jaschob, Daniel; Riffle, Michael
2012-07-30
Laboratories engaged in computational biology or bioinformatics frequently need to run lengthy, multistep, and user-driven computational jobs. Each job can tie up a computer for a few minutes to several days, and many laboratories lack the expertise or resources to build and maintain a dedicated computer cluster. JobCenter is a client-server application and framework for job management and distributed job execution. The client and server components are both written in Java and are cross-platform and relatively easy to install. All communication with the server is client-driven, which allows worker nodes to run anywhere (even behind external firewalls or "in the cloud") and provides inherent load balancing. Adding a worker node to the worker pool is as simple as dropping the JobCenter client files onto any computer and performing basic configuration, which provides tremendous ease-of-use, flexibility, and limitless horizontal scalability. Each worker installation may be independently configured, including the types of jobs it is able to run. Executed jobs may be written in any language and may include multistep workflows. JobCenter is a versatile and scalable distributed job management system that allows laboratories to very efficiently distribute all computational work among available resources. JobCenter is freely available at http://code.google.com/p/jobcenter/.
NASA Technical Reports Server (NTRS)
Boyle, W. G.; Barton, G. W.
1979-01-01
The feasibility of computerized automation of the Analytical Laboratories Section at NASA's Lewis Research Center was considered. Since that laboratory's duties are not routine, the automation goals were set with that in mind. Four instruments were selected as the most likely automation candidates: an atomic absorption spectrophotometer, an emission spectrometer, an X-ray fluorescence spectrometer, and an X-ray diffraction unit. Two options for computer automation were described: a time-shared central computer and a system with microcomputers for each instrument connected to a central computer. A third option, presented for future planning, expands the microcomputer version. Costs and benefits for each option were considered. It was concluded that the microcomputer version best fits the goals and duties of the laboratory and that such an automted system is needed to meet the laboratory's future requirements.
2012-01-01
Background Laboratories engaged in computational biology or bioinformatics frequently need to run lengthy, multistep, and user-driven computational jobs. Each job can tie up a computer for a few minutes to several days, and many laboratories lack the expertise or resources to build and maintain a dedicated computer cluster. Results JobCenter is a client–server application and framework for job management and distributed job execution. The client and server components are both written in Java and are cross-platform and relatively easy to install. All communication with the server is client-driven, which allows worker nodes to run anywhere (even behind external firewalls or “in the cloud”) and provides inherent load balancing. Adding a worker node to the worker pool is as simple as dropping the JobCenter client files onto any computer and performing basic configuration, which provides tremendous ease-of-use, flexibility, and limitless horizontal scalability. Each worker installation may be independently configured, including the types of jobs it is able to run. Executed jobs may be written in any language and may include multistep workflows. Conclusions JobCenter is a versatile and scalable distributed job management system that allows laboratories to very efficiently distribute all computational work among available resources. JobCenter is freely available at http://code.google.com/p/jobcenter/. PMID:22846423
3D Object Recognition: Symmetry and Virtual Views
1992-12-01
NAME(S) AND ADDRESS(ES) 8. PERFORMING ORGANIZATIONI Artificial Intelligence Laboratory REPORT NUMBER 545 Technology Square AIM 1409 Cambridge... ARTIFICIAL INTELLIGENCE LABORATORY and CENTER FOR BIOLOGICAL AND COMPUTATIONAL LEARNING A.I. Memo No. 1409 December 1992 C.B.C.L. Paper No. 76 3D Object...research done within the Center for Biological and Computational Learning in the Department of Brain and Cognitive Sciences, and at the Artificial
The NASA Lewis Research Center High Temperature Fatigue and Structures Laboratory
NASA Technical Reports Server (NTRS)
Mcgaw, M. A.; Bartolotta, P. A.
1987-01-01
The physical organization of the NASA Lewis Research Center High Temperature Fatigue and Structures Laboratory is described. Particular attention is given to uniaxial test systems, high cycle/low cycle testing systems, axial torsional test systems, computer system capabilities, and a laboratory addition. The proposed addition will double the floor area of the present laboratory and will be equipped with its own control room.
The Development of University Computing in Sweden 1965-1985
NASA Astrophysics Data System (ADS)
Dahlstrand, Ingemar
In 1965-70 the government agency, Statskontoret, set up five university computing centers, as service bureaux financed by grants earmarked for computer use. The centers were well equipped and staffed and caused a surge in computer use. When the yearly flow of grant money stagnated at 25 million Swedish crowns, the centers had to find external income to survive and acquire time-sharing. But the charging system led to the computers not being fully used. The computer scientists lacked equipment for laboratory use. The centers were decentralized and the earmarking abolished. Eventually they got new tasks like running computers owned by the departments, and serving the university administration.
Argonne Research Library | Argonne National Laboratory
Publications Researchers Postdocs Exascale Computing Institute for Molecular Engineering at Argonne Work with Scientific Publications Researchers Postdocs Exascale Computing Institute for Molecular Engineering at IMEInstitute for Molecular Engineering JCESRJoint Center for Energy Storage Research MCSGMidwest Center for
The Petascale Data Storage Institute
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gibson, Garth; Long, Darrell; Honeyman, Peter
2013-07-01
Petascale computing infrastructures for scientific discovery make petascale demands on information storage capacity, performance, concurrency, reliability, availability, and manageability.The Petascale Data Storage Institute focuses on the data storage problems found in petascale scientific computing environments, with special attention to community issues such as interoperability, community buy-in, and shared tools.The Petascale Data Storage Institute is a collaboration between researchers at Carnegie Mellon University, National Energy Research Scientific Computing Center, Pacific Northwest National Laboratory, Oak Ridge National Laboratory, Sandia National Laboratory, Los Alamos National Laboratory, University of Michigan, and the University of California at Santa Cruz.
Transportation Research and Analysis Computing Center (TRACC) Year 6 Quarter 4 Progress Report
DOT National Transportation Integrated Search
2013-03-01
Argonne National Laboratory initiated a FY2006-FY2009 multi-year program with the US Department of Transportation (USDOT) on October 1, 2006, to establish the Transportation Research and Analysis Computing Center (TRACC). As part of the TRACC project...
DOE Office of Scientific and Technical Information (OSTI.GOV)
De Conti, C.; Barbero, C.; Galeão, A. P.
In this work we compute the one-nucleon-induced nonmesonic hypernuclear decay rates of {sub Λ}{sup 5}He, {sub Λ}{sup 12}C and {sub Λ}{sup 13}C using a formalism based on the independent particle shell model in terms of laboratory coordinates. To ascertain the correctness and precision of the method, these results are compared with those obtained using a formalism in terms of center-of-mass coordinates, which has been previously reported in the literature. The formalism in terms of laboratory coordinates will be useful in the shell-model approach to two-nucleon-induced transitions.
National Wind Tecnology Center Provides Dual Axis Resonant Blade Testing
Felker, Fort
2018-01-16
NREL's Structural Testing Laboratory at the National Wind Technology Center (NWTC) provides experimental laboratories, computer facilities for analytical work, space for assembling components and turbines for atmospheric testing as well as office space for industry researchers. Fort Felker, center director at the NWTC, discusses NREL's state-of-the-art structural testing capabilities and shows a flapwise and edgewise blade test in progress.
National Wind Tecnology Center Provides Dual Axis Resonant Blade Testing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Felker, Fort
2013-11-13
NREL's Structural Testing Laboratory at the National Wind Technology Center (NWTC) provides experimental laboratories, computer facilities for analytical work, space for assembling components and turbines for atmospheric testing as well as office space for industry researchers. Fort Felker, center director at the NWTC, discusses NREL's state-of-the-art structural testing capabilities and shows a flapwise and edgewise blade test in progress.
ERIC Educational Resources Information Center
Cottrell, William B.; And Others
The Nuclear Safety Information Center (NSIC) is a highly sophisticated scientific information center operated at Oak Ridge National Laboratory (ORNL) for the U.S. Atomic Energy Commission. Its information file, which consists of both data and bibliographic information, is computer stored and numerous programs have been developed to facilitate the…
Utilization of Educationally Oriented Microcomputer Based Laboratories
ERIC Educational Resources Information Center
Fitzpatrick, Michael J.; Howard, James A.
1977-01-01
Describes one approach to supplying engineering and computer science educators with an economical portable digital systems laboratory centered around microprocessors. Expansion of the microcomputer based laboratory concept to include Learning Resource Aided Instruction (LRAI) systems is explored. (Author)
Joint the Center for Applied Scientific Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gamblin, Todd; Bremer, Timo; Van Essen, Brian
The Center for Applied Scientific Computing serves as Livermore Lab’s window to the broader computer science, computational physics, applied mathematics, and data science research communities. In collaboration with academic, industrial, and other government laboratory partners, we conduct world-class scientific research and development on problems critical to national security. CASC applies the power of high-performance computing and the efficiency of modern computational methods to the realms of stockpile stewardship, cyber and energy security, and knowledge discovery for intelligence applications.
Report on Computing and Networking in the Space Science Laboratory by the SSL Computer Committee
NASA Technical Reports Server (NTRS)
Gallagher, D. L. (Editor)
1993-01-01
The Space Science Laboratory (SSL) at Marshall Space Flight Center is a multiprogram facility. Scientific research is conducted in four discipline areas: earth science and applications, solar-terrestrial physics, astrophysics, and microgravity science and applications. Representatives from each of these discipline areas participate in a Laboratory computer requirements committee, which developed this document. The purpose is to establish and discuss Laboratory objectives for computing and networking in support of science. The purpose is also to lay the foundation for a collective, multiprogram approach to providing these services. Special recognition is given to the importance of the national and international efforts of our research communities toward the development of interoperable, network-based computer applications.
Webinar: Delivering Transformational HPC Solutions to Industry
Streitz, Frederick
2018-01-16
Dr. Frederick Streitz, director of the High Performance Computing Innovation Center, discusses Lawrence Livermore National Laboratory computational capabilities and expertise available to industry in this webinar.
Institute for scientific computing research;fiscal year 1999 annual report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keyes, D
2000-03-28
Large-scale scientific computation, and all of the disciplines that support it and help to validate it, have been placed at the focus of Lawrence Livermore National Laboratory by the Accelerated Strategic Computing Initiative (ASCI). The Laboratory operates the computer with the highest peak performance in the world and has undertaken some of the largest and most compute-intensive simulations ever performed. Computers at the architectural extremes, however, are notoriously difficult to use efficiently. Even such successes as the Laboratory's two Bell Prizes awarded in November 1999 only emphasize the need for much better ways of interacting with the results of large-scalemore » simulations. Advances in scientific computing research have, therefore, never been more vital to the core missions of the Laboratory than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, the Laboratory must engage researchers at many academic centers of excellence. In FY 1999, the Institute for Scientific Computing Research (ISCR) has expanded the Laboratory's bridge to the academic community in the form of collaborative subcontracts, visiting faculty, student internships, a workshop, and a very active seminar series. ISCR research participants are integrated almost seamlessly with the Laboratory's Center for Applied Scientific Computing (CASC), which, in turn, addresses computational challenges arising throughout the Laboratory. Administratively, the ISCR flourishes under the Laboratory's University Relations Program (URP). Together with the other four Institutes of the URP, it must navigate a course that allows the Laboratory to benefit from academic exchanges while preserving national security. Although FY 1999 brought more than its share of challenges to the operation of an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and well worth the continued effort. A change of administration for the ISCR occurred during FY 1999. Acting Director John Fitzgerald retired from LLNL in August after 35 years of service, including the last two at helm of the ISCR. David Keyes, who has been a regular visitor in conjunction with ASCI scalable algorithms research since October 1997, overlapped with John for three months and serves half-time as the new Acting Director.« less
High performance computing for advanced modeling and simulation of materials
NASA Astrophysics Data System (ADS)
Wang, Jue; Gao, Fei; Vazquez-Poletti, Jose Luis; Li, Jianjiang
2017-02-01
The First International Workshop on High Performance Computing for Advanced Modeling and Simulation of Materials (HPCMS2015) was held in Austin, Texas, USA, Nov. 18, 2015. HPCMS 2015 was organized by Computer Network Information Center (Chinese Academy of Sciences), University of Michigan, Universidad Complutense de Madrid, University of Science and Technology Beijing, Pittsburgh Supercomputing Center, China Institute of Atomic Energy, and Ames Laboratory.
Computational Science News | Computational Science | NREL
-Cooled High-Performance Computing Technology at the ESIF February 28, 2018 NREL Launches New Website for High-Performance Computing System Users The National Renewable Energy Laboratory (NREL) Computational Science Center has launched a revamped website for users of the lab's high-performance computing (HPC
NASA Technical Reports Server (NTRS)
Davis, V. Leon; Nordeen, Ross
1988-01-01
A laboratory for developing robotics technology for hazardous and repetitive Shuttle and payload processing activities is discussed. An overview of the computer hardware and software responsible for integrating the laboratory systems is given. The center's anthropomorphic robot is placed on a track allowing it to be moved to different stations. Various aspects of the laboratory equipment are described, including industrial robot arm control, smart systems integration, the supervisory computer, programmable process controller, real-time tracking controller, image processing hardware, and control display graphics. Topics of research include: automated loading and unloading of hypergolics for space vehicles and payloads; the use of mobile robotics for security, fire fighting, and hazardous spill operations; nondestructive testing for SRB joint and seal verification; Shuttle Orbiter radiator damage inspection; and Orbiter contour measurements. The possibility of expanding the laboratory in the future is examined.
77 FR 38630 - Open Internet Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-28
... Computer Science and Co-Founder of the Berkman Center for Internet and Society, Harvard University, is... of Technology Computer Science and Artificial Intelligence Laboratory, is appointed vice-chairperson... Jennifer Rexford, Professor of Computer Science, Princeton University Dennis Roberson, Vice Provost...
1993-12-01
where negative charge state. The local symmetry of the Ge(I) and Ge(II) centers are CI and C2 respectively. (See also Fig. 1.) q=- 1 Ge(I) Ge(II) s p...Raymond Field: Dept. of Computer Science Dept, CEM. M•e s , PhD Laboratory: / 3200 Willow Creek Road zmbry-Riddle Aeronautical Univ Vol-Page No: 0- 0...Field: Electrical Engineering Assistant Professor, PhD Laboratory: PL/WS 2390 S . York Street University of Denver Vol-Page No: 3-35 Denver, CO 80209-0177
Saving Water at Los Alamos National Laboratory
Erickson, Andy
2018-01-16
Los Alamos National Laboratory decreased its water usage by 26 percent in 2014, with about one-third of the reduction attributable to using reclaimed water to cool a supercomputing center. The Laboratory's goal during 2014 was to use only re-purposed water to support the mission at the Strategic Computing Complex. Using reclaimed water from the Sanitary Effluent Reclamation Facility, or SERF, substantially decreased water usage and supported the overall mission. SERF collects industrial wastewater and treats it for reuse. The reclamation facility contributed more than 27 million gallons of re-purposed water to the Laboratory's computing center, a secured supercomputing facility that supports the Laboratoryâs national security mission and is one of the institutionâs larger water users. In addition to the strategic water reuse program at SERF, the Laboratory reduced water use in 2014 by focusing conservation efforts on areas that use the most water, upgrading to water-conserving fixtures, and repairing leaks identified in a biennial survey.
Rutkowski, Tomasz M
2015-08-01
This paper presents an applied concept of a brain-computer interface (BCI) student research laboratory (BCI-LAB) at the Life Science Center of TARA, University of Tsukuba, Japan. Several successful case studies of the student projects are reviewed together with the BCI Research Award 2014 winner case. The BCI-LAB design and project-based teaching philosophy is also explained. Future teaching and research directions summarize the review.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Muller, Richard P.
2017-07-01
Sandia National Laboratories has developed a broad set of capabilities in quantum information science (QIS), including elements of quantum computing, quantum communications, and quantum sensing. The Sandia QIS program is built atop unique DOE investments at the laboratories, including the MESA microelectronics fabrication facility, the Center for Integrated Nanotechnologies (CINT) facilities (joint with LANL), the Ion Beam Laboratory, and ASC High Performance Computing (HPC) facilities. Sandia has invested $75 M of LDRD funding over 12 years to develop unique, differentiating capabilities that leverage these DOE infrastructure investments.
ERIC Educational Resources Information Center
Buckley, Elizabeth; Johnston, Peter
In February 1977, computer assisted instruction (CAI) was introducted to the Great Neck Adult Learning Centers (GNALC) to promote greater cognitive and affective growth of educationally disadvantaged adults. The project expanded to include not only adult basic education (ABE) students studying in the learning laboratory, but also ABE students…
Origin of Marshall Space Flight Center (MSFC)
2004-04-15
Twelve scientific specialists of the Peenemuende team at the front of Building 4488, Redstone Arsenal, Huntsville, Alabama. They led the Army's space efforts at ABMA before transfer of the team to National Aeronautic and Space Administration (NASA), George C. Marshall Space Flight Center (MSFC). (Left to right) Dr. Ernst Stuhlinger, Director, Research Projects Office; Dr. Helmut Hoelzer, Director, Computation Laboratory: Karl L. Heimburg, Director, Test Laboratory; Dr. Ernst Geissler, Director, Aeroballistics Laboratory; Erich W. Neubert, Director, Systems Analysis Reliability Laboratory; Dr. Walter Haeussermarn, Director, Guidance and Control Laboratory; Dr. Wernher von Braun, Director Development Operations Division; William A. Mrazek, Director, Structures and Mechanics Laboratory; Hans Hueter, Director, System Support Equipment Laboratory;Eberhard Rees, Deputy Director, Development Operations Division; Dr. Kurt Debus, Director Missile Firing Laboratory; Hans H. Maus, Director, Fabrication and Assembly Engineering Laboratory
Saving Water at Los Alamos National Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Erickson, Andy
Los Alamos National Laboratory decreased its water usage by 26 percent in 2014, with about one-third of the reduction attributable to using reclaimed water to cool a supercomputing center. The Laboratory's goal during 2014 was to use only re-purposed water to support the mission at the Strategic Computing Complex. Using reclaimed water from the Sanitary Effluent Reclamation Facility, or SERF, substantially decreased water usage and supported the overall mission. SERF collects industrial wastewater and treats it for reuse. The reclamation facility contributed more than 27 million gallons of re-purposed water to the Laboratory's computing center, a secured supercomputing facility thatmore » supports the Laboratory’s national security mission and is one of the institution’s larger water users. In addition to the strategic water reuse program at SERF, the Laboratory reduced water use in 2014 by focusing conservation efforts on areas that use the most water, upgrading to water-conserving fixtures, and repairing leaks identified in a biennial survey.« less
Adiabatic Quantum Computation with Neutral Atoms
NASA Astrophysics Data System (ADS)
Biedermann, Grant
2013-03-01
We are implementing a new platform for adiabatic quantum computation (AQC)[2] based on trapped neutral atoms whose coupling is mediated by the dipole-dipole interactions of Rydberg states. Ground state cesium atoms are dressed by laser fields in a manner conditional on the Rydberg blockade mechanism,[3,4] thereby providing the requisite entangling interactions. As a benchmark we study a Quadratic Unconstrained Binary Optimization (QUBO) problem whose solution is found in the ground state spin configuration of an Ising-like model. In collaboration with Lambert Parazzoli, Sandia National Laboratories; Aaron Hankin, Center for Quantum Information and Control (CQuIC), University of New Mexico; James Chin-Wen Chou, Yuan-Yu Jau, Peter Schwindt, Cort Johnson, and George Burns, Sandia National Laboratories; Tyler Keating, Krittika Goyal, and Ivan Deutsch, Center for Quantum Information and Control (CQuIC), University of New Mexico; and Andrew Landahl, Sandia National Laboratories. This work was supported by the Laboratory Directed Research and Development program at Sandia National Laboratories
Programmer's Reference Manual for Dynamic Display Software System
DOT National Transportation Integrated Search
1971-01-01
In 1968, the display sysems group of the Systems Laboratory of the NASA/Electronics Research Center undertook a research task in the area of computer controlled flight information systems for aerospace application. The display laboratory of the Trans...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhattacharya, Papri; Prokopchuk, Demyan E.; Mock, Michael T.
2017-03-01
This review examines the synthesis and acid reactivity of transition metal dinitrogen complexes bearing diphosphine ligands containing pendant amine groups in the second coordination sphere. This manuscript is a review of the work performed in the Center for Molecular Electrocatalysis. This work was supported as part of the Center for Molecular Electrocatalysis, an Energy Frontier Research Center funded by the U.S. Department of Energy (U.S. DOE), Office of Science, Office of Basic Energy Sciences. EPR studies on Fe were performed using EMSL, a national scientific user facility sponsored by the DOE’s Office of Biological and Environmental Research and located atmore » PNNL. Computational resources were provided by the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory. Pacific Northwest National Laboratory is operated by Battelle for the U.S. DOE.« less
High Performance Computing Meets Energy Efficiency - Continuum Magazine |
NREL High Performance Computing Meets Energy Efficiency High Performance Computing Meets Energy turbines. Simulation by Patrick J. Moriarty and Matthew J. Churchfield, NREL The new High Performance Computing Data Center at the National Renewable Energy Laboratory (NREL) hosts high-speed, high-volume data
FY04 Engineering Technology Reports Laboratory Directed Research and Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharpe, R M
2005-01-27
This report summarizes the science and technology research and development efforts in Lawrence Livermore National Laboratory's Engineering Directorate for FY2004, and exemplifies Engineering's more than 50-year history of developing the technologies needed to support the Laboratory's missions. Engineering has been a partner in every major program and project at the Laboratory throughout its existence and has prepared for this role with a skilled workforce and the technical resources developed through venues like the Laboratory Directed Research and Development Program (LDRD). This accomplishment is well summarized by Engineering's mission: ''Enable program success today and ensure the Laboratory's vitality tomorrow''. Engineering's investmentmore » in technologies is carried out through two programs, the ''Tech Base'' program and the LDRD program. LDRD is the vehicle for creating those technologies and competencies that are cutting edge. These require a significant level of research or contain some unknown that needs to be fully understood. Tech Base is used to apply technologies to a Laboratory need. The term commonly used for Tech Base projects is ''reduction to practice''. Therefore, the LDRD report covered here has a strong research emphasis. Areas that are presented all fall into those needed to accomplish our mission. For FY2004, Engineering's LDRD projects were focused on mesoscale target fabrication and characterization, development of engineering computational capability, material studies and modeling, remote sensing and communications, and microtechnology and nanotechnology for national security applications. Engineering's five Centers, in partnership with the Division Leaders and Department Heads, are responsible for guiding the long-term science and technology investments for the Directorate. The Centers represent technologies that have been identified as critical for the present and future work of the Laboratory, and are chartered to develop their respective areas. Their LDRD projects are the key resources to attain this competency, and, as such, nearly all of Engineering's portfolio falls under one of the five Centers. The Centers and their Directors are: (1) Center for Computational Engineering: Robert M. Sharpe; (2) Center for Microtechnology and Nanotechnology: Raymond P. Mariella, Jr.; (3) Center for Nondestructive Characterization: Harry E. Martz, Jr.; (4) Center for Precision Engineering: Keith Carlisle; and (5) Center for Complex Distributed Systems: Gregory J. Suski, Acting Director.« less
Final Report. Center for Scalable Application Development Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mellor-Crummey, John
2014-10-26
The Center for Scalable Application Development Software (CScADS) was established as a part- nership between Rice University, Argonne National Laboratory, University of California Berkeley, University of Tennessee – Knoxville, and University of Wisconsin – Madison. CScADS pursued an integrated set of activities with the aim of increasing the productivity of DOE computational scientists by catalyzing the development of systems software, libraries, compilers, and tools for leadership computing platforms. Principal Center activities were workshops to engage the research community in the challenges of leadership computing, research and development of open-source software, and work with computational scientists to help them develop codesmore » for leadership computing platforms. This final report summarizes CScADS activities at Rice University in these areas.« less
NASA Technical Reports Server (NTRS)
Jansen, B. J., Jr.
1998-01-01
The features of the data acquisition and control systems of the NASA Langley Research Center's Jet Noise Laboratory are presented. The Jet Noise Laboratory is a facility that simulates realistic mixed flow turbofan jet engine nozzle exhaust systems in simulated flight. The system is capable of acquiring data for a complete take-off assessment of noise and nozzle performance. This paper describes the development of an integrated system to control and measure the behavior of model jet nozzles featuring dual independent high pressure combusting air streams with wind tunnel flow. The acquisition and control system is capable of simultaneous measurement of forces, moments, static and dynamic model pressures and temperatures, and jet noise. The design concepts for the coordination of the control computers and multiple data acquisition computers and instruments are discussed. The control system design and implementation are explained, describing the features, equipment, and the experiences of using a primarily Personal Computer based system. Areas for future development are examined.
37. Photograph of plan for repairs to computer room, 1958, ...
37. Photograph of plan for repairs to computer room, 1958, prepared by the Public Works Office, Underwater Sound Laboratory. Drawing on file at Caretaker Site Office, Naval Undersea Warfare Center, New London. Copyright-free. - Naval Undersea Warfare Center, Bowditch Hall, 600 feet east of Smith Street & 350 feet south of Columbia Cove, West bank of Thames River, New London, New London County, CT
Removing the center from computing: biology's new mode of digital knowledge production.
November, Joseph
2011-06-01
This article shows how the USA's National Institutes of Health (NIH) helped to bring about a major shift in the way computers are used to produce knowledge and in the design of computers themselves as a consequence of its early 1960s efforts to introduce information technology to biologists. Starting in 1960 the NIH sought to reform the life sciences by encouraging researchers to make use of digital electronic computers, but despite generous federal support biologists generally did not embrace the new technology. Initially the blame fell on biologists' lack of appropriate (i.e. digital) data for computers to process. However, when the NIH consulted MIT computer architect Wesley Clark about this problem, he argued that the computer's quality as a device that was centralized posed an even greater challenge to potential biologist users than did the computer's need for digital data. Clark convinced the NIH that if the agency hoped to effectively computerize biology, it would need to satisfy biologists' experimental and institutional needs by providing them the means to use a computer without going to a computing center. With NIH support, Clark developed the 1963 Laboratory Instrument Computer (LINC), a small, real-time interactive computer intended to be used inside the laboratory and controlled entirely by its biologist users. Once built, the LINC provided a viable alternative to the 1960s norm of large computers housed in computing centers. As such, the LINC not only became popular among biologists, but also served in later decades as an important precursor of today's computing norm in the sciences and far beyond, the personal computer.
Sandia National Laboratories: Advanced Simulation and Computing
Programs Nuclear Weapons About Nuclear Weapons Safety & Security Weapons Science & Technology Robotics R&D 100 Awards Laboratory Directed Research & Development Technology Deployment Centers Audit Sandia's Economic Impact Licensing & Technology Transfer Browse Technology Portfolios
Research Networks and Technology Migration (RESNETSII)
2004-07-01
Laboratory (LBNL), The International Computer Science Institute (ICSI) Center for Internet Research (ICIR) DARWIN Developing protocols and...degradation in network loss, delay and throughput AT&T Center for Internet Research at ICSI (ACIRI), AT&T Labs-Research, University Of Massachusetts
Reinventing patient-centered computing for the twenty-first century.
Goldberg, H S; Morales, A; Gottlieb, L; Meador, L; Safran, C
2001-01-01
Despite evidence over the past decade that patients like and will use patient-centered computing systems in managing their health, patients have remained forgotten stakeholders in advances in clinical computing systems. We present a framework for patient empowerment and the technical realization of that framework in an architecture called CareLink. In an evaluation of the initial deployment of CareLink in the support of neonatal intensive care, we have demonstrated a reduction in the length of stay for very-low birthweight infants, and an improvement in family satisfaction with care delivery. With the ubiquitous adoption of the Internet into the general culture, patient-centered computing provides the opportunity to mend broken health care relationships and reconnect patients to the care delivery process. CareLink itself provides functionality to support both clinical care and research, and provides a living laboratory for the further study of patient-centered computing.
Center for space microelectronics technology
NASA Technical Reports Server (NTRS)
1993-01-01
The 1992 Technical Report of the Jet Propulsion Laboratory Center for Space Microelectronics Technology summarizes the technical accomplishments, publications, presentations, and patents of the center during the past year. The report lists 187 publications, 253 presentations, and 111 new technology reports and patents in the areas of solid-state devices, photonics, advanced computing, and custom microcircuits.
Laboratory Directed Research & Development (LDRD)
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
The role of dedicated data computing centers in the age of cloud computing
NASA Astrophysics Data System (ADS)
Caramarcu, Costin; Hollowell, Christopher; Strecker-Kellogg, William; Wong, Antonio; Zaytsev, Alexandr
2017-10-01
Brookhaven National Laboratory (BNL) anticipates significant growth in scientific programs with large computing and data storage needs in the near future and has recently reorganized support for scientific computing to meet these needs. A key component is the enhanced role of the RHIC-ATLAS Computing Facility (RACF) in support of high-throughput and high-performance computing (HTC and HPC) at BNL. This presentation discusses the evolving role of the RACF at BNL, in light of its growing portfolio of responsibilities and its increasing integration with cloud (academic and for-profit) computing activities. We also discuss BNL’s plan to build a new computing center to support the new responsibilities of the RACF and present a summary of the cost benefit analysis done, including the types of computing activities that benefit most from a local data center vs. cloud computing. This analysis is partly based on an updated cost comparison of Amazon EC2 computing services and the RACF, which was originally conducted in 2012.
Staff | Computational Science | NREL
develops and leads laboratory-wide efforts in high-performance computing and energy-efficient data centers Professional IV-High Perf Computing Jim.Albin@nrel.gov 303-275-4069 Ananthan, Shreyas Senior Scientist - High -Performance Algorithms and Modeling Shreyas.Ananthan@nrel.gov 303-275-4807 Bendl, Kurt IT Professional IV-High
An information retrieval system for research file data
Joan E. Lengel; John W. Koning
1978-01-01
Research file data have been successfully retrieved at the Forest Products Laboratory through a high-speed cross-referencing system involving the computer program FAMULUS as modified by the Madison Academic Computing Center at the University of Wisconsin. The method of data input, transfer to computer storage, system utilization, and effectiveness are discussed....
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tucker, T.C.
1980-06-01
The implementation of a version of the Rutherford Laboratory's magnetostatic computer code GFUN3D on the CDC 7600 at the National Magnetic Fusion Energy Computer Center is reported. A new iteration technique that greatly increases the probability of convergence and reduces computation time by about 30% for calculations with nonlinear, ferromagnetic materials is included. The use of GFUN3D on the NMFE network is discussed, and suggestions for future work are presented. Appendix A consists of revisions to the GFUN3D User Guide (published by Rutherford Laboratory( that are necessary to use this version. Appendix B contains input and output for some samplemore » calculations. Appendix C is a detailed discussion of the old and new iteration techniques.« less
NASA Technical Reports Server (NTRS)
Young, Gerald W.; Clemons, Curtis B.
2004-01-01
The focus of this Cooperative Agreement between the Computational Materials Laboratory (CML) of the Processing Science and Technology Branch of the NASA Glenn Research Center (GRC) and the Department of Theoretical and Applied Mathematics at The University of Akron was in the areas of system development of the CML workstation environment, modeling of microgravity and earth-based material processing systems, and joint activities in laboratory projects. These efforts complement each other as the majority of the modeling work involves numerical computations to support laboratory investigations. Coordination and interaction between the modelers, system analysts, and laboratory personnel are essential toward providing the most effective simulations and communication of the simulation results. Toward these means, The University of Akron personnel involved in the agreement worked at the Applied Mathematics Research Laboratory (AMRL) in the Department of Theoretical and Applied Mathematics while maintaining a close relationship with the personnel of the Computational Materials Laboratory at GRC. Network communication between both sites has been established. A summary of the projects we undertook during the time period 9/1/03 - 6/30/04 is included.
Adaptive Mesh Experiments for Hyperbolic Partial Differential Equations
1990-02-01
JOSEPH E. FLAHERTY FEBRUARY 1990 US ARMY ARMAMENT RESEARCH , ~ DEVELOPMENT AND ENGINEERlING CENTER CLOSE COMBAT ARMAMENTS CENTER BENET LABORATORIES...NY 12189-4050 If. CONTROLLING OFFICE NAME AND ADDRESS 12. REPORT DATE U.S. Army ARDEC February 1990 Close Combat Armaments Center 13. NUMBER OF...Flaherty Department of Computer Science Rensselaer Polytechnic Institute Troy, NY 12180-3590 and U.S. Army ARDEC Close Combat Armaments Center Benet
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
A FRAMEWORK FOR A COMPUTATIONAL TOXICOLOGY RESEARCH PROGRAM IN ORD
"A Framework for a Computational Toxicology Research Program in ORD" was drafted by a Technical Writing Team having representatives from all of ORD's Laboratories and Centers. The document describes a framework for the development of an program within ORD to utilize approaches d...
Data collection and evaluation for experimental computer science research
NASA Technical Reports Server (NTRS)
Zelkowitz, Marvin V.
1983-01-01
The Software Engineering Laboratory was monitoring software development at NASA Goddard Space Flight Center since 1976. The data collection activities of the Laboratory and some of the difficulties of obtaining reliable data are described. In addition, the application of this data collection process to a current prototyping experiment is reviewed.
ERIC Educational Resources Information Center
Ryan, William C., Ed.
The l50 papers and 28 panel discussion reports in this collection focus on innovations, trends, and research in the use of computers in a variety of educational settings. Topics discussed include: computer centers and laboratories; use of computer at various levels from K-12 through the university, including inservice teacher training; use of…
Senior Computational Scientist | Center for Cancer Research
The Basic Science Program (BSP) pursues independent, multidisciplinary research in basic and applied molecular biology, immunology, retrovirology, cancer biology, and human genetics. Research efforts and support are an integral part of the Center for Cancer Research (CCR) at the Frederick National Laboratory for Cancer Research (FNLCR). The Cancer & Inflammation Program (CIP),
Proceedings of the 5. joint Russian-American computational mathematics conference
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1997-12-31
These proceedings contain a record of the talks presented and papers submitted by participants. The conference participants represented three institutions from the United States, Sandia National Laboratories (SNL), Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and two from Russia, Russian Federal Nuclear Center--All Russian Research Institute of Experimental Physics (RFNC-VNIIEF/Arzamas-16), and Russian Federal Nuclear Center--All Russian Research Institute of Technical Physics (RFNC-VNIITF/Chelyabinsk-70). The presentations and papers cover a wide range of applications from radiation transport to materials. Selected papers have been indexed separately for inclusion in the Energy Science and Technology Database.
Halligan, Brian D.; Geiger, Joey F.; Vallejos, Andrew K.; Greene, Andrew S.; Twigger, Simon N.
2009-01-01
One of the major difficulties for many laboratories setting up proteomics programs has been obtaining and maintaining the computational infrastructure required for the analysis of the large flow of proteomics data. We describe a system that combines distributed cloud computing and open source software to allow laboratories to set up scalable virtual proteomics analysis clusters without the investment in computational hardware or software licensing fees. Additionally, the pricing structure of distributed computing providers, such as Amazon Web Services, allows laboratories or even individuals to have large-scale computational resources at their disposal at a very low cost per run. We provide detailed step by step instructions on how to implement the virtual proteomics analysis clusters as well as a list of current available preconfigured Amazon machine images containing the OMSSA and X!Tandem search algorithms and sequence databases on the Medical College of Wisconsin Proteomics Center website (http://proteomics.mcw.edu/vipdac). PMID:19358578
Halligan, Brian D; Geiger, Joey F; Vallejos, Andrew K; Greene, Andrew S; Twigger, Simon N
2009-06-01
One of the major difficulties for many laboratories setting up proteomics programs has been obtaining and maintaining the computational infrastructure required for the analysis of the large flow of proteomics data. We describe a system that combines distributed cloud computing and open source software to allow laboratories to set up scalable virtual proteomics analysis clusters without the investment in computational hardware or software licensing fees. Additionally, the pricing structure of distributed computing providers, such as Amazon Web Services, allows laboratories or even individuals to have large-scale computational resources at their disposal at a very low cost per run. We provide detailed step-by-step instructions on how to implement the virtual proteomics analysis clusters as well as a list of current available preconfigured Amazon machine images containing the OMSSA and X!Tandem search algorithms and sequence databases on the Medical College of Wisconsin Proteomics Center Web site ( http://proteomics.mcw.edu/vipdac ).
Computational Toxicology as Implemented by the US EPA ...
Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, the U.S. Environmental Protection Agency EPA is developing robust and flexible computational tools that can be applied to the thousands of chemicals in commerce, and contaminant mixtures found in air, water, and hazardous-waste sites. The Office of Research and Development (ORD) Computational Toxicology Research Program (CTRP) is composed of three main elements. The largest component is the National Center for Computational Toxicology (NCCT), which was established in 2005 to coordinate research on chemical screening and prioritization, informatics, and systems modeling. The second element consists of related activities in the National Health and Environmental Effects Research Laboratory (NHEERL) and the National Exposure Research Laboratory (NERL). The third and final component consists of academic centers working on various aspects of computational toxicology and funded by the U.S. EPA Science to Achieve Results (STAR) program. Together these elements form the key components in the implementation of both the initial strategy, A Framework for a Computational Toxicology Research Program (U.S. EPA, 2003), and the newly released The U.S. Environmental Protection Agency's Strategic Plan for Evaluating the T
Laboratory Sequence in Computational Methods for Introductory Chemistry
NASA Astrophysics Data System (ADS)
Cody, Jason A.; Wiser, Dawn C.
2003-07-01
A four-exercise laboratory sequence for introductory chemistry integrating hands-on, student-centered experience with computer modeling has been designed and implemented. The progression builds from exploration of molecular shapes to intermolecular forces and the impact of those forces on chemical separations made with gas chromatography and distillation. The sequence ends with an exploration of molecular orbitals. The students use the computers as a tool; they build the molecules, submit the calculations, and interpret the results. Because of the construction of the sequence and its placement spanning the semester break, good laboratory notebook practices are reinforced and the continuity of course content and methods between semesters is emphasized. The inclusion of these techniques in the first year of chemistry has had a positive impact on student perceptions and student learning.
The Laboratory for Terrestrial Physics
NASA Technical Reports Server (NTRS)
2003-01-01
The Laboratory for Terrestrial Physics is dedicated to the advancement of knowledge in Earth and planetary science, by conducting innovative research using space technology. The Laboratory's mission and activities support the work and new initiatives at NASA's Goddard Space Flight Center (GSFC). The Laboratory's success contributes to the Earth Science Directorate as a national resource for studies of Earth from Space. The Laboratory is part of the Earth Science Directorate based at the GSFC in Greenbelt, MD. The Directorate itself is comprised of the Global Change Data Center (GCDC), the Space Data and Computing Division (SDCD), and four science Laboratories, including Laboratory for Terrestrial Physics, Laboratory for Atmospheres, and Laboratory for Hydrospheric Processes all in Greenbelt, MD. The fourth research organization, Goddard Institute for Space Studies (GISS), is in New York, NY. Relevant to NASA's Strategic Plan, the Laboratory ensures that all work undertaken and completed is within the vision of GSFC. The philosophy of the Laboratory is to balance the completion of near term goals, while building on the Laboratory's achievements as a foundation for the scientific challenges in the future.
NASA Technical Reports Server (NTRS)
1987-01-01
Philip Morris research center scientists use a computer program called CECTRP, for Chemical Equilibrium Composition and Transport Properties, to gain insight into the behavior of atoms as they progress along the reaction pathway. Use of the program lets the scientist accurately predict the behavior of a given molecule or group of molecules. Computer generated data must be checked by laboratory experiment, but the use of CECTRP saves the researchers hundreds of hours of laboratory time since experiments must run only to validate the computer's prediction. Philip Morris estimates that had CECTRP not been available, at least two man years would have been required to develop a program to perform similar free energy calculations.
Computer Center: Software Review: The DynaPulse 200M.
ERIC Educational Resources Information Center
Pankiewicz, Philip R., Ed.
1995-01-01
Reviews the DynaPulse 200M Education Edition microcomputer-based laboratory, which combines interactive software with curriculum and medical instrumentation to teach students about the cardiovascular system. (MKR)
Computational Toxicology at the US EPA | Science Inventory ...
Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, EPA is developing robust and flexible computational tools that can be applied to the thousands of chemicals in commerce, and contaminant mixtures found in America’s air, water, and hazardous-waste sites. The ORD Computational Toxicology Research Program (CTRP) is composed of three main elements. The largest component is the National Center for Computational Toxicology (NCCT), which was established in 2005 to coordinate research on chemical screening and prioritization, informatics, and systems modeling. The second element consists of related activities in the National Health and Environmental Effects Research Laboratory (NHEERL) and the National Exposure Research Laboratory (NERL). The third and final component consists of academic centers working on various aspects of computational toxicology and funded by the EPA Science to Achieve Results (STAR) program. Key intramural projects of the CTRP include digitizing legacy toxicity testing information toxicity reference database (ToxRefDB), predicting toxicity (ToxCast™) and exposure (ExpoCast™), and creating virtual liver (v-Liver™) and virtual embryo (v-Embryo™) systems models. The models and underlying data are being made publicly available t
Scientific Computing Strategic Plan for the Idaho National Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whiting, Eric Todd
Scientific computing is a critical foundation of modern science. Without innovations in the field of computational science, the essential missions of the Department of Energy (DOE) would go unrealized. Taking a leadership role in such innovations is Idaho National Laboratory’s (INL’s) challenge and charge, and is central to INL’s ongoing success. Computing is an essential part of INL’s future. DOE science and technology missions rely firmly on computing capabilities in various forms. Modeling and simulation, fueled by innovations in computational science and validated through experiment, are a critical foundation of science and engineering. Big data analytics from an increasing numbermore » of widely varied sources is opening new windows of insight and discovery. Computing is a critical tool in education, science, engineering, and experiments. Advanced computing capabilities in the form of people, tools, computers, and facilities, will position INL competitively to deliver results and solutions on important national science and engineering challenges. A computing strategy must include much more than simply computers. The foundational enabling component of computing at many DOE national laboratories is the combination of a showcase like data center facility coupled with a very capable supercomputer. In addition, network connectivity, disk storage systems, and visualization hardware are critical and generally tightly coupled to the computer system and co located in the same facility. The existence of these resources in a single data center facility opens the doors to many opportunities that would not otherwise be possible.« less
Characteristics of the Navy Laboratory Warfare Center Technical Workforce
2013-09-29
Mathematics and Information Science (M&IS) Actuarial Science 1510 Computer Science 1550 Gen. Math & Statistics 1501 Mathematics 1520 Operations...Admin. Network Systems & Data Communication Analysts Actuaries Mathematicians Operations Research Analyst Statisticians Social Science (SS...workforce was sub-divided into six broad occupational groups: Life Science , Physical Science , Engineering, Mathematics, Computer Science and Information
Preparation for microgravity - The role of the Microgravity Material Science Laboratory
NASA Technical Reports Server (NTRS)
Johnston, J. Christopher; Rosenthal, Bruce N.; Meyer, Maryjo B.; Glasgow, Thomas K.
1988-01-01
Experiments at the NASA Lewis Research Center's Microgravity Material Science Laboratory using physical and mathematical models to delineate the effects of gravity on processes of scientific and commercial interest are discussed. Where possible, transparent model systems are used to visually track convection, settling, crystal growth, phase separation, agglomeration, vapor transport, diffusive flow, and polymer reactions. Materials studied include metals, alloys, salts, glasses, ceramics, and polymers. Specific technologies discussed include the General Purpose furnace used in the study of metals and crystal growth, the isothermal dendrite growth apparatus, the electromagnetic levitator/instrumented drop tube, the high temperature directional solidification furnace, the ceramics and polymer laboratories and the center's computing facilities.
Multiphasic Health Testing in the Clinic Setting
LaDou, Joseph
1971-01-01
The economy of automated multiphasic health testing (amht) activities patterned after the high-volume Kaiser program can be realized in low-volume settings. amht units have been operated at daily volumes of 20 patients in three separate clinical environments. These programs have displayed economics entirely compatible with cost figures published by the established high-volume centers. This experience, plus the expanding capability of small, general purpose, digital computers (minicomputers) indicates that a group of six or more physicians generating 20 laboratory appraisals per day can economically justify a completely automated multiphasic health testing facility. This system would reside in the clinic or hospital where it is used and can be configured to do analyses such as electrocardiography and generate laboratory reports, and communicate with large computer systems in university medical centers. Experience indicates that the most effective means of implementing these benefits of automation is to make them directly available to the medical community with the physician playing the central role. Economic justification of a dedicated computer through low-volume health testing then allows, as a side benefit, automation of administrative as well as other diagnostic activities—for example, patient billing, computer-aided diagnosis, and computer-aided therapeutics. PMID:4935771
ACSYNT - A standards-based system for parametric, computer aided conceptual design of aircraft
NASA Technical Reports Server (NTRS)
Jayaram, S.; Myklebust, A.; Gelhausen, P.
1992-01-01
A group of eight US aerospace companies together with several NASA and NAVY centers, led by NASA Ames Systems Analysis Branch, and Virginia Tech's CAD Laboratory agreed, through the assistance of Americal Technology Initiative, in 1990 to form the ACSYNT (Aircraft Synthesis) Institute. The Institute is supported by a Joint Sponsored Research Agreement to continue the research and development in computer aided conceptual design of aircraft initiated by NASA Ames Research Center and Virginia Tech's CAD Laboratory. The result of this collaboration, a feature-based, parametric computer aided aircraft conceptual design code called ACSYNT, is described. The code is based on analysis routines begun at NASA Ames in the early 1970's. ACSYNT's CAD system is based entirely on the ISO standard Programmer's Hierarchical Interactive Graphics System and is graphics-device independent. The code includes a highly interactive graphical user interface, automatically generated Hermite and B-Spline surface models, and shaded image displays. Numerous features to enhance aircraft conceptual design are described.
ERIC Educational Resources Information Center
American School and University, 1983
1983-01-01
A campus computer center at Hofstra University (New York) that holds 70 terminals for student use was first a gymnasium, then a language laboratory. Strands of fiber optics are used for the necessary wiring. (MLF)
Postdoctoral Fellow | Center for Cancer Research
The Neuro-Oncology Branch (NOB), Center for Cancer Research (CCR), National Cancer Institute (NCI) of the National Institutes of Health (NIH) is seeking outstanding postdoctoral candidates interested in studying metabolic and cell signaling pathways in the context of brain cancers through construction of computational models amenable to formal computational analysis and simulation. The ability to closely collaborate with the modern metabolomics center developed at CCR provides a unique opportunity for a postdoctoral candidate with a strong theoretical background and interest in demonstrating the incredible potential of computational approaches to solve problems from scientific disciplines and improve lives. The candidate will be given the opportunity to both construct data-driven models, as well as biologically validate the models by demonstrating the ability to predict the effects of altering tumor metabolism in laboratory and clinical settings.
@berkeley.edu 510-642-1220 Research profile » A U.S. Department of Energy National Laboratory Operated by the Computational Study of Excited-State Phenomena in Energy Materials Center for X-ray Optics MSD Facilities Ion Investigators Division Staff Facilities and Centers Staff Jobs Safety Personnel Resources Committees In Case of
NASA Technical Reports Server (NTRS)
1998-01-01
Stirling Technology Company developed the components for its BeCOOL line of Cryocoolers with the help of a series of NASA SBIRs (Small Business Innovative Research), through Goddard Space Flight Center and Marshall Space Flight Center. Features include a hermetically sealed design, compact size, and silent operation. The company has already placed several units with commercial customers for computer applications and laboratory use.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sitek, M. A.; Lottes, S. A.; Bojanowski, C.
Computational fluid dynamics (CFD) modeling is widely used in industry for design and in the research community to support, compliment, and extend the scope of experimental studies. Analysis of transportation infrastructure using high performance cluster computing with CFD and structural mechanics software is done at the Transportation Research and Analysis Computing Center (TRACC) at Argonne National Laboratory. These resources, available at TRACC, were used to perform advanced three-dimensional computational simulations of the wind tunnel laboratory at the Turner-Fairbank Highway Research Center (TFHRC). The goals were to verify the CFD model of the laboratory wind tunnel and then to use versionsmore » of the model to provide the capability to (1) perform larger parametric series of tests than can be easily done in the laboratory with available budget and time, (2) to extend testing to wind speeds that cannot be achieved in the laboratory, and (3) to run types of tests that are very difficult or impossible to run in the laboratory. Modern CFD software has many physics models and domain meshing options. Models, including the choice of turbulence and other physics models and settings, the computational mesh, and the solver settings, need to be validated against measurements to verify that the results are sufficiently accurate for use in engineering applications. The wind tunnel model was built and tested, by comparing to experimental measurements, to provide a valuable tool to perform these types of studies in the future as a complement and extension to TFHRC’s experimental capabilities. Wind tunnel testing at TFHRC is conducted in a subsonic open-jet wind tunnel with a 1.83 m (6 foot) by 1.83 m (6 foot) cross section. A three component dual force-balance system is used to measure forces acting on tested models, and a three degree of freedom suspension system is used for dynamic response tests. Pictures of the room are shown in Figure 1-1 to Figure 1-4. A detailed CAD geometry and CFD model of the wind tunnel laboratory at TFHRC was built and tested. Results were compared against experimental wind velocity measurements at a large number of locations around the room. This testing included an assessment of the air flow uniformity provided by the tunnel to the test zone and assessment of room geometry effects, such as influence of the proximity the room walls, the non-symmetrical position of the tunnel in the room, and the influence of the room setup on the air flow in the room. This information is useful both for simplifying the computational model and in deciding whether or not moving, or removing, some of the furniture or other movable objects in the room will change the flow in the test zone.« less
NASA Technical Reports Server (NTRS)
1993-01-01
Jack is an advanced human factors software package that provides a three dimensional model for predicting how a human will interact with a given system or environment. It can be used for a broad range of computer-aided design applications. Jack was developed by the computer Graphics Research Laboratory of the University of Pennsylvania with assistance from NASA's Johnson Space Center, Ames Research Center and the Army. It is the University's first commercial product. Jack is still used for academic purposes at the University of Pennsylvania. Commercial rights were given to Transom Technologies, Inc.
Computer Software Management and Information Center
NASA Technical Reports Server (NTRS)
1983-01-01
Computer programs for passive anti-roll tank, earth resources laboratory applications, the NIMBUS-7 coastal zone color scanner derived products, transportable applications executive, plastic and failure analysis of composites, velocity gradient method for calculating velocities in an axisymmetric annular duct, an integrated procurement management system, data I/O PRON for the Motorola exorcisor, aerodynamic shock-layer shape, kinematic modeling, hardware library for a graphics computer, and a file archival system are documented.
HPCCP/CAS Workshop Proceedings 1998
NASA Technical Reports Server (NTRS)
Schulbach, Catherine; Mata, Ellen (Editor); Schulbach, Catherine (Editor)
1999-01-01
This publication is a collection of extended abstracts of presentations given at the HPCCP/CAS (High Performance Computing and Communications Program/Computational Aerosciences Project) Workshop held on August 24-26, 1998, at NASA Ames Research Center, Moffett Field, California. The objective of the Workshop was to bring together the aerospace high performance computing community, consisting of airframe and propulsion companies, independent software vendors, university researchers, and government scientists and engineers. The Workshop was sponsored by the HPCCP Office at NASA Ames Research Center. The Workshop consisted of over 40 presentations, including an overview of NASA's High Performance Computing and Communications Program and the Computational Aerosciences Project; ten sessions of papers representative of the high performance computing research conducted within the Program by the aerospace industry, academia, NASA, and other government laboratories; two panel sessions; and a special presentation by Mr. James Bailey.
Hardware survey for the avionics test bed
NASA Technical Reports Server (NTRS)
Cobb, J. M.
1981-01-01
A survey of maor hardware items that could possibly be used in the development of an avionics test bed for space shuttle attached or autonomous large space structures was conducted in NASA Johnson Space Center building 16. The results of the survey are organized to show the hardware by laboratory usage. Computer systems in each laboratory are described in some detail.
The Center for Nanophase Materials Sciences
NASA Astrophysics Data System (ADS)
Lowndes, Douglas
2005-03-01
The Center for Nanophase Materials Sciences (CNMS) located at Oak Ridge National Laboratory (ORNL) will be the first DOE Nanoscale Science Research Center to begin operation, with construction to be completed in April 2005 and initial operations in October 2005. The CNMS' scientific program has been developed through workshops with the national community, with the goal of creating a highly collaborative research environment to accelerate discovery and drive technological advances. Research at the CNMS is organized under seven Scientific Themes selected to address challenges to understanding and to exploit particular ORNL strengths (see http://cnms.ornl.govhttp://cnms.ornl.gov). These include extensive synthesis and characterization capabilities for soft, hard, nanostructured, magnetic and catalytic materials and their composites; neutron scattering at the Spallation Neutron Source and High Flux Isotope Reactor; computational nanoscience in the CNMS' Nanomaterials Theory Institute and utilizing facilities and expertise of the Center for Computational Sciences and the new Leadership Scientific Computing Facility at ORNL; a new CNMS Nanofabrication Research Laboratory; and a suite of unique and state-of-the-art instruments to be made reliably available to the national community for imaging, manipulation, and properties measurements on nanoscale materials in controlled environments. The new research facilities will be described together with the planned operation of the user research program, the latter illustrated by the current ``jump start'' user program that utilizes existing ORNL/CNMS facilities.
Collaborative Systems Biology Projects for the Military Medical Community.
Zalatoris, Jeffrey J; Scheerer, Julia B; Lebeda, Frank J
2017-09-01
This pilot study was conducted to examine, for the first time, the ongoing systems biology research and development projects within the laboratories and centers of the U.S. Army Medical Research and Materiel Command (USAMRMC). The analysis has provided an understanding of the breadth of systems biology activities, resources, and collaborations across all USAMRMC subordinate laboratories. The Systems Biology Collaboration Center at USAMRMC issued a survey regarding systems biology research projects to the eight U.S.-based USAMRMC laboratories and centers in August 2016. This survey included a data call worksheet to gather self-identified project and programmatic information. The general topics focused on the investigators and their projects, on the project's research areas, on omics and other large data types being collected and stored, on the analytical or computational tools being used, and on identifying intramural (i.e., USAMRMC) and extramural collaborations. Among seven of the eight laboratories, 62 unique systems biology studies were funded and active during the final quarter of fiscal year 2016. Of 29 preselected medical Research Task Areas, 20 were associated with these studies, some of which were applicable to two or more Research Task Areas. Overall, studies were categorized among six general types of objectives: biological mechanisms of disease, risk of/susceptibility to injury or disease, innate mechanisms of healing, diagnostic and prognostic biomarkers, and host/patient responses to vaccines, and therapeutic strategies including host responses to therapies. We identified eight types of omics studies and four types of study subjects. Studies were categorized on a scale of increasing complexity from single study subject/single omics technology studies (23/62) to studies integrating results across two study subject types and two or more omics technologies (13/62). Investigators at seven USAMRMC laboratories had collaborations with systems biology experts from 18 extramural organizations and three other USAMRMC laboratories. Collaborators from six USAMRMC laboratories and 58 extramural organizations were identified who provided additional research expertise to these systems biology studies. At the end of fiscal year 2016, USAMRMC laboratories self-reported 66 systems biology/computational biology studies (62 of which were unique) with 25 intramural and 81 extramural collaborators. Nearly two-thirds were led by or in collaboration with the U.S. Army Telemedicine and Advanced Technology Research Center/Department of Defense Biotechnology High-Performance Computing Software Applications Institute and U.S. Army Center for Environmental Health Research. The most common study objective addressed biological mechanisms of disease. The most common types of Research Task Areas addressed infectious diseases (viral and bacterial) and chemical agents (environmental toxicant exposures, and traditional and emerging chemical threats). More than 40% of the studies (27/62) involved collaborations between the reporting USAMRMC laboratory and one other organization. Nearly half of the studies (30/62) involved collaborations between the reporting USAMRMC laboratory and at least two other organizations. These survey results indicate that USAMRMC laboratories are compliant with data-centric policy and guidance documents whose goals are to prevent redundancy and promote collaborations by sharing data and leveraging capabilities. These results also serve as a foundation to make recommendations for future systems biology research efforts. Reprint & Copyright © 2017 Association of Military Surgeons of the U.S.
Cloudbursting - Solving the 3-body problem
NASA Astrophysics Data System (ADS)
Chang, G.; Heistand, S.; Vakhnin, A.; Huang, T.; Zimdars, P.; Hua, H.; Hood, R.; Koenig, J.; Mehrotra, P.; Little, M. M.; Law, E.
2014-12-01
Many science projects in the future will be accomplished through collaboration among 2 or more NASA centers along with, potentially, external scientists. Science teams will be composed of more geographically dispersed individuals and groups. However, the current computing environment does not make this easy and seamless. By being able to share computing resources among members of a multi-center team working on a science/ engineering project, limited pre-competition funds could be more efficiently applied and technical work could be conducted more effectively with less time spent moving data or waiting for computing resources to free up. Based on the work from an NASA CIO IT Labs task, this presentation will highlight our prototype work in identifying the feasibility and identify the obstacles, both technical and management, to perform "Cloudbursting" among private clouds located at three different centers. We will demonstrate the use of private cloud computing infrastructure at the Jet Propulsion Laboratory, Langley Research Center, and Ames Research Center to provide elastic computation to each other to perform parallel Earth Science data imaging. We leverage elastic load balancing and auto-scaling features at each data center so that each location can independently define how many resources to allocate to a particular job that was "bursted" from another data center and demonstrate that compute capacity scales up and down with the job. We will also discuss future work in the area, which could include the use of cloud infrastructure from different cloud framework providers as well as other cloud service providers.
Accuracy of a laboratory-based computer implant guiding system.
Barnea, Eitan; Alt, Ido; Kolerman, Roni; Nissan, Joseph
2010-05-01
Computer-guided implant placement is a growing treatment modality in partially and totally edentulous patients, though data about the accuracy of some systems for computer-guided surgery is limited. The purpose of this study was to evaluate the accuracy of a laboratory computer-guided system. A laboratory-based computer guiding system (M Guide; MIS technologies, Shlomi, Israel) was used to place implants in a fresh sheep mandible. A second computerized tomography (CT) scan was taken after placing the implants . The drill plan figures of the planned implants were positioned using assigned software (Med3D, Heidelberg, Germany) on the second CT scan to compare the implant position with the initial planning. Values representing the implant locations of the original drill plan were compared with that of the placed implants using SPSS software. Six measurements (3 vertical, 3 horizontal) were made on each implant to assess the deviation from the initial implant planning. A repeated-measurement analysis of variance was performed comparing the location of measurement (center, abutment, apex) and type of deviation (vertical vs. horizontal). The vertical deviation (mean -0.168) was significantly smaller than the horizontal deviation (mean 1.148). The laboratory computer-based guiding system may be a viable treatment concept for placing implants. Copyright (c) 2010 Mosby, Inc. All rights reserved.
Energy Systems Integration Partnerships: NREL + Sandia + Johnson Controls
DOE Office of Scientific and Technical Information (OSTI.GOV)
NREL and Sandia National Laboratories partnered with Johnson Controls to deploy the company's BlueStream Hybrid Cooling System at ESIF's high-performance computing data center to reduce water consumption seen in evaporative cooling towers.
Computer systems and software engineering
NASA Technical Reports Server (NTRS)
Mckay, Charles W.
1988-01-01
The High Technologies Laboratory (HTL) was established in the fall of 1982 at the University of Houston Clear Lake. Research conducted at the High Tech Lab is focused upon computer systems and software engineering. There is a strong emphasis on the interrelationship of these areas of technology and the United States' space program. In Jan. of 1987, NASA Headquarters announced the formation of its first research center dedicated to software engineering. Operated by the High Tech Lab, the Software Engineering Research Center (SERC) was formed at the University of Houston Clear Lake. The High Tech Lab/Software Engineering Research Center promotes cooperative research among government, industry, and academia to advance the edge-of-knowledge and the state-of-the-practice in key topics of computer systems and software engineering which are critical to NASA. The center also recommends appropriate actions, guidelines, standards, and policies to NASA in matters pertinent to the center's research. Results of the research conducted at the High Tech Lab/Software Engineering Research Center have given direction to many decisions made by NASA concerning the Space Station Program.
Development of the HERMIES III mobile robot research testbed at Oak Ridge National Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Manges, W.W.; Hamel, W.R.; Weisbin, C.R.
1988-01-01
The latest robot in the Hostile Environment Robotic Machine Intelligence Experiment Series (HERMIES) is now under development at the Center for Engineering Systems Advanced Research (CESAR) in the Oak Ridge National Laboratory. The HERMIES III robot incorporates a larger than human size 7-degree-of-freedom manipulator mounted on a 2-degree-of-freedom mobile platform including a variety of sensors and computers. The deployment of this robot represents a significant increase in research capabilities for the CESAR laboratory. The initial on-board computer capacity of the robot exceeds that of 20 Vax 11/780s. The navigation and vision algorithms under development make extensive use of the on-boardmore » NCUBE hypercube computer while the sensors are interfaced through five VME computers running the OS-9 real-time, multitasking operating system. This paper describes the motivation, key issues, and detailed design trade-offs of implementing the first phase (basic functionality) of the HERMIES III robot. 10 refs., 7 figs.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Srinath Vadlamani; Scott Kruger; Travis Austin
Extended magnetohydrodynamic (MHD) codes are used to model the large, slow-growing instabilities that are projected to limit the performance of International Thermonuclear Experimental Reactor (ITER). The multiscale nature of the extended MHD equations requires an implicit approach. The current linear solvers needed for the implicit algorithm scale poorly because the resultant matrices are so ill-conditioned. A new solver is needed, especially one that scales to the petascale. The most successful scalable parallel processor solvers to date are multigrid solvers. Applying multigrid techniques to a set of equations whose fundamental modes are dispersive waves is a promising solution to CEMM problems.more » For the Phase 1, we implemented multigrid preconditioners from the HYPRE project of the Center for Applied Scientific Computing at LLNL via PETSc of the DOE SciDAC TOPS for the real matrix systems of the extended MHD code NIMROD which is a one of the primary modeling codes of the OFES-funded Center for Extended Magnetohydrodynamic Modeling (CEMM) SciDAC. We implemented the multigrid solvers on the fusion test problem that allows for real matrix systems with success, and in the process learned about the details of NIMROD data structures and the difficulties of inverting NIMROD operators. The further success of this project will allow for efficient usage of future petascale computers at the National Leadership Facilities: Oak Ridge National Laboratory, Argonne National Laboratory, and National Energy Research Scientific Computing Center. The project will be a collaborative effort between computational plasma physicists and applied mathematicians at Tech-X Corporation, applied mathematicians Front Range Scientific Computations, Inc. (who are collaborators on the HYPRE project), and other computational plasma physicists involved with the CEMM project.« less
MIT Laboratory for Computer Science Progress Report, July 1984-June 1985
1985-06-01
larger (up to several thousand machines) multiprocessor systems. This facility, funded by the newly formed Strategic Computing Program of the Defense...Szolovits, Group Leader R. Patil Collaborating Investigators M. Criscitiello, M.D., Tufts-New England Medical Center Hospital J. Dzierzanowski, Ph.D., Dept...COMPUTATION STRUCTURES Academic Staff J. B. Dennis, Group Leader Research Staff W. B. Ackerman G. A. Boughton W. Y-P. Lim Graduate Students T-A. Chu S
NASA Astrophysics Data System (ADS)
Crease, Robert P.
2008-04-01
There are few more dramatic illustrations of the vicissitudes of laboratory architecturethan the contrast between Building 20 at the Massachusetts Institute of Technology (MIT) and its replacement, the Ray and Maria Stata Center. Building 20 was built hurriedly in 1943 as temporary housing for MIT's famous Rad Lab, the site of wartime radar research, and it remained a productive laboratory space for over half a century. A decade ago it was demolished to make way for the Stata Center, an architecturally striking building designed by Frank Gehry to house MIT's computer science and artificial intelligence labs (above). But in 2004 - just two years after the Stata Center officially opened - the building was criticized for being unsuitable for research and became the subject of still ongoing lawsuits alleging design and construction failures.
Software For Monitoring A Computer Network
NASA Technical Reports Server (NTRS)
Lee, Young H.
1992-01-01
SNMAT is rule-based expert-system computer program designed to assist personnel in monitoring status of computer network and identifying defective computers, workstations, and other components of network. Also assists in training network operators. Network for SNMAT located at Space Flight Operations Center (SFOC) at NASA's Jet Propulsion Laboratory. Intended to serve as data-reduction system providing windows, menus, and graphs, enabling users to focus on relevant information. SNMAT expected to be adaptable to other computer networks; for example in management of repair, maintenance, and security, or in administration of planning systems, billing systems, or archives.
Center for Computational Structures Technology
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.; Perry, Ferman W.
1995-01-01
The Center for Computational Structures Technology (CST) is intended to serve as a focal point for the diverse CST research activities. The CST activities include the use of numerical simulation and artificial intelligence methods in modeling, analysis, sensitivity studies, and optimization of flight-vehicle structures. The Center is located at NASA Langley and is an integral part of the School of Engineering and Applied Science of the University of Virginia. The key elements of the Center are: (1) conducting innovative research on advanced topics of CST; (2) acting as pathfinder by demonstrating to the research community what can be done (high-potential, high-risk research); (3) strong collaboration with NASA scientists and researchers from universities and other government laboratories; and (4) rapid dissemination of CST to industry, through integration of industrial personnel into the ongoing research efforts.
GIS Facility and Services at the Ronald Greeley Center for Planetary Studies
NASA Astrophysics Data System (ADS)
Nelson, D. M.; Williams, D. A.
2017-06-01
At the RGCPS, we established a Geographic Information Systems (GIS) computer laboratory, where we instruct researchers how to use GIS and image processing software. Seminars demonstrate viewing, integrating, and digitally mapping planetary data.
A model study of bridge hydraulics
DOT National Transportation Integrated Search
2010-08-01
Most flood studies in the United States use the Army Corps of Engineers HEC-RAS (Hydrologic Engineering : Centers River Analysis System) computer program. This study was carried out to compare results of HEC-RAS : bridge modeling with laboratory e...
Computational Structures Technology for Airframes and Propulsion Systems
NASA Technical Reports Server (NTRS)
Noor, Ahmed K. (Compiler); Housner, Jerrold M. (Compiler); Starnes, James H., Jr. (Compiler); Hopkins, Dale A. (Compiler); Chamis, Christos C. (Compiler)
1992-01-01
This conference publication contains the presentations and discussions from the joint University of Virginia (UVA)/NASA Workshops. The presentations included NASA Headquarters perspectives on High Speed Civil Transport (HSCT), goals and objectives of the UVA Center for Computational Structures Technology (CST), NASA and Air Force CST activities, CST activities for airframes and propulsion systems in industry, and CST activities at Sandia National Laboratory.
Fly-by-Wire Systems Enable Safer, More Efficient Flight
NASA Technical Reports Server (NTRS)
2012-01-01
Using the ultra-reliable Apollo Guidance Computer that enabled the Apollo Moon missions, Dryden Flight Research Center engineers, in partnership with industry leaders such as Cambridge, Massachusetts-based Draper Laboratory, demonstrated that digital computers could be used to fly aircraft. Digital fly-by-wire systems have since been incorporated into large airliners, military jets, revolutionary new aircraft, and even cars and submarines.
Knowledge management: Role of the the Radiation Safety Information Computational Center (RSICC)
NASA Astrophysics Data System (ADS)
Valentine, Timothy
2017-09-01
The Radiation Safety Information Computational Center (RSICC) at Oak Ridge National Laboratory (ORNL) is an information analysis center that collects, archives, evaluates, synthesizes and distributes information, data and codes that are used in various nuclear technology applications. RSICC retains more than 2,000 software packages that have been provided by code developers from various federal and international agencies. RSICC's customers (scientists, engineers, and students from around the world) obtain access to such computing codes (source and/or executable versions) and processed nuclear data files to promote on-going research, to ensure nuclear and radiological safety, and to advance nuclear technology. The role of such information analysis centers is critical for supporting and sustaining nuclear education and training programs both domestically and internationally, as the majority of RSICC's customers are students attending U.S. universities. Additionally, RSICC operates a secure CLOUD computing system to provide access to sensitive export-controlled modeling and simulation (M&S) tools that support both domestic and international activities. This presentation will provide a general review of RSICC's activities, services, and systems that support knowledge management and education and training in the nuclear field.
Biomedical Computing Technology Information Center: introduction and report of early progress
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maskewitz, B.F.; Henne, R.L.; McClain, W.J.
1976-01-01
In July 1975, the Biomedical Computing Technology Information Center (BCTIC) was established by the Division of Biomedical and Environmental Research of the U. S. Energy Research and Development Administration (ERDA) at the Oak Ridge National Laboratory. BCTIC collects, organizes, evaluates, and disseminates information on computing technology pertinent to biomedicine, providing needed routes of communication between installations and serving as a clearinghouse for the exchange of biomedical computing software, data, and interface designs. This paper presents BCTIC's functions and early progress to the MUMPS Users' Group in order to stimulate further discussion and cooperation between the two organizations. (BCTIC services aremore » available to its sponsors and their contractors and to any individual/group willing to participate in mutual exchange.) 1 figure.« less
Reeves, Rustin E; Aschenbrenner, John E; Wordinger, Robert J; Roque, Rouel S; Sheedlo, Harold J
2004-05-01
The need to increase the efficiency of dissection in the gross anatomy laboratory has been the driving force behind the technologic changes we have recently implemented. With the introduction of an integrated systems-based medical curriculum and a reduction in laboratory teaching hours, anatomy faculty at the University of North Texas Health Science Center (UNTHSC) developed a computer-based dissection manual to adjust to these curricular changes and time constraints. At each cadaver workstation, Apple iMac computers were added and a new dissection manual, running in a browser-based format, was installed. Within the text of the manual, anatomical structures required for dissection were linked to digital images from prosected materials; in addition, for each body system, the dissection manual included images from cross sections, radiographs, CT scans, and histology. Although we have placed a high priority on computerization of the anatomy laboratory, we remain strong advocates of the importance of cadaver dissection. It is our belief that the utilization of computers for dissection is a natural evolution of technology and fosters creative teaching strategies adapted for anatomy laboratories in the 21st century. Our strategy has significantly enhanced the independence and proficiency of our students, the efficiency of their dissection time, and the quality of laboratory instruction by the faculty. Copyright 2004 Wiley-Liss, Inc.
Goddard Visiting Scientist Program
NASA Technical Reports Server (NTRS)
2000-01-01
Under this Indefinite Delivery Indefinite Quantity (IDIQ) contract, USRA was expected to provide short term (from I day up to I year) personnel as required to provide a Visiting Scientists Program to support the Earth Sciences Directorate (Code 900) at the Goddard Space Flight Center. The Contractor was to have a pool, or have access to a pool, of scientific talent, both domestic and international, at all levels (graduate student to senior scientist), that would support the technical requirements of the following laboratories and divisions within Code 900: 1) Global Change Data Center (902); 2) Laboratory for Atmospheres (Code 910); 3) Laboratory for Terrestrial Physics (Code 920); 4) Space Data and Computing Division (Code 930); 5) Laboratory for Hydrospheric Processes (Code 970). The research activities described below for each organization within Code 900 were intended to comprise the general scope of effort covered under the Visiting Scientist Program.
A model study of bridge hydraulics : technical summary.
DOT National Transportation Integrated Search
2010-08-01
Most flood studies in the United States use the Army Corps of Engineers Hydrologic Engineering Centers River Analysis System (HEC-RAS) computer program. This report is the second edition. The first edition of the report considered the laboratory m...
Sandia National Laboratories: Research: Materials Science
Technology Partnerships Business, Industry, & Non-Profits Government Universities Center for Development Agreement (CRADA) Strategic Partnership Projects, Non-Federal Entity (SPP/NFE) Agreements New research. Research Our research uses Sandia's experimental, theoretical, and computational capabilities to
Recommendations for Establishing the Texas Roadway Research Implementation Center
DOT National Transportation Integrated Search
1998-07-01
The overall objective of the Roadway Research Initiative study was to describe an advanced testing capability, on that would speed implementation of the results from traditional computer and laboratory-based research efforts by providing a reusable t...
NASA Technical Reports Server (NTRS)
Thomas, V. C.
1986-01-01
A Vibroacoustic Data Base Management Center has been established at the Jet Propulsion Laboratory (JPL). The center utilizes the Vibroacoustic Payload Environment Prediction System (VAPEPS) software package to manage a data base of shuttle and expendable launch vehicle flight and ground test data. Remote terminal access over telephone lines to a dedicated VAPEPS computer system has been established to provide the payload community a convenient means of querying the global VAPEPS data base. This guide describes the functions of the JPL Data Base Management Center and contains instructions for utilizing the resources of the center.
US Department of Energy High School Student Supercomputing Honors Program: A follow-up assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1987-01-01
The US DOE High School Student Supercomputing Honors Program was designed to recognize high school students with superior skills in mathematics and computer science and to provide them with formal training and experience with advanced computer equipment. This document reports on the participants who attended the first such program, which was held at the National Magnetic Fusion Energy Computer Center at the Lawrence Livermore National Laboratory (LLNL) during August 1985.
Green Supercomputing at Argonne
Beckman, Pete
2018-02-07
Pete Beckman, head of Argonne's Leadership Computing Facility (ALCF) talks about Argonne National Laboratory's green supercomputingâeverything from designing algorithms to use fewer kilowatts per operation to using cold Chicago winter air to cool the machine more efficiently. Argonne was recognized for green computing in the 2009 HPCwire Readers Choice Awards. More at http://www.anl.gov/Media_Center/News/2009/news091117.html Read more about the Argonne Leadership Computing Facility at http://www.alcf.anl.gov/
Systems engineering technology for networks
NASA Technical Reports Server (NTRS)
1994-01-01
The report summarizes research pursued within the Systems Engineering Design Laboratory at Virginia Polytechnic Institute and State University between May 16, 1993 and January 31, 1994. The project was proposed in cooperation with the Computational Science and Engineering Research Center at Howard University. Its purpose was to investigate emerging systems engineering tools and their applicability in analyzing the NASA Network Control Center (NCC) on the basis of metrics and measures.
ERIC Educational Resources Information Center
Haider, Md. Zulfeqar; Chowdhury, Takad Ahmed
2012-01-01
This study is based on a survey of the Communicative English Language Certificate (CELC) course run by the Foreign Language Training Center (FLTC), a Project under the Ministry of Education, Bangladesh. FLTC is working to promote the teaching and learning of English through its eleven computer-based and state of the art language laboratories. As…
1983-09-01
Report Al-TR-346. Artifcial Intelligence Laboratory, Mamachusetts Institute of Tech- niugy. Cambridge, Mmeh mett. June 19 [G.usmn@ A. Gaman-Arenas...Testbed Coordinator, 415/859-4395 Artificial Intelligence Center Computer Science and Technology Division Prepared for: Defense Advanced Research...to support processing of aerial photographs for such military applications as cartography, Intelligence , weapon guidance, and targeting. A key
ISCR Annual Report: Fical Year 2004
DOE Office of Scientific and Technical Information (OSTI.GOV)
McGraw, J R
2005-03-03
Large-scale scientific computation and all of the disciplines that support and help to validate it have been placed at the focus of Lawrence Livermore National Laboratory (LLNL) by the Advanced Simulation and Computing (ASC) program of the National Nuclear Security Administration (NNSA) and the Scientific Discovery through Advanced Computing (SciDAC) initiative of the Office of Science of the Department of Energy (DOE). The maturation of computational simulation as a tool of scientific and engineering research is underscored in the November 2004 statement of the Secretary of Energy that, ''high performance computing is the backbone of the nation's science and technologymore » enterprise''. LLNL operates several of the world's most powerful computers--including today's single most powerful--and has undertaken some of the largest and most compute-intensive simulations ever performed. Ultrascale simulation has been identified as one of the highest priorities in DOE's facilities planning for the next two decades. However, computers at architectural extremes are notoriously difficult to use efficiently. Furthermore, each successful terascale simulation only points out the need for much better ways of interacting with the resulting avalanche of data. Advances in scientific computing research have, therefore, never been more vital to LLNL's core missions than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, LLNL must engage researchers at many academic centers of excellence. In Fiscal Year 2004, the Institute for Scientific Computing Research (ISCR) served as one of LLNL's main bridges to the academic community with a program of collaborative subcontracts, visiting faculty, student internships, workshops, and an active seminar series. The ISCR identifies researchers from the academic community for computer science and computational science collaborations with LLNL and hosts them for short- and long-term visits with the aim of encouraging long-term academic research agendas that address LLNL's research priorities. Through such collaborations, ideas and software flow in both directions, and LLNL cultivates its future workforce. The Institute strives to be LLNL's ''eyes and ears'' in the computer and information sciences, keeping the Laboratory aware of and connected to important external advances. It also attempts to be the ''feet and hands'' that carry those advances into the Laboratory and incorporates them into practice. ISCR research participants are integrated into LLNL's Computing and Applied Research (CAR) Department, especially into its Center for Applied Scientific Computing (CASC). In turn, these organizations address computational challenges arising throughout the rest of the Laboratory. Administratively, the ISCR flourishes under LLNL's University Relations Program (URP). Together with the other five institutes of the URP, it navigates a course that allows LLNL to benefit from academic exchanges while preserving national security. While it is difficult to operate an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and worth the continued effort.« less
Twelve Scientific Specialists of the Peenemuende Team
NASA Technical Reports Server (NTRS)
2004-01-01
Twelve scientific specialists of the Peenemuende team at the front of Building 4488, Redstone Arsenal, Huntsville, Alabama. They led the Army's space efforts at ABMA before transfer of the team to National Aeronautic and Space Administration (NASA), George C. Marshall Space Flight Center (MSFC). (Left to right) Dr. Ernst Stuhlinger, Director, Research Projects Office; Dr. Helmut Hoelzer, Director, Computation Laboratory: Karl L. Heimburg, Director, Test Laboratory; Dr. Ernst Geissler, Director, Aeroballistics Laboratory; Erich W. Neubert, Director, Systems Analysis Reliability Laboratory; Dr. Walter Haeussermarn, Director, Guidance and Control Laboratory; Dr. Wernher von Braun, Director Development Operations Division; William A. Mrazek, Director, Structures and Mechanics Laboratory; Hans Hueter, Director, System Support Equipment Laboratory;Eberhard Rees, Deputy Director, Development Operations Division; Dr. Kurt Debus, Director Missile Firing Laboratory; Hans H. Maus, Director, Fabrication and Assembly Engineering Laboratory
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Sandia National Laboratories: Careers: Materials Science
Technology Partnerships Business, Industry, & Non-Profits Government Universities Center for Development Agreement (CRADA) Strategic Partnership Projects, Non-Federal Entity (SPP/NFE) Agreements New Sandia's experimental, theoretical, and computational capabilities to establish the state of the art in
Proceedings of the Thirteenth Annual Software Engineering Workshop
NASA Technical Reports Server (NTRS)
1988-01-01
Topics covered in the workshop included studies and experiments conducted in the Software Engineering Laboratory (SEL), a cooperative effort of NASA Goddard Space Flight Center, the University of Maryland, and Computer Sciences Corporation; software models; software products; and software tools.
Center for the Integration of Optical Computing
1992-03-15
their photorefractive properties, calculating the possible interconnect capacities, and collaborating with industry( Brimrose Corp. and Hughes Research...cooperation with Hughes Research Laboratories and Brimrose Corporation we have proceeded with a basic study of CdTe, ZnTe, and the mixed crystals Cd
The Astromaterials X-Ray Computed Tomography Laboratory at Johnson Space Center
NASA Astrophysics Data System (ADS)
Zeigler, R. A.; Blumenfeld, E. H.; Srinivasan, P.; McCubbin, F. M.; Evans, C. A.
2018-04-01
The Astromaterials Curation Office has recently begun incorporating X-ray CT data into the curation processes for lunar and meteorite samples, and long-term curation of that data and serving it to the public represent significant technical challenges.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lottes, S.A.; Bojanowski, C.; Shen, J.
2012-04-09
The computational fluid dynamics (CFD) and computational structural mechanics (CSM) focus areas at Argonne's Transportation Research and Analysis Computing Center (TRACC) initiated a project to support and compliment the experimental programs at the Turner-Fairbank Highway Research Center (TFHRC) with high performance computing based analysis capabilities in August 2010. The project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at TFHRC for a period of five years, beginning in October 2010. Themore » analysis methods employ well-benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focus of the project are wind and water effects on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to improve design allowing for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, CFD analysis of the operation of the wind tunnel in the TFHRC wind engineering laboratory. This quarterly report documents technical progress on the project tasks for the period of October through December 2011.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lottes, S.A.; Bojanowski, C.; Shen, J.
2012-06-28
The computational fluid dynamics (CFD) and computational structural mechanics (CSM) focus areas at Argonne's Transportation Research and Analysis Computing Center (TRACC) initiated a project to support and compliment the experimental programs at the Turner-Fairbank Highway Research Center (TFHRC) with high performance computing based analysis capabilities in August 2010. The project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at TFHRC for a period of five years, beginning in October 2010. Themore » analysis methods employ well benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focus of the project are wind and water effects on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to improve design allowing for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, CFD analysis of the operation of the wind tunnel in the TFHRC wind engineering laboratory. This quarterly report documents technical progress on the project tasks for the period of January through March 2012.« less
Predictive Model and Methodology for Heat Treatment Distortion Final Report CRADA No. TC-298-92
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nikkel, D. J.; McCabe, J.
This project was a multi-lab, multi-partner CRADA involving LLNL, Los Alamos National Laboratory, Sandia National Laboratories, Oak Ridge National Laboratory, Martin Marietta Energy Systems and the industrial partner, The National Center of Manufacturing Sciences (NCMS). A number of member companies of NCMS participated including General Motors Corporation, Ford Motor Company, The Torrington Company, Gear Research, the Illinois Institute of Technology Research Institute, and Deformation Control Technology •. LLNL was the lead laboratory for metrology technology used for validation of the computational tool/methodology. LLNL was also the lead laboratory for the development of the software user interface , for the computationalmore » tool. This report focuses on the participation of LLNL and NCMS. The purpose of the project was to develop a computational tool/methodology that engineers would use to predict the effects of heat treatment on the _size and shape of industrial parts made of quench hardenable alloys. Initially, the target application of the tool was gears for automotive power trains.« less
The use of graphics in the design of the human-telerobot interface
NASA Technical Reports Server (NTRS)
Stuart, Mark A.; Smith, Randy L.
1989-01-01
The Man-Systems Telerobotics Laboratory (MSTL) of NASA's Johnson Space Center employs computer graphics tools in their design and evaluation of the Flight Telerobotic Servicer (FTS) human/telerobot interface on the Shuttle and on the Space Station. It has been determined by the MSTL that the use of computer graphics can promote more expedient and less costly design endeavors. Several specific examples of computer graphics applied to the FTS user interface by the MSTL are described.
NASA Center for Intelligent Robotic Systems for Space Exploration
NASA Technical Reports Server (NTRS)
1990-01-01
NASA's program for the civilian exploration of space is a challenge to scientists and engineers to help maintain and further develop the United States' position of leadership in a focused sphere of space activity. Such an ambitious plan requires the contribution and further development of many scientific and technological fields. One research area essential for the success of these space exploration programs is Intelligent Robotic Systems. These systems represent a class of autonomous and semi-autonomous machines that can perform human-like functions with or without human interaction. They are fundamental for activities too hazardous for humans or too distant or complex for remote telemanipulation. To meet this challenge, Rensselaer Polytechnic Institute (RPI) has established an Engineering Research Center for Intelligent Robotic Systems for Space Exploration (CIRSSE). The Center was created with a five year $5.5 million grant from NASA submitted by a team of the Robotics and Automation Laboratories. The Robotics and Automation Laboratories of RPI are the result of the merger of the Robotics and Automation Laboratory of the Department of Electrical, Computer, and Systems Engineering (ECSE) and the Research Laboratory for Kinematics and Robotic Mechanisms of the Department of Mechanical Engineering, Aeronautical Engineering, and Mechanics (ME,AE,&M), in 1987. This report is an examination of the activities that are centered at CIRSSE.
Unique life sciences research facilities at NASA Ames Research Center
NASA Technical Reports Server (NTRS)
Mulenburg, G. M.; Vasques, M.; Caldwell, W. F.; Tucker, J.
1994-01-01
The Life Science Division at NASA's Ames Research Center has a suite of specialized facilities that enable scientists to study the effects of gravity on living systems. This paper describes some of these facilities and their use in research. Seven centrifuges, each with its own unique abilities, allow testing of a variety of parameters on test subjects ranging from single cells through hardware to humans. The Vestibular Research Facility allows the study of both centrifugation and linear acceleration on animals and humans. The Biocomputation Center uses computers for 3D reconstruction of physiological systems, and interactive research tools for virtual reality modeling. Psycophysiological, cardiovascular, exercise physiology, and biomechanical studies are conducted in the 12 bed Human Research Facility and samples are analyzed in the certified Central Clinical Laboratory and other laboratories at Ames. Human bedrest, water immersion and lower body negative pressure equipment are also available to study physiological changes associated with weightlessness. These and other weightlessness models are used in specialized laboratories for the study of basic physiological mechanisms, metabolism and cell biology. Visual-motor performance, perception, and adaptation are studied using ground-based models as well as short term weightlessness experiments (parabolic flights). The unique combination of Life Science research facilities, laboratories, and equipment at Ames Research Center are described in detail in relation to their research contributions.
The Lister Hill National Center for Biomedical Communications.
Smith, K A
1994-09-01
On August 3, 1968, the Joint Resolution of the Congress established the program and construction of the Lister Hill National Center for Biomedical Communications. The facility dedicated in 1980 contains the latest in computer and communications technologies. The history, program requirements, construction management, and general planning are discussed including technical issues regarding cabling, systems functions, heating, ventilation, and air conditioning system (HVAC), fire suppression, research and development laboratories, among others.
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Environmental Management System
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Sandia National Laboratories: National Security Missions: Nuclear Weapons
Technology Partnerships Business, Industry, & Non-Profits Government Universities Center for Development Agreement (CRADA) Strategic Partnership Projects, Non-Federal Entity (SPP/NFE) Agreements New , in which fundamental science, computer models, and unique experimental facilities come together so
’Do-It-Yourself’ Fallout/Blast Shelter Evaluation
1984-03-01
N4AME & AOORIESS(I! dittvrevI !M’", Controlling Olif~t) IS. SEC’.JRITY CL-ASS. (GO this report) Lawrence Livermore National Laboratory Unclassified P...the data from the transient recorder iemory tirough the Computer Automated Measurement and Control (CAMAC) data busa und stores them on an $-inch...Command and Control Technical Center Emergency Technology Division Department of Defense 0a& Ridge Natioual Laboratory The Pentagon Attn: Librarian
Effect of Graphene with Nanopores on Metal Clusters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Hu; Chen, Xianlang; Wang, Lei
Porous graphene, which is a novel type of defective graphene, shows excellent potential as a support material for metal clusters. In this work, the stability and electronic structures of metal clusters (Pd, Ir, Rh) supported on pristine graphene and graphene with different sizes of nanopore were investigated by first-principle density functional theory (DFT) calculations. Thereafter, CO adsorption and oxidation reaction on the Pd-graphene system were chosen to evaluate its catalytic performance. Graphene with nanopore can strongly stabilize the metal clusters and cause a substantial downshift of the d-band center of the metal clusters, thus decreasing CO adsorption. All binding energies,more » d-band centers, and adsorption energies show a linear change with the size of the nanopore: a bigger size of nanopore corresponds to a stronger metal clusters bond to the graphene, lower downshift of the d-band center, and weaker CO adsorption. By using a suitable size nanopore, supported Pd clusters on the graphene will have similar CO and O2 adsorption ability, thus leading to superior CO tolerance. The DFT calculated reaction energy barriers show that graphene with nanopore is a superior catalyst for CO oxidation reaction. These properties can play an important role in instructing graphene-supported metal catalyst preparation to prevent the diffusion or agglomeration of metal clusters and enhance catalytic performance. This work was supported by National Basic Research Program of China (973Program) (2013CB733501), the National Natural Science Foundation of China (NSFC-21176221, 21136001, 21101137, 21306169, and 91334013). D. Mei acknowledges the support from the US Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences & Biosciences. Pacific Northwest National Laboratory (PNNL) is a multiprogram national laboratory operated for DOE by Battelle. Computing time was granted by the grand challenge of computational catalysis of the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) and by the National Energy Research Scientific Computing Center (NERSC).« less
Pulmonary Testing Laboratory Computer Application
Johnson, Martin E.
1980-01-01
An interactive computer application reporting patient pulmonary function data has been developed by Washington, D.C. VA Medical Center staff. A permanent on-line data base of patient demographics, lung capacity, flows, diffusion, arterial blood gases and physician interpretation is maintained by a minicomputer at the hospital. A user oriented application program resulted from development in concert with the clinical users. Rapid program development resulted from employing a newly developed time saving technique that has found wide application at other VA Medical Centers. Careful attention to user interaction has resulted in an application program requiring little training and which has been satisfactorily used by a number of clinicians.
Ammonia Oxidation by Abstraction of Three Hydrogen Atoms from a Mo–NH 3 Complex
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhattacharya, Papri; Heiden, Zachariah M.; Wiedner, Eric S.
We report ammonia oxidation by homolytic cleavage of all three H atoms from a Mo-15NH3 complex using the 2,4,6-tri-tert-butylphenoxyl radical to afford a Mo-alkylimido (Mo=15NR) complex (R = 2,4,6-tri-t-butylcyclohexa-2,5-dien-1-one). Reductive cleavage of Mo=15NR generates a terminal Mo≡N nitride, and a [Mo-15NH]+ complex is formed by protonation. Computational analysis describes the energetic profile for the stepwise removal of three H atoms from the Mo-15NH3 complex and the formation of Mo=15NR. Acknowledgment. This work was supported as part of the Center for Molecular Electrocatalysis, an Energy Frontier Re-search Center funded by the U.S. Department of Energy (U.S. DOE), Office of Science, Officemore » of Basic Energy Sciences. EPR and mass spectrometry experiments were performed using EMSL, a national scientific user facility sponsored by the DOE’s Office of Biological and Environmental Research and located at PNNL. The authors thank Dr. Eric D. Walter and Dr. Rosalie Chu for assistance in performing EPR and mass spectroscopy analysis, respectively. Computational resources provided by the National Energy Re-search Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory. Pacific North-west National Laboratory is operated by Battelle for the U.S. DOE.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Dean N.
2011-04-02
This report summarizes work carried out by the Earth System Grid Center for Enabling Technologies (ESG-CET) from October 1, 2010 through March 31, 2011. It discusses ESG-CET highlights for the reporting period, overall progress, period goals, and collaborations, and lists papers and presentations. To learn more about our project and to find previous reports, please visit the ESG-CET Web sites: http://esg-pcmdi.llnl.gov/ and/or https://wiki.ucar.edu/display/esgcet/Home. This report will be forwarded to managers in the Department of Energy (DOE) Scientific Discovery through Advanced Computing (SciDAC) program and the Office of Biological and Environmental Research (OBER), as well as national and international collaborators andmore » stakeholders (e.g., those involved in the Coupled Model Intercomparison Project, phase 5 (CMIP5) for the Intergovernmental Panel on Climate Change (IPCC) 5th Assessment Report (AR5); the Community Earth System Model (CESM); the Climate Science Computational End Station (CCES); SciDAC II: A Scalable and Extensible Earth System Model for Climate Change Science; the North American Regional Climate Change Assessment Program (NARCCAP); the Atmospheric Radiation Measurement (ARM) program; the National Aeronautics and Space Administration (NASA), the National Oceanic and Atmospheric Administration (NOAA)), and also to researchers working on a variety of other climate model and observation evaluation activities. The ESG-CET executive committee consists of Dean N. Williams, Lawrence Livermore National Laboratory (LLNL); Ian Foster, Argonne National Laboratory (ANL); and Don Middleton, National Center for Atmospheric Research (NCAR). The ESG-CET team is a group of researchers and scientists with diverse domain knowledge, whose home institutions include eight laboratories and two universities: ANL, Los Alamos National Laboratory (LANL), Lawrence Berkeley National Laboratory (LBNL), LLNL, NASA/Jet Propulsion Laboratory (JPL), NCAR, Oak Ridge National Laboratory (ORNL), Pacific Marine Environmental Laboratory (PMEL)/NOAA, Rensselaer Polytechnic Institute (RPI), and University of Southern California, Information Sciences Institute (USC/ISI). All ESG-CET work is accomplished under DOE open-source guidelines and in close collaboration with the project's stakeholders, domain researchers, and scientists. Through the ESG project, the ESG-CET team has developed and delivered a production environment for climate data from multiple climate model sources (e.g., CMIP (IPCC), CESM, ocean model data (e.g., Parallel Ocean Program), observation data (e.g., Atmospheric Infrared Sounder, Microwave Limb Sounder), and analysis and visualization tools) that serves a worldwide climate research community. Data holdings are distributed across multiple sites including LANL, LBNL, LLNL, NCAR, and ORNL as well as unfunded partners sites such as the Australian National University (ANU) National Computational Infrastructure (NCI), the British Atmospheric Data Center (BADC), the Geophysical Fluid Dynamics Laboratory/NOAA, the Max Planck Institute for Meteorology (MPI-M), the German Climate Computing Centre (DKRZ), and NASA/JPL. As we transition from development activities to production and operations, the ESG-CET team is tasked with making data available to all users who want to understand it, process it, extract value from it, visualize it, and/or communicate it to others. This ongoing effort is extremely large and complex, but it will be incredibly valuable for building 'science gateways' to critical climate resources (such as CESM, CMIP5, ARM, NARCCAP, Atmospheric Infrared Sounder (AIRS), etc.) for processing the next IPCC assessment report. Continued ESG progress will result in a production-scale system that will empower scientists to attempt new and exciting data exchanges, which could ultimately lead to breakthrough climate science discoveries.« less
Performance assessment of KORAT-3D on the ANL IBM-SP computer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alexeyev, A.V.; Zvenigorodskaya, O.A.; Shagaliev, R.M.
1999-09-01
The TENAR code is currently being developed at the Russian Federal Nuclear Center (VNIIEF) as a coupled dynamics code for the simulation of transients in VVER and RBMK systems and other nuclear systems. The neutronic module in this code system is KORAT-3D. This module is also one of the most computationally intensive components of the code system. A parallel version of KORAT-3D has been implemented to achieve the goal of obtaining transient solutions in reasonable computational time, particularly for RBMK calculations that involve the application of >100,000 nodes. An evaluation of the KORAT-3D code performance was recently undertaken on themore » Argonne National Laboratory (ANL) IBM ScalablePower (SP) parallel computer located in the Mathematics and Computer Science Division of ANL. At the time of the study, the ANL IBM-SP computer had 80 processors. This study was conducted under the auspices of a technical staff exchange program sponsored by the International Nuclear Safety Center (INSC).« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keyes, D E; McGraw, J R
2006-02-02
Large-scale scientific computation and all of the disciplines that support and help validate it have been placed at the focus of Lawrence Livermore National Laboratory (LLNL) by the Advanced Simulation and Computing (ASC) program of the National Nuclear Security Administration (NNSA) and the Scientific Discovery through Advanced Computing (SciDAC) initiative of the Office of Science of the Department of Energy (DOE). The maturation of simulation as a fundamental tool of scientific and engineering research is underscored in the President's Information Technology Advisory Committee (PITAC) June 2005 finding that ''computational science has become critical to scientific leadership, economic competitiveness, and nationalmore » security''. LLNL operates several of the world's most powerful computers--including today's single most powerful--and has undertaken some of the largest and most compute-intensive simulations ever performed, most notably the molecular dynamics simulation that sustained more than 100 Teraflop/s and won the 2005 Gordon Bell Prize. Ultrascale simulation has been identified as one of the highest priorities in DOE's facilities planning for the next two decades. However, computers at architectural extremes are notoriously difficult to use in an efficient manner. Furthermore, each successful terascale simulation only points out the need for much better ways of interacting with the resulting avalanche of data. Advances in scientific computing research have, therefore, never been more vital to the core missions of LLNL than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, LLNL must engage researchers at many academic centers of excellence. In FY 2005, the Institute for Scientific Computing Research (ISCR) served as one of LLNL's main bridges to the academic community with a program of collaborative subcontracts, visiting faculty, student internships, workshops, and an active seminar series. The ISCR identifies researchers from the academic community for computer science and computational science collaborations with LLNL and hosts them for both brief and extended visits with the aim of encouraging long-term academic research agendas that address LLNL research priorities. Through these collaborations, ideas and software flow in both directions, and LLNL cultivates its future workforce. The Institute strives to be LLNL's ''eyes and ears'' in the computer and information sciences, keeping the Laboratory aware of and connected to important external advances. It also attempts to be the ''hands and feet'' that carry those advances into the Laboratory and incorporate them into practice. ISCR research participants are integrated into LLNL's Computing Applications and Research (CAR) Department, especially into its Center for Applied Scientific Computing (CASC). In turn, these organizations address computational challenges arising throughout the rest of the Laboratory. Administratively, the ISCR flourishes under LLNL's University Relations Program (URP). Together with the other four institutes of the URP, the ISCR navigates a course that allows LLNL to benefit from academic exchanges while preserving national security. While it is difficult to operate an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and worth the continued effort. The pages of this annual report summarize the activities of the faculty members, postdoctoral researchers, students, and guests from industry and other laboratories who participated in LLNL's computational mission under the auspices of the ISCR during FY 2005.« less
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Nuclear Deterrence and Stockpile Stewardship
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Emerging Threats and Opportunities
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Protecting Against Nuclear Threats
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
2015-06-01
public release; distribution is unlimited. The US Army Engineer Research and Development Center (ERDC) solves the nation’s toughest engineering and...Framework (PIAF) Timothy K. Perkins and Chris C. Rewerts Construction Engineering Research Laboratory U.S. Army Engineer Research and Development Center...Prepared for U.S. Army Corps of Engineers Washington, DC 20314-1000 Under Project P2 335530, “Cultural Reasoning and Ethnographic Analysis for the
LTSS compendium: an introduction to the CDC 7600 and the Livermore Timesharing System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fong, K. W.
1977-08-15
This report is an introduction to the CDC 7600 computer and to the Livermore Timesharing System (LTSS) used by the National Magnetic Fusion Energy Computer Center (NMFECC) and the Lawrence Livermore Laboratory Computer Center (LLLCC or Octopus network) on their 7600's. This report is based on a document originally written specifically about the system as it is implemented at NMFECC but has been broadened to point out differences in implementation at LLLCC. It also contains information about LLLCC not relevant to NMFECC. This report is written for computational physicists who want to prepare large production codes to run under LTSSmore » on the 7600's. The generalized discussion of the operating system focuses on creating and executing controllees. This document and its companion, UCID-17557, CDC 7600 LTSS Programming Stratagems, provide a basis for understanding more specialized documents about individual parts of the system.« less
Modeling Subsurface Reactive Flows Using Leadership-Class Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mills, Richard T; Hammond, Glenn; Lichtner, Peter
2009-01-01
We describe our experiences running PFLOTRAN - a code for simulation of coupled hydro-thermal-chemical processes in variably saturated, non-isothermal, porous media - on leadership-class supercomputers, including initial experiences running on the petaflop incarnation of Jaguar, the Cray XT5 at the National Center for Computational Sciences at Oak Ridge National Laboratory. PFLOTRAN utilizes fully implicit time-stepping and is built on top of the Portable, Extensible Toolkit for Scientific Computation (PETSc). We discuss some of the hurdles to 'at scale' performance with PFLOTRAN and the progress we have made in overcoming them on leadership-class computer architectures.
The flight robotics laboratory
NASA Technical Reports Server (NTRS)
Tobbe, Patrick A.; Williamson, Marlin J.; Glaese, John R.
1988-01-01
The Flight Robotics Laboratory of the Marshall Space Flight Center is described in detail. This facility, containing an eight degree of freedom manipulator, precision air bearing floor, teleoperated motion base, reconfigurable operator's console, and VAX 11/750 computer system, provides simulation capability to study human/system interactions of remote systems. The facility hardware, software and subsequent integration of these components into a real time man-in-the-loop simulation for the evaluation of spacecraft contact proximity and dynamics are described.
Laboratory for Computer Science Progress Report 21, July 1983-June 1984.
1984-06-01
Systems 269 4. Distributed Consensus 270 5. Election of a Leader in a Distributed Ring of Processors 273 6. Distributed Network Algorithms 274 7. Diagnosis...multiprocessor systems. This facility, funded by the new!y formed Strategic Computing Program of the Defense Advanced Research Projects Agency, will enable...Academic Staff P. Szo)ovits, Group Leader R. Patil Collaborating Investigators M. Criscitiello, M.D., Tufts-New England Medical Center Hospital R
Blast Computations over a Hemicylindrical Aircraft Shelter
1981-07-01
Westmoreland, C.D., "The HULL Hydro- dinamics Computer Code", AFWL-TR-76-183, U.S. Air Force Wocpon Laboratory, Kirtland Air Force Baze, IN (Septenber...DISTRIBUTION LIST No. of No. of Copies Organization Copies Organization 2 Commander 1 Director Defense Technical Info Center Weapons Systems Evaluation Gp ATTN...DRDTA-’’. Fort Monroe, VA 23651 Warrel, MI 48090 2 Director Commander US Army TRADOC Systems US Army Foreign Scienco and Analysis Activity Technology
Tiny plastic lung mimics human pulmonary function
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Science and Innovation at Los Alamos
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Public Reading Room: Environmental Documents, Reports
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
The Sky's the Limit in Math-Related Careers.
ERIC Educational Resources Information Center
Askew, Judy
This booklet introduces readers--particularly women--to some jobs that use mathematical training, in laboratories, computer centers, universities, insurance companies, and government offices. Based on information from women working in mathematics-related fields, it is designed to help women consider various career choices. Sections focus on…
NASA Technical Reports Server (NTRS)
Matthews, Christine G.; Posenau, Mary-Anne; Leonard, Desiree M.; Avis, Elizabeth L.; Debure, Kelly R.; Stacy, Kathryn; Vonofenheim, Bill
1992-01-01
The intent is to provide an introduction to the image processing capabilities available at the Langley Research Center (LaRC) Central Scientific Computing Complex (CSCC). Various image processing software components are described. Information is given concerning the use of these components in the Data Visualization and Animation Laboratory at LaRC.
Building Blueprints: A Clear View of Technology.
ERIC Educational Resources Information Center
College Planning & Management, 2002
2002-01-01
Describes the design of the technology center at Laney College in Oakland, California, which was renovated from a welding shop. The building, which illustrates a "transparency" theme, houses the computer information systems department and serves as a multimedia teaching laboratory for the entire campus and local businesses. Includes…
Press Releases | Argonne National Laboratory
Electrochemical Energy Science --Center for Transportation Research --Chain Reaction Innovations --Computation renewable energy such as wind and solar power. April 25, 2018 John Carlisle, director of Chain Reaction across nation to grow startups Argonne announces second cohort of Chain Reaction Innovations. April 18
Science Education: An Experiment in Facilitating the Learning of Neurophysiology.
ERIC Educational Resources Information Center
Levitan, Herbert
1981-01-01
Summarizes the experiences of a zoology professor attempting to construct a student-centered course in neurophysiology. Various aspects of the organization and conduct of the course are described, including the beginning experience, topics of interest, lecture, laboratory, computer simulation, examinations, student lectures. Evaluation of the…
DCDM1: Lessons Learned from the World's Most Energy Efficient Data Center
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sickinger, David E; Van Geet, Otto D; Carter, Thomas
This presentation discusses the holistic approach to design the world's most energy-efficient data center, which is located at the U.S. Department of Energy National Renewable Energy Laboratory (NREL). This high-performance computing (HPC) data center has achieved a trailing twelve-month average power usage effectiveness (PUE) of 1.04 and features a chiller-less design, component-level warm-water liquid cooling, and waste heat capture and reuse. We provide details of the demonstrated PUE and energy reuse effectiveness (ERE) and lessons learned during four years of production operation. Recent efforts to dramatically reduce the water footprint will also be discussed. Johnson Controls partnered with NREL andmore » Sandia National Laboratories to deploy a thermosyphon cooler (TSC) as a test bed at NREL's HPC data center that resulted in a 50% reduction in water usage during the first year of operation. The Thermosyphon Cooler Hybrid System (TCHS) integrates the control of a dry heat rejection device with an open cooling tower.« less
Laboratory on Legs: An Architecture for Adjustable Morphology with Legged Robots
2012-04-01
fit within the body of the robot. Additional capabilities will largely depend upon a given activity, and should be easily reconfigurable to maximize...mobile robots, the essential units of actuation, computation, and sensing must be designed to fit within the body of the robot. Additional...PackBot,36 among others. Two parallel rails, 40 cm long and spaced at a center-to-center distance of 14 cm, span the length of the each robot’s body
Panel: If I Only Knew Then What I Know Now
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Interactive Spreadsheets in JCE Webware
ERIC Educational Resources Information Center
Coleman, William F.; Fedosky, Edward W.
2005-01-01
A description of the Microsoft Excel spreadsheet simulation, Anharmonicity.xls that can be used to smoothly and continuously switch a plotted function and its quadratic approximation is presented. It can be used in a classroom demonstration or incorporated into a student-centered computer-laboratory exercise to examine the qualitative behavior of…
Laboratory Information Management System (LIMS): A case study
NASA Technical Reports Server (NTRS)
Crandall, Karen S.; Auping, Judith V.; Megargle, Robert G.
1987-01-01
In the late 70's, a refurbishment of the analytical laboratories serving the Materials Division at NASA Lewis Research Center was undertaken. As part of the modernization efforts, a Laboratory Information Management System (LIMS) was to be included. Preliminary studies indicated a custom-designed system as the best choice in order to satisfy all of the requirements. A scaled down version of the original design has been in operation since 1984. The LIMS, a combination of computer hardware, provides the chemical characterization laboratory with an information data base, a report generator, a user interface, and networking capabilities. This paper is an account of the processes involved in designing and implementing that LIMS.
NASA Technical Reports Server (NTRS)
Barker, L. Keith; Mckinney, William S., Jr.
1989-01-01
The Laboratory Telerobotic Manipulator (LTM) is a seven-degree-of-freedom robot arm. Two of the arms were delivered to Langley Research Center for ground-based research to assess the use of redundant degree-of-freedom robot arms in space operations. Resolved-rate control equations for the LTM are derived. The equations are based on a scheme developed at the Oak Ridge National Laboratory for computing optimized joint angle rates in real time. The optimized joint angle rates actually represent a trade-off, as the hand moves, between small rates (least-squares solution) and those rates which work toward satisfying a specified performance criterion of joint angles. In singularities where the optimization scheme cannot be applied, alternate control equations are devised. The equations developed were evaluated using a real-time computer simulation to control a 3-D graphics model of the LTM.
NASA Technical Reports Server (NTRS)
1997-01-01
In 1990, Lewis Research Center jointly sponsored a conference with the U.S. Air Force Wright Laboratory focused on high speed imaging. This conference, and early funding by Lewis Research Center, helped to spur work by Silicon Mountain Design, Inc. to break the performance barriers of imaging speed, resolution, and sensitivity through innovative technology. Later, under a Small Business Innovation Research contract with the Jet Propulsion Laboratory, the company designed a real-time image enhancing camera that yields superb, high quality images in 1/30th of a second while limiting distortion. The result is a rapidly available, enhanced image showing significantly greater detail compared to image processing executed on digital computers. Current applications include radiographic and pathology-based medicine, industrial imaging, x-ray inspection devices, and automated semiconductor inspection equipment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fong, K. W.
1977-08-15
This report deals with some techniques in applied programming using the Livermore Timesharing System (LTSS) on the CDC 7600 computers at the National Magnetic Fusion Energy Computer Center (NMFECC) and the Lawrence Livermore Laboratory Computer Center (LLLCC or Octopus network). This report is based on a document originally written specifically about the system as it is implemented at NMFECC but has been revised to accommodate differences between LLLCC and NMFECC implementations. Topics include: maintaining programs, debugging, recovering from system crashes, and using the central processing unit, memory, and input/output devices efficiently and economically. Routines that aid in these procedures aremore » mentioned. The companion report, UCID-17556, An LTSS Compendium, discusses the hardware and operating system and should be read before reading this report.« less
Nonperturbative methods in HZE ion transport
NASA Technical Reports Server (NTRS)
Wilson, John W.; Badavi, Francis F.; Costen, Robert C.; Shinn, Judy L.
1993-01-01
A nonperturbative analytic solution of the high charge and energy (HZE) Green's function is used to implement a computer code for laboratory ion beam transport. The code is established to operate on the Langley Research Center nuclear fragmentation model used in engineering applications. Computational procedures are established to generate linear energy transfer (LET) distributions for a specified ion beam and target for comparison with experimental measurements. The code is highly efficient and compares well with the perturbation approximations.
Computational Methods for Crashworthiness
NASA Technical Reports Server (NTRS)
Noor, Ahmed K. (Compiler); Carden, Huey D. (Compiler)
1993-01-01
Presentations and discussions from the joint UVA/NASA Workshop on Computational Methods for Crashworthiness held at Langley Research Center on 2-3 Sep. 1992 are included. The presentations addressed activities in the area of impact dynamics. Workshop attendees represented NASA, the Army and Air Force, the Lawrence Livermore and Sandia National Laboratories, the aircraft and automotive industries, and academia. The workshop objectives were to assess the state-of-technology in the numerical simulation of crash and to provide guidelines for future research.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lottes, S.A.; Kulak, R.F.; Bojanowski, C.
2011-12-09
The computational fluid dynamics (CFD) and computational structural mechanics (CSM) focus areas at Argonne's Transportation Research and Analysis Computing Center (TRACC) initiated a project to support and compliment the experimental programs at the Turner-Fairbank Highway Research Center (TFHRC) with high performance computing based analysis capabilities in August 2010. The project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at TFHRC for a period of five years, beginning in October 2010. Themore » analysis methods employ well-benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focus of the project are wind and water effects on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to assess them for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, CFD analysis of the operation of the wind tunnel in the TFCHR wind engineering laboratory, vehicle stability under high wind loading, and the use of electromagnetic shock absorbers to improve vehicle stability under high wind conditions. This quarterly report documents technical progress on the project tasks for the period of July through September 2011.« less
CSI computer system/remote interface unit acceptance test results
NASA Technical Reports Server (NTRS)
Sparks, Dean W., Jr.
1992-01-01
The validation tests conducted on the Control/Structures Interaction (CSI) Computer System (CCS)/Remote Interface Unit (RIU) is discussed. The CCS/RIU consists of a commercially available, Langley Research Center (LaRC) programmed, space flight qualified computer and a flight data acquisition and filtering computer, developed at LaRC. The tests were performed in the Space Structures Research Laboratory (SSRL) and included open loop excitation, closed loop control, safing, RIU digital filtering, and RIU stand alone testing with the CSI Evolutionary Model (CEM) Phase-0 testbed. The test results indicated that the CCS/RIU system is comparable to ground based systems in performing real-time control-structure experiments.
FY04 Engineering Technology Reports Technology Base
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharpe, R M
2005-01-27
Lawrence Livermore National Laboratory's Engineering Directorate has two primary discretionary avenues for its investment in technologies: the Laboratory Directed Research and Development (LDRD) program and the ''Tech Base'' program. This volume summarizes progress on the projects funded for technology-base efforts in FY2004. The Engineering Technical Reports exemplify Engineering's more than 50-year history of researching and developing (LDRD), and reducing to practice (technology-base) the engineering technologies needed to support the Laboratory's missions. Engineering has been a partner in every major program and project at the Laboratory throughout its existence, and has prepared for this role with a skilled workforce and technicalmore » resources. This accomplishment is well summarized by Engineering's mission: ''Enable program success today and ensure the Laboratory's vitality tomorrow''. LDRD is the vehicle for creating those technologies and competencies that are cutting edge. These require a significant level of research or contain some unknown that needs to be fully understood. Tech Base is used to apply those technologies, or adapt them to a Laboratory need. The term commonly used for Tech Base projects is ''reduction to practice''. Tech Base projects effect the natural transition to reduction-to-practice of scientific or engineering methods that are well understood and established. They represent discipline-oriented, core competency activities that are multi-programmatic in application, nature, and scope. The objectives of technology-base funding include: (1) the development and enhancement of tools and processes to provide Engineering support capability, such as code maintenance and improved fabrication methods; (2) support of Engineering science and technology infrastructure, such as the installation or integration of a new capability; (3) support for technical and administrative leadership through our technology Centers; and (4) the initial scoping and exploration of selected technology areas with high strategic potential, such as assessment of university, laboratory, and industrial partnerships. Engineering's five Centers, in partnership with the Division Leaders and Department Heads, focus and guide longer-term investments within Engineering. The Centers attract and retain top staff, develop and maintain critical core technologies, and enable programs. Through their technology-base projects, they oversee the application of known engineering approaches and techniques to scientific and technical problems. The Centers and their Directors are as follows: (1) Center for Computational Engineering: Robert M. Sharpe; (2) Center for Microtechnology and Nanotechnology: Raymond P. Mariella, Jr. (3) Center for Nondestructive Characterization: Harry E. Martz, Jr.; (4) Center for Precision Engineering: Keith Carlisle; and (5) Center for Complex Distributed Systems: Gregory J. Suski, Acting Director.« less
1981-01-01
Spacelab was a versatile laboratory carried in the Space Shuttle's cargo bay for special research flights. Its various elements could be combined to accommodate the many types of scientific research that could best be performed in space. Spacelab consisted of an enclosed, pressurized laboratory module and open U-shaped pallets located at the rear of the laboratory module. The laboratory module contained utilities, computers, work benches, and instrument racks to conduct scientific experiments in astronomy, physics, chemistry, biology, medicine, and engineering. Equipment, such as telescopes, anternas, and sensors, was mounted on pallets for direct exposure to space. A 1-meter (3.3-ft.) diameter aluminum tunnel, resembling a z-shaped tube, connected the crew compartment (mid deck) to the module. The reusable Spacelab allowed scientists to bring experiment samples back to Earth for post-flight analysis. Spacelab was a cooperative venture of the European Space Agency (ESA) and NASA. ESA was responsible for funding, developing, and building of Spacelab, while NASA was responsible for the launch and operational use of Spacelab. Spacelab missions were cooperative efforts between scientists and engineers from around the world. Teams from NASA centers, universities, private industry, government agencies and international space organizations designed the experiments. The Marshall Space Flight Center was NASA's lead center for monitoring the development of Spacelab and managing the program.
Marginal and internal fits of fixed dental prostheses zirconia retainers.
Beuer, Florian; Aggstaller, Hans; Edelhoff, Daniel; Gernet, Wolfgang; Sorensen, John
2009-01-01
CAM (computer-aided manufacturing) and CAD (computer-aided design)/CAM systems facilitate the use of zirconia substructure materials for all-ceramic fixed partial dentures. This in vitro study compared the precision of fit of frameworks milled from semi-sintered zirconia blocks that were designed and machined with two CAD/CAM and one CAM system. Three-unit posterior fixed dental prostheses (FDP) (n=10) were fabricated for standardized dies by: a milling center CAD/CAM system (Etkon), a laboratory CAD/CAM system (Cerec InLab), and a laboratory CAM system (Cercon). After adaptation by a dental technician, the FDP were cemented on definitive dies, embedded and sectioned. The marginal and internal fits were measured under an optical microscope at 50x magnification. A one-way analysis of variance (ANOVA) was used to compare data (alpha=0.05). The mean (S.D.) for the marginal fit and internal fit adaptation were: 29.1 microm (14.0) and 62.7 microm (18.9) for the milling center system, 56.6 microm (19.6) and 73.5 microm (20.6) for the laboratory CAD/CAM system, and 81.4 microm (20.3) and 119.2 microm (37.5) for the laboratory CAM system. One-way ANOVA showed significant differences between systems for marginal fit (P<0.001) and internal fit (P<0.001). All systems showed marginal gaps below 120 microm and were therefore considered clinically acceptable. The CAD/CAM systems were more precise than the CAM system.
NASA Technical Reports Server (NTRS)
Davis, Bruce E.; Elliot, Gregory
1989-01-01
Jackson State University recently established the Center for Spatial Data Research and Applications, a Geographical Information System (GIS) and remote sensing laboratory. Taking advantage of new technologies and new directions in the spatial (geographic) sciences, JSU is building a Center of Excellence in Spatial Data Management. New opportunities for research, applications, and employment are emerging. GIS requires fundamental shifts and new demands in traditional computer science and geographic training. The Center is not merely another computer lab but is one setting the pace in a new applied frontier. GIS and its associated technologies are discussed. The Center's facilities are described. An ARC/INFO GIS runs on a Vax mainframe, with numerous workstations. Image processing packages include ELAS, LIPS, VICAR, and ERDAS. A host of hardware and software peripheral are used in support. Numerous projects are underway, such as the construction of a Gulf of Mexico environmental data base, development of AI in image processing, a land use dynamics study of metropolitan Jackson, and others. A new academic interdisciplinary program in Spatial Data Management is under development, combining courses in Geography and Computer Science. The broad range of JSU's GIS and remote sensing activities is addressed. The impacts on changing paradigms in the university and in the professional world conclude the discussion.
Brookhaven National Laboratory technology transfer report, fiscal year 1987
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1987-01-01
The Brookhaven Office of Research and Technology Applications (ORTA) inaugurated two major initiatives. The effort by our ORTA in collaboration with the National Synchrotron Light Source (NSLS) has succeeded in alerting American industry to the potential of using a synchrotron x-ray source for high resolution lithography. We are undertaking a preconstruction study for the construction of a prototype commercial synchrotron and development of an advanced commercial cryogenic synchrotron (XLS). ORTA sponsored a technology transfer workshop where industry expressed its views on how to transfer accelerator technology during the construction of the prototype commercial machine. The Northeast Regional utility Initiative broughtmore » 14 utilities to a workshop at the Laboratory in November. One recommendation of this workshop was to create a Center at the Laboratory for research support on issues of interest to utilities in the region where BNL has unique capability. The ORTA has initiated discussions with the New York State Science and Technology Commission, Cornell University's world renowned Nannofabrication Center and the computer aided design capabilities at SUNY at Stony Brook to create, centered around the NSLS and the XLS, the leading edge semiconductor process technology development center when the XLS becomes operational in two and a half years. 1 fig.« less
Evolution in a centralized transfusion service.
AuBuchon, James P; Linauts, Sandra; Vaughan, Mimi; Wagner, Jeffrey; Delaney, Meghan; Nester, Theresa
2011-12-01
The metropolitan Seattle area has utilized a centralized transfusion service model throughout the modern era of blood banking. This approach has used four laboratories to serve over 20 hospitals and clinics, providing greater capabilities for all at a lower consumption of resources than if each depended on its own laboratory and staff for these functions. In addition, this centralized model has facilitated wider use of the medical capabilities of the blood center's physicians, and a county-wide network of transfusion safety officers is now being developed to increase the impact of the blood center's transfusion expertise at the patient's bedside. Medical expectations and traffic have led the blood center to evolve the centralized model to include on-site laboratories at facilities with complex transfusion requirements (e.g., a children's hospital) and to implement in all the others a system of remote allocation. This new capability places a refrigerator stocked with uncrossmatched units in the hospital but retains control over the dispensing of these through the blood center's computer system; the correct unit can be electronically cross-matched and released on demand, obviating the need for transportation to the hospital and thus speeding transfusion. This centralized transfusion model has withstood the test of time and continues to evolve to meet new situations and ensure optimal patient care. © 2011 American Association of Blood Banks.
NPBalsara@lbl.gov 510-642-8973 Research profile » A U.S. Department of Energy National Laboratory Operated by the University of California UC logo Questions & Comments * Privacy & Security Notice Computational Study of Excited-State Phenomena in Energy Materials Center for X-ray Optics MSD Facilities Ion
California, Berkeley tingxu@berkeley.edu 510-642-1632 Research profile » A U.S. Department of Energy National Laboratory Operated by the University of California UC logo Questions & Comments * Privacy Computational Study of Excited-State Phenomena in Energy Materials Center for X-ray Optics MSD Facilities Ion
STS-133 crew members Mike Barratt and Nicole Stott in cupola
2010-06-08
JSC2010-E-090701 (8 June 2010) --- Several computer monitors are featured in this image photographed during an STS-133 exercise in the systems engineering simulator in the Avionics Systems Laboratory at NASA's Johnson Space Center. The facility includes moving scenes of full-sized International Space Station components over a simulated Earth.
A Cloud Computing Based Patient Centric Medical Information System
NASA Astrophysics Data System (ADS)
Agarwal, Ankur; Henehan, Nathan; Somashekarappa, Vivek; Pandya, A. S.; Kalva, Hari; Furht, Borko
This chapter discusses an emerging concept of a cloud computing based Patient Centric Medical Information System framework that will allow various authorized users to securely access patient records from various Care Delivery Organizations (CDOs) such as hospitals, urgent care centers, doctors, laboratories, imaging centers among others, from any location. Such a system must seamlessly integrate all patient records including images such as CT-SCANS and MRI'S which can easily be accessed from any location and reviewed by any authorized user. In such a scenario the storage and transmission of medical records will have be conducted in a totally secure and safe environment with a very high standard of data integrity, protecting patient privacy and complying with all Health Insurance Portability and Accountability Act (HIPAA) regulations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maxwell, Don E; Ezell, Matthew A; Becklehimer, Jeff
While sites generally have systems in place to monitor the health of Cray computers themselves, often the cooling systems are ignored until a computer failure requires investigation into the source of the failure. The Liebert XDP units used to cool the Cray XE/XK models as well as the Cray proprietary cooling system used for the Cray XC30 models provide data useful for health monitoring. Unfortunately, this valuable information is often available only to custom solutions not accessible by a center-wide monitoring system or is simply ignored entirely. In this paper, methods and tools used to harvest the monitoring data availablemore » are discussed, and the implementation needed to integrate the data into a center-wide monitoring system at the Oak Ridge National Laboratory is provided.« less
NASA Scientists Push the Limits of Computer Technology
NASA Technical Reports Server (NTRS)
1998-01-01
Dr. Donald Frazier,NASA researcher, uses a blue laser shining through a quarts window into a special mix of chemicals to generate a polymer film on the inside quartz surface. As the chemicals respond to the laser light, they adhere to the glass surface, forming optical films. Dr. Frazier and Dr. Mark S. Paley developed the process in the Space Sciences Laboratory at NASA's Marshall Space Flight Center in Huntsville, AL. Working aboard the Space Shuttle, a science team led by Dr. Frazier formed thin films potentially useful in optical computers with fewer impurities than those formed on Earth. Patterns of these films can be traced onto the quartz surface. In the optical computers of the future, these films could replace electronic circuits and wires, making the systems more efficient and cost-effective, as well as lighter and more compact. Photo credit: NASA/Marshall Space Flight Center.
NASA Scientists Push the Limits of Computer Technology
NASA Technical Reports Server (NTRS)
1998-01-01
NASA research Dr. Donald Frazier uses a blue laser shining through a quartz window into a special mix of chemicals to generate a polymer film on the inside quartz surface. As the chemicals respond to the laser light, they adhere to the glass surface, forming opticl films. Dr. Frazier and Dr. Mark S. Paley developed the process in the Space Sciences Laboratory at NASA's Marshall Space Flight Center in Huntsville, AL. Working aboard the Space Shuttle, a science team led by Dr. Frazier formed thin-films potentially useful in optical computers with fewer impurities than those formed on Earth. Patterns of these films can be traced onto the quartz surface. In the optical computers on the future, these films could replace electronic circuits and wires, making the systems more efficient and cost-effective, as well as lighter and more compact. Photo credit: NASA/Marshall Space Flight Center
NASA Scientists Push the Limits of Computer Technology
NASA Technical Reports Server (NTRS)
1999-01-01
NASA researcher Dr. Donald Frazier uses a blue laser shining through a quartz window into a special mix of chemicals to generate a polymer film on the inside quartz surface. As the chemicals respond to the laser light, they adhere to the glass surface, forming optical films. Dr. Frazier and Dr. Mark S. Paley developed the process in the Space Sciences Laboratory at NASA's Marshall Space Flight Center in Huntsville, AL. Working aboard the Space Shuttle, a science team led by Dr. Frazier formed thin-films potentially useful in optical computers with fewer impurities than those formed on Earth. Patterns of these films can be traced onto the quartz surface. In the optical computers of the future, thee films could replace electronic circuits and wires, making the systems more efficient and cost-effective, as well as lighter and more compact. Photo credit: NASA/Marshall Space Flight Center
DOE Office of Scientific and Technical Information (OSTI.GOV)
East, D. R.; Sexton, J.
This was a collaborative effort between Lawrence Livermore National Security, LLC as manager and operator of Lawrence Livermore National Laboratory (LLNL) and IBM TJ Watson Research Center to research, assess feasibility and develop an implementation plan for a High Performance Computing Innovation Center (HPCIC) in the Livermore Valley Open Campus (LVOC). The ultimate goal of this work was to help advance the State of California and U.S. commercial competitiveness in the arena of High Performance Computing (HPC) by accelerating the adoption of computational science solutions, consistent with recent DOE strategy directives. The desired result of this CRADA was a well-researched,more » carefully analyzed market evaluation that would identify those firms in core sectors of the US economy seeking to adopt or expand their use of HPC to become more competitive globally, and to define how those firms could be helped by the HPCIC with IBM as an integral partner.« less
Tropical Ocean and Global Atmosphere (TOGA) heat exchange project: A summary report
NASA Technical Reports Server (NTRS)
Liu, W. T.; Niiler, P. P.
1985-01-01
A pilot data center to compute ocean atmosphere heat exchange over the tropical ocean is prposed at the Jet Propulsion Laboratory (JPL) in response to the scientific needs of the Tropical Ocean and Global Atmosphere (TOGA) Program. Optimal methods will be used to estimate sea surface temperature (SET), surface wind speed, and humidity from spaceborne observations. A monthly summary of these parameters will be used to compute ocean atmosphere latent heat exchanges. Monthly fields of surface heat flux over tropical oceans will be constructed using estimations of latent heat exchanges and short wave radiation from satellite data. Verification of all satellite data sets with in situ measurements at a few locations will be provided. The data center will be an experimental active archive where the quality and quantity of data required for TOGA flux computation are managed. The center is essential to facilitate the construction of composite data sets from global measurements taken from different sensors on various satellites. It will provide efficient utilization and easy access to the large volume of satellite data available for studies of ocean atmosphere energy exchanges.
Dehydration of 1-octadecanol over H-BEA: A combined experimental and computational study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Song, Wenji; Liu, Yuanshuai; Barath, Eszter
Liquid phase dehydration of 1-octdecanol, which is intermediately formed during the hydrodeoxygenation of microalgae oil, has been explored in a combined experimental and computational study. The alkyl chain of C18 alcohol interacts with acid sites during diffusion inside the zeolite pores, resulting in an inefficient utilization of the Brønsted acid sites for samples with high acid site concentrations. The parallel intra- and inter- molecular dehydration pathways having different activation energies pass through alternative reaction intermediates. Formation of surface-bound alkoxide species is the rate-limiting step during intramolecular dehydration, whereas intermolecular dehydration proceeds via a bulky dimer intermediate. Octadecene is the primarymore » dehydration product over H-BEA at 533 K. Despite of the main contribution of Brønsted acid sites towards both dehydration pathways, Lewis acid sites are also active in the formation of dioctadecyl ether. The intramolecular dehydration to octadecene and cleavage of the intermediately formed ether, however, require strong BAS. L. Wang, D. Mei and J. A. Lercher, acknowledge the partial support from the US Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences & Biosciences. Pacific Northwest National Laboratory (PNNL) is a multiprogram national laboratory operated for DOE by Battelle. Computing time was granted by the grand challenge of computational catalysis of the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) and by the National Energy Research Scientific Computing Center (NERSC). EMSL is a national scientific user facility located at Pacific Northwest National Laboratory (PNNL) and sponsored by DOE’s Office of Biological and Environmental Research.« less
NASA Astrophysics Data System (ADS)
Schulthess, Thomas C.
2013-03-01
The continued thousand-fold improvement in sustained application performance per decade on modern supercomputers keeps opening new opportunities for scientific simulations. But supercomputers have become very complex machines, built with thousands or tens of thousands of complex nodes consisting of multiple CPU cores or, most recently, a combination of CPU and GPU processors. Efficient simulations on such high-end computing systems require tailored algorithms that optimally map numerical methods to particular architectures. These intricacies will be illustrated with simulations of strongly correlated electron systems, where the development of quantum cluster methods, Monte Carlo techniques, as well as their optimal implementation by means of algorithms with improved data locality and high arithmetic density have gone hand in hand with evolving computer architectures. The present work would not have been possible without continued access to computing resources at the National Center for Computational Science of Oak Ridge National Laboratory, which is funded by the Facilities Division of the Office of Advanced Scientific Computing Research, and the Swiss National Supercomputing Center (CSCS) that is funded by ETH Zurich.
Data Visualization and Animation Lab (DVAL) overview
NASA Technical Reports Server (NTRS)
Stacy, Kathy; Vonofenheim, Bill
1994-01-01
The general capabilities of the Langley Research Center Data Visualization and Animation Laboratory is described. These capabilities include digital image processing, 3-D interactive computer graphics, data visualization and analysis, video-rate acquisition and processing of video images, photo-realistic modeling and animation, video report generation, and color hardcopies. A specialized video image processing system is also discussed.
ERIC Educational Resources Information Center
Hacisalihoglu, Gokhan; Hilgert, Uwe; Nash, E. Bruce; Micklos, David A.
2008-01-01
Today's biology educators face the challenge of training their students in modern molecular biology techniques including genomics and bioinformatics. The Dolan DNA Learning Center (DNALC) of Cold Spring Harbor Laboratory has developed and disseminated a bench- and computer-based plant genomics curriculum for biology faculty. In 2007, a five-day…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Franz, James A.; O'Hagan, Molly J.; Ho, Ming-Hsun
2013-12-09
The [Ni(PR2NR’2)2]2+ catalysts, (where PR2NR´2 is 1,5-R´-3,7-R-1,5-diaza-3,7-diphosphacyclooctane), are some of the fastest reported for hydrogen production and oxidation, however, chair/boat isomerization and the presence of a fifth solvent ligand have the potential to slow catalysis by incorrectly positioning the pendant amines or blocking the addition of hydrogen. Here, we report the structural dynamics of a series of [Ni(PR2NR’2)2]n+ complexes, characterized by NMR spectroscopy and theoretical modeling. A fast exchange process was observed for the [Ni(CH3CN)(PR2NR’2)2]2+ complexes which depends on the ligand. This exchange process was identified to occur through a three step mechanism including dissociation of the acetonitrile, boat/chair isomerizationmore » of each of the four rings identified by the phosphine ligands (including nitrogen inversion), and reassociation of acetonitrile on the opposite side of the complex. The rate of the chair/boat inversion can be influenced by varying the substituent on the nitrogen atom, but the rate of the overall exchange process is at least an order of magnitude faster than the catalytic rate in acetonitrile demonstrating that the structural dynamics of the [Ni(PR2NR´2)2]2+ complexes does not hinder catalysis. This material is based upon work supported as part of the Center for Molecular Electrocatalysis, an Energy Frontier Research Center funded by the US Department of Energy, Office of Science, Office of Basic Energy Sciences under FWP56073. Research by J.A.F., M.O., M-H. H., M.L.H, D.L.D. A.M.A., S. R. and R.M.B. was carried out in the Center for Molecular Electrocatalysis, an Energy Frontier Research Center funded by the U.S. Department of Energy, Office of Science. W.J.S. and S.L. were funded by the DOE Office of Science Early Career Research Program through the Office of Basic Energy Sciences. T.L. was supported by the US Department of Energy, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences & Biosciences. Pacific Northwest National Laboratory (PNNL) is a multiprogram national laboratory operated for DOE by Battelle. Computational resources were provided at W. R. Wiley Environmental Molecular Science Laboratory (EMSL), a national scientific user facility sponsored by the Department of Energy’s Office of Biological and Environmental Research located at Pacific Northwest National Laboratory; the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory; and the Jaguar supercomputer at Oak Ridge National Laboratory (INCITE 2008-2011 award supported by the Office of Science of the U.S. DOE under Contract No. DE-AC0500OR22725).« less
Russ, Alissa L; Weiner, Michael; Russell, Scott A; Baker, Darrell A; Fahner, W Jeffrey; Saleem, Jason J
2012-12-01
Although the potential benefits of more usable health information technologies (HIT) are substantial-reduced HIT support costs, increased work efficiency, and improved patient safety--human factors methods to improve usability are rarely employed. The US Department of Veterans Affairs (VA) has emerged as an early leader in establishing usability laboratories to inform the design of HIT, including its electronic health record. Experience with a usability laboratory at a VA Medical Center provides insights on how to design, implement, and leverage usability laboratories in the health care setting. The VA Health Services Research and Development Service Human-Computer Interaction & Simulation Laboratory emerged as one of the first VA usability laboratories and was intended to provide research-based findings about HIT designs. This laboratory supports rapid prototyping, formal usability testing, and analysis tools to assess existing technologies, alternative designs, and potential future technologies. RESULTS OF IMPLEMENTATION: Although the laboratory has maintained a research focus, it has become increasingly integrated with VA operations, both within the medical center and on a national VA level. With this resource, data-driven recommendations have been provided for the design of HIT applications before and after implementation. The demand for usability testing of HIT is increasing, and information on how to develop usability laboratories for the health care setting is often needed. This article may assist other health care organizations that want to invest in usability resources to improve HIT. The establishment and utilization of usability laboratories in the health care setting may improve HIT designs and promote safe, high-quality care for patients.
Communication and computing technology in biocontainment laboratories using the NEIDL as a model.
McCall, John; Hardcastle, Kath
2014-07-01
The National Emerging Infectious Diseases Laboratories (NEIDL), Boston University, is a globally unique biocontainment research facility housing biosafety level 2 (BSL-2), BSL-3, and BSL-4 laboratories. Located in the BioSquare area at the University's Medical Campus, it is part of a national network of secure facilities constructed to study infectious diseases of major public health concern. The NEIDL allows for basic, translational, and clinical phases of research to be carried out in a single facility with the overall goal of accelerating understanding, treatment, and prevention of infectious diseases. The NEIDL will also act as a center of excellence providing training and education in all aspects of biocontainment research. Within every detail of NEIDL operations is a primary emphasis on safety and security. The ultramodern NEIDL has required a new approach to communications technology solutions in order to ensure safety and security and meet the needs of investigators working in this complex building. This article discusses the implementation of secure wireless networks and private cloud computing to promote operational efficiency, biosecurity, and biosafety with additional energy-saving advantages. The utilization of a dedicated data center, virtualized servers, virtualized desktop integration, multichannel secure wireless networks, and a NEIDL-dedicated Voice over Internet Protocol (VoIP) network are all discussed. © 2014 Federation of European Microbiological Societies. Published by John Wiley & Sons Ltd. All rights reserved.
Venus - Global View Centered at 180 degrees
1996-11-26
This global view of the surface of Venus is centered at 180 degrees east longitude. Magellan synthetic aperture radar mosaics from the first cycle of Magellan mapping, and a 5 degree latitude-longitude grid, are mapped onto a computer-simulated globe to create this image. Data gaps are filled with Pioneer-Venus Orbiter data, or a constant mid-range value. The image was produced by the Solar System Visualization project and the Magellan Science team at the JPL Multimission Image Processing Laboratory. http://photojournal.jpl.nasa.gov/catalog/PIA00478
How to Quickly Import CAD Geometry into Thermal Desktop
NASA Technical Reports Server (NTRS)
Wright, Shonte; Beltran, Emilio
2002-01-01
There are several groups at JPL (Jet Propulsion Laboratory) that are committed to concurrent design efforts, two are featured here. Center for Space Mission Architecture and Design (CSMAD) enables the practical application of advanced process technologies in JPL's mission architecture process. Team I functions as an incubator for projects that are in the Discovery, and even pre-Discovery proposal stages. JPL's concurrent design environment is to a large extent centered on the CAD (Computer Aided Design) file. During concurrent design sessions CAD geometry is ported to other more specialized engineering design packages.
Astronauts Working in Spacelab
NASA Technical Reports Server (NTRS)
1999-01-01
This Quick Time movie captures astronaut Jan Davis and her fellow crew members working in the Spacelab, a versatile laboratory carried in the Space Shuttle's cargo bay for special research flights. Its various elements can be combined to accommodate the many types of scientific research that can best be performed in space. Spacelab consisted of an enclosed, pressurized laboratory module and open U-shaped pallets located at the rear of the laboratory module. The laboratory module contained utilities, computers, work benches, and instrument racks to conduct scientific experiments in astronomy, physics, chemistry, biology, medicine, and engineering. Equipment, such as telescopes, antennas, and sensors, is mounted on pallets for direct exposure to space. A 1-meter (3.3-ft.) diameter aluminum tunnel, resembling a z-shaped tube, connected the crew compartment (mid deck) to the module. The reusable Spacelab allowed scientists to bring experiment samples back to Earth for post-flight analysis. Spacelab was a cooperative venture of the European Space Agency (ESA) and NASA. ESA was responsible for funding, developing, and building Spacelab, while NASA was responsible for the launch and operational use of Spacelab. Spacelab missions were cooperative efforts between scientists and engineers from around the world. Teams from NASA centers, universities, private industry, government agencies and international space organizations designed the experiments. The Marshall Space Flight Center was NASA's lead center for monitoring the development of Spacelab and managing the program.
A Future State for NASA Laboratories - Working in the 21st Century
NASA Technical Reports Server (NTRS)
Kegelman, Jerome T.; Harris, Charles E.; Antcliff, Richard R.; Bushnell, Dennis M.; Dwoyer, Douglas L.
2009-01-01
The name "21 st Century Laboratory" is an emerging concept of how NASA (and the world) will conduct research in the very near future. Our approach is to carefully plan for significant technological changes in products, organization, and society. The NASA mission can be the beneficiary of these changes, provided the Agency prepares for the role of 21st Century laboratories in research and technology development and its deployment in this new age. It has been clear for some time now that the technology revolutions, technology "mega-trends" that we are in the midst of now, all have a common element centered around advanced computational modeling of small scale physics. Whether it is nano technology, bio technology or advanced computational technology, all of these megatrends are converging on science at the very small scale where it is profoundly important to consider the quantum effects at play with physics at that scale. Whether it is the bio-technology creation of "nanites" designed to mimic our immune system or the creation of nanoscale infotechnology devices, allowing an order of magnitude increase in computational capability, all involve quantum physics that serves as the heart of these revolutionary changes.
Virtual Environments in Scientific Visualization
NASA Technical Reports Server (NTRS)
Bryson, Steve; Lisinski, T. A. (Technical Monitor)
1994-01-01
Virtual environment technology is a new way of approaching the interface between computers and humans. Emphasizing display and user control that conforms to the user's natural ways of perceiving and thinking about space, virtual environment technologies enhance the ability to perceive and interact with computer generated graphic information. This enhancement potentially has a major effect on the field of scientific visualization. Current examples of this technology include the Virtual Windtunnel being developed at NASA Ames Research Center. Other major institutions such as the National Center for Supercomputing Applications and SRI International are also exploring this technology. This talk will be describe several implementations of virtual environments for use in scientific visualization. Examples include the visualization of unsteady fluid flows (the virtual windtunnel), the visualization of geodesics in curved spacetime, surface manipulation, and examples developed at various laboratories.
Final report: Prototyping a combustion corridor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rutland, Christopher J.; Leach, Joshua
2001-12-15
The Combustion Corridor is a concept in which researchers in combustion and thermal sciences have unimpeded access to large volumes of remote computational results. This will enable remote, collaborative analysis and visualization of state-of-the-art combustion science results. The Engine Research Center (ERC) at the University of Wisconsin - Madison partnered with Lawrence Berkeley National Laboratory, Argonne National Laboratory, Sandia National Laboratory, and several other universities to build and test the first stages of a combustion corridor. The ERC served two important functions in this partnership. First, we work extensively with combustion simulations so we were able to provide real worldmore » research data sets for testing the Corridor concepts. Second, the ERC was part of an extension of the high bandwidth based DOE National Laboratory connections to universities.« less
How Data Becomes Physics: Inside the RACF
Ernst, Michael; Rind, Ofer; Rajagopalan, Srini; Lauret, Jerome; Pinkenburg, Chris
2018-06-22
The RHIC & ATLAS Computing Facility (RACF) at the U.S. Department of Energyâs (DOE) Brookhaven National Laboratory sits at the center of a global computing network. It connects more than 2,500 researchers around the world with the data generated by millions of particle collisions taking place each second at Brookhaven Lab's Relativistic Heavy Ion Collider (RHIC, a DOE Office of Science User Facility for nuclear physics research), and the ATLAS experiment at the Large Hadron Collider in Europe. Watch this video to learn how the people and computing resources of the RACF serve these scientists to turn petabytes of raw data into physics discoveries.
Training the Future - Swamp Work Activities
2017-07-19
In the Swamp Works laboratory at NASA's Kennedy Space Center in Florida, student interns, from the left, Jeremiah House, Thomas Muller and Austin Langdon are joining agency scientists, contributing in the area of Exploration Research and Technology. House is studying computer/electrical engineering at John Brown University in Siloam Springs, Arkansas. Muller is pursuing a degree in computer engineering and control systems and Florida Tech. Langdon is an electrical engineering major at the University of Kentucky. The agency attracts its future workforce through the NASA Internship, Fellowships and Scholarships, or NIFS, Program.
A practical VEP-based brain-computer interface.
Wang, Yijun; Wang, Ruiping; Gao, Xiaorong; Hong, Bo; Gao, Shangkai
2006-06-01
This paper introduces the development of a practical brain-computer interface at Tsinghua University. The system uses frequency-coded steady-state visual evoked potentials to determine the gaze direction of the user. To ensure more universal applicability of the system, approaches for reducing user variation on system performance have been proposed. The information transfer rate (ITR) has been evaluated both in the laboratory and at the Rehabilitation Center of China, respectively. The system has been proved to be applicable to > 90% of people with a high ITR in living environments.
Merging the Machines of Modern Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wolf, Laura; Collins, Jim
Two recent projects have harnessed supercomputing resources at the US Department of Energy’s Argonne National Laboratory in a novel way to support major fusion science and particle collider experiments. Using leadership computing resources, one team ran fine-grid analysis of real-time data to make near-real-time adjustments to an ongoing experiment, while a second team is working to integrate Argonne’s supercomputers into the Large Hadron Collider/ATLAS workflow. Together these efforts represent a new paradigm of the high-performance computing center as a partner in experimental science.
Space and Earth Sciences, Computer Systems, and Scientific Data Analysis Support, Volume 1
NASA Technical Reports Server (NTRS)
Estes, Ronald H. (Editor)
1993-01-01
This Final Progress Report covers the specific technical activities of Hughes STX Corporation for the last contract triannual period of 1 June through 30 Sep. 1993, in support of assigned task activities at Goddard Space Flight Center (GSFC). It also provides a brief summary of work throughout the contract period of performance on each active task. Technical activity is presented in Volume 1, while financial and level-of-effort data is presented in Volume 2. Technical support was provided to all Division and Laboratories of Goddard's Space Sciences and Earth Sciences Directorates. Types of support include: scientific programming, systems programming, computer management, mission planning, scientific investigation, data analysis, data processing, data base creation and maintenance, instrumentation development, and management services. Mission and instruments supported include: ROSAT, Astro-D, BBXRT, XTE, AXAF, GRO, COBE, WIND, UIT, SMM, STIS, HEIDI, DE, URAP, CRRES, Voyagers, ISEE, San Marco, LAGEOS, TOPEX/Poseidon, Pioneer-Venus, Galileo, Cassini, Nimbus-7/TOMS, Meteor-3/TOMS, FIFE, BOREAS, TRMM, AVHRR, and Landsat. Accomplishments include: development of computing programs for mission science and data analysis, supercomputer applications support, computer network support, computational upgrades for data archival and analysis centers, end-to-end management for mission data flow, scientific modeling and results in the fields of space and Earth physics, planning and design of GSFC VO DAAC and VO IMS, fabrication, assembly, and testing of mission instrumentation, and design of mission operations center.
NCAR's Experimental Real-time Convection-allowing Ensemble Prediction System
NASA Astrophysics Data System (ADS)
Schwartz, C. S.; Romine, G. S.; Sobash, R.; Fossell, K.
2016-12-01
Since April 2015, the National Center for Atmospheric Research's (NCAR's) Mesoscale and Microscale Meteorology (MMM) Laboratory, in collaboration with NCAR's Computational Information Systems Laboratory (CISL), has been producing daily, real-time, 10-member, 48-hr ensemble forecasts with 3-km horizontal grid spacing over the conterminous United States (http://ensemble.ucar.edu). These computationally-intensive, next-generation forecasts are produced on the Yellowstone supercomputer, have been embraced by both amateur and professional weather forecasters, are widely used by NCAR and university researchers, and receive considerable attention on social media. Initial conditions are supplied by NCAR's Data Assimilation Research Testbed (DART) software and the forecast model is NCAR's Weather Research and Forecasting (WRF) model; both WRF and DART are community tools. This presentation will focus on cutting-edge research results leveraging the ensemble dataset, including winter weather predictability, severe weather forecasting, and power outage modeling. Additionally, the unique design of the real-time analysis and forecast system and computational challenges and solutions will be described.
Closed-Loop HIRF Experiments Performed on a Fault Tolerant Flight Control Computer
NASA Technical Reports Server (NTRS)
Belcastro, Celeste M.
1997-01-01
ABSTRACT Closed-loop HIRF experiments were performed on a fault tolerant flight control computer (FCC) at the NASA Langley Research Center. The FCC used in the experiments was a quad-redundant flight control computer executing B737 Autoland control laws. The FCC was placed in one of the mode-stirred reverberation chambers in the HIRF Laboratory and interfaced to a computer simulation of the B737 flight dynamics, engines, sensors, actuators, and atmosphere in the Closed-Loop Systems Laboratory. Disturbances to the aircraft associated with wind gusts and turbulence were simulated during tests. Electrical isolation between the FCC under test and the simulation computer was achieved via a fiber optic interface for the analog and discrete signals. Closed-loop operation of the FCC enabled flight dynamics and atmospheric disturbances affecting the aircraft to be represented during tests. Upset was induced in the FCC as a result of exposure to HIRF, and the effect of upset on the simulated flight of the aircraft was observed and recorded. This paper presents a description of these closed- loop HIRF experiments, upset data obtained from the FCC during these experiments, and closed-loop effects on the simulated flight of the aircraft.
2015-05-26
and Lipscomb, 2004) to describe the ice dynamics and compute strain rates. It incorporates the standard ridging scheme of Thorndike et al. (1975...Forecast System (ACNFS). NRL/MR/7320—10- 9287, Naval Research Laboratory, Stennis Space Center, MS, 55 pp. Thorndike , A.S., D.A. Rothrock, G.A. Maykut, and
Department of Defense In-House RDT and E Activities
1976-10-30
BALLISTIC TESTS.FAC AVAL FCR TESIS OF SP ELELTRONIC’ FIl’ CON EQUIP 4 RELATED SYSTEMS E COMPONFNTZ, 35 INSTALLATION: MEDICAL BIOENGINEERINC- R&D LABORATORY...ANALYSIS OF CHEMICAL AND METALLOGRAPHIC EFFECTS, MICROBIOLOGICAL EFFECTS, CLIMATIC ENVIRONMENTAL EFFECTS. TEST AND EVALUATE WARHEADS AND SPECIAL...CCMMUNICATI’N SYST:M INSTRUMENTED DROP ZONES ENGINEERING TEST FACILITY INSTRUMENTATION CALIBRATICN FACILITY SCIENTIFIC COMPUTER CENTER ENVIRONMENTAL TESY
2003-04-24
KENNEDY SPACE CENTER, FLA. - Jim Lloyd, with the Mars Exploration Rover (MER) program, places on MER-1 a computer chip with about 35,000 laser-engraved signatures of visitors to the rovers at the Jet Propulsion Laboratory. The signatures include those of senators, artists, and John Glenn. The identical Mars rovers are scheduled to launch June 5 and June 25 from Cape Canaveral Air Force Station.
NASA Technical Reports Server (NTRS)
1985-01-01
Exactatron, an accurate weighing and spotting system in bowling ball manufacture, was developed by Ebonite International engineers with the assistance of a NASA computer search which identified Jet Propulsion Laboratory (JPL) technology. The JPL research concerned a means of determining the center of an object's mass, and an apparatus for measuring liquid viscosity, enabling Ebonite to identify the exact spotting of the drilling point for top weighting.
Using Experiment and Computer Modeling to Determine the Off-Axis Magnetic Field of a Solenoid
ERIC Educational Resources Information Center
Lietor-Santos, Juan Jose
2014-01-01
The study of the ideal solenoid is a common topic among introductory-based physics textbooks and a typical current arrangement in laboratory hands-on experiences where the magnetic field inside a solenoid is determined at different currents and at different distances from its center using a magnetic probe. It additionally provides a very simple…
Space shuttle engineering and operations support. Avionics system engineering
NASA Technical Reports Server (NTRS)
Broome, P. A.; Neubaur, R. J.; Welsh, R. T.
1976-01-01
The shuttle avionics integration laboratory (SAIL) requirements for supporting the Spacelab/orbiter avionics verification process are defined. The principal topics are a Spacelab avionics hardware assessment, test operations center/electronic systems test laboratory (TOC/ESL) data processing requirements definition, SAIL (Building 16) payload accommodations study, and projected funding and test scheduling. Because of the complex nature of the Spacelab/orbiter computer systems, the PCM data link, and the high rate digital data system hardware/software relationships, early avionics interface verification is required. The SAIL is a prime candidate test location to accomplish this early avionics verification.
JSC earth resources data analysis capabilities available to EOD revision B
NASA Technical Reports Server (NTRS)
1974-01-01
A list and summary description of all Johnson Space Center electronic laboratory and photographic laboratory capabilities available to earth resources division personnel for processing earth resources data are provided. The electronic capabilities pertain to those facilities and systems that use electronic and/or photographic products as output. The photographic capabilities pertain to equipment that uses photographic images as input and electronic and/or table summarizes processing steps. A general hardware description is presented for each of the data processing systems, and the titles of computer programs are used to identify the capabilities and data flow.
1978-02-01
Trans. ASME, Vol. 81, 1959, pp. 259- 264 . 112 0 C> 0 LJj 0 CD 0 D ~) . [") r "-’ . 1’ n -- 1 . 2 0 1 . lj 0 1. :iO 1 • 13 0 ? . (JO p;a...n ntout Compute determinant elements forb n, Comoute and write backsc~tter cross-section\\ (Figure 2.2-1) 264 J. BACKSCATTER CROSS-SECTION FOR A...Overrelaxation Iteration Methods," Report WAPD -TM-1038, Bettis Atomic Power Laboratory, Westinghouse Electric Corp., Pittsburgh, Pennsylvania. 10
A Urinalysis Result Reporting System for a Clinical Laboratory
Sullivan, James E.; Plexico, Perry S.; Blank, David W.
1987-01-01
A menu driven Urinalysis Result Reporting System based on multiple IBM-PC Workstations connected together by a local area network was developed for the Clinical Chemistry Section of the Clinical Pathology Department at the National Institutes of Health's Clinical Center. Two Network File Servers redundantly save the test results of each urine specimen. When all test results for a specimen are entered into the system, the results are transmitted to the Department's Laboratory Computer System where they are made available to the ordering physician. The Urinalysis Data Management System has proven easy to learn and use.
Kavlock, Robert; Dix, David
2010-02-01
Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, the U.S. Environmental Protection Agency EPA is developing robust and flexible computational tools that can be applied to the thousands of chemicals in commerce, and contaminant mixtures found in air, water, and hazardous-waste sites. The Office of Research and Development (ORD) Computational Toxicology Research Program (CTRP) is composed of three main elements. The largest component is the National Center for Computational Toxicology (NCCT), which was established in 2005 to coordinate research on chemical screening and prioritization, informatics, and systems modeling. The second element consists of related activities in the National Health and Environmental Effects Research Laboratory (NHEERL) and the National Exposure Research Laboratory (NERL). The third and final component consists of academic centers working on various aspects of computational toxicology and funded by the U.S. EPA Science to Achieve Results (STAR) program. Together these elements form the key components in the implementation of both the initial strategy, A Framework for a Computational Toxicology Research Program (U.S. EPA, 2003), and the newly released The U.S. Environmental Protection Agency's Strategic Plan for Evaluating the Toxicity of Chemicals (U.S. EPA, 2009a). Key intramural projects of the CTRP include digitizing legacy toxicity testing information toxicity reference database (ToxRefDB), predicting toxicity (ToxCast) and exposure (ExpoCast), and creating virtual liver (v-Liver) and virtual embryo (v-Embryo) systems models. U.S. EPA-funded STAR centers are also providing bioinformatics, computational toxicology data and models, and developmental toxicity data and models. The models and underlying data are being made publicly available through the Aggregated Computational Toxicology Resource (ACToR), the Distributed Structure-Searchable Toxicity (DSSTox) Database Network, and other U.S. EPA websites. While initially focused on improving the hazard identification process, the CTRP is placing increasing emphasis on using high-throughput bioactivity profiling data in systems modeling to support quantitative risk assessments, and in developing complementary higher throughput exposure models. This integrated approach will enable analysis of life-stage susceptibility, and understanding of the exposures, pathways, and key events by which chemicals exert their toxicity in developing systems (e.g., endocrine-related pathways). The CTRP will be a critical component in next-generation risk assessments utilizing quantitative high-throughput data and providing a much higher capacity for assessing chemical toxicity than is currently available.
Use of ``virtual'' field trips in teaching introductory geology
NASA Astrophysics Data System (ADS)
Hurst, Stephen D.
1998-08-01
We designed a series of case studies for Introductory Geology Laboratory courses using computer visualization techniques integrated with traditional laboratory materials. These consist of a comprehensive case study which requires three two-hour long laboratory periods to complete, and several shorter case studies requiring one or two, two-hour laboratory periods. Currently we have prototypes of the Yellowstone National Park, Hawaii volcanoes and the Mid-Atlantic Ridge case studies. The Yellowstone prototype can be used to learn about a wide variety of rocks and minerals, about geothermal activity and hydrology, about volcanic hazards and the hot-spot theory of plate tectonics. The Hawaiian exercise goes into more depth about volcanoes, volcanic rocks and their relationship to plate movements. The Mid-Atlantic Ridge project focuses on formation of new ocean crust and mineral-rich hydrothermal deposits at spreading centers. With new improvements in visualization technology that are making their way to personal computers, we are now closer to the ideal of a "virtual" field trip. We are currently making scenes of field areas in Hawaii and Yellowstone which allow the student to pan around the area and zoom in on interesting objects. Specific rocks in the scene will be able to be "picked up" and studied in three dimensions. This technology improves the ability of the computer to present a realistic simulation of the field area and allows the student to have more control over the presentation. This advanced interactive technology is intuitive to control, relatively cheap and easy to add to existing computer programs and documents.
Testimony to the House Science Space and Technology Committee.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Church, Michael Kenton; Tannenbaum, Benn
Chairman Smith, Ranking Member Johnson, and distinguished members of the Committee on Science, Space, and Technology, I thank you for the opportunity to testify today on the role of science, engineering, and research at Sandia National Laboratories, one of the nation’s premiere national labs and the nation’s largest Federally Funded Research and Development Center (FFRDC) laboratory. I am Dr. Susan Seestrom, Sandia’s Associate Laboratories Director for Advanced Science & Technology (AST) and Chief Research Officer (CRO). As CRO I am responsible for research strategy, Laboratory Directed Research & Development (LDRD), partnerships strategy, and technology transfer. As director and line managermore » for AST I manage capabilities and mission delivery across a variety of the physical and mathematical sciences and engineering disciplines, such as pulsed power, radiation effects, major environmental testing, high performance computing, and modeling and simulation.« less
National Kidney Registry: 213 transplants in three years.
Veale, Jeffrey; Hil, Garet
2010-01-01
Since its establishment in 2008, the National Kidney Registry has facilitated 213 kidney transplants between unrelated living donors and recipients at 28 transplant centers. Rapid innovations in matching strategies, advanced computer technologies, good communication and an evolving understanding of the processes at participating transplant centers and histocompatibility laboratories are among the factors driving the success of the NKR. Virtual cross match accuracy has improved from 43% to 91% as a result of changes to the HLA typing requirements for potential donors and improved mechanisms to list unacceptable HLA antigens for sensitized patients. A uniform financial agreement among participating centers eliminated a major roadblock to facilitate unbalanced donor kidney exchanges among centers. The NKR transplanted 64% of the patients registered since 2008 and the average waiting time for those transplanted in 2010 was 11 months.
1998-02-27
NASA research Dr. Donald Frazier uses a blue laser shining through a quartz window into a special mix of chemicals to generate a polymer film on the inside quartz surface. As the chemicals respond to the laser light, they adhere to the glass surface, forming opticl films. Dr. Frazier and Dr. Mark S. Paley developed the process in the Space Sciences Laboratory at NASA's Marshall Space Flight Center in Huntsville, AL. Working aboard the Space Shuttle, a science team led by Dr. Frazier formed thin-films potentially useful in optical computers with fewer impurities than those formed on Earth. Patterns of these films can be traced onto the quartz surface. In the optical computers on the future, these films could replace electronic circuits and wires, making the systems more efficient and cost-effective, as well as lighter and more compact. Photo credit: NASA/Marshall Space Flight Center
1999-05-26
NASA researcher Dr. Donald Frazier uses a blue laser shining through a quartz window into a special mix of chemicals to generate a polymer film on the inside quartz surface. As the chemicals respond to the laser light, they adhere to the glass surface, forming optical films. Dr. Frazier and Dr. Mark S. Paley developed the process in the Space Sciences Laboratory at NASA's Marshall Space Flight Center in Huntsville, AL. Working aboard the Space Shuttle, a science team led by Dr. Frazier formed thin-films potentially useful in optical computers with fewer impurities than those formed on Earth. Patterns of these films can be traced onto the quartz surface. In the optical computers of the future, thee films could replace electronic circuits and wires, making the systems more efficient and cost-effective, as well as lighter and more compact. Photo credit: NASA/Marshall Space Flight Center
DePaolo, Donald J. (Director, Center for Nanoscale Control of Geologic CO2); NCGC Staff
2017-12-09
'Carbon in Underland' was submitted by the Center for Nanoscale Control of Geologic CO2 (NCGC) to the 'Life at the Frontiers of Energy Research' video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. This video was selected as one of five winners by a distinguished panel of judges for its 'entertaining animation and engaging explanations of carbon sequestration'. NCGC, an EFRC directed by Donald J. DePaolo at Lawrence Berkeley National Laboratory is a partnership of scientists from seven institutions: LBNL (lead) Massachusetts Institute of Technology, Lawrence Livermore National Laboratory, Oak Ridge National Laboratory, University of California, Davis, Ohio State University, and Washington University in St. Louis. The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges. The mission of the Center for Nanoscale Control of Geologic CO{sub 2} is 'to use new investigative tools, combined with experiments and computer simulations, to build a fundamental understanding of molecular-to-pore-scale processes in fluid-rock systems, and to demonstrate the ability to control critical aspects of flow, transport, and mineralization in porous rock media as applied to geologic sequestration of CO{sub 2}. Research topics are: bio-inspired, CO{sub 2} (store), greenhouse gas, and interfacial characterization.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mann, R.C.; Weisbin, C.R.; Pin, F.G.
1989-01-01
This paper reviews ongoing and planned research with mobile autonomous robots at the Oak Ridge National Laboratory (ORNL), Center for Engineering Systems Advanced Research (CESAR). Specifically we report on results obtained with the robot HERMIES-IIB in navigation, intelligent sensing, learning, and on-board parallel computing in support of these functions. We briefly summarize an experiment with HERMIES-IIB that demonstrates the capability of smooth transitions between robot autonomy and tele-operation. This experiment results from collaboration among teams at the Universities of Florida, Michigan, Tennessee, and Texas; and ORNL in a program targeted at robotics for advanced nuclear power stations. We conclude bymore » summarizing ongoing R D with our new mobile robot HERMIES-III which is equipped with a seven degree-of-freedom research manipulator arm. 12 refs., 4 figs.« less
Application Modernization at LLNL and the Sierra Center of Excellence
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neely, J. Robert; de Supinski, Bronis R.
We repport that in 2014, Lawrence Livermore National Laboratory began acquisition of Sierra, a pre-exascale system from IBM and Nvidia. It marks a significant shift in direction for LLNL by introducing the concept of heterogeneous computing via GPUs. LLNL’s mission requires application teams to prepare for this paradigm shift. Thus, the Sierra procurement required a proposed Center of Excellence that would align the expertise of the chosen vendors with laboratory personnel that represent the application developers, system software, and tool providers in a concentrated effort to prepare the laboratory’s codes in advance of the system transitioning to production in 2018.more » Finally, this article presents LLNL’s overall application strategy, with a focus on how LLNL is collaborating with IBM and Nvidia to ensure a successful transition of its mission-oriented applications into the exascale era.« less
Application Modernization at LLNL and the Sierra Center of Excellence
Neely, J. Robert; de Supinski, Bronis R.
2017-09-01
We repport that in 2014, Lawrence Livermore National Laboratory began acquisition of Sierra, a pre-exascale system from IBM and Nvidia. It marks a significant shift in direction for LLNL by introducing the concept of heterogeneous computing via GPUs. LLNL’s mission requires application teams to prepare for this paradigm shift. Thus, the Sierra procurement required a proposed Center of Excellence that would align the expertise of the chosen vendors with laboratory personnel that represent the application developers, system software, and tool providers in a concentrated effort to prepare the laboratory’s codes in advance of the system transitioning to production in 2018.more » Finally, this article presents LLNL’s overall application strategy, with a focus on how LLNL is collaborating with IBM and Nvidia to ensure a successful transition of its mission-oriented applications into the exascale era.« less
Reengineering the project design process
NASA Astrophysics Data System (ADS)
Kane Casani, E.; Metzger, Robert M.
1995-01-01
In response to the National Aeronautics and Space Administration's goal of working faster, better, and cheaper, the Jet Propulsion Laboratory (JPL) has developed extensive plans to minimize cost, maximize customer and employee satisfaction, and implement small- and moderate-size missions. These plans include improved management structures and processes, enhanced technical design processes, the incorporation of new technology, and the development of more economical space- and ground-system designs. The Laboratory's new Flight Projects Implementation Development Office has been chartered to oversee these innovations and the reengineering of JPL's project design process, including establishment of the Project Design Center (PDC) and the Flight System Testbed (FST). Reengineering at JPL implies a cultural change whereby the character of the Laboratory's design process will change from sequential to concurrent and from hierarchical to parallel. The Project Design Center will support missions offering high science return, design to cost, demonstrations of new technology, and rapid development. Its computer-supported environment will foster high-fidelity project life-cycle development and more accurate cost estimating. These improvements signal JPL's commitment to meeting the challenges of space exploration in the next century.
2013-07-26
ISS036-E-025034 (26 July 2013) --- From the International Space Station?s Destiny laboratory, European Space Agency astronaut Luca Parmitano, Expedition 36 flight engineer, uses a computer as he partners with Ames Research Center to remotely control a surface rover in California. The experiment, called Surface Telerobotics, will help scientists plan future missions where a robotic rover could prepare a site on a moon or a planet for a crew.
2013-07-26
ISS036-E-025030 (26 July 2013) --- From the International Space Station?s Destiny laboratory, European Space Agency astronaut Luca Parmitano, Expedition 36 flight engineer, uses a computer as he partners with Ames Research Center to remotely control a surface rover in California. The experiment, called Surface Telerobotics, will help scientists plan future missions where a robotic rover could prepare a site on a moon or a planet for a crew.
2013-07-26
ISS036-E-025012 (26 July 2013) --- From the International Space Station?s Destiny laboratory, European Space Agency astronaut Luca Parmitano, Expedition 36 flight engineer, uses a computer as he partners with Ames Research Center to remotely control a surface rover in California. The experiment, called Surface Telerobotics, will help scientists plan future missions where a robotic rover could prepare a site on a moon or a planet for a crew.
Training the Future - Interns Harvesting & Testing Plant Experim
2017-07-19
In the Space Life Sciences Laboratory at NASA's Kennedy Space Center in Florida, student interns such as Ayla Grandpre are joining agency scientists, contributing in the area of plant growth research for food production in space. Grandpre is majoring in computer science and chemistry at Rocky Mountain College in Billings, Montana. The agency attracts its future workforce through the NASA Internship, Fellowships and Scholarships, or NIFS, Program.
NAVO MSRC Navigator. Fall 2001
2001-01-01
of the CAVE. A view from the VR Juggler simulator . The particles indicate snow (white) & ice (blue). Rainfall is shown on the terrain, and clouds as...the Cover: Virtual environment built by the NAVO MSRC Visualization Center for the Concurrent Computing Laboratory for Materials Simulation at...Louisiana State University. This application allows the researchers to visualize a million atom simulation of an indentor puncturing a block of gallium
User manual for semi-circular compact range reflector code: Version 2
NASA Technical Reports Server (NTRS)
Gupta, Inder J.; Burnside, Walter D.
1987-01-01
A computer code has been developed at the Ohio State University ElectroScience Laboratory to analyze a semi-circular paraboloidal reflector with or without a rolled edge at the top and a skirt at the bottom. The code can be used to compute the total near field of the reflector or its individual components at a given distance from the center of the paraboloid. The code computes the fields along a radial, horizontal, vertical or axial cut at that distance. Thus, it is very effective in computing the size of the sweet spot for a semi-circular compact range reflector. This report describes the operation of the code. Various input and output statements are explained. Some results obtained using the computer code are presented to illustrate the code's capability as well as being samples of input/output sets.
Real-Time Hardware-in-the-Loop Simulation of Ares I Launch Vehicle
NASA Technical Reports Server (NTRS)
Tobbe, Patrick; Matras, Alex; Walker, David; Wilson, Heath; Fulton, Chris; Alday, Nathan; Betts, Kevin; Hughes, Ryan; Turbe, Michael
2009-01-01
The Ares Real-Time Environment for Modeling, Integration, and Simulation (ARTEMIS) has been developed for use by the Ares I launch vehicle System Integration Laboratory at the Marshall Space Flight Center. The primary purpose of the Ares System Integration Laboratory is to test the vehicle avionics hardware and software in a hardware - in-the-loop environment to certify that the integrated system is prepared for flight. ARTEMIS has been designed to be the real-time simulation backbone to stimulate all required Ares components for verification testing. ARTE_VIIS provides high -fidelity dynamics, actuator, and sensor models to simulate an accurate flight trajectory in order to ensure realistic test conditions. ARTEMIS has been designed to take advantage of the advances in underlying computational power now available to support hardware-in-the-loop testing to achieve real-time simulation with unprecedented model fidelity. A modular realtime design relying on a fully distributed computing architecture has been implemented.
Smart Computer-Assisted Markets
NASA Astrophysics Data System (ADS)
McCabe, Kevin A.; Rassenti, Stephen J.; Smith, Vernon L.
1991-10-01
The deregulation movement has motivated the experimental study of auction markets designed for interdependent network industries such as natural gas pipelines or electric power systems. Decentralized agents submit bids to buy commodity and offers to sell transportation and commodity to a computerized dispatch center. Computer algorithms determine prices and allocations that maximize the gains from exchange in the system relative to the submitted bids and offers. The problem is important, because traditionally the scale and coordination economies in such industries were thought to require regulation. Laboratory experiments are used to study feasibility, limitations, incentives, and performance of proposed market designs for deregulation, providing motivation for new theory.
Networked Instructional Chemistry: Using Technology To Teach Chemistry
NASA Astrophysics Data System (ADS)
Smith, Stanley; Stovall, Iris
1996-10-01
Networked multimedia microcomputers provide new ways to help students learn chemistry and to help instructors manage the learning environment. This technology is used to replace some traditional laboratory work, collect on-line experimental data, enhance lectures and quiz sections with multimedia presentations, provide prelaboratory training for beginning nonchemistry- major organic laboratory, provide electronic homework for organic chemistry students, give graduate students access to real NMR data for analysis, and provide access to molecular modeling tools. The integration of all of these activities into an active learning environment is made possible by a client-server network of hundreds of computers. This requires not only instructional software but also classroom and course management software, computers, networking, and room management. Combining computer-based work with traditional course material is made possible with software management tools that allow the instructor to monitor the progress of each student and make available an on-line gradebook so students can see their grades and class standing. This client-server based system extends the capabilities of the earlier mainframe-based PLATO system, which was used for instructional computing. This paper outlines the components of a technology center used to support over 5,000 students per semester.
NASA Technical Reports Server (NTRS)
Murray, S.
1999-01-01
In this project, we worked with the University of California at Berkeley/Center for Extreme Ultraviolet Astrophysics and five science museums (the National Air and Space Museum, the Science Museum of Virginia, the Lawrence Hall of Science, the Exploratorium., and the New York Hall of Science) to formulate plans for computer-based laboratories located at these museums. These Science Learning Laboratories would be networked and provided with real Earth and space science observations, as well as appropriate lesson plans, that would allow the general public to directly access and manipulate the actual remote sensing data, much as a scientist would.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sigrin, Benjamin O
High customer acquisition costs remain a persistent challenge in the U.S. residential solar industry. Effective customer acquisition in the residential solar market is increasingly achieved with the help of data analysis and machine learning, whether that means more targeted advertising, understanding customer motivations, or responding to competitors. New research by the National Renewable Energy Laboratory, Sandia National Laboratories, Vanderbilt University, University of Pennsylvania, and the California Center for Sustainable Energy and funded through the U.S. Department of Energy's Solar Energy Evolution and Diffusion (SEEDS) program demonstrates novel computational methods that can help drive down costs in the residential solar industry.
Software process improvement in the NASA software engineering laboratory
NASA Technical Reports Server (NTRS)
Mcgarry, Frank; Pajerski, Rose; Page, Gerald; Waligora, Sharon; Basili, Victor; Zelkowitz, Marvin
1994-01-01
The Software Engineering Laboratory (SEL) was established in 1976 for the purpose of studying and measuring software processes with the intent of identifying improvements that could be applied to the production of ground support software within the Flight Dynamics Division (FDD) at the National Aeronautics and Space Administration (NASA)/Goddard Space Flight Center (GSFC). The SEL has three member organizations: NASA/GSFC, the University of Maryland, and Computer Sciences Corporation (CSC). The concept of process improvement within the SEL focuses on the continual understanding of both process and product as well as goal-driven experimentation and analysis of process change within a production environment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vrieling, P. Douglas
2016-01-01
The Livermore Valley Open Campus (LVOC), a joint initiative of the National Nuclear Security Administration (NNSA), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL), enhances the national security missions of NNSA by promoting greater collaboration between world-class scientists at the national security laboratories, and their partners in industry and academia. Strengthening the science, technology, and engineering (ST&E) base of our nation is one of the NNSA’s top goals. By conducting coordinated and collaborative programs, LVOC enhances both the NNSA and the broader national science and technology base, and helps to ensure the health of core capabilities at LLNLmore » and SNL. These capabilities must remain strong to enable the laboratories to execute their primary mission for NNSA.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCallen, R; Salari, K; Ortega, J
2003-05-01
A Working Group Meeting on Heavy Vehicle Aerodynamic Drag was held at Lawrence Livermore National Laboratory on May 29-30, 2003. The purpose of the meeting was to present and discuss suggested guidance and direction for the design of drag reduction devices determined from experimental and computational studies. Representatives from the Department of Energy (DOE)/Office of Energy Efficiency and Renewable Energy/Office of FreedomCAR & Vehicle Technologies, Lawrence Livermore National Laboratory (LLNL), Sandia National Laboratories (SNL), NASA Ames Research Center (NASA), University of Southern California (USC), California Institute of Technology (Caltech), Georgia Tech Research Institute (GTRI), Argonne National Laboratory (ANL), Clarkson University,more » and PACCAR participated in the meeting. This report contains the technical presentations (viewgraphs) delivered at the Meeting, briefly summarizes the comments and conclusions, provides some highlighted items, and outlines the future action items.« less
Open-Loop HIRF Experiments Performed on a Fault Tolerant Flight Control Computer
NASA Technical Reports Server (NTRS)
Koppen, Daniel M.
1997-01-01
During the third quarter of 1996, the Closed-Loop Systems Laboratory was established at the NASA Langley Research Center (LaRC) to study the effects of High Intensity Radiated Fields on complex avionic systems and control system components. This new facility provided a link and expanded upon the existing capabilities of the High Intensity Radiated Fields Laboratory at LaRC that were constructed and certified during 1995-96. The scope of the Closed-Loop Systems Laboratory is to place highly integrated avionics instrumentation into a high intensity radiated field environment, interface the avionics to a real-time flight simulation that incorporates aircraft dynamics, engines, sensors, actuators and atmospheric turbulence, and collect, analyze, and model aircraft performance. This paper describes the layout and functionality of the Closed-Loop Systems Laboratory, and the open-loop calibration experiments that led up to the commencement of closed-loop real-time flight experiments.
Interior of the U.S. Laboratory / Destiny module
2001-02-11
STS98-E-5113 (11 February 2001) --- This wide shot, photographed with a digital still camera, shows the interior of the newly attached Destiny laboratory. The crews of Atlantis and the International Space Station opened the laboratory on Feb. 11 and spent the first full day of what are planned to be years of work ahead inside the orbiting science and command center. Station commander William M. (Bill) Shepherd opened the Destiny hatch, and he and shuttle commander Kenneth D. Cockrell ventured inside at 8:38 a.m. (CST), Feb. 11. As depicted in subsequent digital images in this series, members of both crews went to work quickly inside the new module, activating air systems, fire extinguishers, alarm systems, computers and internal communications. The crew also continued equipment transfers from the shuttle to the station.
A hardware/software environment to support R D in intelligent machines and mobile robotic systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mann, R.C.
1990-01-01
The Center for Engineering Systems Advanced Research (CESAR) serves as a focal point at the Oak Ridge National Laboratory (ORNL) for basic and applied research in intelligent machines. R D at CESAR addresses issues related to autonomous systems, unstructured (i.e. incompletely known) operational environments, and multiple performing agents. Two mobile robot prototypes (HERMIES-IIB and HERMIES-III) are being used to test new developments in several robot component technologies. This paper briefly introduces the computing environment at CESAR which includes three hypercube concurrent computers (two on-board the mobile robots), a graphics workstation, VAX, and multiple VME-based systems (several on-board the mobile robots).more » The current software environment at CESAR is intended to satisfy several goals, e.g.: code portability, re-usability in different experimental scenarios, modularity, concurrent computer hardware transparent to applications programmer, future support for multiple mobile robots, support human-machine interface modules, and support for integration of software from other, geographically disparate laboratories with different hardware set-ups. 6 refs., 1 fig.« less
1998-11-04
Computer simulation of atmospheric flow corresponds well to imges taken during the second Geophysical Fluid Flow Cell (BFFC) mission. The top shows a view from the pole, while the bottom shows a view from the equator. Red corresponds to hot fluid rising while blue shows cold fluid falling. This simulation was developed by Anil Deane of the University of Maryland, College Park and Paul Fischer of Argorne National Laboratory. Credit: NASA/Goddard Space Flight Center
NASA Gulf of Mexico Initiative Hypoxia Research
NASA Technical Reports Server (NTRS)
Armstrong, Curtis D.
2012-01-01
The Applied Science & Technology Project Office at Stennis Space Center (SSC) manages NASA's Gulf of Mexico Initiative (GOMI). Addressing short-term crises and long-term issues, GOMI participants seek to understand the environment using remote sensing, in-situ observations, laboratory analyses, field observations and computational models. New capabilities are transferred to end-users to help them make informed decisions. Some GOMI activities of interest to the hypoxia research community are highlighted.
2003-04-24
KENNEDY SPACE CENTER, FLA. - Jim Lloyd, with the Mars Exploration Rover (MER) program, points to the place on MER-1 where he will place a computer chip with about 35,000 laser-engraved signatures of visitors to the rovers at the Jet Propulsion Laboratory. The signatures include those of senators, artists, and John Glenn. The identical Mars rovers are scheduled to launch June 5 and June 25 from Cape Canaveral Air Force Station.
2003-04-24
KENNEDY SPACE CENTER, FLA. - This hand points to the place on the Mars Exploration Rover 1 where a computer chip with about 35,000 laser-engraved signatures of visitors to the Jet Propulsion Laboratory will be placed. The first rover already has one. The signatures include those of senators, artists, and John Glenn. The identical Mars rovers are scheduled to launch June 5 and June 25 from Cape Canaveral Air Force Station.
Fourier spectroscopy with a one-million-point transformation
NASA Technical Reports Server (NTRS)
Connes, J.; Delouis, H.; Connes, P.; Guelachvili, G.; Maillard, J.; Michel, G.
1972-01-01
A new type of interferometer for use in Fourier spectroscopy has been devised at the Aime Cotton Laboratory of the National Center for Scientific Research (CNRS), Orsay, France. With this interferometer and newly developed computational techniques, interferograms comprising as many as one million samples can now be transformed. The techniques are described, and examples of spectra of thorium and holmium, derived from one million-point interferograms, are presented.
What’s Wrong With Automatic Speech Recognition (ASR) and How Can We Fix It?
2013-03-01
Jordan Cohen International Computer Science Institute 1947 Center Street, Suite 600 Berkeley, CA 94704 MARCH 2013 Final Report ...This report was cleared for public release by the 88th Air Base Wing Public Affairs Office and is available to the general public, including foreign...711th Human Performance Wing Air Force Research Laboratory This report is published in the interest of scientific and technical
NASA Technical Reports Server (NTRS)
Cohen, Jarrett
1999-01-01
Parallel computers built out of mass-market parts are cost-effectively performing data processing and simulation tasks. The Supercomputing (now known as "SC") series of conferences celebrated its 10th anniversary last November. While vendors have come and gone, the dominant paradigm for tackling big problems still is a shared-resource, commercial supercomputer. Growing numbers of users needing a cheaper or dedicated-access alternative are building their own supercomputers out of mass-market parts. Such machines are generally called Beowulf-class systems after the 11th century epic. This modern-day Beowulf story began in 1994 at NASA's Goddard Space Flight Center. A laboratory for the Earth and space sciences, computing managers there threw down a gauntlet to develop a $50,000 gigaFLOPS workstation for processing satellite data sets. Soon, Thomas Sterling and Don Becker were working on the Beowulf concept at the University Space Research Association (USRA)-run Center of Excellence in Space Data and Information Sciences (CESDIS). Beowulf clusters mix three primary ingredients: commodity personal computers or workstations, low-cost Ethernet networks, and the open-source Linux operating system. One of the larger Beowulfs is Goddard's Highly-parallel Integrated Virtual Environment, or HIVE for short.
Yang, Yunpeng; Jiang, Shan; Yang, Zhiyong; Yuan, Wei; Dou, Huaisu; Wang, Wei; Zhang, Daguang; Bian, Yuan
2017-04-01
Nowadays, biopsy is a decisive method of lung cancer diagnosis, whereas lung biopsy is time-consuming, complex and inaccurate. So a computed tomography-compatible robot for rapid and precise lung biopsy is developed in this article. According to the actual operation process, the robot is divided into two modules: 4-degree-of-freedom position module for location of puncture point is appropriate for patient's almost all positions and 3-degree-of-freedom tendon-based orientation module with remote center of motion is compact and computed tomography-compatible to orientate and insert needle automatically inside computed tomography bore. The workspace of the robot surrounds patient's thorax, and the needle tip forms a cone under patient's skin. A new error model of the robot based on screw theory is proposed in view of structure error and actuation error, which are regarded as screw motions. Simulation is carried out to verify the precision of the error model contrasted with compensation via inverse kinematics. The results of insertion experiment on specific phantom prove the feasibility of the robot with mean error of 1.373 mm in laboratory environment, which is accurate enough to replace manual operation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Jeff
"Carbon in Underland" was submitted by the Center for Nanoscale Controls on Geologic CO2 (NCGC) to the "Life at the Frontiers of Energy Research" video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. This video was selected as one of five winners by a distinguished panel of judges for its "entertaining animation and engaging explanations of carbon sequestration". NCGC, an EFRC directed by Donald J. DePaolo at Lawrence Berkeley National Laboratory is a partnership of scientists from sevenmore » institutions: LBNL (lead) Massachusetts Institute of Technology, Lawrence Livermore National Laboratory, Oak Ridge National Laboratory, University of California, Davis, Ohio State University, and Washington University in St. Louis. The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges. The mission of the Center for Nanoscale Control of Geologic CO2 is 'to use new investigative tools, combined with experiments and computer simulations, to build a fundamental understanding of molecular-to-pore-scale processes in fluid-rock systems, and to demonstrate the ability to control critical aspects of flow, transport, and mineralization in porous rock media as applied to geologic sequestration of CO2. Research topics are: bio-inspired, CO2 (store), greenhouse gas, and interfacial characterization.« less
The Nuclear Energy Advanced Modeling and Simulation Enabling Computational Technologies FY09 Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Diachin, L F; Garaizar, F X; Henson, V E
2009-10-12
In this document we report on the status of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Enabling Computational Technologies (ECT) effort. In particular, we provide the context for ECT In the broader NEAMS program and describe the three pillars of the ECT effort, namely, (1) tools and libraries, (2) software quality assurance, and (3) computational facility (computers, storage, etc) needs. We report on our FY09 deliverables to determine the needs of the integrated performance and safety codes (IPSCs) in these three areas and lay out the general plan for software quality assurance to meet the requirements of DOE andmore » the DOE Advanced Fuel Cycle Initiative (AFCI). We conclude with a brief description of our interactions with the Idaho National Laboratory computer center to determine what is needed to expand their role as a NEAMS user facility.« less
Einstein, Andrew J.; Berman, Daniel S.; Min, James K.; Hendel, Robert C.; Gerber, Thomas C.; Carr, J. Jeffrey; Cerqueira, Manuel D.; Cullom, S. James; DeKemp, Robert; Dickert, Neal; Dorbala, Sharmila; Garcia, Ernest V.; Gibbons, Raymond J.; Halliburton, Sandra S.; Hausleiter, Jörg; Heller, Gary V.; Jerome, Scott; Lesser, John R.; Fazel, Reza; Raff, Gilbert L.; Tilkemeier, Peter; Williams, Kim A.; Shaw, Leslee J.
2014-01-01
Objective To identify key components of a radiation accountability framework fostering patient-centered imaging and shared decision-making in cardiac imaging. Background An NIH-NHLBI/NCI-sponsored symposium was held in November 2012 to address these issues. Methods Symposium participants, working in three tracks, identified key components of a framework to target critical radiation safety issues for the patient, the laboratory, and the larger population of patients with known or suspected cardiovascular disease. Results Use of ionizing radiation during an imaging procedure should be disclosed to all patients by the ordering provider at the time of ordering, and reinforced by the performing provider team. An imaging protocol with effective dose ≤3mSv is considered very low risk, not warranting extensive discussion or written consent. However, a protocol effective dose <20mSv was proposed as a level requiring particular attention in terms of shared decision-making and either formal discussion or written informed consent. Laboratory reporting of radiation dosimetry is a critical component of creating a quality laboratory fostering a patient-centered environment with transparent procedural methodology. Efforts should be directed to avoiding testing involving radiation, in patients with inappropriate indications. Standardized reporting and diagnostic reference levels for computed tomography and nuclear cardiology are important for the goal of public reporting of laboratory radiation dose levels in conjunction with diagnostic performance. Conclusions The development of cardiac imaging technologies revolutionized cardiology practice by allowing routine, noninvasive assessment of myocardial perfusion and anatomy. It is now incumbent upon the imaging community to create an accountability framework to safely drive appropriate imaging utilization. PMID:24530677
Computational Analyses of Offset Stream Nozzles for Noise Reduction
NASA Technical Reports Server (NTRS)
Dippold, Vance, III; Foster, Lancert; Wiese,Michael
2007-01-01
The Wind computational fluid dynamics code was used to perform a series of simulations on two offset stream nozzle concepts for jet noise reduction. The first concept used an S-duct to direct the secondary stream to the lower side of the nozzle. The second concept used vanes to turn the secondary flow downward. The analyses were completed in preparation of tests conducted in the NASA Glenn Research Center Aeroacoustic Propulsion Laboratory. The offset stream nozzles demonstrated good performance and reduced the amount of turbulence on the lower side of the jet plume. The computer analyses proved instrumental in guiding the development of the final test configurations and giving insight into the flow mechanics of offset stream nozzles. The computational predictions were compared with flowfield results from the jet rig testing and showed excellent agreement.
Network-based approaches to climate knowledge discovery
NASA Astrophysics Data System (ADS)
Budich, Reinhard; Nyberg, Per; Weigel, Tobias
2011-11-01
Climate Knowledge Discovery Workshop; Hamburg, Germany, 30 March to 1 April 2011 Do complex networks combined with semantic Web technologies offer the next generation of solutions in climate science? To address this question, a first Climate Knowledge Discovery (CKD) Workshop, hosted by the German Climate Computing Center (Deutsches Klimarechenzentrum (DKRZ)), brought together climate and computer scientists from major American and European laboratories, data centers, and universities, as well as representatives from industry, the broader academic community, and the semantic Web communities. The participants, representing six countries, were concerned with large-scale Earth system modeling and computational data analysis. The motivation for the meeting was the growing problem that climate scientists generate data faster than it can be interpreted and the need to prepare for further exponential data increases. Current analysis approaches are focused primarily on traditional methods, which are best suited for large-scale phenomena and coarse-resolution data sets. The workshop focused on the open discussion of ideas and technologies to provide the next generation of solutions to cope with the increasing data volumes in climate science.
Telemetry Data Collection from Oscar Satellite
NASA Technical Reports Server (NTRS)
Haddock, Paul C.; Horan, Stephen
1998-01-01
This paper discusses the design, configuration, and operation of a satellite station built for the Center for Space Telemetering and Telecommunications Laboratory in the Klipsch School of Electrical and Computer Engineering Engineering at New Mexico State University (NMSU). This satellite station consists of a computer-controlled antenna tracking system, 2m/70cm transceiver, satellite tracking software, and a demodulator. The satellite station receives satellite,telemetry, allows for voice communications, and will be used in future classes. Currently this satellite station is receiving telemetry from an amateur radio satellite, UoSAT-OSCAR-11. Amateur radio satellites are referred to as Orbiting Satellites Carrying Amateur Radio (OSCAR) satellites as discussed in the next section.
Developing the human-computer interface for Space Station Freedom
NASA Technical Reports Server (NTRS)
Holden, Kritina L.
1991-01-01
For the past two years, the Human-Computer Interaction Laboratory (HCIL) at the Johnson Space Center has been involved in prototyping and prototype reviews of in support of the definition phase of the Space Station Freedom program. On the Space Station, crew members will be interacting with multi-monitor workstations where interaction with several displays at one time will be common. The HCIL has conducted several experiments to begin to address design issues for this complex system. Experiments have dealt with design of ON/OFF indicators, the movement of the cursor across multiple monitors, and the importance of various windowing capabilities for users performing multiple tasks simultaneously.
Laboratory for Atmospheres: Philosophy, Organization, Major Activities, and 2001 Highlights
NASA Technical Reports Server (NTRS)
Hoegy, Walter R.; Cote, Charles, E.
2002-01-01
How can we improve our ability to predict the weather? How is the Earth's climate changing? What can the atmospheres of other planets teach us about our own? The Laboratory for Atmospheres is helping to answer these and other scientific questions. The Laboratory conducts a broad theoretical and experimental research program studying all aspects of the atmospheres of the Earth and other planets, including their structural, dynamical, radiative, and chemical properties. Vigorous research is central to NASA's exploration of the frontiers of knowledge. NASA scientists play a key role in conceiving new space missions, providing mission requirements., and carrying out research to explore the behavior of planetary systems, including, notably, the Earth's. Our Laboratory's scientists also supply outside scientists with technical assistance and scientific data to further investigations not immediately addressed by NASA itself. The Laboratory for Atmospheres is a vital participant in NASA's research program. The Laboratory is part of the Earth Sciences Directorate based at NASA's Goddard Space Flight Center in Greenbelt, Maryland. The Directorate itself comprises the Global Change Data Center; the Earth and Space Data Computing Division; three laboratories: the Laboratory for Atmospheres, the Laboratory for Terrestrial Physics, and the Laboratory for Hydrospheric Processes; and the Goddard Institute for Space Studies (GISS) in New York, New York. In this report, you will find a statement of our philosophy and a description of our role in NASA's mission. You'll also find a broad description of our research and a summary of our scientists' major accomplishments in 2001. The report also presents useful information on human resources, scientific interactions, and outreach activities with the outside community. For your convenience, we have published a version of this report on the Internet. Our Web site includes links to additional information about the Laboratory's Offices and Branches. You can find us on the World Wide Web at http://atmospheres.gsfc.nasa.gov.
Mask Matching for Linear Feature Detection.
1987-01-01
decide which matched masks are part of a linear feature by sim- ple thresholding of the confidence measures. However, it is shown in a compan - ion report...Laboratory, Center for Automation Research, University of Maryland, January 1987. 3. E.M. Allen, R.H. Trigg, and R.J. Wood, The Maryland Artificial ... Intelligence Group Franz Lisp Environment, Variation 3.5, TR-1226, Department of Computer Science, University of Maryland, December 1984. 4. D.E. Knuth, The
Recursive Gradient Estimation Using Splines for Navigation of Autonomous Vehicles.
1985-07-01
AUTONOMOUS VEHICLES C. N. SHEN DTIC " JULY 1985 SEP 1 219 85 V US ARMY ARMAMENT RESEARCH AND DEVELOPMENT CENTER LARGE CALIBER WEAPON SYSTEMS LABORATORY I...GRADIENT ESTIMATION USING SPLINES FOR NAVIGATION OF AUTONOMOUS VEHICLES Final S. PERFORMING ORG. REPORT NUMBER 7. AUTHOR(q) 8. CONTRACT OR GRANT NUMBER...which require autonomous vehicles . Essential to these robotic vehicles is an adequate and efficient computer vision system. A potentially more
Training the Future - Swamp Work Activities
2017-07-19
In the Swamp Works laboratory at NASA's Kennedy Space Center in Florida, student interns such as Thomas Muller, left, and Austin Langdon are joining agency scientists, contributing in the area of Exploration Research and Technology. Muller is pursuing a degree in computer engineering and control systems and Florida Tech. Langdon is an electrical engineering major at the University of Kentucky. The agency attracts its future workforce through the NASA Internship, Fellowships and Scholarships, or NIFS, Program.
Langley applications experiments data management system study. [for space shuttles
NASA Technical Reports Server (NTRS)
Lanham, C. C., Jr.
1975-01-01
A data management system study is presented that defines, in functional terms, the most cost effective ground data management system to support Advanced Technology Laboratory (ATL) flights of the space shuttle. Results from each subtask performed and the recommended system configuration for reformatting the experiment instrumentation tapes to computer compatible tape are examined. Included are cost factors for development of a mini control center for real-time support of the ATL flights.
2005 Precision Strike Technology Symposium
2005-10-20
Radars in Production Mission Computer Software Functionality to drive Mission System Requirements Liquid Cooling Expanded Cooling Capability and Flow...Targeting Demonstration Using the APL Precision Target Locator Demonstrator, Mr. Ben Huguenin and Mr. Joe Schissler, Johns Hopkins University, Applied ...Forces October 18-20, 2005 Kossiakoff Conference Center The Johns Hopkins University/ Applied Physics Laboratory, Laurel, MD David K. Sanders
Journal of Naval Science. Volume 2, Number 2. April 1976
1976-04-01
with bold lines to permit reduction in block making. A recent photograph and biographical note of the Author(s) will also be welcomed. Views and...Research Laboratory and of the Naval Under- water Systems Center aboard. The U.S. National Aeronautics and Space Administration ( NASA ) provided...F. Garcia. Fault Isolation Computer Methods. NASA Contractor Report CR-1758. February 1971. "•> P. A. Payne. D. R. Towill and K. J. Baker
2000-01-01
One Is Best? 8 Meet Your Customer Assistance Team 20 ERDC MSRC Computer Systems Are Gems 25 ERDC MSRC Contributions to SC99 15 2 ERDC MSRC The... Customer Assistance Center (CAC) for several years and has achieved a reputation as an onsite Kerberos and portable batch system (PBS) expert within...Laboratory to send its mobile unit to obtain blood donations for a young woman in the Vicksburg community who was in desperate need of blood in January
NASA Technical Reports Server (NTRS)
Jani, Yashvant
1992-01-01
As part of the Research Institute for Computing and Information Systems (RICIS) activity, the reinforcement learning techniques developed at Ames Research Center are being applied to proximity and docking operations using the Shuttle and Solar Max satellite simulation. This activity is carried out in the software technology laboratory utilizing the Orbital Operations Simulator (OOS). This interim report provides the status of the project and outlines the future plans.
Using a medical simulation center as an electronic health record usability laboratory
Landman, Adam B; Redden, Lisa; Neri, Pamela; Poole, Stephen; Horsky, Jan; Raja, Ali S; Pozner, Charles N; Schiff, Gordon; Poon, Eric G
2014-01-01
Usability testing is increasingly being recognized as a way to increase the usability and safety of health information technology (HIT). Medical simulation centers can serve as testing environments for HIT usability studies. We integrated the quality assurance version of our emergency department (ED) electronic health record (EHR) into our medical simulation center and piloted a clinical care scenario in which emergency medicine resident physicians evaluated a simulated ED patient and documented electronically using the ED EHR. Meticulous planning and close collaboration with expert simulation staff was important for designing test scenarios, pilot testing, and running the sessions. Similarly, working with information systems teams was important for integration of the EHR. Electronic tools are needed to facilitate entry of fictitious clinical results while the simulation scenario is unfolding. EHRs can be successfully integrated into existing simulation centers, which may provide realistic environments for usability testing, training, and evaluation of human–computer interactions. PMID:24249778
Interfacing LabVIEW With Instrumentation for Electronic Failure Analysis and Beyond
NASA Technical Reports Server (NTRS)
Buchanan, Randy K.; Bryan, Coleman; Ludwig, Larry
1996-01-01
The Laboratory Virtual Instrumentation Engineering Workstation (LabVIEW) software is designed such that equipment and processes related to control systems can be operationally lined and controlled by the use of a computer. Various processes within the failure analysis laboratories of NASA's Kennedy Space Center (KSC) demonstrate the need for modernization and, in some cases, automation, using LabVIEW. An examination of procedures and practices with the Failure Analaysis Laboratory resulted in the conclusion that some device was necessary to elevate the potential users of LabVIEW to an operational level in minimum time. This paper outlines the process involved in creating a tutorial application to enable personnel to apply LabVIEW to their specific projects. Suggestions for furthering the extent to which LabVIEW is used are provided in the areas of data acquisition and process control.
NASA Astrophysics Data System (ADS)
Cox, S. J.; Wyborn, L. A.; Fraser, R.; Rankine, T.; Woodcock, R.; Vote, J.; Evans, B.
2012-12-01
The Virtual Geophysics Laboratory (VGL) is web portal that provides geoscientists with an integrated online environment that: seamlessly accesses geophysical and geoscience data services from the AuScope national geoscience information infrastructure; loosely couples these data to a variety of gesocience software tools; and provides large scale processing facilities via cloud computing. VGL is a collaboration between CSIRO, Geoscience Australia, National Computational Infrastructure, Monash University, Australian National University and the University of Queensland. The VGL provides a distributed system whereby a user can enter an online virtual laboratory to seamlessly connect to OGC web services for geoscience data. The data is supplied in open standards formats using international standards like GeoSciML. A VGL user uses a web mapping interface to discover and filter the data sources using spatial and attribute filters to define a subset. Once the data is selected the user is not required to download the data. VGL collates the service query information for later in the processing workflow where it will be staged directly to the computing facilities. The combination of deferring data download and access to Cloud computing enables VGL users to access their data at higher resolutions and to undertake larger scale inversions, more complex models and simulations than their own local computing facilities might allow. Inside the Virtual Geophysics Laboratory, the user has access to a library of existing models, complete with exemplar workflows for specific scientific problems based on those models. For example, the user can load a geological model published by Geoscience Australia, apply a basic deformation workflow provided by a CSIRO scientist, and have it run in a scientific code from Monash. Finally the user can publish these results to share with a colleague or cite in a paper. This opens new opportunities for access and collaboration as all the resources (models, code, data, processing) are shared in the one virtual laboratory. VGL provides end users with access to an intuitive, user-centered interface that leverages cloud storage and cloud and cluster processing from both the research communities and commercial suppliers (e.g. Amazon). As the underlying data and information services are agnostic of the scientific domain, they can support many other data types. This fundamental characteristic results in a highly reusable virtual laboratory infrastructure that could also be used for example natural hazards, satellite processing, soil geochemistry, climate modeling, agriculture crop modeling.
NASA Technical Reports Server (NTRS)
Oliver, Michael J.
2014-01-01
The National Aeronautics and Space Administration (NASA) conducted a full scale ice crystal icing turbofan engine test using an obsolete Allied Signal ALF502-R5 engine in the Propulsion Systems Laboratory (PSL) at NASA Glenn Research Center. The test article used was the exact engine that experienced a loss of power event after the ingestion of ice crystals while operating at high altitude during a 1997 Honeywell flight test campaign investigating the turbofan engine ice crystal icing phenomena. The test plan included test points conducted at the known flight test campaign field event pressure altitude and at various pressure altitudes ranging from low to high throughout the engine operating envelope. The test article experienced a loss of power event at each of the altitudes tested. For each pressure altitude test point conducted the ambient static temperature was predicted using a NASA engine icing risk computer model for the given ambient static pressure while maintaining the engine speed.
Quantification of Uncertainty in Extreme Scale Computations (QUEST)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ghanem, Roger
QUEST was a SciDAC Institute comprising Sandia National Laboratories, Los Alamos National Laboratory, the University of Southern California, the Massachusetts Institute of Technology, the University of Texas at Austin, and Duke University. The mission of QUEST is to: (1) develop a broad class of uncertainty quantification (UQ) methods/tools, and (2) provide UQ expertise and software to other SciDAC projects, thereby enabling/guiding their UQ activities. The USC effort centered on the development of reduced models and efficient algorithms for implementing various components of the UQ pipeline. USC personnel were responsible for the development of adaptive bases, adaptive quadrature, and reduced modelsmore » to be used in estimation and inference.« less
Simulations of Laboratory Astrophysics Experiments using the CRASH code
NASA Astrophysics Data System (ADS)
Trantham, Matthew; Kuranz, Carolyn; Fein, Jeff; Wan, Willow; Young, Rachel; Keiter, Paul; Drake, R. Paul
2015-11-01
Computer simulations can assist in the design and analysis of laboratory astrophysics experiments. The Center for Radiative Shock Hydrodynamics (CRASH) at the University of Michigan developed a code that has been used to design and analyze high-energy-density experiments on OMEGA, NIF, and other large laser facilities. This Eulerian code uses block-adaptive mesh refinement (AMR) with implicit multigroup radiation transport, electron heat conduction and laser ray tracing. This poster will demonstrate some of the experiments the CRASH code has helped design or analyze including: Kelvin-Helmholtz, Rayleigh-Taylor, magnetized flows, jets, and laser-produced plasmas. This work is funded by the following grants: DEFC52-08NA28616, DE-NA0001840, and DE-NA0002032.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lottes, S.A.; Kulak, R.F.; Bojanowski, C.
2011-08-26
The computational fluid dynamics (CFD) and computational structural mechanics (CSM) focus areas at Argonne's Transportation Research and Analysis Computing Center (TRACC) initiated a project to support and compliment the experimental programs at the Turner-Fairbank Highway Research Center (TFHRC) with high performance computing based analysis capabilities in August 2010. The project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at TFHRC for a period of five years, beginning in October 2010. Themore » analysis methods employ well-benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focus of the project are wind and water loads on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to assess them for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, vehicle stability under high wind loading, and the use of electromagnetic shock absorbers to improve vehicle stability under high wind conditions. This quarterly report documents technical progress on the project tasks for the period of April through June 2011.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-13
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Disease Control and Prevention Clinical Laboratory Improvement Advisory Committee, Centers for Disease Control and Prevention: Notice of Charter..., that the Clinical Laboratory Improvement Advisory Committee, Centers for Disease Control and Prevention...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Maxine D.; Leigh, Jason
2014-02-17
The Blaze high-performance visual computing system serves the high-performance computing research and education needs of University of Illinois at Chicago (UIC). Blaze consists of a state-of-the-art, networked, computer cluster and ultra-high-resolution visualization system called CAVE2(TM) that is currently not available anywhere in Illinois. This system is connected via a high-speed 100-Gigabit network to the State of Illinois' I-WIRE optical network, as well as to national and international high speed networks, such as the Internet2, and the Global Lambda Integrated Facility. This enables Blaze to serve as an on-ramp to national cyberinfrastructure, such as the National Science Foundation’s Blue Waters petascalemore » computer at the National Center for Supercomputing Applications at the University of Illinois at Chicago and the Department of Energy’s Argonne Leadership Computing Facility (ALCF) at Argonne National Laboratory. DOE award # DE-SC005067, leveraged with NSF award #CNS-0959053 for “Development of the Next-Generation CAVE Virtual Environment (NG-CAVE),” enabled us to create a first-of-its-kind high-performance visual computing system. The UIC Electronic Visualization Laboratory (EVL) worked with two U.S. companies to advance their commercial products and maintain U.S. leadership in the global information technology economy. New applications are being enabled with the CAVE2/Blaze visual computing system that is advancing scientific research and education in the U.S. and globally, and help train the next-generation workforce.« less
Mercury: Photomosaic of the Shakespeare Quadrangle of Mercury (Southern Half) H-3
NASA Technical Reports Server (NTRS)
1974-01-01
This computer generated photomosaic from Mariner 10 is of the southern half of Mercury's Shakespeare Quadrangle, named for the ancient Shakespeare crater located on the upper edge to the left of center. This portion of the quadrangle covers the geographic region from 20 to 45 degrees north latitude and from 90 to 180 degrees longitude. The photomosaic was produced using computer techniques and software developed in the Image Processing Laboratory of NASA's Jet Propulsion Laboratory. The pictures have been high-pass filtered and contrast enhanced to accentuate surface detail, and geometrically transformed into a Lambert conformal projection.
Well defined bright streaks or ray systems radiating away from craters constitute another distinctive feature of the Mercurian surface, remarkably similar to the Moon. The rays cut across and are superimposed on all other surface features, indicating that the source craters are the youngest topographic features on the surface of Mercury.The above material was taken from the following publication... Davies, M. E., S. E. Dwornik, D. E. Gault, and R. G. Strom, Atlas of Mercury,NASA SP-423 (1978).The Mariner 10 mission was managed by the Jet Propulsion Laboratory for NASA's Office of Space Science.NASA Technical Reports Server (NTRS)
Levine, A. L.
1981-01-01
An engineer and a computer expert from Goddard Space Flight Center were assigned to provide technical assistance in the design and installation of a computer assisted system for dispatching and communicating with fire department personnel and equipment in Baltimore City. Primary contributions were in decision making and management processes. The project is analyzed from four perspectives: (1) fire service; (2) technology transfer; (3) public administration; and (5) innovation. The city benefitted substantially from the approach and competence of the NASA personnel. Given the proper conditions, there are distinct advantages in having a nearby Federal laboratory provide assistance to a city on a continuing basis, as is done in the Baltimore Applications Project.
Preferential superior surface motion in wear simulations of the Charité total disc replacement.
Goreham-Voss, Curtis M; Vicars, Rachel; Hall, Richard M; Brown, Thomas D
2012-06-01
Laboratory wear simulations of the dual-bearing surface Charité total disc replacement (TDR) are complicated by the non-specificity of the device's center of rotation (CoR). Previous studies have suggested that articulation of the Charité preferentially occurs at the superior-bearing surface, although it is not clear how sensitive this phenomenon is to lubrication conditions or CoR location. In this study, a computational wear model is used to study the articulation kinematics and wear of the Charité TDR. Implant wear was found to be insensitive to the CoR location, although seemingly non-physiologic endplate motion can result. Articulation and wear were biased significantly to the superior-bearing surface, even in the presence of significant perturbations of loading and friction. The computational wear model provides novel insight into the mechanics and wear of the Charité TDR, allowing for better interpretation of in vivo results, and giving useful insight for designing future laboratory physical tests.
FORTRAN plotting subroutines for the space plasma laboratory
NASA Technical Reports Server (NTRS)
Williams, R.
1983-01-01
The computer program known as PLOTRW was custom made to satisfy some of the graphics requirements for the data collected in the Space Plasma Laboratory at the Johnson Space Center (JSC). The general requirements for the program were as follows: (1) all subroutines shall be callable through a FORTRAN source program; (2) all graphs shall fill one page and be properly labeled; (3) there shall be options for linear axes and logarithmic axes; (4) each axis shall have tick marks equally spaced with numeric values printed at the beginning tick mark and at the last tick mark; and (5) there shall be three options for plotting. These are: (1) point plot, (2) line plot and (3) point-line plot. The subroutines were written in FORTRAN IV for the LSI-11 Digital equipment Corporation (DEC) Computer. The program is now operational and can be run on any TEKTRONICX graphics terminal that uses a DEC Real-Time-11 (RT-11) operating system.
NASA Technical Reports Server (NTRS)
Hickey, J. S.
1983-01-01
The Mesoscale Analysis and Space Sensor (MASS) Data Management and Analysis System developed by Atsuko Computing International (ACI) on the MASS HP-1000 Computer System within the Systems Dynamics Laboratory of the Marshall Space Flight Center is described. The MASS Data Management and Analysis System was successfully implemented and utilized daily by atmospheric scientists to graphically display and analyze large volumes of conventional and satellite derived meteorological data. The scientists can process interactively various atmospheric data (Sounding, Single Level, Gird, and Image) by utilizing the MASS (AVE80) share common data and user inputs, thereby reducing overhead, optimizing execution time, and thus enhancing user flexibility, useability, and understandability of the total system/software capabilities. In addition ACI installed eight APPLE III graphics/imaging computer terminals in individual scientist offices and integrated them into the MASS HP-1000 Computer System thus providing significant enhancement to the overall research environment.
Evolutionary Computation for the Identification of Emergent Behavior in Autonomous Systems
NASA Technical Reports Server (NTRS)
Terrile, Richard J.; Guillaume, Alexandre
2009-01-01
Over the past several years the Center for Evolutionary Computation and Automated Design at the Jet Propulsion Laboratory has developed a technique based on Evolutionary Computational Methods (ECM) that allows for the automated optimization of complex computationally modeled systems. An important application of this technique is for the identification of emergent behaviors in autonomous systems. Mobility platforms such as rovers or airborne vehicles are now being designed with autonomous mission controllers that can find trajectories over a solution space that is larger than can reasonably be tested. It is critical to identify control behaviors that are not predicted and can have surprising results (both good and bad). These emergent behaviors need to be identified, characterized and either incorporated into or isolated from the acceptable range of control characteristics. We use cluster analysis of automatically retrieved solutions to identify isolated populations of solutions with divergent behaviors.
ASR4: A computer code for fitting and processing 4-gage anelastic strain recovery data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Warpinski, N.R.
A computer code for analyzing four-gage Anelastic Strain Recovery (ASR) data has been modified for use on a personal computer. This code fits the viscoelastic model of Warpinski and Teufel to measured ASR data, calculates the stress orientation directly, and computes stress magnitudes if sufficient input data are available. The code also calculates the stress orientation using strain-rosette equations, and its calculates stress magnitudes using Blanton's approach, assuming sufficient input data are available. The program is written in FORTRAN, compiled with Ryan-McFarland Version 2.4. Graphics use PLOT88 software by Plotworks, Inc., but the graphics software must be obtained by themore » user because of licensing restrictions. A version without graphics can also be run. This code is available through the National Energy Software Center (NESC), operated by Argonne National Laboratory. 5 refs., 3 figs.« less
Enhancement of computer system for applications software branch
NASA Technical Reports Server (NTRS)
Bykat, Alex
1987-01-01
Presented is a compilation of the history of a two-month project concerned with a survey, evaluation, and specification of a new computer system for the Applications Software Branch of the Software and Data Management Division of Information and Electronic Systems Laboratory of Marshall Space Flight Center, NASA. Information gathering consisted of discussions and surveys of branch activities, evaluation of computer manufacturer literature, and presentations by vendors. Information gathering was followed by evaluation of their systems. The criteria of the latter were: the (tentative) architecture selected for the new system, type of network architecture supported, software tools, and to some extent the price. The information received from the vendors, as well as additional research, lead to detailed design of a suitable system. This design included considerations of hardware and software environments as well as personnel issues such as training. Design of the system culminated in a recommendation for a new computing system for the Branch.
Laboratory for Computer Science Progress Report 24, July 1986-June 1987
1987-06-01
Szolovits, Group Leader R. Patil Collaborating Investigators M. Criscitiello, M.D., Tufts-New England Medical Center Hospital M. Eckman, M.D., Tufts-New...solutiolls from previously seen patients to aid in the treatment of subsequent patients. Finally, some knowledge (e.g., strategic and treatment, but...Academic Staff Arvind, Group Leader J.B. Dennis R.S. Nikhil Research Staff G.A. Boughton Graduate Students P.S. Barth G.K. Maa S.A. Brobst D.R. Morals
The 1970/71 spectral data management programs
NASA Technical Reports Server (NTRS)
Marshall, A. A.
1971-01-01
The data management programs used by the Stanford Remote Sensing Laboratory to access, modify, and reduce the data obtained from both the NASA IR airborne spectrometer, and Stanford's SG-4 field spectrometer are reported. Many details covered in previous reports are not repeated. References are provided. These programs are written in FORTRAN 4 and S/360 Assembler Language, and are currently running on a S/360 model 67 (operating under OS/MFT) at the Stanford Computation Center Campus Facility.
A visiting scientist program in atmospheric sciences for the Goddard Space Flight Center
NASA Technical Reports Server (NTRS)
Davis, M. H.
1989-01-01
A visiting scientist program was conducted in the atmospheric sciences and related areas at the Goddard Laboratory for Atmospheres. Research was performed in mathematical analysis as applied to computer modeling of the atmospheres; development of atmospheric modeling programs; analysis of remotely sensed atmospheric, surface, and oceanic data and its incorporation into atmospheric models; development of advanced remote sensing instrumentation; and related research areas. The specific research efforts are detailed by tasks.
2003-04-24
KENNEDY SPACE CENTER, FLA. - This closeup shows the size of the computer chip that holds about 35,000 laser-engraved signatures of visitors to the Mars Exploration Rovers at the Jet Propulsion Laboratory. It will be placed on the second rover to be launched to Mars; the first rover already has one. The signatures include those of senators, artists, and John Glenn. The identical Mars rovers are scheduled to launch June 5 and June 25 from Cape Canaveral Air Force Station.
2003-04-24
KENNEDY SPACE CENTER, FLA. - Jim Lloyd, with the Mars Exploration Rover program, holds a computer chip with about 35,000 laser-engraved signatures of visitors to the Jet Propulsion Laboratory. The chip will be placed on the second rover to be launched to Mars (MER-1/MER-B); the first rover already has one. The signatures include those of senators, artists, and John Glenn. The identical Mars rovers are scheduled to launch June 5 and June 25 from Cape Canaveral Air Force Station.
2003-08-18
KENNEDY SPACE CENTER, FLA. - Dr. Grant Gilmore, Dynamac Corp., utilizes a laptop computer to explain aspects of the underwater acoustic research under way in the Launch Complex 39 turn basin. Several government agencies, including NASA, NOAA, the Navy, the Coast Guard, and the Florida Fish and Wildlife Commission are involved in the testing. The research involves demonstrations of passive and active sensor technologies, with applications in fields ranging from marine biological research to homeland security. The work is also serving as a pilot project to assess the cooperation between the agencies involved. Equipment under development includes a passive acoustic monitor developed by NASA’s Jet Propulsion Laboratory, and mobile robotic sensors from the Navy’s Mobile Diving and Salvage Unit.
Experimental and Analytical Determination of the Geometric Far Field for Round Jets
NASA Technical Reports Server (NTRS)
Koch, L. Danielle; Bridges, James E.; Brown, Clifford E.; Khavaran, Abbas
2005-01-01
An investigation was conducted at the NASA Glenn Research Center using a set of three round jets operating under unheated subsonic conditions to address the question: "How close is too close?" Although sound sources are distributed at various distances throughout a jet plume downstream of the nozzle exit, at great distances from the nozzle the sound will appear to emanate from a point and the inverse-square law can be properly applied. Examination of normalized sound spectra at different distances from a jet, from experiments and from computational tools, established the required minimum distance for valid far-field measurements of the sound from subsonic round jets. Experimental data were acquired in the Aeroacoustic Propulsion Laboratory at the NASA Glenn Research Center. The WIND computer program solved the Reynolds-Averaged Navier-Stokes equations for aerodynamic computations; the MGBK jet-noise prediction computer code was used to predict the sound pressure levels. Results from both the experiments and the analytical exercises indicated that while the shortest measurement arc (with radius approximately 8 nozzle diameters) was already in the geometric far field for high-frequency sound (Strouhal number >5), low-frequency sound (Strouhal number <0.2) reached the geometric far field at a measurement radius of at least 50 nozzle diameters because of its extended source distribution.
Jacobs, J; Weir, C; Evans, R S; Staes, C
2014-01-01
Following liver transplantation, patients require lifelong immunosuppressive care and monitoring. Computerized clinical decision support (CDS) has been shown to improve post-transplant immunosuppressive care processes and outcomes. The readiness of transplant information systems to implement computerized CDS to support post-transplant care is unknown. a) Describe the current clinical information system functionality and manual and automated processes for laboratory monitoring of immunosuppressive care, b) describe the use of guidelines that may be used to produce computable logic and the use of computerized alerts to support guideline adherence, and c) explore barriers to implementation of CDS in U.S. liver transplant centers. We developed a web-based survey using cognitive interviewing techniques. We surveyed 119 U.S. transplant programs that performed at least five liver transplantations per year during 2010-2012. Responses were summarized using descriptive analyses; barriers were identified using qualitative methods. Respondents from 80 programs (67% response rate) completed the survey. While 98% of programs reported having an electronic health record (EHR), all programs used paper-based manual processes to receive or track immunosuppressive laboratory results. Most programs (85%) reported that 30% or more of their patients used external laboratories for routine testing. Few programs (19%) received most external laboratory results as discrete data via electronic interfaces while most (80%) manually entered laboratory results into the EHR; less than half (42%) could integrate internal and external laboratory results. Nearly all programs had guidelines regarding pre-specified target ranges (92%) or testing schedules (97%) for managing immunosuppressive care. Few programs used computerized alerting to notify transplant coordinators of out-of-range (27%) or overdue laboratory results (20%). Use of EHRs is common, yet all liver transplant programs were largely dependent on manual paper-based processes to monitor immunosuppression for post-liver transplant patients. Similar immunosuppression guidelines provide opportunities for sharing CDS once integrated laboratory data are available.
Redirecting Under-Utilised Computer Laboratories into Cluster Computing Facilities
ERIC Educational Resources Information Center
Atkinson, John S.; Spenneman, Dirk H. R.; Cornforth, David
2005-01-01
Purpose: To provide administrators at an Australian university with data on the feasibility of redirecting under-utilised computer laboratories facilities into a distributed high performance computing facility. Design/methodology/approach: The individual log-in records for each computer located in the computer laboratories at the university were…
NREL's Building-Integrated Supercomputer Provides Heating and Efficient Computing (Fact Sheet)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2014-09-01
NREL's Energy Systems Integration Facility (ESIF) is meant to investigate new ways to integrate energy sources so they work together efficiently, and one of the key tools to that investigation, a new supercomputer, is itself a prime example of energy systems integration. NREL teamed with Hewlett-Packard (HP) and Intel to develop the innovative warm-water, liquid-cooled Peregrine supercomputer, which not only operates efficiently but also serves as the primary source of building heat for ESIF offices and laboratories. This innovative high-performance computer (HPC) can perform more than a quadrillion calculations per second as part of the world's most energy-efficient HPC datamore » center.« less
COMET-AR User's Manual: COmputational MEchanics Testbed with Adaptive Refinement
NASA Technical Reports Server (NTRS)
Moas, E. (Editor)
1997-01-01
The COMET-AR User's Manual provides a reference manual for the Computational Structural Mechanics Testbed with Adaptive Refinement (COMET-AR), a software system developed jointly by Lockheed Palo Alto Research Laboratory and NASA Langley Research Center under contract NAS1-18444. The COMET-AR system is an extended version of an earlier finite element based structural analysis system called COMET, also developed by Lockheed and NASA. The primary extensions are the adaptive mesh refinement capabilities and a new "object-like" database interface that makes COMET-AR easier to extend further. This User's Manual provides a detailed description of the user interface to COMET-AR from the viewpoint of a structural analyst.
Simulations of Laboratory Astrophysics Experiments using the CRASH code
NASA Astrophysics Data System (ADS)
Trantham, Matthew; Kuranz, Carolyn; Manuel, Mario; Keiter, Paul; Drake, R. P.
2014-10-01
Computer simulations can assist in the design and analysis of laboratory astrophysics experiments. The Center for Radiative Shock Hydrodynamics (CRASH) at the University of Michigan developed a code that has been used to design and analyze high-energy-density experiments on OMEGA, NIF, and other large laser facilities. This Eulerian code uses block-adaptive mesh refinement (AMR) with implicit multigroup radiation transport, electron heat conduction and laser ray tracing. This poster/talk will demonstrate some of the experiments the CRASH code has helped design or analyze including: Kelvin-Helmholtz, Rayleigh-Taylor, imploding bubbles, and interacting jet experiments. This work is funded by the Predictive Sciences Academic Alliances Program in NNSA-ASC via Grant DEFC52-08NA28616, by the NNSA-DS and SC-OFES Joint Program in High-Energy-Density Laboratory Plasmas, Grant Number DE-NA0001840, and by the National Laser User Facility Program, Grant Number DE-NA0000850.
Hatch leading into U.S. Laboratory / Destiny module
2001-02-11
STS98-E-5114 (11 February 2001) --- This medium close-up shot, photographed with a digital still camera, shows Unity's closed hatch to the newly delivered Destiny laboratory. The crews of Atlantis and the International Space Station opened the laboratory, shortly after this photo was made on Feb. 11, and the astronauts and cosmonauts spent the first full day of what are planned to be years of work ahead inside the orbiting science and command center. Station commander William M. (Bill) Shepherd opened the Destiny hatch, and he and shuttle commander Kenneth D. Cockrell ventured inside at 8:38 a.m. (CST), Feb. 11. As depicted in subsequent digital images in this series, members of both crews went to work quickly inside the new module, activating air systems, fire extinguishers, alarm systems, computers and internal communications. The crew also continued equipment transfers from the shuttle to the station.
NASA Technical Reports Server (NTRS)
Cetin, Haluk
1999-01-01
The purpose of this project was to establish a new hyperspectral remote sensing laboratory at the Mid-America Remote sensing Center (MARC), dedicated to in situ and laboratory measurements of environmental samples and to the manipulation, analysis, and storage of remotely sensed data for environmental monitoring and research in ecological modeling using hyperspectral remote sensing at MARC, one of three research facilities of the Center of Reservoir Research at Murray State University (MSU), a Kentucky Commonwealth Center of Excellence. The equipment purchased, a FieldSpec FR portable spectroradiometer and peripherals, and ENVI hyperspectral data processing software, allowed MARC to provide hands-on experience, education, and training for the students of the Department of Geosciences in quantitative remote sensing using hyperspectral data, Geographic Information System (GIS), digital image processing (DIP), computer, geological and geophysical mapping; to provide field support to the researchers and students collecting in situ and laboratory measurements of environmental data; to create a spectral library of the cover types and to establish a World Wide Web server to provide the spectral library to other academic, state and Federal institutions. Much of the research will soon be published in scientific journals. A World Wide Web page has been created at the web site of MARC. Results of this project are grouped in two categories, education and research accomplishments. The Principal Investigator (PI) modified remote sensing and DIP courses to introduce students to ii situ field spectra and laboratory remote sensing studies for environmental monitoring in the region by using the new equipment in the courses. The PI collected in situ measurements using the spectroradiometer for the ER-2 mission to Puerto Rico project for the Moderate Resolution Imaging Spectrometer (MODIS) Airborne Simulator (MAS). Currently MARC is mapping water quality in Kentucky Lake and vegetation in the Land-Between-the Lakes (LBL) using Landsat-TM data. A Landsat-TM scene of the same day was obtained to relate ground measurements to the satellite data. A spectral library has been created for overstory species in LBL. Some of the methods, such as NPDF and IDFD techniques for spectral unmixing and reduction of effects of shadows in classifications- comparison of hyperspectral classification techniques, and spectral nonlinear and linear unmixing techniques, are being tested using the laboratory.
Performance Assessment Institute-NV
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lombardo, Joesph
2012-12-31
The National Supercomputing Center for Energy and the Environment’s intention is to purchase a multi-purpose computer cluster in support of the Performance Assessment Institute (PA Institute). The PA Institute will serve as a research consortium located in Las Vegas Nevada with membership that includes: national laboratories, universities, industry partners, and domestic and international governments. This center will provide a one-of-a-kind centralized facility for the accumulation of information for use by Institutions of Higher Learning, the U.S. Government, and Regulatory Agencies and approved users. This initiative will enhance and extend High Performance Computing (HPC) resources in Nevada to support critical nationalmore » and international needs in "scientific confirmation". The PA Institute will be promoted as the leading Modeling, Learning and Research Center worldwide. The program proposes to utilize the existing supercomputing capabilities and alliances of the University of Nevada Las Vegas as a base, and to extend these resource and capabilities through a collaborative relationship with its membership. The PA Institute will provide an academic setting for interactive sharing, learning, mentoring and monitoring of multi-disciplinary performance assessment and performance confirmation information. The role of the PA Institute is to facilitate research, knowledge-increase, and knowledge-sharing among users.« less
Introducing Computational Approaches in Intermediate Mechanics
NASA Astrophysics Data System (ADS)
Cook, David M.
2006-12-01
In the winter of 2003, we at Lawrence University moved Lagrangian mechanics and rigid body dynamics from a required sophomore course to an elective junior/senior course, freeing 40% of the time for computational approaches to ordinary differential equations (trajectory problems, the large amplitude pendulum, non-linear dynamics); evaluation of integrals (finding centers of mass and moment of inertia tensors, calculating gravitational potentials for various sources); and finding eigenvalues and eigenvectors of matrices (diagonalizing the moment of inertia tensor, finding principal axes), and to generating graphical displays of computed results. Further, students begin to use LaTeX to prepare some of their submitted problem solutions. Placed in the middle of the sophomore year, this course provides the background that permits faculty members as appropriate to assign computer-based exercises in subsequent courses. Further, students are encouraged to use our Computational Physics Laboratory on their own initiative whenever that use seems appropriate. (Curricular development supported in part by the W. M. Keck Foundation, the National Science Foundation, and Lawrence University.)
NASA Astrophysics Data System (ADS)
Laws, Priscilla
2010-02-01
In June 1986 Ronald Thornton (at the Tufts University Center for Science and Mathematics Teaching) and Priscilla Laws (at Dickinson College) applied independently for grants to develop curricular materials based on both the outcomes of Physics Education Research and the use of Microcomputer Based Laboratory Tools (MBL) developed by Robert Tinker, Ron Thornton and others at Technical Education Research Centers (TERC). Thornton proposed to develop a series of Tools for Scientific Thinking (TST) laboratory exercises to address known learning difficulties using carefully sequenced MBL observations. These TST laboratories were to be beta tested at several types of institutions. Laws proposed to develop a Workshop Physics Activity Guide for a 2 semester calculus-based introductory course sequence centering on MBL-based guided inquiry. Workshop Physics was to be designed to replace traditional lectures and separate labs in relatively small classes and was to be tested at Dickinson College. In September 1986 a project officer at the Fund for Post-Secondary Education (FIPSE) awarded grants to Laws and Thornton provided that they would collaborate. David Sokoloff (at the University of Oregon) joined Thornton to develop and test the TST laboratories. This talk will describe the 23 year collaboration between Thornton, Laws, and Sokoloff that led to the development of a suite of Activity Based Physics curricular materials, new apparatus and enhanced computer tools for real time graphing, data collection and mathematical modeling. The Suite includes TST Labs, the Workshop Physics Activity Guide, RealTime Physics Laboratory Modules, and a series of Interactive Lecture Demonstrations. A textbook and a guide to using the Suite were also developed. The vital importance of obtaining continued grant support, doing continuous research on student learning, collaborating with instructors at other institutions, and forging relationships with vendors and publishers will be described. )
Genome annotation in a community college cell biology lab.
Beagley, C Timothy
2013-01-01
The Biology Department at Salt Lake Community College has used the IMG-ACT toolbox to introduce a genome mapping and annotation exercise into the laboratory portion of its Cell Biology course. This project provides students with an authentic inquiry-based learning experience while introducing them to computational biology and contemporary learning skills. Additionally, the project strengthens student understanding of the scientific method and contributes to student learning gains in curricular objectives centered around basic molecular biology, specifically, the Central Dogma. Importantly, inclusion of this project in the laboratory course provides students with a positive learning environment and allows for the use of cooperative learning strategies to increase overall student success. Copyright © 2012 International Union of Biochemistry and Molecular Biology, Inc.
2018-01-31
Michael Watkins, Director of NASA's Jet Propulsion Laboratory, left, Susan Finley, who began working at NASA's Jet Propulsion Laboratory in January 1958 as a "human computer", center, and Thomas Zurbuchen, Associate Administrator for NASA's Science Mission Directorate, right, pose for a picture with a replica of the Explorer 1 satellite during an event celebrating the 60th Anniversary of the Explorer 1 mission and the discovery of Earth's radiation belts, Wednesday, Jan. 31, 2018, at the National Academy of Sciences in Washington. The first U.S. satellite, Explorer 1, was launched from Cape Canaveral on January 31, 1958. The 30-pound satellite would yield a major scientific discovery, the Van Allen radiation belts circling our planet, and begin six decades of groundbreaking space science and human exploration. (NASA/Joel Kowsky)
STS-98 and Expedition One crew prepare to open U.S. Lab hatch
2001-02-11
STS098-352-0025 (11 February 2001) --- STS-98 mission commander Kenneth D. Cockrell (left) assists as Expedition One commander William M. (Bill) Shepherd opens the hatch to the newly attached Destiny laboratory. The crews of Atlantis and the International Space Station entered the laboratory shortly after this photo was made on February 11; and the astronauts and cosmonauts spent the first full day of what are planned to be years of work ahead inside the orbiting science and command center. Members of both crews went to work quickly inside the new module, activating air systems, fire extinguishers, alarm systems, computers and internal communications. The crew also continued equipment transfers from the shuttle to the station.
Study and Development of an Air Conditioning System Operating on a Magnetic Heat Pump Cycle
NASA Technical Reports Server (NTRS)
Wang, Pao-Lien
1991-01-01
This report describes the design of a laboratory scale demonstration prototype of an air conditioning system operating on a magnetic heat pump cycle. Design parameters were selected through studies performed by a Kennedy Space Center (KSC) System Simulation Computer Model. The heat pump consists of a rotor turning through four magnetic fields that are created by permanent magnets. Gadolinium was selected as the working material for this demonstration prototype. The rotor was designed to be constructed of flat parallel disks of gadolinium with very little space in between. The rotor rotates in an aluminum housing. The laboratory scale demonstration prototype is designed to provide a theoretical Carnot Cycle efficiency of 62 percent and a Coefficient of Performance of 16.55.
Global view of Venus from Magellan, Pioneer, and Venera data
1991-10-29
This global view of Venus, centered at 270 degrees east longitude, is a compilation of data from several sources. Magellan synthetic aperature radar mosaics from the first cycle of Magellan mapping are mapped onto a computer-simulated globe to create the image. Data gaps are filled with Pioneer-Venus orbiter data, or a constant mid-range value. Simulated color is used to enhance small-scale structure. The simulated hues are based on color images recorded by the Soviet Venera 13 and 14 spacecraft. The image was produced at the Jet Propulsion Laboratory (JPL) Multimission Image Processing Laboratory and is a single frame from a video released at the JPL news conference, 10-29-91. View provided by JPL with alternate number P-39225 MGN81.
First-Principles Thermodynamics Study of Spinel MgAl 2 O 4 Surface Stability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cai, Qiuxia; Wang, Jian-guo; Wang, Yong
The surface stability of all possible terminations for three low-index (111, 110, 100) structures of the spinel MgAl2O4 has been studied using first-principles based thermodynamic approach. The surface Gibbs free energy results indicate that the 100_AlO2 termination is the most stable surface structure under ultra-high vacuum at T=1100 K regardless of Al-poor or Al-rich environment. With increasing oxygen pressure, the 111_O2(Al) termination becomes the most stable surface in the Al-rich environment. The oxygen vacancy formation is thermodynamically favorable over the 100_AlO2, 111_O2(Al) and the (111) structure with Mg/O connected terminations. On the basis of surface Gibbs free energies for bothmore » perfect and defective surface terminations, the 100_AlO2 and 111_O2(Al) are the most dominant surfaces in Al-rich environment under atmospheric condition. This is also consistent with our previously reported experimental observation. This work was supported by a Laboratory Directed Research and Development (LDRD) project of the Pacific Northwest National Laboratory (PNNL). The computing time was granted by the National Energy Research Scientific Computing Center (NERSC). Part of computing time was also granted by a scientific theme user proposal in the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL), which is a U.S. Department of Energy national scientific user facility located at PNNL in Richland, Washington.« less
Automated real-time software development
NASA Technical Reports Server (NTRS)
Jones, Denise R.; Walker, Carrie K.; Turkovich, John J.
1993-01-01
A Computer-Aided Software Engineering (CASE) system has been developed at the Charles Stark Draper Laboratory (CSDL) under the direction of the NASA Langley Research Center. The CSDL CASE tool provides an automated method of generating source code and hard copy documentation from functional application engineering specifications. The goal is to significantly reduce the cost of developing and maintaining real-time scientific and engineering software while increasing system reliability. This paper describes CSDL CASE and discusses demonstrations that used the tool to automatically generate real-time application code.
MIT Laboratory for Computer Science Progress Report No. 23, July 1985-June 1986
1986-06-01
Staff P. Szolovits, Group Leader R. Patil Collaborating Investigators M. Criscitiello, M.D., Tufts-New England Medical Center Hospital M. Eckman, M.D...COMMON SYSTEM Academic Staff D. Clark, Leader S. Ward D. Gifford W. Weihl B. Liskov R. Zippel Research Staff T. Bloom R. Scheifler I. Greif K. Sollins...Group Leader J.B. Dennis R.S. Nikhil Research Staff W.B. Ackerman R.A. lannucci G.A. Boughton J.T. Pinkerton Graduate Students M.J. Beckerle B.C
Facilities at Indian Institute of Astrophysics and New Initiatives
NASA Astrophysics Data System (ADS)
Bhatt, Bhuwan Chandra
2018-04-01
The Indian Institute of Astrophysics is a premier national institute of India for the study of and research into topics pertaining to astronomy, astrophysics and related subjects. The Institute's main campus in Bangalore city in southern India houses the main administrative set up, library and computer center, photonics lab and state of art mechanical workshop. IIA has a network of laboratories and observatories located in various places in India, including Kodaikanal (Tamilnadu), Kavalur (Tamilnadu), Gauribidanur (Karnataka), Leh & Hanle (Jammu & Kashmir) and Hosakote (Karnataka).
NASA Technical Reports Server (NTRS)
Hwang, James; Campbell, Perry; Ross, Mike; Price, Charles R.; Barron, Don
1989-01-01
An integrated operating environment was designed to incorporate three general purpose robots, sensors, and end effectors, including Force/Torque Sensors, Tactile Array sensors, Tactile force sensors, and Force-sensing grippers. The design and implementation of: (1) the teleoperation of a general purpose PUMA robot; (2) an integrated sensor hardware/software system; (3) the force-sensing gripper control; (4) the host computer system for dual Robotic Research arms; and (5) the Ethernet integration are described.
Training the Future - Interns Harvesting & Testing Plant Experim
2017-07-19
In the Space Life Sciences Laboratory at NASA's Kennedy Space Center in Florida, student interns such as Ayla Grandpre, left, and Payton Barnwell are joining agency scientists, contributing in the area of plant growth research for food production in space. Grandpre is pursuing a degree in computer science and chemistry at Rocky Mountain College in Billings, Montana. Barnwell is a mechanical engineering and nanotechnology major at Florida Polytechnic University. The agency attracts its future workforce through the NASA Internship, Fellowships and Scholarships, or NIFS, Program.
Taylor, David; Valenza, John A; Spence, James M; Baber, Randolph H
2007-10-11
Simulation has been used for many years in dental education, but the educational context is typically a laboratory divorced from the clinical setting, which impairs the transfer of learning. Here we report on a true simulation clinic with multimedia communication from a central teaching station. Each of the 43 fully-functioning student operatories includes a thin-client networked computer with access to an Electronic Patient Record (EPR).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kozubal, Eric J
LiquidCool Solutions (LCS) has developed liquid submerged server (LSS) technology that changes the way computer electronics are cooled. The technology provides an option to cool electronics by the direct contact flow of dielectric fluid (coolant) into a sealed enclosure housing all the electronics of a single server. The intimate dielectric fluid contact with electronics improves the effectiveness of heat removal from the electronics.
Conversion of the CALAP (Computer Aided Landform Analysis Program) Program from FORTRAN to DUCK.
1986-09-01
J’ DUCK artificial intelligence logic programming 20 AVrACT (Cthm m reerse stabN ameeaaW idelfr by block mbae) An expert advisor program named CALAP...original program was developed in FORTRAN on an HP- 1000, a mirticomputer. CALAP was reprogrammed in an Artificial Intelligence (AI) language called DUCK...the Artificial Intelligence Center, U.S. Army Engineer Topographic Laboratory, Fort Belvoir. Z" I. S. n- Page 1 I. Introduction An expert advisor
Advanced computational tools for 3-D seismic analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barhen, J.; Glover, C.W.; Protopopescu, V.A.
1996-06-01
The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advancemore » in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.« less
THE NASA AMES POLYCYCLIC AROMATIC HYDROCARBON INFRARED SPECTROSCOPIC DATABASE: THE COMPUTED SPECTRA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bauschlicher, C. W.; Ricca, A.; Boersma, C.
The astronomical emission features, formerly known as the unidentified infrared bands, are now commonly ascribed to polycyclic aromatic hydrocarbons (PAHs). The laboratory experiments and computational modeling done at the NASA Ames Research Center to create a collection of PAH IR spectra relevant to test and refine the PAH hypothesis have been assembled into a spectroscopic database. This database now contains over 800 PAH spectra spanning 2-2000 {mu}m (5000-5 cm{sup -1}). These data are now available on the World Wide Web at www.astrochem.org/pahdb. This paper presents an overview of the computational spectra in the database and the tools developed to analyzemore » and interpret astronomical spectra using the database. A description of the online and offline user tools available on the Web site is also presented.« less
Computer laboratory in medical education for medical students.
Hercigonja-Szekeres, Mira; Marinović, Darko; Kern, Josipa
2009-01-01
Five generations of second year students at the Zagreb University School of Medicine were interviewed through an anonymous questionnaire on their use of personal computers, Internet, computer laboratories and computer-assisted education in general. Results show an advance in students' usage of information and communication technology during the period from 1998/99 to 2002/03. However, their positive opinion about computer laboratory depends on installed capacities: the better the computer laboratory technology, the better the students' acceptance and use of it.
Science alliance: A vital ORNL-UT partnership
DOE Office of Scientific and Technical Information (OSTI.GOV)
Richmond, C.R.; Riedinger, L.; Garritano, T.
1991-01-01
Partnerships between Department of Energy national laboratories and universities have long been keys to advancing scientific research and education in the United States. Perhaps the most enduring and closely knit of these relationships is the one between Oak Ridge National Laboratory and the University of Tennessee at Knoxville. Since its birth in the 1940's, ORNL has had a very special relationship with UT, and today the two institutions have closer ties than virtually any other university and national laboratory. Seven years ago, ORNL and UT began a new era of cooperation by creating the Science Alliance, a Center of Excellencemore » at UT sponsored by the Tennessee Higher Education Commission. As the oldest and largest of these centers, the Science Alliance is the primary vehicle through which Tennessee promotes research and educational collaboration between UT and ORNL. By letting the two institutions pool their intellectual and financial resources, the alliance creates a more fertile scientific environment than either could achieve on its own. Part of the UT College of Liberal Arts, the Science Alliance is composed of four divisions (Biological Sciences, Chemical Sciences, Physical Sciences, and Mathematics and Computer Science) that team 100 of the university's top faculty with their outstanding colleagues from ORNL.« less
Strain, J J; Felciano, R M; Seiver, A; Acuff, R; Fagan, L
1996-01-01
Approximately 30 minutes of computer access time are required by surgical residents at Stanford University Medical Center (SUMC) to examine the lab values of all patients on a surgical intensive care unit (ICU) service, a task that must be performed several times a day. To reduce the time accessing this information and simultaneously increase the readability and currency of the data, we have created a mobile, pen-based user interface and software system that delivers lab results to surgeons in the ICU. The ScroungeMaster system, loaded on a portable tablet computer, retrieves lab results for a subset of patients from the central laboratory computer and stores them in a local database cache. The cache can be updated on command; this update takes approximately 2.7 minutes for all ICU patients being followed by the surgeon, and can be performed as a background task while the user continues to access selected lab results. The user interface presents lab results according to physiologic system. Which labs are displayed first is governed by a layout selection algorithm based on previous accesses to the patient's lab information, physician preferences, and the nature of the patient's medical condition. Initial evaluation of the system has shown that physicians prefer the ScroungeMaster interface to that of existing systems at SUMC and are satisfied with the system's performance. We discuss the evolution of ScroungeMaster and make observations on changes to physician work flow with the presence of mobile, pen-based computing in the ICU.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mei, Donghai
2013-05-20
Molecular adsorption of formate and carboxyl on the stoichiometric CeO2(111) and CeO2(110) surfaces was studied using periodic density functional theory (DFT+U) calculations. Two distinguishable adsorption modes (strong and weak) of formate are identified. The bidentate configuration is more stable than the monodentate adsorption configuration. Both formate and carboxyl bind at the more open CeO2(110) surface are stronger. The calculated vibrational frequencies of two adsorbed species are consistent with experimental measurements. Finally, the effects of U parameters on the adsorption of formate and carboxyl over both CeO2 surfaces were investigated. We found that the geometrical configurations of two adsorbed species aremore » not affected by using different U parameters (U=0, 5, and 7). However, the calculated adsorption energy of carboxyl pronouncedly increases with the U value while the adsorption energy of formate only slightly changes (<0.2 eV). The Bader charge analysis shows the opposite charge transfer occurs for formate and carboxyl adsorption where the adsorbed formate is negatively charge whiled the adsorbed carboxyl is positively charged. Interestingly, with the increasing U parameter, the amount of charge is also increased. This work was supported by the Laboratory Directed Research and Development (LDRD) project of the Pacific Northwest National Laboratory (PNNL) and by a Cooperative Research and Development Agreement (CRADA) with General Motors. The computations were performed using the Molecular Science Computing Facility in the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL), which is a U.S. Department of Energy national scientific user facility located at PNNL in Richland, Washington. Part of the computing time was also granted by the National Energy Research Scientific Computing Center (NERSC)« less
Spira, Thomas; Lindegren, Mary Lou; Ferris, Robert; Habiyambere, Vincent; Ellerbrock, Tedd
2009-06-01
The expansion of HIV/AIDS care and treatment in resource-constrained countries, especially in sub-Saharan Africa, has generally developed in a top-down manner. Further expansion will involve primary health centers where human and other resources are limited. This article describes the World Health Organization/President's Emergency Plan for AIDS Relief collaboration formed to help scale up HIV services in primary health centers in high-prevalence, resource-constrained settings. It reviews the contents of the Operations Manual developed, with emphasis on the Laboratory Services chapter, which discusses essential laboratory services, both at the center and the district hospital level, laboratory safety, laboratory testing, specimen transport, how to set up a laboratory, human resources, equipment maintenance, training materials, and references. The chapter provides specific information on essential tests and generic job aids for them. It also includes annexes containing a list of laboratory supplies for the health center and sample forms.
Next Generation Workload Management System For Big Data on Heterogeneous Distributed Computing
NASA Astrophysics Data System (ADS)
Klimentov, A.; Buncic, P.; De, K.; Jha, S.; Maeno, T.; Mount, R.; Nilsson, P.; Oleynik, D.; Panitkin, S.; Petrosyan, A.; Porter, R. J.; Read, K. F.; Vaniachine, A.; Wells, J. C.; Wenaus, T.
2015-05-01
The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS and ALICE are the largest collaborations ever assembled in the sciences and are at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, both experiments rely on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Management System (WMS) for managing the workflow for all data processing on hundreds of data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. The scale is demonstrated by the following numbers: PanDA manages O(102) sites, O(105) cores, O(108) jobs per year, O(103) users, and ATLAS data volume is O(1017) bytes. In 2013 we started an ambitious program to expand PanDA to all available computing resources, including opportunistic use of commercial and academic clouds and Leadership Computing Facilities (LCF). The project titled ‘Next Generation Workload Management and Analysis System for Big Data’ (BigPanDA) is funded by DOE ASCR and HEP. Extending PanDA to clouds and LCF presents new challenges in managing heterogeneity and supporting workflow. The BigPanDA project is underway to setup and tailor PanDA at the Oak Ridge Leadership Computing Facility (OLCF) and at the National Research Center "Kurchatov Institute" together with ALICE distributed computing and ORNL computing professionals. Our approach to integration of HPC platforms at the OLCF and elsewhere is to reuse, as much as possible, existing components of the PanDA system. We will present our current accomplishments with running the PanDA WMS at OLCF and other supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facilities infrastructure for High Energy and Nuclear Physics as well as other data-intensive science applications.
Michigan/Air Force Research Laboratory (AFRL) Collaborative Center in Control Science (MACCCS)
2016-09-01
AFRL-RQ-WP-TR-2016-0139 MICHIGAN/AIR FORCE RESEARCH LABORATORY (AFRL) COLLABORATIVE CENTER IN CONTROL SCIENCE (MACCCS) Anouck Girard...Final 18 April 2007 – 30 September 2016 4. TITLE AND SUBTITLE MICHIGAN/AIR FORCE RESEARCH LABORATORY (AFRL) COLLABORATIVE CENTER IN CONTROL SCIENCE...and amplify an internationally recognized center of excellence in control science research and education, through interaction between the faculty and
The National Kidney Registry: 175 transplants in one year.
Veale, Jeffrey; Hil, Garet
2011-01-01
Since organizing its first swap in 2008, the National Kidney Registry had facilitated 389 kidney transplants by the end of 2011 across 45 U.S. transplant centers. Rapid innovations, advanced computer technologies, and an evolving understanding of the processes at participating transplant centers and histocompatibility laboratories are among the factors driving the success of the NKR. Virtual cross match accuracy has improved from 43% to 94% as a result of improvements in the HLA typing process for donor antigens and enhanced mechanisms to list unacceptable HLA antigens for sensitized patients. By the end of 2011, the NKR had transplanted 66% of the patients enrolled since 2008. The 2011 wait time (from enrollment to transplant) for the 175 patients transplanted that year averaged 5 months.
NASA Technical Reports Server (NTRS)
1997-01-01
I-FORCE, a computer peripheral from Immersion Corporation, was derived from virtual environment and human factors research at the Advanced Displays and Spatial Perception Laboratory at Ames Research Center in collaboration with Stanford University Center for Design Research. Entrepreneur Louis Rosenberg, a former Stanford researcher, now president of Immersion, collaborated with Dr. Bernard Adelstein at Ames on studies of perception in virtual reality. The result was an inexpensive way to incorporate motors and a sophisticated microprocessor into joysticks and other game controllers. These devices can emulate the feel of a car on the skid, a crashing plane, the bounce of a ball, compressed springs, or other physical phenomenon. The first products incorporating I-FORCE technology include CH- Products' line of FlightStick and CombatStick controllers.
NASA Astrophysics Data System (ADS)
Trochimczuk, R.
2017-02-01
This paper presents an analysis of a parallelogram mechanism commonly used to provide a kinematic remote center of motion in surgical telemanipulators. Selected types of parallel manipulator designs, encountered in commercial and laboratory-made designs described in the medical robotics literature, will serve as the research material. Among other things, computer simulations in the ANSYS 13.0 CAD/CAE software environment, employing the finite element method, will be used. The kinematics of the solution of manipulator with the parallelogram mechanism will be determined in order to provide a more complete description. These results will form the basis for the decision regarding the possibility of applying a parallelogram mechanism in an original prototype of a telemanipulator arm.
Using the Computer as a Laboratory Instrument.
ERIC Educational Resources Information Center
Collings, Peter J.; Greenslade, Thomas B., Jr.
1989-01-01
Reports experiences during a two-year period in introducing the computer to the laboratory and students to the computer as a laboratory instrument. Describes a working philosophy, data acquisition system, and experiments. Summarizes the laboratory procedures of nine experiments, covering mechanics, heat, electromagnetism, and optics. (YP)
Artificial intelligence within AFSC
NASA Technical Reports Server (NTRS)
Gersh, Mark A.
1990-01-01
Information on artificial intelligence research in the Air Force Systems Command is given in viewgraph form. Specific research that is being conducted at the Rome Air Development Center, the Space Technology Center, the Human Resources Laboratory, the Armstrong Aerospace Medical Research Laboratory, the Armamant Laboratory, and the Wright Research and Development Center is noted.
1993-12-01
A I 7f t UNITED STATE AIR FORCE SUMMER RESEARCH PROGRAM -- 1993 SUMMER RESEARCH PROGRAM FINAL REPORTS VOLUME 16 ARNOLD ENGINEERING DEVELOPMENT CENTER...FRANK J. SELLER RESEARCH LABORATORY WILFORD HALL MEDICAL CENTER RESEARCH & DEVELOPMENT LABORATORIES 5800 Uplander Way Culver City, CA 90230-6608...National Rd. Vol-Page No: 15-44 Dist Tecumseh High School 8.4 New Carlisle, OH 45344-0000 Barber, Jason Laboratory: AL/CF 1000 10th St. Vol-Page No
System Finds Horizontal Location of Center of Gravity
NASA Technical Reports Server (NTRS)
Johnston, Albert S.; Howard, Richard T.; Brewster, Linda L.
2006-01-01
An instrumentation system rapidly and repeatedly determines the horizontal location of the center of gravity of a laboratory vehicle that slides horizontally on three air bearings (see Figure 1). Typically, knowledge of the horizontal center-of-mass location of such a vehicle is needed in order to balance the vehicle properly for an experiment and/or to assess the dynamic behavior of the vehicle. The system includes a load cell above each air bearing, electronic circuits that generate digital readings of the weight on each load cell, and a computer equipped with software that processes the readings. The total weight and, hence, the mass of the vehicle are computed from the sum of the load-cell weight readings. Then the horizontal position of the center of gravity is calculated straightforwardly as the weighted sum of the known position vectors of the air bearings, the contribution of each bearing being proportional to the weight on that bearing. In the initial application for which this system was devised, the center- of-mass calculation is particularly simple because the air bearings are located at corners of an equilateral triangle. However, the system is not restricted to this simple geometry. The system acquires and processes weight readings at a rate of 800 Hz for each load cell. The total weight and the horizontal location of the center of gravity are updated at a rate of 800/3 approx. equals 267 Hz. In a typical application, a technician would use the center-of-mass output of this instrumentation system as a guide to the manual placement of small weights on the vehicle to shift the center of gravity to a desired horizontal position. Usually, the desired horizontal position is that of the geometric center. Alternatively, this instrumentation system could be used to provide position feedback for a control system that would cause weights to be shifted automatically (see Figure 2) in an effort to keep the center of gravity at the geometric center.
Astronomical Data Center Bulletin, volume 1, number 2
NASA Technical Reports Server (NTRS)
Nagy, T. A.; Warren, W. H., Jr.; Mead, J. M.
1981-01-01
Work in progress on astronomical catalogs is presented in 16 papers. Topics cover astronomical data center operations; automatic astronomical data retrieval at GSFC; interactive computer reference search of astronomical literature 1950-1976; formatting, checking, and documenting machine-readable catalogs; interactive catalog of UV, optical, and HI data for 201 Virgo cluster galaxies; machine-readable version of the general catalog of variable stars, third edition; galactic latitude and magnitude distribution of two astronomical catalogs; the catalog of open star clusters; infrared astronomical data base and catalog of infrared observations; the Air Force geophysics laboratory; revised magnetic tape of the N30 catalog of 5,268 standard stars; positional correlation of the two-micron sky survey and Smithsonian Astrophysical Observatory catalog sources; search capabilities for the catalog of stellar identifications (CSI) 1979 version; CSI statistics: blue magnitude versus spectral type; catalogs available from the Astronomical Data Center; and status report on machine-readable astronomical catalogs.
An Overview of the Computational Physics and Methods Group at Los Alamos National Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker, Randal Scott
CCS Division was formed to strengthen the visibility and impact of computer science and computational physics research on strategic directions for the Laboratory. Both computer science and computational science are now central to scientific discovery and innovation. They have become indispensable tools for all other scientific missions at the Laboratory. CCS Division forms a bridge between external partners and Laboratory programs, bringing new ideas and technologies to bear on today’s important problems and attracting high-quality technical staff members to the Laboratory. The Computational Physics and Methods Group CCS-2 conducts methods research and develops scientific software aimed at the latest andmore » emerging HPC systems.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Labios, Liezel A.; Heiden, Zachariah M.; Mock, Michael T.
2015-05-04
The synthesis of a series of P EtP NRR' (P EtP NRR' = Et₂PCH₂CH₂P(CH₂NRR')₂, R = H, R' = Ph or 2,4-difluorophenyl; R = R' = Ph or iPr) diphosphine ligands containing mono- and disubstituted pendant amine groups, and the preparation of their corresponding molybdenum bis(dinitrogen) complexes trans-Mo(N₂)₂(PMePh₂)₂(P EtP NRR') is described. In situ IR and multinuclear NMR spectroscopic studies monitoring the stepwise addition of (HOTf) to trans-Mo(N₂)₂(PMePh₂)₂(P EtP NRR') complexes in THF at -40 °C show that the electronic and steric properties of the R and R' groups of the pendant amines influence whether the complexes are protonated atmore » Mo, a pendant amine, a coordinated N2 ligand, or a combination of these sites. For example, complexes containing mono-aryl substituted pendant amines are protonated at Mo and pendant amine to generate mono- and dicationic Mo–H species. Protonation of the complex containing less basic diphenyl-substituted pendant amines exclusively generates a monocationic hydrazido (Mo(NNH₂)) product, indicating preferential protonation of an N₂ ligand. Addition of HOTf to the complex featuring more basic diisopropyl amines primarily produces a monocationic product protonated at a pendant amine site, as well as a trace amount of dicationic Mo(NNH₂) product that contain protonated pendant amines. In addition, trans-Mo(N₂)₂(PMePh₂)₂(depe) (depe = Et₂PCH₂CH₂PEt₂) without a pendant amine was synthesized and treated with HOTf, generating a monocationic Mo(NNH₂) product. Protonolysis experiments conducted on select complexes in the series afforded trace amounts of NH₄⁺. Computational analysis of the series of trans-Mo(N₂)₂(PMePh₂)₂(P EtP NRR') complexes provides further insight into the proton affinity values of the metal center, N₂ ligand, and pendant amine sites to rationalize the differing reactivity profiles. This research was supported as part of the Center for Molecular Electrocatalysis, an Energy Frontier Research Center funded by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences. Computational resources provided by the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory. Pacific Northwest National Laboratory is operated by Battelle for the U.S. Department of Energy.« less
ASCR Cybersecurity for Scientific Computing Integrity - Research Pathways and Ideas Workshop
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peisert, Sean; Potok, Thomas E.; Jones, Todd
At the request of the U.S. Department of Energy's (DOE) Office of Science (SC) Advanced Scientific Computing Research (ASCR) program office, a workshop was held June 2-3, 2015, in Gaithersburg, MD, to identify potential long term (10 to +20 year) cybersecurity fundamental basic research and development challenges, strategies and roadmap facing future high performance computing (HPC), networks, data centers, and extreme-scale scientific user facilities. This workshop was a follow-on to the workshop held January 7-9, 2015, in Rockville, MD, that examined higher level ideas about scientific computing integrity specific to the mission of the DOE Office of Science. Issues includedmore » research computation and simulation that takes place on ASCR computing facilities and networks, as well as network-connected scientific instruments, such as those run by various DOE Office of Science programs. Workshop participants included researchers and operational staff from DOE national laboratories, as well as academic researchers and industry experts. Participants were selected based on the submission of abstracts relating to the topics discussed in the previous workshop report [1] and also from other ASCR reports, including "Abstract Machine Models and Proxy Architectures for Exascale Computing" [27], the DOE "Preliminary Conceptual Design for an Exascale Computing Initiative" [28], and the January 2015 machine learning workshop [29]. The workshop was also attended by several observers from DOE and other government agencies. The workshop was divided into three topic areas: (1) Trustworthy Supercomputing, (2) Extreme-Scale Data, Knowledge, and Analytics for Understanding and Improving Cybersecurity, and (3) Trust within High-end Networking and Data Centers. Participants were divided into three corresponding teams based on the category of their abstracts. The workshop began with a series of talks from the program manager and workshop chair, followed by the leaders for each of the three topics and a representative of each of the four major DOE Office of Science Advanced Scientific Computing Research Facilities: the Argonne Leadership Computing Facility (ALCF), the Energy Sciences Network (ESnet), the National Energy Research Scientific Computing Center (NERSC), and the Oak Ridge Leadership Computing Facility (OLCF). The rest of the workshop consisted of topical breakout discussions and focused writing periods that produced much of this report.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robinson, Alastair; Regnier, Cindy; Settlemyre, Kevin
Massachusetts Institute of Technology (MIT) partnered with the U.S. Department of Energy (DOE) to develop and implement solutions to retrofit existing buildings to reduce energy consumption by at least 30% as part of DOE’s Commercial Building Partnerships (CBP) Program.1 Lawrence Berkeley National Laboratory (LBNL) provided technical expertise in support of this DOE program. MIT is one of the U.S.’s foremost higher education institutions, occupying a campus that is nearly 100 years old, with a building floor area totaling more than 12 million square feet. The CBP project focused on improving the energy performance of two campus buildings, the Ray andmore » Maria Stata Center (RMSC) and the Building W91 (BW91) data center. A key goal of the project was to identify energy saving measures that could be applied to other buildings both within MIT’s portfolio and at other higher education institutions. The CBP retrofits at MIT are projected to reduce energy consumption by approximately 48%, including a reduction of around 72% in RMSC lighting energy and a reduction of approximately 55% in RMSC server room HVAC energy. The energy efficiency measure (EEM) package proposed for the BW91 data center is expected to reduce heating, ventilation, and air-conditioning (HVAC) energy use by 30% to 50%, depending on the final air intake temperature that is established for the server racks. The RMSC, an iconic building designed by Frank Gehry, houses the Computer Science and Artificial Intelligence Laboratory, the Laboratory for Information and Decision Systems, and the Department of Linguistics and Philosophy.« less
ANL statement of site strategy for computing workstations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fenske, K.R.; Boxberger, L.M.; Amiot, L.W.
1991-11-01
This Statement of Site Strategy describes the procedure at Argonne National Laboratory for defining, acquiring, using, and evaluating scientific and office workstations and related equipment and software in accord with DOE Order 1360.1A (5-30-85), and Laboratory policy. It is Laboratory policy to promote the installation and use of computing workstations to improve productivity and communications for both programmatic and support personnel, to ensure that computing workstations acquisitions meet the expressed need in a cost-effective manner, and to ensure that acquisitions of computing workstations are in accord with Laboratory and DOE policies. The overall computing site strategy at ANL is tomore » develop a hierarchy of integrated computing system resources to address the current and future computing needs of the laboratory. The major system components of this hierarchical strategy are: Supercomputers, Parallel computers, Centralized general purpose computers, Distributed multipurpose minicomputers, and Computing workstations and office automation support systems. Computing workstations include personal computers, scientific and engineering workstations, computer terminals, microcomputers, word processing and office automation electronic workstations, and associated software and peripheral devices costing less than $25,000 per item.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerber, Richard; Allcock, William; Beggio, Chris
2014-10-17
U.S. Department of Energy (DOE) High Performance Computing (HPC) facilities are on the verge of a paradigm shift in the way they deliver systems and services to science and engineering teams. Research projects are producing a wide variety of data at unprecedented scale and level of complexity, with community-specific services that are part of the data collection and analysis workflow. On June 18-19, 2014 representatives from six DOE HPC centers met in Oakland, CA at the DOE High Performance Operational Review (HPCOR) to discuss how they can best provide facilities and services to enable large-scale data-driven scientific discovery at themore » DOE national laboratories. The report contains findings from that review.« less
NASA Technical Reports Server (NTRS)
Horvitz, Eric; Ruokangas, Corinne; Srinivas, Sampath; Barry, Matthew
1993-01-01
We describe a collaborative research and development effort between the Palo Alto Laboratory of the Rockwell Science Center, Rockwell Space Operations Company, and the Propulsion Systems Section of NASA JSC to design computational tools that can manage the complexity of information displayed to human operators in high-stakes, time-critical decision contexts. We shall review an application from NASA Mission Control and describe how we integrated a probabilistic diagnostic model and a time-dependent utility model, with techniques for managing the complexity of computer displays. Then, we shall describe the behavior of VPROP, a system constructed to demonstrate promising display-management techniques. Finally, we shall describe our current research directions on the Vista 2 follow-on project.
Cloud computing: a new business paradigm for biomedical information sharing.
Rosenthal, Arnon; Mork, Peter; Li, Maya Hao; Stanford, Jean; Koester, David; Reynolds, Patti
2010-04-01
We examine how the biomedical informatics (BMI) community, especially consortia that share data and applications, can take advantage of a new resource called "cloud computing". Clouds generally offer resources on demand. In most clouds, charges are pay per use, based on large farms of inexpensive, dedicated servers, sometimes supporting parallel computing. Substantial economies of scale potentially yield costs much lower than dedicated laboratory systems or even institutional data centers. Overall, even with conservative assumptions, for applications that are not I/O intensive and do not demand a fully mature environment, the numbers suggested that clouds can sometimes provide major improvements, and should be seriously considered for BMI. Methodologically, it was very advantageous to formulate analyses in terms of component technologies; focusing on these specifics enabled us to bypass the cacophony of alternative definitions (e.g., exactly what does a cloud include) and to analyze alternatives that employ some of the component technologies (e.g., an institution's data center). Relative analyses were another great simplifier. Rather than listing the absolute strengths and weaknesses of cloud-based systems (e.g., for security or data preservation), we focus on the changes from a particular starting point, e.g., individual lab systems. We often find a rough parity (in principle), but one needs to examine individual acquisitions--is a loosely managed lab moving to a well managed cloud, or a tightly managed hospital data center moving to a poorly safeguarded cloud? 2009 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Phister, P. W., Jr.
1983-12-01
Development of the Air Force Institute of Technology's Digital Engineering Laboratory Network (DELNET) was continued with the development of an initial draft of a protocol standard for all seven layers as specified by the International Standards Organization's (ISO) Reference Model for Open Systems Interconnections. This effort centered on the restructuring of the Network Layer to perform Datagram routing and to conform to the developed protocol standards and actual software module development of the upper four protocol layers residing within the DELNET Monitor (Zilog MCZ 1/25 Computer System). Within the guidelines of the ISO Reference Model the Transport Layer was developed utilizing the Internet Header Format (IHF) combined with the Transport Control Protocol (TCP) to create a 128-byte Datagram. Also a limited Application Layer was created to pass the Gettysburg Address through the DELNET. This study formulated a first draft for the DELNET Protocol Standard and designed, implemented, and tested the Network, Transport, and Application Layers to conform to these protocol standards.
A Data-Driven Framework for Incorporating New Tools for ...
This talk was given during the “Exposure-Based Toxicity Testing” session at the annual meeting of the International Society for Exposure Science. It provided an update on the state of the science and tools that may be employed in risk-based prioritization efforts. It outlined knowledge gained from the data provided using these high-throughput tools to assess chemical bioactivity and to predict chemical exposures and also identified future needs. It provided an opportunity to showcase ongoing research efforts within the National Exposure Research Laboratory and the National Center for Computational Toxicology within the Office of Research and Development to an international audience. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.
NASA Technical Reports Server (NTRS)
1998-01-01
This video is a collection of computer animations and live footage showing the construction and assembly of the International Space Station (ISS). Computer animations show the following: (1) ISS fly around; (2) ISS over a sunrise seen from space; (3) the launch of the Zarya Control Module; (4) a Proton rocket launch; (5) the Space Shuttle docking with Zarya and attaching Zarya to the Unity Node; (6) the docking of the Service Module, Zarya, and Unity to Soyuz; (7) the Space Shuttle docking to ISS and installing the Z1 Truss segment and the Pressurized Mating Adapter (PMA); (8) Soyuz docking to the ISS; (9) the Transhab components; and (10) a complete ISS assembly. Live footage shows the construction of Zarya, the Proton rocket, Unity Node, PMA, Service Module, US Laboratory, Italian Multipurpose Logistics Module, US Airlock, and the US Habitation Module. STS-88 Mission Specialists Jerry Ross and James Newman are seen training in the Neutral Buoyancy Laboratory (NBL). The Expedition 1 crewmembers, William Shepherd, Yuri Gidzenko, and Sergei Krikalev, are shown training in the Black Sea and at Johnson Space Flight Center for water survival.
NASA Technical Reports Server (NTRS)
1990-01-01
NASA formally launched Project LASER (Learning About Science, Engineering and Research) in March 1990, a program designed to help teachers improve science and mathematics education and to provide 'hands on' experiences. It featured the first LASER Mobile Teacher Resource Center (MTRC), is designed to reach educators all over the nation. NASA hopes to operate several MTRCs with funds provided by private industry. The mobile unit is a 22-ton tractor-trailer stocked with NASA educational publications and outfitted with six work stations. Each work station, which can accommodate two teachers at a time, has a computer providing access to NASA Spacelink. Each also has video recorders and photocopy/photographic equipment for the teacher's use. MTRC is only one of the five major elements within LASER. The others are: a Space Technology Course, to promote integration of space science studies with traditional courses; the Volunteer Databank, in which NASA employees are encouraged to volunteer as tutors, instructors, etc; Mobile Discovery Laboratories that will carry simple laboratory equipment and computers to provide hands-on activities for students and demonstrations of classroom activities for teachers; and the Public Library Science Program which will present library based science and math programs.
NASA Astrophysics Data System (ADS)
Klieger, Aviva; Ben-Hur, Yehuda; Bar-Yossef, Nurit
2010-04-01
The study examines the professional development of junior-high-school teachers participating in the Israeli "Katom" (Computer for Every Class, Student and Teacher) Program, begun in 2004. A three-circle support and training model was developed for teachers' professional development. The first circle applies to all teachers in the program; the second, to all teachers at individual schools; the third to teachers of specific disciplines. The study reveals and describes the attitudes of science teachers to the integration of laptop computers and to the accompanying professional development model. Semi-structured interviews were conducted with eight science teachers from the four schools participating in the program. The interviews were analyzed according to the internal relational framework taken from the information that arose from the interviews. Two factors influenced science teachers' professional development: (1) Introduction of laptops to the teachers and students. (2) The support and training system. Interview analysis shows that the disciplinary training is most relevant to teachers and they are very interested in belonging to the professional science teachers' community. They also prefer face-to-face meetings in their school. Among the difficulties they noted were the new learning environment, including control of student computers, computer integration in laboratory work and technical problems. Laptop computers contributed significantly to teachers' professional and personal development and to a shift from teacher-centered to student-centered teaching. One-to-One laptops also changed the schools' digital culture. The findings are important for designing concepts and models for professional development when introducing technological innovation into the educational system.
Process Engineering Technology Center Initiative
NASA Technical Reports Server (NTRS)
Centeno, Martha A.
2001-01-01
NASA's Kennedy Space Center (KSC) is developing as a world-class Spaceport Technology Center (STC). From a process engineering (PE) perspective, the facilities used for flight hardware processing at KSC are NASA's premier factories. The products of these factories are safe, successful shuttle and expendable vehicle launches carrying state-of-the-art payloads. PE is devoted to process design, process management, and process improvement, rather than product design. PE also emphasizes the relationships of workers with systems and processes. Thus, it is difficult to speak of having a laboratory for PE at KSC because the entire facility is practically a laboratory when observed from a macro level perspective. However, it becomes necessary, at times, to show and display how KSC has benefited from PE and how KSC has contributed to the development of PE; hence, it has been proposed that a Process Engineering Technology Center (PETC) be developed to offer a place with a centralized focus on PE projects, and a place where KSC's PE capabilities can be showcased, and a venue where new Process Engineering technologies can be investigated and tested. Graphics for showcasing PE capabilities have been designed, and two initial test beds for PE technology research have been identified. Specifically, one test bed will look into the use of wearable computers with head mounted displays to deliver work instructions; the other test bed will look into developing simulation models that can be assembled into one to create a hierarchical model.
Process Engineering Technology Center Initiative
NASA Technical Reports Server (NTRS)
Centeno, Martha A.
2002-01-01
NASA's Kennedy Space Center (KSC) is developing as a world-class Spaceport Technology Center (STC). From a process engineering (PE) perspective, the facilities used for flight hardware processing at KSC are NASA's premier factories. The products of these factories are safe, successful shuttle and expendable vehicle launches carrying state-of-the-art payloads. PE is devoted to process design, process management, and process improvement, rather than product design. PE also emphasizes the relationships of workers with systems and processes. Thus, it is difficult to speak of having a laboratory for PE at K.S.C. because the entire facility is practically a laboratory when observed from a macro level perspective. However, it becomes necessary, at times, to show and display how K.S.C. has benefited from PE and how K.S.C. has contributed to the development of PE; hence, it has been proposed that a Process Engineering Technology Center (PETC) be developed to offer a place with a centralized focus on PE projects, and a place where K.S.C.'s PE capabilities can be showcased, and a venue where new Process Engineering technologies can be investigated and tested. Graphics for showcasing PE capabilities have been designed, and two initial test beds for PE technology research have been identified. Specifically, one test bed will look into the use of wearable computers with head mounted displays to deliver work instructions; the other test bed will look into developing simulation models that can be assembled into one to create a hierarchical model.
Electrochemistry for Energy Conversion
NASA Astrophysics Data System (ADS)
O'Hayre, Ryan
2010-10-01
Imagine a laptop computer that runs for 30 hours on a single charge. Imagine a world where you plug your house into your car and power lines are a distant memory. These dreams motivate today's fuel cell research. While some dreams (like powering your home with your fuel cell car) may be distant, others (like a 30-hour fuel cell laptop) may be closer than you think. If you are curious about fuel cells---how they work, when you might start seeing them in your daily life--- this talk is for you. Learn about the state-of-the art in fuel cells, and where the technology is likely to be headed in the next 20 years. You'll also be treated to several ``behind-the scenes'' glimpses of cutting-edge research projects under development in the Renewable Energy Materials Center at the Colorado School of Mines--- projects like an ``ionic transistor'' that works with protons instead of electrons, and a special ceramic membrane material that enables the ``uphill'' diffusion of steam. Associate Professor Ryan O'Hayre's laboratory at the Colorado School of Mines develops new materials and devices to enable alternative energy technologies including fuel cells and solar cells. Prof. O'Hayre and his students collaborate with the Colorado Fuel Cell Center, the Colorado Center for Advanced Ceramics, the Renewable Energy Materials Science and Engineering Center, and the National Renewable Energy Laboratory.[4pt] In collaboration with Ann Deml, Jianhua Tong, Svitlana Pylypenko, Archana Subramaniyan, Micahael Sanders, Jason Fish, Annette Bunge, Colorado School of Mines.
Watanabe, S; Tanaka, M; Wada, Y; Suzuki, H; Takagi, S; Mori, S; Fukai, K; Kanazawa, Y; Takagi, M; Hirakawa, K; Ogasawara, K; Tsumura, K; Ogawa, K; Matsumoto, K; Nagaoka, S; Suzuki, T; Shimura, D; Yamashita, M; Nishio, S
1994-07-01
The telescience testbed experiments were carried out to test and investigate the tele-manipulation techniques in the intracellular potential recording of amphibian eggs. Implementation of telescience testbed was set up in the two separated laboratories of the Tsukuba Space center of NASDA, which were connected by tele-communication links. Manipulators respective for a microelectrode and a sample stage of microscope were moved by computers, of which command signals were transmitted from a computer in a remote control room. The computer in the control room was operated by an investigator (PI) who controlled the movement of each manipulator remotely. A stereoscopic vision of the microscope image were prepared by using a head mounted display (HMD) and were indispensable to the intracellular single cell recording. The fertilization potential of amphibian eggs was successfully obtained through the remote operating system.
Interactive information processing for NASA's mesoscale analysis and space sensor program
NASA Technical Reports Server (NTRS)
Parker, K. G.; Maclean, L.; Reavis, N.; Wilson, G.; Hickey, J. S.; Dickerson, M.; Karitani, S.; Keller, D.
1985-01-01
The Atmospheric Sciences Division (ASD) of the Systems Dynamics Laboratory at NASA's Marshall Space Flight Center (MSFC) is currently involved in interactive information processing for the Mesoscale Analysis and Space Sensor (MASS) program. Specifically, the ASD is engaged in the development and implementation of new space-borne remote sensing technology to observe and measure mesoscale atmospheric processes. These space measurements and conventional observational data are being processed together to gain an improved understanding of the mesoscale structure and the dynamical evolution of the atmosphere relative to cloud development and precipitation processes. To satisfy its vast data processing requirements, the ASD has developed a Researcher Computer System consiting of three primary computer systems which provides over 20 scientists with a wide range of capabilities for processing and displaying a large volumes of remote sensing data. Each of the computers performs a specific function according to its unique capabilities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Warpinski, N.R.
A computer code for calculating hydraulic fracture height and width in a stressed-layer medium has been modified for easy use on a personal computer. HSTRESS allows for up to 51 layers having different thicknesses, stresses and fracture toughnesses. The code can calculate fracture height versus pressure or pressure versus fracture height, depending on the design model in which the data will be used. At any pressure/height, a width profile is calculated and an equivalent width factor and flow resistance factor are determined. This program is written in FORTRAN. Graphics use PLOT88 software by Plotworks, Inc., but the graphics software mustmore » be obtained by the user because of licensing restrictions. A version without graphics can also be run. This code is available through the National Energy Software Center (NESC), operated by Argonne National Laboratory. 14 refs., 21 figs.« less
An, Gary; Bartels, John; Vodovotz, Yoram
2011-03-01
The clinical translation of promising basic biomedical findings, whether derived from reductionist studies in academic laboratories or as the product of extensive high-throughput and -content screens in the biotechnology and pharmaceutical industries, has reached a period of stagnation in which ever higher research and development costs are yielding ever fewer new drugs. Systems biology and computational modeling have been touted as potential avenues by which to break through this logjam. However, few mechanistic computational approaches are utilized in a manner that is fully cognizant of the inherent clinical realities in which the drugs developed through this ostensibly rational process will be ultimately used. In this article, we present a Translational Systems Biology approach to inflammation. This approach is based on the use of mechanistic computational modeling centered on inherent clinical applicability, namely that a unified suite of models can be applied to generate in silico clinical trials, individualized computational models as tools for personalized medicine, and rational drug and device design based on disease mechanism.
Calibration Laboratory Capabilities Listing as of April 2009
NASA Technical Reports Server (NTRS)
Kennedy, Gary W.
2009-01-01
This document reviews the Calibration Laboratory capabilities for various NASA centers (i.e., Glenn Research Center and Plum Brook Test Facility Kennedy Space Center Marshall Space Flight Center Stennis Space Center and White Sands Test Facility.) Some of the parameters reported are: Alternating current, direct current, dimensional, mass, force, torque, pressure and vacuum, safety, and thermodynamics parameters. Some centers reported other parameters.
Exploring Midwives' Need and Intention to Adopt Electronic Integrated Antenatal Care.
Markam, Hosizah; Hochheiser, Harry; Kuntoro, Kuntoro; Notobroto, Hari Basuki
2018-01-01
Documentation requirements for the Indonesian integrated antenatal care (ANC) program suggest the need for electronic systems to address gaps in existing paper documentation practices. Our goals were to quantify midwives' documentation completeness in a primary healthcare center, understand documentation challenges, develop a tool, and assess intention to use the tool. We analyzed existing ANC records in a primary healthcare center in Bangkalan, East Java, and conducted interviews with stakeholders to understand needs for an electronic system in support of ANC. Development of the web-based Electronic Integrated ANC (e-iANC) system used the System Development Life Cycle method. Training on the use of the system was held in the computer laboratory for 100 midwives chosen from four primary healthcare centers in each of five regions. The Unified Theory of Acceptance and Use of Technology (UTAUT) questionnaire was used to assess their intention to adopt e-iANC. The midwives' intention to adopt e-iANC was significantly influenced by performance expectancy, effort expectancy and facilitating conditions. Age, education level, and computer literacy did not significantly moderate the effects of performance expectancy and effort expectancy on adoption intention. The UTAUT results indicated that the factors that might influence intention to adopt e-iANC are potentially addressable. Results suggest that e-iANC might well be accepted by midwives.
78 FR 44954 - Clinical Laboratory Improvement Advisory Committee (CLIAC)
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-25
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Disease Control and Prevention Clinical... and Human Services; the Assistant Secretary for Health; the Director, Centers for Disease Control and... laboratory quality and laboratory [[Page 44955
Putting the Laboratory at the Center of Teaching Chemistry
ERIC Educational Resources Information Center
Bopegedera, A. M. R. P.
2011-01-01
This article describes an effective approach to teaching chemistry by bringing the laboratory to the center of teaching, to bring the excitement of discovery to the learning process. The lectures and laboratories are closely integrated to provide a holistic learning experience. The laboratories progress from verification to open-inquiry and…
ERIC Educational Resources Information Center
Liu, Xiufeng
2006-01-01
Based on current theories of chemistry learning, this study intends to test a hypothesis that computer modeling enhanced hands-on chemistry laboratories are more effective than hands-on laboratories or computer modeling laboratories alone in facilitating high school students' understanding of chemistry concepts. Thirty-three high school chemistry…
Coastal Oceanography in the Beaufort Sea, Summer 1985.
1987-07-01
Laboratory University of Washington Li -- and 1: R. K. Perry The Arctic Submarine Laboratory Naval Ocean Systems Center 1. . .. DT IC .ELECTE J 27 V...Applied Physics Laboratory Arctic Submarine Laboratory University of Washington Naval Ocean Systems Center Seattle, Washington 98105 San Diego, California...Becker and G. R. Garrison N The Applied Physics Laboratory University of Washington and R. K. Perry The Arctic Submarine Laboratory Naval Ocean Systems
Chen, Tun-Chieh; Lin, Wei-Ru; Lu, Po-Liang; Lin, Chun-Yu; Lin, Shu-Hui; Lin, Chuen-Ju; Feng, Ming-Chu; Chiang, Horn-Che; Chen, Yen-Hsu; Huang, Ming-Shyan
2011-06-01
We investigated the impacts of introducing an expedited acid-fast bacilli (AFB) smear laboratory procedure and an automatic, real-time laboratory notification system by short message with mobile phones on delays in prompt isolation of patients with pulmonary tuberculosis (TB). We analyzed the data for all patients with active pulmonary tuberculosis at a hospital in Kaohsiung, Taiwan, a 1,600-bed medical center, during baseline (January 2004 to February 2005) and intervention (July 2005 to August 2006) phases. A total of 96 and 127 patients with AFB-positive TB was reported during the baseline and intervention phases, respectively. There were significant decreases in health care system delays (ie, laboratory delays: reception of sputum to reporting, P < .001; response delays: reporting to patient isolation, P = .045; and interval from admission to patient isolation, P < .001) during the intervention phase. Significantly fewer nurses were exposed to each patient with active pulmonary TB during the intervention phase (P = .039). Implementation of expedited AFB smear laboratory procedures and an automatic, real-time laboratory mobile notification system significantly decreased delays in the diagnosis and isolation of patients with active TB. Copyright © 2011 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Mosby, Inc. All rights reserved.
Computer Implementation of the Bounding Surface Plasticity Model for Cohesive Soils.
1983-12-01
Contract # DACA 39-79-M-0059), the Civil Engineering Laboratory, Naval Construction Battalion Center (Re: Orders N62583-80-M-R478, N62583-81-M- R320, N62#74...correspondinga a expression in the equation of "Modification 5" (Section 1) is treated in a similar fashion. In general, this arbitrary action does not...Contract # DACA 39-79-M-0059), April 1980. 3. Dafalias, Y.F., L.R. Herrmann and J.S. DeNatale, "Prediction of the Response of the Natural Clays X and Y
2003-04-24
KENNEDY SPACE CENTER, FLA. - Tom Shain, the MER ATLO logistics manager, holds a computer chip with about 35,000 laser-engraved signatures of visitors to the Mars Exploration Rovers at the Jet Propulsion Laboratory. He and Jim Lloyd, also with the program, will place the chip on the second rover to be launched to Mars (MER-1/MER-B); the first rover already has one. The signatures include those of senators, artists, and John Glenn. The identical Mars rovers are scheduled to launch June 5 and June 25 from Cape Canaveral Air Force Station.
Support of the Laboratory for Terrestrial Physics for Dynamics of the Solid Earth (DOSE)
NASA Technical Reports Server (NTRS)
Vandenberg, N. R.; Ma, C. (Technical Monitor)
2002-01-01
This final report summarizes the accomplishments during the contract period. Under the contract Nepal, Inc. provided support to the VLBI group at NASA's Goddard Space Flight Center. The contract covered a period of approximately eight years during high geodetic and astrometric VLBI evolved through several major changes. This report is divided into five sections that correspond to major task areas in the contract: A) Coordination rid Scheduling, B) Field System, CN Station Support, D) Analysis and Research and Development, and E) Computer Support.
The evolving trend in spacecraft health analysis
NASA Technical Reports Server (NTRS)
Kirkpatrick, Russell L.
1993-01-01
The Space Flight Operations Center inaugurated the concept of a central data repository for spacecraft data and the distribution of computing power to the end users for that data's analysis at the Jet Propulsion Laboratory. The Advanced Multimission Operations System is continuing the evolution of this concept as new technologies emerge. Constant improvements in data management tools, data visualization, and hardware lead to ever expanding ideas for improving the analysis of spacecraft health in an era of budget constrained mission operations systems. The foundation of this evolution, its history, and its current plans will be discussed.
2016-11-17
A test unit, or prototype, of NASA's Advanced Plant Habitat (APH) was delivered to the Space Station Processing Facility at the agency's Kennedy Space Center in Florida. Inside a laboratory, Engineering Services Contract engineers set up test parameters on computers. From left, are Glenn Washington, ESC quality engineer; Claton Grosse, ESC mechanical engineer; and Jeff Richards, ESC project scientist. The APH is the largest plant chamber built for the agency. It will have 180 sensors and four times the light output of Veggie. The APH will be delivered to the International Space Station in March 2017.
NASA Technical Reports Server (NTRS)
1996-01-01
Through Goddard Space Flight Center and Jet Propulsion Laboratory Small Business Innovation Research contracts, Irvine Sensors developed a three-dimensional memory system for a spaceborne data recorder and other applications for NASA. From these contracts, the company created the Memory Short Stack product, a patented technology for stacking integrated circuits that offers higher processing speeds and levels of integration, and lower power requirements. The product is a three-dimensional semiconductor package in which dozens of integrated circuits are stacked upon each other to form a cube. The technology is being used in various computer and telecommunications applications.
Majamanda, J; Ndhlovu, P; Shawa, I T
2013-12-01
Tuberculosis (TB) is caused by Mycobacterium tuberculosis and is transmitted mainly through aerosolization of infected sputum which puts laboratory workers at risk in spite of the laboratory workers' risk of infection being at 3 to 9 times higher than the general public. Laboratory safety should therefore be prioritized and optimized to provide sufficient safety to laboratory workers. To assess the safety for the laboratory workers in TB primary microscopy centres in Blantyre urban. TB primary microscopy centers in Blantyre urban were assessed in aspects of equipment availability, facility layout, and work practice, using a standardized WHO/AFRO ISO 15189 checklist for the developing countries which sets the minimum safety score at ≥80%. Each center was graded according to the score it earned upon assessment. Only one (1) microscopy center out nine (9) reached the minimum safety requirement. Four (4) centers were awarded 1 star level, four (4) centers were awarded 2 star level and only one (1) center was awarded 3 star level. In Blantyre urban, 89% of the Tuberculosis microscopy centers are failing to provide the minimum safety to the laboratory workers. Government and other stake holders should be committed in addressing the safety challenges of TB microscopy centres in the country to ensure safety for the laboratory workers. It is recommended that the study be conducted at the regional or national level for both public and private laboratories in order to have a general picture of safety in Tb microscopy centres possibly across the country.
The Russian effort in establishing large atomic and molecular databases
NASA Astrophysics Data System (ADS)
Presnyakov, Leonid P.
1998-07-01
The database activities in Russia have been developed in connection with UV and soft X-ray spectroscopic studies of extraterrestrial and laboratory (magnetically confined and laser-produced) plasmas. Two forms of database production are used: i) a set of computer programs to calculate radiative and collisional data for the general atom or ion, and ii) development of numeric database systems with the data stored in the computer. The first form is preferable for collisional data. At the Lebedev Physical Institute, an appropriate set of the codes has been developed. It includes all electronic processes at collision energies from the threshold up to the relativistic limit. The ion -atom (and -ion) collisional data are calculated with the methods developed recently. The program for the calculations of the level populations and line intensities is used for spectrical diagnostics of transparent plasmas. The second form of database production is widely used at the Institute of Physico-Technical Measurements (VNIIFTRI), and the Troitsk Center: the Institute of Spectroscopy and TRINITI. The main results obtained at the centers above are reviewed. Plans for future developments jointly with international collaborations are discussed.
The Spider Center Wide File System; From Concept to Reality
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shipman, Galen M; Dillow, David A; Oral, H Sarp
2009-01-01
The Leadership Computing Facility (LCF) at Oak Ridge National Laboratory (ORNL) has a diverse portfolio of computational resources ranging from a petascale XT4/XT5 simulation system (Jaguar) to numerous other systems supporting development, visualization, and data analytics. In order to support vastly different I/O needs of these systems Spider, a Lustre-based center wide file system was designed and deployed to provide over 240 GB/s of aggregate throughput with over 10 Petabytes of formatted capacity. A multi-stage InfiniBand network, dubbed as Scalable I/O Network (SION), with over 889 GB/s of bisectional bandwidth was deployed as part of Spider to provide connectivity tomore » our simulation, development, visualization, and other platforms. To our knowledge, while writing this paper, Spider is the largest and fastest POSIX-compliant parallel file system in production. This paper will detail the overall architecture of the Spider system, challenges in deploying and initial testings of a file system of this scale, and novel solutions to these challenges which offer key insights into file system design in the future.« less
Accelerating Science with the NERSC Burst Buffer Early User Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhimji, Wahid; Bard, Debbie; Romanus, Melissa
NVRAM-based Burst Buffers are an important part of the emerging HPC storage landscape. The National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory recently installed one of the first Burst Buffer systems as part of its new Cori supercomputer, collaborating with Cray on the development of the DataWarp software. NERSC has a diverse user base comprised of over 6500 users in 700 different projects spanning a wide variety of scientific computing applications. The use-cases of the Burst Buffer at NERSC are therefore also considerable and diverse. We describe here performance measurements and lessons learned from the Burstmore » Buffer Early User Program at NERSC, which selected a number of research projects to gain early access to the Burst Buffer and exercise its capability to enable new scientific advancements. To the best of our knowledge this is the first time a Burst Buffer has been stressed at scale by diverse, real user workloads and therefore these lessons will be of considerable benefit to shaping the developing use of Burst Buffers at HPC centers.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sheppy, M.; Lobato, C.; Van Geet, O.
2011-12-01
This publication detailing the design, implementation strategies, and continuous performance monitoring of NREL's Research Support Facility data center. Data centers are energy-intensive spaces that facilitate the transmission, receipt, processing, and storage of digital data. These spaces require redundancies in power and storage, as well as infrastructure, to cool computing equipment and manage the resulting waste heat (Tschudi, Xu, Sartor, and Stein, 2003). Data center spaces can consume more than 100 times the energy of standard office spaces (VanGeet 2011). The U.S. Environmental Protection Agency (EPA) reported that data centers used 61 billion kilowatt-hours (kWh) in 2006, which was 1.5% ofmore » the total electricity consumption in the U.S. (U.S. EPA, 2007). Worldwide, data centers now consume more energy annually than Sweden (New York Times, 2009). Given their high energy consumption and conventional operation practices, there is a potential for huge energy savings in data centers. The National Renewable Energy Laboratory (NREL) is world renowned for its commitment to green building construction. In June 2010, the laboratory finished construction of a 220,000-square-foot (ft{sup 2}), LEED Platinum, Research Support Facility (RSF), which included a 1,900-ft{sup 2} data center. The RSF will expand to 360,000 ft{sup 2} with the opening of an additional wing December, 2011. The project's request for proposals (RFP) set a whole-building demand-side energy use requirement of a nominal 35 kBtu/ft{sup 2} per year. On-site renewable energy generation will offset the annual energy consumption. To support the RSF's energy goals, NREL's new data center was designed to minimize its energy footprint without compromising service quality. Several implementation challenges emerged during the design, construction, and first 11 months of operation of the RSF data center. This document highlights these challenges and describes in detail how NREL successfully overcame them. The IT settings and strategies outlined in this document have been used to significantly reduce data center energy requirements in the RSF; however, these can also be used in existing buildings and retrofits.« less
End-to-end remote sensing at the Science and Technology Laboratory of John C. Stennis Space Center
NASA Technical Reports Server (NTRS)
Kelly, Patrick; Rickman, Douglas; Smith, Eric
1991-01-01
The Science and Technology Laboratory (STL) of Stennis Space Center (SSC) was developing an expertise in remote sensing for more than a decade. Capabilities at SSC/STL include all major areas of the field. STL includes the Sensor Development Laboratory (SDL), Image Processing Center, a Learjet 23 flight platform, and on-staff scientific investigators.
A strategy for computer-assisted mental practice in stroke rehabilitation.
Gaggioli, Andrea; Meneghini, Andrea; Morganti, Francesca; Alcaniz, Mariano; Riva, Giuseppe
2006-12-01
To investigate the technical and clinical viability of using computer-facilitated mental practice in the rehabilitation of upper-limb hemiparesis following stroke. A single-case study. Academic-affiliated rehabilitation center. A 46-year-old man with stable motor deficit of the upper right limb following subcortical ischemic stroke. Three computer-enhanced mental practice sessions per week at the rehabilitation center, in addition to usual physical therapy. A custom-made virtual reality system equipped with arm-tracking sensors was used to guide mental practice. The system was designed to superimpose over the (unseen) paretic arm a virtual reconstruction of the movement registered from the nonparetic arm. The laboratory intervention was followed by a 1-month home-rehabilitation program, making use of a portable display device. Pretreatment and posttreatment clinical assessment measures were the upper-extremity scale of the Fugl-Meyer Assessment of Sensorimotor Impairment and the Action Research Arm Test. Performance of the affected arm was evaluated using the healthy arm as the control condition. The patient's paretic limb improved after the first phase of intervention, with modest increases after home rehabilitation, as indicated by functional assessment scores and sensors data. Results suggest that technology-supported mental training is a feasible and potentially effective approach for improving motor skills after stroke.
Assessment of the Mars Science Laboratory Entry, Descent, and Landing Simulation
NASA Technical Reports Server (NTRS)
Way, David W.; Davis, J. L.; Shidner, Jeremy D.
2013-01-01
On August 5, 2012, the Mars Science Laboratory rover, Curiosity, successfully landed inside Gale Crater. This landing was only the seventh successful landing and fourth rover to be delivered to Mars. Weighing nearly one metric ton, Curiosity is the largest and most complex rover ever sent to investigate another planet. Safely landing such a large payload required an innovative Entry, Descent, and Landing system, which included the first guided entry at Mars, the largest supersonic parachute ever flown at Mars, and a novel and untested Sky Crane landing system. A complete, end-to-end, six degree-of-freedom, multi-body computer simulation of the Mars Science Laboratory Entry, Descent, and Landing sequence was developed at the NASA Langley Research Center. In-flight data gathered during the successful landing is compared to pre-flight statistical distributions, predicted by the simulation. These comparisons provide insight into both the accuracy of the simulation and the overall performance of the vehicle.
Preliminary assessment of the Mars Science Laboratory entry, descent, and landing simulation
NASA Astrophysics Data System (ADS)
Way, David W.
On August 5, 2012, the Mars Science Laboratory rover, Curiosity, successfully landed inside Gale Crater. This landing was the seventh successful landing and fourth rover to be delivered to Mars. Weighing nearly one metric ton, Curiosity is the largest and most complex rover ever sent to investigate another planet. Safely landing such a large payload required an innovative Entry, Descent, and Landing system, which included the first guided entry at Mars, the largest supersonic parachute ever flown at Mars, and the novel Sky Crane landing system. A complete, end-to-end, six degree-of-freedom, multi-body computer simulation of the Mars Science Laboratory Entry, Descent, and Landing sequence was developed at the NASA Langley Research Center. In-flight data gathered during the successful landing is compared to pre-flight statistical distributions, predicted by the simulation. These comparisons provide insight into both the accuracy of the simulation and the overall performance of the Entry, Descent, and Landing system.
Modeling Laboratory Astrophysics Experiments using the CRASH code
NASA Astrophysics Data System (ADS)
Trantham, Matthew; Drake, R. P.; Grosskopf, Michael; Bauerle, Matthew; Kruanz, Carolyn; Keiter, Paul; Malamud, Guy; Crash Team
2013-10-01
The understanding of high energy density systems can be advanced by laboratory astrophysics experiments. Computer simulations can assist in the design and analysis of these experiments. The Center for Radiative Shock Hydrodynamics (CRASH) at the University of Michigan developed a code that has been used to design and analyze high-energy-density experiments on OMEGA, NIF, and other large laser facilities. This Eulerian code uses block-adaptive mesh refinement (AMR) with implicit multigroup radiation transport and electron heat conduction. This poster/talk will demonstrate some of the experiments the CRASH code has helped design or analyze including: Radiative shocks experiments, Kelvin-Helmholtz experiments, Rayleigh-Taylor experiments, plasma sheet, and interacting jets experiments. This work is funded by the Predictive Sciences Academic Alliances Program in NNSA-ASC via grant DEFC52- 08NA28616, by the NNSA-DS and SC-OFES Joint Program in High-Energy-Density Laboratory Plasmas, grant number DE-FG52-09NA29548, and by the National Laser User Facility Program, grant number DE-NA0000850.
PLT Polansky looks through hatch at U.S. Laboratory / Destiny module
2001-02-11
STS98-E-5115 (11 February 2001) --- This medium shot, photographed with a digital still camera, shows STS-98 pilot Mark L. Polansky looking through the observation port on Unity's closed hatch to the newly attached Destiny laboratory. The crews of Atlantis and the International Space Station opened the laboratory shortly after this photo was made on Feb. 11; and the astronauts and cosmonauts spent the first full day of what are planned to be years of work ahead inside the orbiting science and command center. Station commander William M. (Bill) Shepherd opened the Destiny hatch, and he and shuttle commander Kenneth D. Cockrell ventured inside at 8:38 a.m. (CST), Feb. 11. As depicted in subsequent digital images in this series, members of both crews went to work quickly inside the new module, activating air systems, fire extinguishers, alarm systems, computers and internal communications. The crew also continued equipment transfers from the shuttle to the station.
Preliminary Assessment of the Mars Science Laboratory Entry, Descent, and Landing Simulation
NASA Technical Reports Server (NTRS)
Way, David W.
2013-01-01
On August 5, 2012, the Mars Science Laboratory rover, Curiosity, successfully landed inside Gale Crater. This landing was only the seventh successful landing and fourth rover to be delivered to Mars. Weighing nearly one metric ton, Curiosity is the largest and most complex rover ever sent to investigate another planet. Safely landing such a large payload required an innovative Entry, Descent, and Landing system, which included the first guided entry at Mars, the largest supersonic parachute ever flown at Mars, and a novel and untested Sky Crane landing system. A complete, end-to-end, six degree-of-freedom, multibody computer simulation of the Mars Science Laboratory Entry, Descent, and Landing sequence was developed at the NASA Langley Research Center. In-flight data gathered during the successful landing is compared to pre-flight statistical distributions, predicted by the simulation. These comparisons provide insight into both the accuracy of the simulation and the overall performance of the vehicle.
CDR Shepherd looks in hatch at U.S. Laboratory / Destiny module
2001-02-11
STS98-E-5121 (11 February 2001) --- This digital still camera shot shows Expedition One commander William M. (Bill) Shepherd looking through the observation port on Unity's closed hatch to the newly attached Destiny laboratory. Astronauts Kenneth D. Cockrell and Mark L. Polansky appear at the left and right edges, respectively. The crews of Atlantis and the International Space Station opened the laboratory shortly after this photo was made on Feb. 11, and the astronauts and cosmonauts spent the first full day of what are planned to be years of work ahead inside the orbiting science and command center. Shepherd opened the Destiny hatch, and he and shuttle commander Cockrell ventured inside at 8:38 a.m. (CST), Feb. 11. As depicted in subsequent digital images in this series, members of both crews went to work quickly inside the new module, activating air systems, fire extinguishers, alarm systems, computers and internal communications. The crew also continued equipment transfers from the shuttle to the station.
Energy Department Announces National Bioenergy Center
Department of Energy's National Renewable Energy Laboratory (NREL) in Golden, Colo., and Oak Ridge National Laboratories (ORNL) in Oak Ridge, Tenn. will lead the Bioenergy Center. The center will link DOE-funded biomass
The University of Mississippi Geoinformatics Center (UMGC)
NASA Technical Reports Server (NTRS)
Easson, Gregory L.
2003-01-01
The overarching goal of the University of Mississippi Geoinformatics Center (UMGC) is to promote application of geospatial information technologies through technology education, research support, and infrastructure development. During the initial two- year phase of operation the UMGC has successfully met those goals and is uniquely positioned to continue operation and further expand the UMGC into additional academic programs. At the end of the first funding cycle, the goals of the UMGC have been and are being met through research and educational activities in the original four participating programs; Biology, Computer and Information Science, Geology and Geological Engineering, and Sociology and Anthropology, with the School of Business joining the UMGC in early 2001. Each of these departments is supporting graduate students conducting research, has created combination teaching and research laboratories, and supported faculty during the summer months.
Wide band fiber-optic communications
NASA Technical Reports Server (NTRS)
Bates, Harry E.
1993-01-01
A number of optical communication lines are now in use at the Kennedy Space Center (KSC) for the transmission of voice, computer data and video signals. At the present time most of these channels utilize a single carrier wavelength centered near 1300 nm. As a result of previous work the bandwidth capacity of a number of these channels is being increased by transmitting another signal in the 1550 nm region on the same fiber. This is accomplished by means of wavelength division multiplexing (WDM). It is therefore important to understand the bandwidth properties of the installed fiber plant. This work developed new procedures for measuring the bandwidth of fibers in both the 1300nm and 1550nm region. In addition, a preliminary study of fiber links terminating in the Engineering Development Laboratory was completed.
2001-06-05
This computer-generated image depicts the Materials Science Research Rack-1 (MSRR-1) being developed by NASA's Marshall Space Flight Center and the European Space Agency (ESA) for placement in the Destiny laboratory module aboard the International Space Station. The rack is part of the plarned Materials Science Research Facility (MSRF) and is expected to include two furnace module inserts, a Quench Module Insert (being developed by NASA's Marshall Space Flight Center) to study directional solidification in rapidly cooled alloys and a Diffusion Module Insert (being developed by the European Space Agency) to study crystal growth, and a transparent furnace (being developed by NASA's Space Product Development program). Multi-user equipment in the rack is being developed under the auspices of NASA's Office of Biological and Physical Research (OBPR) and ESA. Key elements are labeled in other images (0101754, 0101830, and TBD).
2001-06-05
This computer-generated image depicts the Materials Science Research Rack-1 (MSRR-1) being developed by NASA's Marshall Space Flight Center and the European Space Agency (ESA) for placement in the Destiny laboratory module aboard the International Space Station. The rack is part of the plarned Materials Science Research Facility (MSRF) and is expected to include two furnace module inserts, a Quench Module Insert (being developed by NASA's Marshall Space Flight Center) to study directional solidification in rapidly cooled alloys and a Diffusion Module Insert (being developed by the European Space Agency) to study crystal growth, and a transparent furnace (being developed by NASA's Space Product Development program). Multi-user equipment in the rack is being developed under the auspices of NASA's Office of Biological and Physical Research (OBPR) and ESA. Key elements are labeled in other images (0101754, 0101829, 0101830).
2001-06-05
This computer-generated image depicts the Materials Science Research Rack-1 (MSRR-1) being developed by NASA's Marshall Space Flight Center and the European Space Agency (ESA) for placement in the Destiny laboratory module aboard the International Space Station. The rack is part of the plarned Materials Science Research Facility (MSRF) and is expected to include two furnace module inserts, a Quench Module Insert (being developed by NASA's Marshall Space Flight Center) to study directional solidification in rapidly cooled alloys and a Diffusion Module Insert (being developed by the European Space Agency) to study crystal growth, and a transparent furnace (being developed by NASA's Space Product Development program). Multi-user equipment in the rack is being developed under the auspices of NASA's Office of Biological and Physical Research (OBPR) and ESA. A larger image is available without labels (No. 0101755).
1996 NASA-ASEE-Stanford Summer Faculty Fellowship Program. Part 1
NASA Technical Reports Server (NTRS)
1996-01-01
As is customary, the final technical report for the NASA-ASEE Summer Faculty Fellowship Program at the Ames Research Center, Dryden Flight Research Center and Stanford University essentially consists of a compilation of the summary technical reports of all the fellows. More extended versions done either as NASA publications, archival papers, or other laboratory reports are not included here. The reader will note that the areas receiving emphasis were the life sciences, astronomy, remote sensing, aeronautics, fluid dynamics/aerophysics, and computer science. Of course, the areas of emphasis vary somewhat from year to year depending on the interests of the most qualified applicants. Once again, the work is of especially high quality. The reports of the first and second year fellows are grouped separately and are arranged alphabetically within each group.
Relativistic Collisions of Highly-Charged Ions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ionescu, Dorin; Belkacem, Ali
1998-11-19
The physics of elementary atomic processes in relativistic collisions between highly-charged ions and atoms or other ions is briefly discussed, and some recent theoretical and experimental results in this field are summarized. They include excitation, capture, ionization, and electron-positron pair creation. The numerical solution of the two-center Dirac equation in momentum space is shown to be a powerful nonperturbative method for describing atomic processes in relativistic collisions involving heavy and highly-charged ions. By propagating negative-energy wave packets in time the evolution of the QED vacuum around heavy ions in relativistic motion is investigated. Recent results obtained from numerical calculations usingmore » massively parallel processing on the Cray-T3E supercomputer of the National Energy Research Scientific Computer Center (NERSC) at Berkeley National Laboratory are presented.« less
Nielsen, Jens E.; Gunner, M. R.; Bertrand García-Moreno, E.
2012-01-01
The pKa Cooperative http://www.pkacoop.org was organized to advance development of accurate and useful computational methods for structure-based calculation of pKa values and electrostatic energy in proteins. The Cooperative brings together laboratories with expertise and interest in theoretical, computational and experimental studies of protein electrostatics. To improve structure-based energy calculations it is necessary to better understand the physical character and molecular determinants of electrostatic effects. The Cooperative thus intends to foment experimental research into fundamental aspects of proteins that depend on electrostatic interactions. It will maintain a depository for experimental data useful for critical assessment of methods for structure-based electrostatics calculations. To help guide the development of computational methods the Cooperative will organize blind prediction exercises. As a first step, computational laboratories were invited to reproduce an unpublished set of experimental pKa values of acidic and basic residues introduced in the interior of staphylococcal nuclease by site-directed mutagenesis. The pKa values of these groups are unique and challenging to simulate owing to the large magnitude of their shifts relative to normal pKa values in water. Many computational methods were tested in this 1st Blind Prediction Challenge and critical assessment exercise. A workshop was organized in the Telluride Science Research Center to assess objectively the performance of many computational methods tested on this one extensive dataset. This volume of PROTEINS: Structure, Function, and Bioinformatics introduces the pKa Cooperative, presents reports submitted by participants in the blind prediction challenge, and highlights some of the problems in structure-based calculations identified during this exercise. PMID:22002877
NASA Technical Reports Server (NTRS)
Schoenenberger, Mark; VanNorman, John; Rhode, Matthew; Paulson, John
2013-01-01
On August 5 , 2012, the Mars Science Laboratory (MSL) entry capsule successfully entered Mars' atmosphere and landed the Curiosity rover in Gale Crater. The capsule used a reaction control system (RCS) consisting of four pairs of hydrazine thrusters to fly a guided entry. The RCS provided bank control to fly along a flight path commanded by an onboard computer and also damped unwanted rates due to atmospheric disturbances and any dynamic instabilities of the capsule. A preliminary assessment of the MSL's flight data from entry showed that the capsule flew much as predicted. This paper will describe how the MSL aerodynamics team used engineering analyses, computational codes and wind tunnel testing in concert to develop the RCS system and certify it for flight. Over the course of MSL's development, the RCS configuration underwent a number of design iterations to accommodate mechanical constraints, aeroheating concerns and excessive aero/RCS interactions. A brief overview of the MSL RCS configuration design evolution is provided. Then, a brief description is presented of how the computational predictions of RCS jet interactions were validated. The primary work to certify that the RCS interactions were acceptable for flight was centered on validating computational predictions at hypersonic speeds. A comparison of computational fluid dynamics (CFD) predictions to wind tunnel force and moment data gathered in the NASA Langley 31-Inch Mach 10 Tunnel was the lynch pin to validating the CFD codes used to predict aero/RCS interactions. Using the CFD predictions and experimental data, an interaction model was developed for Monte Carlo analyses using 6-degree-of-freedom trajectory simulation. The interaction model used in the flight simulation is presented.
UC Merced Center for Computational Biology Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Colvin, Michael; Watanabe, Masakatsu
Final report for the UC Merced Center for Computational Biology. The Center for Computational Biology (CCB) was established to support multidisciplinary scientific research and academic programs in computational biology at the new University of California campus in Merced. In 2003, the growing gap between biology research and education was documented in a report from the National Academy of Sciences, Bio2010 Transforming Undergraduate Education for Future Research Biologists. We believed that a new type of biological sciences undergraduate and graduate programs that emphasized biological concepts and considered biology as an information science would have a dramatic impact in enabling the transformationmore » of biology. UC Merced as newest UC campus and the first new U.S. research university of the 21st century was ideally suited to adopt an alternate strategy - to create a new Biological Sciences majors and graduate group that incorporated the strong computational and mathematical vision articulated in the Bio2010 report. CCB aimed to leverage this strong commitment at UC Merced to develop a new educational program based on the principle of biology as a quantitative, model-driven science. Also we expected that the center would be enable the dissemination of computational biology course materials to other university and feeder institutions, and foster research projects that exemplify a mathematical and computations-based approach to the life sciences. As this report describes, the CCB has been successful in achieving these goals, and multidisciplinary computational biology is now an integral part of UC Merced undergraduate, graduate and research programs in the life sciences. The CCB began in fall 2004 with the aid of an award from U.S. Department of Energy (DOE), under its Genomes to Life program of support for the development of research and educational infrastructure in the modern biological sciences. This report to DOE describes the research and academic programs made possible by the CCB from its inception until August, 2010, at the end of the final extension. Although DOE support for the center ended in August 2010, the CCB will continue to exist and support its original objectives. The research and academic programs fostered by the CCB have led to additional extramural funding from other agencies, and we anticipate that CCB will continue to provide support for quantitative and computational biology program at UC Merced for many years to come. Since its inception in fall 2004, CCB research projects have continuously had a multi-institutional collaboration with Lawrence Livermore National Laboratory (LLNL), and the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign, as well as individual collaborators at other sites. CCB affiliated faculty cover a broad range of computational and mathematical research including molecular modeling, cell biology, applied math, evolutional biology, bioinformatics, etc. The CCB sponsored the first distinguished speaker series at UC Merced, which had an important role is spreading the word about the computational biology emphasis at this new campus. One of CCB's original goals is to help train a new generation of biologists who bridge the gap between the computational and life sciences. To archive this goal, by summer 2006, a new program - summer undergraduate internship program, have been established under CCB to train the highly mathematical and computationally intensive Biological Science researchers. By the end of summer 2010, 44 undergraduate students had gone through this program. Out of those participants, 11 students have been admitted to graduate schools and 10 more students are interested in pursuing graduate studies in the sciences. The center is also continuing to facilitate the development and dissemination of undergraduate and graduate course materials based on the latest research in computational biology.« less
Micro-CT images reconstruction and 3D visualization for small animal studying
NASA Astrophysics Data System (ADS)
Gong, Hui; Liu, Qian; Zhong, Aijun; Ju, Shan; Fang, Quan; Fang, Zheng
2005-01-01
A small-animal x-ray micro computed tomography (micro-CT) system has been constructed to screen laboratory small animals and organs. The micro-CT system consists of dual fiber-optic taper-coupled CCD detectors with a field-of-view of 25x50 mm2, a microfocus x-ray source, a rotational subject holder. For accurate localization of rotation center, coincidence between the axis of rotation and centre of image was studied by calibration with a polymethylmethacrylate cylinder. Feldkamp"s filtered back-projection cone-beam algorithm is adopted for three-dimensional reconstruction on account of the effective corn-beam angle is 5.67° of the micro-CT system. 200x1024x1024 matrix data of micro-CT is obtained with the magnification of 1.77 and pixel size of 31x31μm2. In our reconstruction software, output image size of micro-CT slices data, magnification factor and rotation sample degree can be modified in the condition of different computational efficiency and reconstruction region. The reconstructed image matrix data is processed and visualization by Visualization Toolkit (VTK). Data parallelism of VTK is performed in surface rendering of reconstructed data in order to improve computing speed. Computing time of processing a 512x512x512 matrix datasets is about 1/20 compared with serial program when 30 CPU is used. The voxel size is 54x54x108 μm3. The reconstruction and 3-D visualization images of laboratory rat ear are presented.
High-Performance Computing Data Center | Energy Systems Integration
Facility | NREL High-Performance Computing Data Center High-Performance Computing Data Center The Energy Systems Integration Facility's High-Performance Computing Data Center is home to Peregrine -the largest high-performance computing system in the world exclusively dedicated to advancing
Conventional Microscopy vs. Computer Imagery in Chiropractic Education.
Cunningham, Christine M; Larzelere, Elizabeth D; Arar, Ilija
2008-01-01
As human tissue pathology slides become increasingly difficult to obtain, other methods of teaching microscopy in educational laboratories must be considered. The purpose of this study was to evaluate our students' satisfaction with newly implemented computer imagery based laboratory instruction and to obtain input from their perspective on the advantages and disadvantages of computerized vs. traditional microscope laboratories. This undertaking involved the creation of a new computer laboratory. Robbins and Cotran Pathologic Basis of Disease, 7(th)ed, was chosen as the required text which gave students access to the Robbins Pathology website, including complete content of text, Interactive Case Study Companion, and Virtual Microscope. Students had experience with traditional microscopes in their histology and microbiology laboratory courses. Student satisfaction with computer based learning was assessed using a 28 question survey which was administered to three successive trimesters of pathology students (n=193) using the computer survey website Zoomerang. Answers were given on a scale of 1-5 and statistically analyzed using weighted averages. The survey data indicated that students were satisfied with computer based learning activities during pathology laboratory instruction. The most favorable aspect to computer imagery was 24-7 availability (weighted avg. 4.16), followed by clarification offered by accompanying text and captions (weighted avg. 4.08). Although advantages and disadvantages exist in using conventional microscopy and computer imagery, current pathology teaching environments warrant investigation of replacing traditional microscope exercises with computer applications. Chiropractic students supported the adoption of computer-assisted instruction in pathology laboratories.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carter, Thomas; Liu, Zan; Sickinger, David
The Thermosyphon Cooler Hybrid System (TCHS) integrates the control of a dry heat rejection device, the thermosyphon cooler (TSC), with an open cooling tower. A combination of equipment and controls, this new heat rejection system embraces the 'smart use of water,' using evaporative cooling when it is most advantageous and then saving water and modulating toward increased dry sensible cooling as system operations and ambient weather conditions permit. Innovative fan control strategies ensure the most economical balance between water savings and parasitic fan energy. The unique low-pressure-drop design of the TSC allows water to be cooled directly by the TSCmore » evaporator without risk of bursting tubes in subfreezing ambient conditions. Johnson Controls partnered with the National Renewable Energy Laboratory (NREL) and Sandia National Laboratories to deploy the TSC as a test bed at NREL's high-performance computing (HPC) data center in the first half of 2016. Located in NREL's Energy Systems Integration Facility (ESIF), this HPC data center has achieved an annualized average power usage effectiveness rating of 1.06 or better since 2012. Warm-water liquid cooling is used to capture heat generated by computer systems direct to water; that waste heat is either reused as the primary heat source in the ESIF building or rejected using evaporative cooling. This data center is the single largest source of water and power demand on the NREL campus, using about 7,600 m3 (2.0 million gal) of water during the past year with an hourly average IT load of nearly 1 MW (3.4 million Btu/h) -- so dramatically reducing water use while continuing efficient data center operations is of significant interest. Because Sandia's climate is similar to NREL's, this new heat rejection system being deployed at NREL has gained interest at Sandia. Sandia's data centers utilize an hourly average of 8.5 MW (29 million Btu/h) and are also one of the largest consumers of water on Sandia's site. In addition to describing the installation of the TSC and its integration into the ESIF, this paper focuses on the full heat rejection system simulation program used for hourly analysis of the energy and water consumption of the complete system under varying operating scenarios. A follow-up paper will detail the test results. The evaluation of the TSC's performance at NREL will also determine a path forward at Sandia for possible deployment in a large-scale system not only for data center use but also possibly site wide.« less
Bridging the PSI Knowledge Gap: A Multi-Scale Approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wirth, Brian D.
2015-01-08
Plasma-surface interactions (PSI) pose an immense scientific hurdle in magnetic confinement fusion and our present understanding of PSI in confinement environments is highly inadequate; indeed, a recent Fusion Energy Sciences Advisory Committee report found that 4 out of the 5 top five fusion knowledge gaps were related to PSI. The time is appropriate to develop a concentrated and synergistic science effort that would expand, exploit and integrate the wealth of laboratory ion-beam and plasma research, as well as exciting new computational tools, towards the goal of bridging the PSI knowledge gap. This effort would broadly advance plasma and material sciences,more » while providing critical knowledge towards progress in fusion PSI. This project involves the development of a Science Center focused on a new approach to PSI science; an approach that both exploits access to state-of-the-art PSI experiments and modeling, as well as confinement devices. The organizing principle is to develop synergistic experimental and modeling tools that treat the truly coupled multi-scale aspect of the PSI issues in confinement devices. This is motivated by the simple observation that while typical lab experiments and models allow independent manipulation of controlling variables, the confinement PSI environment is essentially self-determined with few outside controls. This means that processes that may be treated independently in laboratory experiments, because they involve vastly different physical and time scales, will now affect one another in the confinement environment. Also, lab experiments cannot simultaneously match all exposure conditions found in confinement devices typically forcing a linear extrapolation of lab results. At the same time programmatic limitations prevent confinement experiments alone from answering many key PSI questions. The resolution to this problem is to usefully exploit access to PSI science in lab devices, while retooling our thinking from a linear and de-coupled extrapolation to a multi-scale, coupled approach. The PSI Plasma Center consisted of three equal co-centers; one located at the MIT Plasma Science and Fusion Center, one at UC San Diego Center for Energy Research and one at the UC Berkeley Department of Nuclear Engineering, which moved to the University of Tennessee, Knoxville (UTK) with Professor Brian Wirth in July 2010. The Center had three co-directors: Prof. Dennis Whyte led the MIT co-center, the UCSD co-center was led by Dr. Russell Doerner, and Prof. Brian Wirth led the UCB/UTK center. The directors have extensive experience in PSI and material research, and have been internationally recognized in the magnetic fusion, materials and plasma research fields. The co-centers feature keystone PSI experimental and modeling facilities dedicated to PSI science: the DIONISOS/CLASS facility at MIT, the PISCES facility at UCSD, and the state-of-the-art numerical modeling capabilities at UCB/UTK. A collaborative partner in the center is Sandia National Laboratory at Livermore (SNL/CA), which has extensive capabilities with low energy ion beams and surface diagnostics, as well as supporting plasma facilities, including the Tritium Plasma Experiment, all of which significantly augment the Center. Interpretive, continuum material models are available through SNL/CA, UCSD and MIT. The participating institutions of MIT, UCSD, UCB/UTK, SNL/CA and LLNL brought a formidable array of experimental tools and personnel abilities into the PSI Plasma Center. Our work has focused on modeling activities associated with plasma surface interactions that are involved in effects of He and H plasma bombardment on tungsten surfaces. This involved performing computational material modeling of the surface evolution during plasma bombardment using molecular dynamics modeling. The principal outcomes of the research efforts within the combined experimental – modeling PSI center are to provide a knowledgebase of the mechanisms of surface degradation, and the influence of the surface on plasma conditions.« less
Lunar ephemeris and selenographic coordinates of the earth and sun for 1971 and 1972
NASA Technical Reports Server (NTRS)
Hartung, A. D.
1972-01-01
Ephemeris data are presented for each month of 1971 and 1972 to provide a time history of lunar coordinates and related geometric information. A NASA Manned Spacecraft Center modification of the Jet Propulsion Laboratory ephemeris tape was used to calculate and plot coordinates of the earth, moon, and sun. The ephemeris is referenced to the mean vernal equinox at the nearest beginning of a Besselian year. Therefore, the reference equinox changes from one year to the next between 30 June and 1 July. The apparent discontinuity in the data is not noticeable in the graphical presentation, but can be observed in the digital output. The mean equator of epoch is used in all cases. The computer program used to compute and plot the ephemeris data is described.
Flowfield visualization for SSME hot gas manifold
NASA Technical Reports Server (NTRS)
Roger, Robert P.
1988-01-01
The objective of this research, as defined by NASA-Marshall Space Flight Center, was two-fold: (1) to numerically simulate viscous subsonic flow in a proposed elliptical two-duct version of the fuel side Hot Gas Manifold (HGM) for the Space Shuttle Main Engine (SSME), and (2) to provide analytical support for SSME related numerical computational experiments, being performed by the Computational Fluid Dynamics staff in the Aerophysics Division of the Structures and Dynamics Laboratory at NASA-MSFC. Numerical results of HGM were calculations to complement both water flow visualization experiments and air flow visualization experiments and air experiments in two-duct geometries performed at NASA-MSFC and Rocketdyne. In addition, code modification and improvement efforts were to strengthen the CFD capabilities of NASA-MSFC for producing reliable predictions of flow environments within the SSME.
Final Report for DOE Award ER25756
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kesselman, Carl
2014-11-17
The SciDAC-funded Center for Enabling Distributed Petascale Science (CEDPS) was established to address technical challenges that arise due to the frequent geographic distribution of data producers (in particular, supercomputers and scientific instruments) and data consumers (people and computers) within the DOE laboratory system. Its goal is to produce technical innovations that meet DOE end-user needs for (a) rapid and dependable placement of large quantities of data within a distributed high-performance environment, and (b) the convenient construction of scalable science services that provide for the reliable and high-performance processing of computation and data analysis requests from many remote clients. The Centermore » is also addressing (c) the important problem of troubleshooting these and other related ultra-high-performance distributed activities from the perspective of both performance and functionality« less
Lunar ephemeris and selenographic coordinates of the earth and sun for 1973 and 1974
NASA Technical Reports Server (NTRS)
Hartung, A. D.
1972-01-01
Ephemeris data are presented for each month of 1973 and 1974 to provide a time history of lunar coordinates and related geometric information. A NASA Manned Spacecraft Center modification of the Jet Propulsion Laboratory ephemeris tape was used to calculate and plot coordinates of the earth, moon, and sun. The ephemeris is referenced to the mean vernal equinox at the nearest beginning of a Besselian year. Therefore, the reference equinox changes from one year to the next between 30 June and 1 July. The apparent discontinuity in the data is not noticeable in the graphical presentation, but can be observed in the digital output. The mean equator of epoch is used in all cases. The computer program used to compute and plot the ephemeris data is described.
NASA Technical Reports Server (NTRS)
Prasad, Sheo S.; Lee, Timothy J.
1994-01-01
Possible existence and chemistry of ClO (center dot) O2 was originally proposed to explain the Norrish-Neville effect that O2 suppresses chlorine photosensitized loss of ozone. It was also thought that ClO (center dot) O2 might have some atmospheric chemistry significance. Recently, doubts have been cast on this proposal, because certain laboratory data seem to imply that the equilibrium constant of the title reaction is so small that ClO (center dot) O2 may be too unstable to matter. However, those data create only a superficial illusion to that effect, because on a closer analysis they do not disprove a moderately stable and chemically significant ClO (center dot) O2. Furthermore, our state-of-the-science accurate computational chemistry calculations also suggest that ClO (center dot) O2 may be a weakly bound ClOOO radical with a reactive (2)A ground electronic state. There is therefore a need to design and perform definitive experimental tests of the existence and chemistry of the ClO (center dot) O2 species, which we discuss and which have the potential to mediate the chlorine-catalyzed stratospheric ozone depletion.
A National Virtual Specimen Database for Early Cancer Detection
NASA Technical Reports Server (NTRS)
Crichton, Daniel; Kincaid, Heather; Kelly, Sean; Thornquist, Mark; Johnsey, Donald; Winget, Marcy
2003-01-01
Access to biospecimens is essential for enabling cancer biomarker discovery. The National Cancer Institute's (NCI) Early Detection Research Network (EDRN) comprises and integrates a large number of laboratories into a network in order to establish a collaborative scientific environment to discover and validate disease markers. The diversity of both the institutions and the collaborative focus has created the need for establishing cross-disciplinary teams focused on integrating expertise in biomedical research, computational and biostatistics, and computer science. Given the collaborative design of the network, the EDRN needed an informatics infrastructure. The Fred Hutchinson Cancer Research Center, the National Cancer Institute,and NASA's Jet Propulsion Laboratory (JPL) teamed up to build an informatics infrastructure creating a collaborative, science-driven research environment despite the geographic and morphology differences of the information systems that existed within the diverse network. EDRN investigators identified the need to share biospecimen data captured across the country managed in disparate databases. As a result, the informatics team initiated an effort to create a virtual tissue database whereby scientists could search and locate details about specimens located at collaborating laboratories. Each database, however, was locally implemented and integrated into collection processes and methods unique to each institution. This meant that efforts to integrate databases needed to be done in a manner that did not require redesign or re-implementation of existing system
Comparative Study in Laboratory Rats to Validate Sperm Quality Methods and Endpoints
NASA Technical Reports Server (NTRS)
Price, W. A.; Briggs, G. B.; Alexander, W. K.; Still, K. R.; Grasman, K. A.
2000-01-01
Abstract The Naval Health Research Center, Detachment (Toxicology) performs toxicity studies in laboratory animals to characterize the risk of exposure to chemicals of Navy interest. Research was conducted at the Toxicology Detachment at WPAFB, OH in collaboration with Wright State University, Department of Biological Sciences for the validation of new bioassay methods for evaluating reproductive toxicity. The Hamilton Thorne sperm analyzer was used to evaluate sperm damage produced by exposure to a known testicular toxic agent, methoxyacetic acid and by inhalation exposure to JP-8 and JP-5 in laboratory rats. Sperm quality parameters were evaluated (sperm concentration, motility, and morphology) to provide evidence of sperm damage. The Hamilton Thorne sperm analyzer utilizes a DNA specific fluorescent stain (similar to flow cytometry) and digitized optical computer analysis to detect sperm cell damage. The computer assisted sperm analysis (CASA) is a more rapid, robust, predictive and sensitive method for characterizing reproductive toxicity. The results presented in this poster report validation information showing exposure to methoxyacetic acid causes reproductive toxicity and inhalation exposure to JP-8 and JP-5 had no significant effects. The CASA method detects early changes that result in reproductive deficits and these data will be used in a continuing program to characterize the toxicity of chemicals, and combinations of chemicals, of military interest to formulate permissible exposure limits.
Temporary Laboratory Office in Huntsville Industrial Center Building
NASA Technical Reports Server (NTRS)
1964-01-01
Temporary quarters in the Huntsville Industrial Center (HIC) building located in downtown Huntsville, Alabama, as Marshall Space Flight Center (MSFC) grew. This image shows drafting specialists from the Propulsion and Vehicle Engineering Laboratory at work in the HIC building.
A Software Laboratory Environment for Computer-Based Problem Solving.
ERIC Educational Resources Information Center
Kurtz, Barry L.; O'Neal, Micheal B.
This paper describes a National Science Foundation-sponsored project at Louisiana Technological University to develop computer-based laboratories for "hands-on" introductions to major topics of computer science. The underlying strategy is to develop structured laboratory environments that present abstract concepts through the use of…
Human Centered Autonomous and Assistant Systems Testbed for Exploration Operations
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Mount, Frances; Carreon, Patricia; Torney, Susan E.
2001-01-01
The Engineering and Mission Operations Directorates at NASA Johnson Space Center are combining laboratories and expertise to establish the Human Centered Autonomous and Assistant Systems Testbed for Exploration Operations. This is a testbed for human centered design, development and evaluation of intelligent autonomous and assistant systems that will be needed for human exploration and development of space. This project will improve human-centered analysis, design and evaluation methods for developing intelligent software. This software will support human-machine cognitive and collaborative activities in future interplanetary work environments where distributed computer and human agents cooperate. We are developing and evaluating prototype intelligent systems for distributed multi-agent mixed-initiative operations. The primary target domain is control of life support systems in a planetary base. Technical approaches will be evaluated for use during extended manned tests in the target domain, the Bioregenerative Advanced Life Support Systems Test Complex (BIO-Plex). A spinoff target domain is the International Space Station (ISS) Mission Control Center (MCC). Prodl}cts of this project include human-centered intelligent software technology, innovative human interface designs, and human-centered software development processes, methods and products. The testbed uses adjustable autonomy software and life support systems simulation models from the Adjustable Autonomy Testbed, to represent operations on the remote planet. Ground operations prototypes and concepts will be evaluated in the Exploration Planning and Operations Center (ExPOC) and Jupiter Facility.
A collaborative institutional model for integrating computer applications in the medical curriculum.
Friedman, C. P.; Oxford, G. S.; Juliano, E. L.
1991-01-01
The introduction and promotion of information technology in an established medical curriculum with existing academic and technical support structures poses a number of challenges. The UNC School of Medicine has developed the Taskforce on Educational Applications in Medicine (TEAM), to coordinate this effort. TEAM works as a confederation of existing research and support units with interests in computers and education, along with a core of interested faculty with curricular responsibilities. Constituent units of the TEAM confederation include the medical center library, medical television studios, basic science teaching laboratories, educational development office, microcomputer and network support groups, academic affairs administration, and a subset of course directors and teaching faculty. Among our efforts have been the establishment of (1) a mini-grant program to support faculty initiated development and implementation of computer applications in the curriculum, (2) a symposium series with visiting speakers to acquaint faculty with current developments in medical informatics and related curricular efforts at other institution, (3) 20 computer workstations located in the multipurpose teaching labs where first and second year students do much of their academic work, (4) a demonstration center for evaluation of courseware and technologically advanced delivery systems. The student workstations provide convenient access to electronic mail, University schedules and calendars, the CoSy computer conferencing system, and several software applications integral to their courses in pathology, histology, microbiology, biochemistry, and neurobiology. The progress achieved toward the primary goal has modestly exceeded our initial expectations, while the collegiality and interest expressed toward TEAM activities in the local environment stand as empirical measures of the success of the concept. PMID:1807705
Next Generation Workload Management System For Big Data on Heterogeneous Distributed Computing
Klimentov, A.; Buncic, P.; De, K.; ...
2015-05-22
The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS and ALICE are the largest collaborations ever assembled in the sciences and are at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, both experiments rely on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Managementmore » System (WMS) for managing the workflow for all data processing on hundreds of data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. The scale is demonstrated by the following numbers: PanDA manages O(10 2) sites, O(10 5) cores, O(10 8) jobs per year, O(10 3) users, and ATLAS data volume is O(10 17) bytes. In 2013 we started an ambitious program to expand PanDA to all available computing resources, including opportunistic use of commercial and academic clouds and Leadership Computing Facilities (LCF). The project titled 'Next Generation Workload Management and Analysis System for Big Data' (BigPanDA) is funded by DOE ASCR and HEP. Extending PanDA to clouds and LCF presents new challenges in managing heterogeneity and supporting workflow. The BigPanDA project is underway to setup and tailor PanDA at the Oak Ridge Leadership Computing Facility (OLCF) and at the National Research Center "Kurchatov Institute" together with ALICE distributed computing and ORNL computing professionals. Our approach to integration of HPC platforms at the OLCF and elsewhere is to reuse, as much as possible, existing components of the PanDA system. Finally, we will present our current accomplishments with running the PanDA WMS at OLCF and other supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facilities infrastructure for High Energy and Nuclear Physics as well as other data-intensive science applications.« less
Next Generation Workload Management System For Big Data on Heterogeneous Distributed Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klimentov, A.; Buncic, P.; De, K.
The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS and ALICE are the largest collaborations ever assembled in the sciences and are at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, both experiments rely on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Managementmore » System (WMS) for managing the workflow for all data processing on hundreds of data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. The scale is demonstrated by the following numbers: PanDA manages O(10 2) sites, O(10 5) cores, O(10 8) jobs per year, O(10 3) users, and ATLAS data volume is O(10 17) bytes. In 2013 we started an ambitious program to expand PanDA to all available computing resources, including opportunistic use of commercial and academic clouds and Leadership Computing Facilities (LCF). The project titled 'Next Generation Workload Management and Analysis System for Big Data' (BigPanDA) is funded by DOE ASCR and HEP. Extending PanDA to clouds and LCF presents new challenges in managing heterogeneity and supporting workflow. The BigPanDA project is underway to setup and tailor PanDA at the Oak Ridge Leadership Computing Facility (OLCF) and at the National Research Center "Kurchatov Institute" together with ALICE distributed computing and ORNL computing professionals. Our approach to integration of HPC platforms at the OLCF and elsewhere is to reuse, as much as possible, existing components of the PanDA system. Finally, we will present our current accomplishments with running the PanDA WMS at OLCF and other supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facilities infrastructure for High Energy and Nuclear Physics as well as other data-intensive science applications.« less
Expedition One CDR Shepherd with IMAX camera
2001-02-11
STS98-E-5164 (11 February 2001) --- Astronaut William M. (Bill) Shepherd documents activity onboard the newly attached Destiny laboratory using an IMAX motion picture camera. The crews of Atlantis and the International Space Station on February 11 opened the Destiny laboratory and spent the first full day of what are planned to be years of work ahead inside the orbiting science and command center. Shepherd opened the Destiny hatch, and he and Shuttle commander Kenneth D. Cockrell ventured inside at 8:38 a.m. (CST). Members of both crews went to work quickly inside the new module, activating air systems, fire extinguishers, alarm systems, computers and internal communications. The crew also continued equipment transfers from the shuttle to the station and filmed several scenes onboard the station using an IMAX camera. This scene was recorded with a digital still camera.
National Biocontainment Training Center
2014-08-01
and Dr. Christopher Kasanga, Virologist, SACIDS, SUA. Pictured bottom right: Martha Betson, an instructor at Sokoine from the Royal Veterinary ...laboratories in the Pendik Veterinary Control Institute, which is a national research laboratory under the Turkish Ministry of Food, Agriculture and Livestock...Gargili (first row, center) for laboratory staff of the Pendik Veterinary Control Institute, a national research laboratory under the Turkish
Catalytic N 2 Reduction to Silylamines and Thermodynamics of N 2 Binding at Square Planar Fe
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prokopchuk, Demyan E.; Wiedner, Eric S.; Walter, Eric D.
The geometric constraints imposed by a tetradentate P 4N 2 ligand play an essential role in stabilizing square planar Fe complexes with changes in metal oxidation state. A combination of high-pressure electrochemistry and variable temperature UV-vis spectroscopy were used to obtain these thermodynamic measurements, while X-ray crystallography, 57Fe Mössbauer spectroscopy, and EPR spectroscopy were used to fully characterize these new compounds. Analysis of Fe 0, FeI, and FeII complexes reveals that the free energy of N 2 binding across three oxidation states spans more than 37 kcal mol -1. The square pyramidal Fe0(N 2)(P 4N 2) complex catalyzes the conversionmore » of N 2 to N(SiR 3) 3 (R = Me, Et) at room temperature, representing the highest turnover number (TON) of any Fe-based N 2 silylation catalyst to date (up to 65 equiv N(SiMe 3) 3 per Fe center). Elevated N 2 pressures (> 1 atm) have a dramatic effect on catalysis, increasing N 2 solubility and the thermodynamic N 2 binding affinity at Fe0(N 2)(P 4N 2). Acknowledgment. This research was supported as part of the Center for Molecular Electrocatalysis, an Energy Frontier Research Center funded by the U.S. Department of Energy (DOE), Office of Science, Office of Basic Energy Sciences. EPR experiments were performed using EMSL, a national scientific user facility sponsored by the DOE’s Office of Biological and Environmental Research and located at Pacific Northwest National Laboratory (PNNL). PNNL is operated by Battelle for the U.S. DOE. Computational resources were provided by the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory. The authors thank Prof. Yisong Alex Guo at Carnegie Mellon University for recording Mössbauer data for some complexes and Emma Wellington and Kaye Kuphal for their assistance with the collection of Mössbauer data at Colgate University, Dr. Katarzyna Grubel for X-ray assistance, and Dr. Rosalie Chu for mass spectrometry assistance. The authors also thank Dr. Aaron Appel and Dr. Alex Kendall for helpful discussions.« less
NASA Astrophysics Data System (ADS)
Onuoha, Cajetan O.
The purpose of this research study was to determine the overall effectiveness of computer-based laboratory compared with the traditional hands-on laboratory for improving students' science academic achievement and attitudes towards science subjects at the college and pre-college levels of education in the United States. Meta-analysis was used to synthesis the findings from 38 primary research studies conducted and/or reported in the United States between 1996 and 2006 that compared the effectiveness of computer-based laboratory with the traditional hands-on laboratory on measures related to science academic achievements and attitudes towards science subjects. The 38 primary research studies, with total subjects of 3,824 generated a total of 67 weighted individual effect sizes that were used in this meta-analysis. The study found that computer-based laboratory had small positive effect sizes over the traditional hands-on laboratory (ES = +0.26) on measures related to students' science academic achievements and attitudes towards science subjects (ES = +0.22). It was also found that computer-based laboratory produced more significant effects on physical science subjects compared to biological sciences (ES = +0.34, +0.17).
Investigations in Computer-Aided Instruction and Computer-Aided Controls. Final Report.
ERIC Educational Resources Information Center
Rosenberg, R.C.; And Others
These research projects, designed to delve into certain relationships between humans and computers, are focused on computer-assisted instruction and on man-computer interaction. One study demonstrates that within the limits of formal engineering theory, a computer simulated laboratory (Dynamic Systems Laboratory) can be built in which freshmen…
Facilities | Argonne National Laboratory
Skip to main content Argonne National Laboratory Toggle Navigation Toggle Search Research Facilities Advanced Powertrain Research Facility Center for Transportation Research Distributed Energy Research Center Engine Research Facility Heat Transfer Laboratory Materials Engineering Research Facility
Determination of Absolute Zero Using a Computer-Based Laboratory
ERIC Educational Resources Information Center
Amrani, D.
2007-01-01
We present a simple computer-based laboratory experiment for evaluating absolute zero in degrees Celsius, which can be performed in college and undergraduate physical sciences laboratory courses. With a computer, absolute zero apparatus can help demonstrators or students to observe the relationship between temperature and pressure and use…
Stocks, G. Malcolm (Director, Center for Defect Physics in Structural Materials); CDP Staff
2017-12-09
'Center for Defect Physics - Energy Frontier Research Center' was submitted by the Center for Defect Physics (CDP) to the 'Life at the Frontiers of Energy Research' video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. CDP is directed by G. Malcolm Stocks at Oak Ridge National Laboratory, and is a partnership of scientists from nine institutions: Oak Ridge National Laboratory (lead); Ames Laboratory; Brown University; University of California, Berkeley; Carnegie Mellon University; University of Illinois, Urbana-Champaign; Lawrence Livermore National Laboratory; Ohio State University; and University of Tennessee. The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges.
Atmosphere of Freedom: Sixty Years at the NASA Ames Research Center
NASA Technical Reports Server (NTRS)
Bugos, Glenn E.; Launius, Roger (Technical Monitor)
2000-01-01
Throughout Ames History, four themes prevail: a commitment to hiring the best people; cutting-edge research tools; project management that gets things done faster, better and cheaper; and outstanding research efforts that serve the scientific professions and the nation. More than any other NASA Center, Ames remains shaped by its origins in the NACA (National Advisory Committee for Aeronautics). Not that its missions remain the same. Sure, Ames still houses the world's greatest collection of wind tunnels and simulation facilities, its aerodynamicists remain among the best in the world, and pilots and engineers still come for advice on how to build better aircraft. But that is increasingly part of Ames' past. Ames people have embraced two other missions for its future. First, intelligent systems and information science will help NASA use new tools in supercomputing, networking, telepresence and robotics. Second, astrobiology will explore lore the prospects for life on Earth and beyond. Both new missions leverage Ames long-standing expertise in computation and in the life sciences, as well as its relations with the computing and biotechnology firms working in the Silicon Valley community that has sprung up around the Center. Rather than the NACA missions, it is the NACA culture that still permeates Ames. The Ames way of research management privileges the scientists and engineers working in the laboratories. They work in an atmosphere of freedom, laced with the expectation of integrity and responsibility. Ames researchers are free to define their research goals and define how they contribute to the national good. They are expected to keep their fingers on the pulse of their disciplines, to be ambitious yet frugal in organizing their efforts, and to always test their theories in the laboratory or in the field. Ames' leadership ranks, traditionally, are cultivated within this scientific community. Rather than manage and supervise these researchers, Ames leadership merely guided them, represents them to NASA headquarters and the world outside, then steps out of the way before they get run over.
Berkeley Lab - Materials Sciences Division
Computational Study of Excited-State Phenomena in Energy Materials Center for X-ray Optics MSD Facilities Ion and Materials Physics Scattering and Instrumentation Science Centers Center for Computational Study of Sciences Centers Center for Computational Study of Excited-State Phenomena in Energy Materials Center for X
International External Quality Assurance for Laboratory Diagnosis of Diphtheria ▿
Neal, S. E.; Efstratiou, A.
2009-01-01
The diphtheria surveillance network (DIPNET) encompassing National Diphtheria Reference Centers from 25 European countries is a Dedicated Surveillance Network recognized by the European Commission. A key DIPNET objective is the quality assessment of microbiological procedures for diphtheria across the European Union and beyond. A detailed questionnaire on the level of reference laboratory services and an external quality assessment (EQA) panel comprising six simulated throat specimens were sent to 34 centers. Twenty-three centers are designated National Diphtheria Reference Centers, with the laboratory in the United Kingdom being the only WHO Collaborating Centre. A variety of screening and identification tests were used, including the cysteinase test (20/34 centers), pyrazinamidase test (17/34 centers), and commercial kits (25/34 centers). The classic Elek test for toxigenicity testing is mostly used (28/34 centers), with variations in serum sources and antitoxin concentrations. Many laboratories reported problems obtaining Elek reagents or media. Only six centers produced acceptable results for all six specimens. Overall, 21% of identification and 13% of toxigenicity reports were unacceptable. Many centers could not isolate the target organism, and most found difficulties with the specimens that contained Corynebacterium striatum as a commensal contaminant. Nineteen centers generated either false-positive or negative toxigenic results, which may have caused inappropriate medical management. The discrepancies in this diphtheria diagnostics EQA alarmingly reflect the urgent need to improve laboratory performance in diphtheria diagnostics in Europe, standardize feasible and robust microbiological methods, and build awareness among public health authorities. Therefore, DIPNET recommends that regular workshops and EQA distributions for diphtheria diagnostics should be supported and maintained. PMID:19828749
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vigil,Benny Manuel; Ballance, Robert; Haskell, Karen
Cielo is a massively parallel supercomputer funded by the DOE/NNSA Advanced Simulation and Computing (ASC) program, and operated by the Alliance for Computing at Extreme Scale (ACES), a partnership between Los Alamos National Laboratory (LANL) and Sandia National Laboratories (SNL). The primary Cielo compute platform is physically located at Los Alamos National Laboratory. This Cielo Computational Environment Usage Model documents the capabilities and the environment to be provided for the Q1 FY12 Level 2 Cielo Capability Computing (CCC) Platform Production Readiness Milestone. This document describes specific capabilities, tools, and procedures to support both local and remote users. The model ismore » focused on the needs of the ASC user working in the secure computing environments at Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory, or Sandia National Laboratories, but also addresses the needs of users working in the unclassified environment. The Cielo Computational Environment Usage Model maps the provided capabilities to the tri-Lab ASC Computing Environment (ACE) Version 8.0 requirements. The ACE requirements reflect the high performance computing requirements for the Production Readiness Milestone user environment capabilities of the ASC community. A description of ACE requirements met, and those requirements that are not met, are included in each section of this document. The Cielo Computing Environment, along with the ACE mappings, has been issued and reviewed throughout the tri-Lab community.« less
Real Time Data in Synoptic Meteolorolgy and Weather Forecasting Education
NASA Astrophysics Data System (ADS)
Campetella, C. M.; Gassmann, M. I.
2006-05-01
The Department of Atmospheric and Oceanographic Sciences (DAOS) of the University of Buenos Aires is the university component of the World Meteorological Organization (WMO) Regional Meteorological Training Center (RMTC) in Region III. In January, 2002 our RMTC was invited to take part in the MeteoForum pilot project that was developed jointly by the COMET and Unidata programs of the University Corporation for Atmospheric Research (UCAR). MeteoForum comprises an international network of WMO Region III and IV RMTCs working collaboratively with universities to enhance their roles of training and education through information technologies and multilingual collections of resources. The DAOS undertook to improve its infrastructure to be able to access hydro-meteorological information in real-time as part of the Unidata community. In 2003, the DAOS received some Unidata equipment grant funds to update its computer infrastructure, improving communications with an operationally quicker system. Departmental networking was upgraded to 100 Mb/s capability while, at the same time, new computation resources were purchased that increased the number of computers available for student use from 5 to 8. This upgrade has also resulted in more and better computers being available for student and faculty research. A video projection system, purchased with funds provided by the COMET program as part of Meteoforum, is used in classrooms with Internet connections for a variety of educational activities. The upgraded computing and networking facilities have contributed to the development of educational modules using real-time hydro-meteorological and other digital data for the classroom. With the aid of Unidata personal, the Unidata Local Data Management (LDM) software was installed and configured to request and process real-time feeds of global observational data; global numerical model output from the US National Centers for Environmental Prediction (NCEP) models; and all imager channels from GOES-12 from the Unidata Internet Data Distribution (IDD) system. The data now being routinely received have impacted not only the meteorological education in the DAOS, but also have been instructive in techniques for Internet-based data sharing for our students. The DAOS has made a substantial effort to provide undergraduate students with experience in manipulating, displaying, and analyzing weather data in real-time through interactive displays of data using visualization tools provided by Unidata. Two of the specific courses whose curriculum have been improved are synoptic meteorology and a laboratory on weather prediction. Some laboratory materials have been developed to reflect current data as applied to the lecture material. This talk will briefly describe the data compiled and the fields used to analyze an intense cyclogenesis event that occurred over the La Plata River in August, 2005. This event was used as a case study for discussions in the Synoptic Weather Laboratory degree course of Atmospheric Sciences Licentiate.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
A video on computer security is described. Lonnie Moore, the Computer Security Manager, CSSM/CPPM at Lawrence Livermore National Laboratory (LLNL) and Gale Warshawsky, the Coordinator for Computer Security Education and Awareness at LLNL, wanted to share topics such as computer ethics, software piracy, privacy issues, and protecting information in a format that would capture and hold an audience`s attention. Four Computer Security Short Subject videos were produced which ranged from 1--3 minutes each. These videos are very effective education and awareness tools that can be used to generate discussions about computer security concerns and good computing practices.
5. AERIAL PHOTO OF THE COMPONENTS TEST LABORATORY DURING THE ...
5. AERIAL PHOTO OF THE COMPONENTS TEST LABORATORY DURING THE CONSTRUCTION OF THE EAST TEST AREA. 1955, FRED ORDWAY COLLECTION, U.S. SPACE AND ROCKET CENTER, HUNTSVILLE, AL. - Marshall Space Flight Center, East Test Area, Components Test Laboratory, Huntsville, Madison County, AL
Center of excellence for small robots
NASA Astrophysics Data System (ADS)
Nguyen, Hoa G.; Carroll, Daniel M.; Laird, Robin T.; Everett, H. R.
2005-05-01
The mission of the Unmanned Systems Branch of SPAWAR Systems Center, San Diego (SSC San Diego) is to provide network-integrated robotic solutions for Command, Control, Communications, Computers, Intelligence, Surveillance, and Reconnaissance (C4ISR) applications, serving and partnering with industry, academia, and other government agencies. We believe the most important criterion for a successful acquisition program is producing a value-added end product that the warfighter needs, uses and appreciates. Through our accomplishments in the laboratory and field, SSC San Diego has been designated the Center of Excellence for Small Robots by the Office of the Secretary of Defense Joint Robotics Program. This paper covers the background, experience, and collaboration efforts by SSC San Diego to serve as the "Impedance-Matching Transformer" between the robotic user and technical communities. Special attention is given to our Unmanned Systems Technology Imperatives for Research, Development, Testing and Evaluation (RDT&E) of Small Robots. Active projects, past efforts, and architectures are provided as success stories for the Unmanned Systems Development Approach.
2004-02-01
Andy Jenkins, an engineer for the Lab on a Chip Applications Development program, helped build the Applications Development Unit (ADU-25), a one-of-a-kind facility for controlling and analyzing processes on chips with extreme accuracy. Pressure is used to cause fluids to travel through network of fluid pathways, or micro-channels, embossed on the chips through a process similar to the one used to print circuits on computer chips. To make customized chips for various applications, NASA has an agreement with the U.S. Army's Micro devices and Micro fabrication Laboratory at Redstone Arsenal in Huntsville, Alabama, where NASA's Marshall Space Flight Center (MSFC) is located. The Marshall Center team is also collaborating with scientists at other NASA centers and at universities to develop custom chip designs for many applications, such as studying how fluidic systems work in spacecraft and identifying microbes in self-contained life support systems. Chips could even be designed for use on Earth, such as for detecting deadly microbes in heating and air systems. (NASA/MSFC/D.Stoffer)
Ho, Chi-Kung; Chen, Fu-Cheng; Chen, Yung-Lung; Wang, Hui-Ting; Lee, Chien-Ho; Chung, Wen-Jung; Lin, Cheng-Jui; Hsueh, Shu-Kai; Hung, Shin-Chiang; Wu, Kuan-Han; Liu, Chu-Feng; Kung, Chia-Te; Cheng, Cheng-I
2017-01-01
This study evaluated the impact on clinical outcomes using a cloud computing system to reduce percutaneous coronary intervention hospital door-to-balloon (DTB) time for ST segment elevation myocardial infarction (STEMI). A total of 369 patients before and after implementation of the transfer protocol were enrolled. Of these patients, 262 were transferred through protocol while the other 107 patients were transferred through the traditional referral process. There were no significant differences in DTB time, pain to door of STEMI receiving center arrival time, and pain to balloon time between the two groups. Pain to electrocardiography time in patients with Killip I/II and catheterization laboratory to balloon time in patients with Killip III/IV were significantly reduced in transferred through protocol group compared to in traditional referral process group (both p < 0.05). There were also no remarkable differences in the complication rate and 30-day mortality between two groups. The multivariate analysis revealed that the independent predictors of 30-day mortality were elderly patients, advanced Killip score, and higher level of troponin-I. This study showed that patients transferred through our present protocol could reduce pain to electrocardiography and catheterization laboratory to balloon time in Killip I/II and III/IV patients separately. However, this study showed that using a cloud computing system in our present protocol did not reduce DTB time.
Household wireless electroencephalogram hat
NASA Astrophysics Data System (ADS)
Szu, Harold; Hsu, Charles; Moon, Gyu; Yamakawa, Takeshi; Tran, Binh
2012-06-01
We applied Compressive Sensing to design an affordable, convenient Brain Machine Interface (BMI) measuring the high spatial density, and real-time process of Electroencephalogram (EEG) brainwaves by a Smartphone. It is useful for therapeutic and mental health monitoring, learning disability biofeedback, handicap interfaces, and war gaming. Its spec is adequate for a biomedical laboratory, without the cables hanging over the head and tethered to a fixed computer terminal. Our improved the intrinsic signal to noise ratio (SNR) by using the non-uniform placement of the measuring electrodes to create the proximity of measurement to the source effect. We computing a spatiotemporal average the larger magnitude of EEG data centers in 0.3 second taking on tethered laboratory data, using fuzzy logic, and computing the inside brainwave sources, by Independent Component Analysis (ICA). Consequently, we can overlay them together by non-uniform electrode distribution enhancing the signal noise ratio and therefore the degree of sparseness by threshold. We overcame the conflicting requirements between a high spatial electrode density and precise temporal resolution (beyond Event Related Potential (ERP) P300 brainwave at 0.3 sec), and Smartphone wireless bottleneck of spatiotemporal throughput rate. Our main contribution in this paper is the quality and the speed of iterative compressed image recovery algorithm based on a Block Sparse Code (Baranuick et al, IEEE/IT 2008). As a result, we achieved real-time wireless dynamic measurement of EEG brainwaves, matching well with traditionally tethered high density EEG.
High-Performance Computing Data Center Warm-Water Liquid Cooling |
Computational Science | NREL Warm-Water Liquid Cooling High-Performance Computing Data Center Warm-Water Liquid Cooling NREL's High-Performance Computing Data Center (HPC Data Center) is liquid water Liquid cooling technologies offer a more energy-efficient solution that also allows for effective
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhong, Xing; Wang, Lei; Zhou, Hu
A novel PtCo alloy in situ etched and embedded in graphene nanopores (PtCo/NPG) as a high-performance catalyst for ORR was reported. Graphene nanopores were fabricated in situ while forming PtCo nanoparticles that were uniformly embedded in the graphene nanopores. Given the synergistic effect between PtCo alloy and nanopores, PtCo/NPG exhibited 11.5 times higher mass activity than that of the commercial Pt/C cathode electrocatalyst. DFT calculations indicated that the nanopores in NPG cannot only stabilize PtCo nanoparticles but can also definitely change the electronic structures, thereby change its adsorption abilities. This enhancement can lead to a favorable reaction pathway on PtCo/NPGmore » for ORR. This study showed that PtCo/NPG is a potential candidate for the next generation of Pt-based catalysts in fuel cells. This study also offered a promising alternative strategy and enabled the fabrication of various kinds of metal/graphene nanopore nanohybrids with potential applications in catalysts and potential use for other technological devices. The authors acknowledge the financial support from the National Basic Research Program (973 program, No. 2013CB733501), Zhejiang Provincial Education Department Research Program (Y201326554) and the National Natural Science Foundation of China (No. 21306169, 21101137, 21136001, 21176221 and 91334013). D. Mei acknowledges the support from the US Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Division of Chemical Sciences, Geosciences & Biosciences. Pacific Northwest National Laboratory (PNNL) is a multiprogram national laboratory operated for DOE by Battelle. Computing time was granted by the grand challenge of computational catalysis of the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) and by the National Energy Research Scientific Computing Center (NERSC).« less
A Comparison of the Apple Macintosh and IBM PC in Laboratory Applications.
ERIC Educational Resources Information Center
Williams, Ron
1986-01-01
Compares Apple Macintosh and IBM PC microcomputers in terms of their usefulness in the laboratory. No attempt is made to equalize the two computer systems since they represent opposite ends of the computer spectrum. Indicates that the IBM PC is the most useful general-purpose personal computer for laboratory applications. (JN)
Voting with Their Seats: Computer Laboratory Design and the Casual User
ERIC Educational Resources Information Center
Spennemann, Dirk H. R.; Atkinson, John; Cornforth, David
2007-01-01
Student computer laboratories are provided by most teaching institutions around the world; however, what is the most effective layout for such facilities? The log-in data files from computer laboratories at a regional university in Australia were analysed to determine whether there was a pattern in student seating. In particular, it was…
ERIC Educational Resources Information Center
Newby, Michael; Marcoulides, Laura D.
2008-01-01
Purpose: The purpose of this paper is to model the relationship between student performance, student attitudes, and computer laboratory environments. Design/methodology/approach: Data were collected from 234 college students enrolled in courses that involved the use of a computer to solve problems and provided the laboratory experience by means of…
ERIC Educational Resources Information Center
Conlon, Michael P.; Mullins, Paul
2011-01-01
The Computer Science Department at Slippery Rock University created a laboratory for its Computer Networks and System Administration and Security courses under relaxed financial constraints. This paper describes the department's experience designing and using this laboratory, including lessons learned and descriptions of some student projects…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stoffel, T.; Reda, I.
2013-05-01
The NREL Pyrheliometer Comparisons for 2012 (NPC-2012) were held at the Solar Radiation Research Laboratory in Golden, Colorado, from September 24 through October 5 for the purpose of transferring the World Radiometric Reference (WRR) to participating instrument. Twenty scientists and engineers operated 32 absolute cavity radiometers and 18 conventional thermopile-based pyrheliometers to simultaneously measure clear-sky direct normal irradiance during the comparisons. The transfer standard group of reference radiometers for NPC-2012 consisted of four NREL radiometers with direct traceability to the WRR, having participated in the Eleventh International Pyrheliometer Comparisons (IPC-XI) hosted by the World Radiation Center in the fall ofmore » 2010. As the result of NPC-2012, each participating absolute cavity radiometer was assigned a new WRR transfer factor, computed as the reference irradiance computed by the transfer standard group divided by the observed irradiance from the participating radiometer. The performance of the transfer standard group during NPC-2012 was consistent with previous comparisons, including IPC-XI. The measurement performance of the transfer standard group allowed the transfer of the WRR to each participating radiometer with an estimated uncertainty of +/- 0.33% with respect to the International System of Units.« less
The National Program of Educational Laboratories. Final Report.
ERIC Educational Resources Information Center
Chase, Francis S.
This report presents results of a critical analysis of 20 regional educational laboratories and nine university research and development centers established under ESEA Title IV. Observations, supported by specific examples, are made concerning the laboratories and centers and deal with their roles, programs definitions, impact on educational…
Materials Science Research Rack-1 (MSRR-1)
NASA Technical Reports Server (NTRS)
2001-01-01
This computer-generated image depicts the Materials Science Research Rack-1 (MSRR-1) being developed by NASA's Marshall Space Flight Center and the European Space Agency (ESA) for placement in the Destiny laboratory module aboard the International Space Station. The rack is part of the plarned Materials Science Research Facility (MSRF) and is expected to include two furnace module inserts, a Quench Module Insert (being developed by NASA's Marshall Space Flight Center) to study directional solidification in rapidly cooled alloys and a Diffusion Module Insert (being developed by the European Space Agency) to study crystal growth, and a transparent furnace (being developed by NASA's Space Product Development program). Multi-user equipment in the rack is being developed under the auspices of NASA's Office of Biological and Physical Research (OBPR) and ESA. Key elements are labeled in other images (0101754, 0101829, 0101830, and TBD).
Materials Science Research Rack-1 (MSRR-1)
NASA Technical Reports Server (NTRS)
2001-01-01
This computer-generated image depicts the Materials Science Research Rack-1 (MSRR-1) being developed by NASA's Marshall Space Flight Center and the European Space Agency (ESA) for placement in the Destiny laboratory module aboard the International Space Station. The rack is part of the plarned Materials Science Research Facility (MSRF) and is expected to include two furnace module inserts, a Quench Module Insert (being developed by NASA's Marshall Space Flight Center) to study directional solidification in rapidly cooled alloys and a Diffusion Module Insert (being developed by the European Space Agency) to study crystal growth, and a transparent furnace (being developed by NASA's Space Product Development program). Multi-user equipment in the rack is being developed under the auspices of NASA's Office of Biological and Physical Research (OBPR) and ESA. A larger image is available without labels (No. 0101755).
Materials Science Research Rack-1 (MSRR-1)
NASA Technical Reports Server (NTRS)
2001-01-01
This computer-generated image depicts the Materials Science Research Rack-1 (MSRR-1) being developed by NASA's Marshall Space Flight Center and the European Space Agency (ESA) for placement in the Destiny laboratory module aboard the International Space Station. The rack is part of the plarned Materials Science Research Facility (MSRF) and is expected to include two furnace module inserts, a Quench Module Insert (being developed by NASA's Marshall Space Flight Center) to study directional solidification in rapidly cooled alloys and a Diffusion Module Insert (being developed by the European Space Agency) to study crystal growth, and a transparent furnace (being developed by NASA's Space Product Development program). Multi-user equipment in the rack is being developed under the auspices of NASA's Office of Biological and Physical Research (OBPR) and ESA. Key elements are labeled in other images (0101754, 0101830, and TBD).
Materials Science Research Rack-1 (MSRR-1)
NASA Technical Reports Server (NTRS)
2001-01-01
This computer-generated image depicts the Materials Science Research Rack-1 (MSRR-1) being developed by NASA's Marshall Space Flight Center and the European Space Agency (ESA) for placement in the Destiny laboratory module aboard the International Space Station. The rack is part of the plarned Materials Science Research Facility (MSRF) and is expected to include two furnace module inserts, a Quench Module Insert (being developed by NASA's Marshall Space Flight Center) to study directional solidification in rapidly cooled alloys and a Diffusion Module Insert (being developed by the European Space Agency) to study crystal growth, and a transparent furnace (being developed by NASA's Space Product Development program). Multi-user equipment in the rack is being developed under the auspices of NASA's Office of Biological and Physical Research (OBPR) and ESA. Key elements are labeled in other images (0101754, 0101829, 0101830).
NASA Technical Reports Server (NTRS)
Barclay, Rebecca O.; Pinelli, Thomas E.; Tan, Axel S. T.; Kennedy, John M.
1993-01-01
As part of Phase 4 of the NASA/DOD Aerospace Knowledge Diffusion Research Project, two studies were conducted that investigated the technical communications practices of Dutch and U.S. aerospace engineers and scientists. A self-administered questionnaire was distributed to aerospace engineers and scientists at the National Aerospace Laboratory (The Netherlands), and NASA Ames Research Center (U.S.), and the NASA Langley Research Center (U.S.). This paper presents responses of the Dutch and U.S. participants to selected questions about four of the seven project objectives: determining the importance of technical communications to aerospace engineering professionals, investigating the production of technical communications, examining the use and importance of computer and information technology, and exploring the use of electronic networks.
PNNL streamlines energy-guzzling computers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beckman, Mary T.; Marquez, Andres
In a room the size of a garage, two rows of six-foot-tall racks holding supercomputer hard drives sit back-to-back. Thin tubes and wires snake off the hard drives, slithering into the corners. Stepping between the rows, a rush of heat whips around you -- the air from fans blowing off processing heat. But walk farther in, between the next racks of hard drives, and the temperature drops noticeably. These drives are being cooled by a non-conducting liquid that runs right over the hardworking processors. The liquid carries the heat away in tubes, saving the air a few degrees. This ismore » the Energy Smart Data Center at Pacific Northwest National Laboratory. The bigger, faster, and meatier supercomputers get, the more energy they consume. PNNL's Andres Marquez has developed this test bed to learn how to train the behemoths in energy efficiency. The work will help supercomputers perform better as well. Processors have to keep cool or suffer from "thermal throttling," says Marquez. "That's the performance threshold where the computer is too hot to run well. That threshold is an industry secret." The center at EMSL, DOE's national scientific user facility at PNNL, harbors several ways of experimenting with energy usage. For example, the room's air conditioning is isolated from the rest of EMSL -- pipes running beneath the floor carry temperature-controlled water through heat exchangers to cooling towers outside. "We can test whether it's more energy efficient to cool directly on the processing chips or out in the water tower," says Marquez. The hard drives feed energy and temperature data to a network server running specially designed software that controls and monitors the data center. To test the center’s limits, the team runs the processors flat out – not only on carefully controlled test programs in the Energy Smart computers, but also on real world software from other EMSL research, such as regional weather forecasting models. Marquez's group is also developing "power aware computing", where the computer programs themselves perform calculations more energy efficiently. Maybe once computers get smart about energy, they'll have tips for their users.« less
Integration of Panda Workload Management System with supercomputers
NASA Astrophysics Data System (ADS)
De, K.; Jha, S.; Klimentov, A.; Maeno, T.; Mashinistov, R.; Nilsson, P.; Novikov, A.; Oleynik, D.; Panitkin, S.; Poyda, A.; Read, K. F.; Ryabinkin, E.; Teslyuk, A.; Velikhov, V.; Wells, J. C.; Wenaus, T.
2016-09-01
The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS, one of the largest collaborations ever assembled in the sciences, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Management System for managing the workflow for all data processing on over 140 data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. While PanDA currently uses more than 250000 cores with a peak performance of 0.3+ petaFLOPS, next LHC data taking runs will require more resources than Grid computing can possibly provide. To alleviate these challenges, LHC experiments are engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with supercomputers in United States, Europe and Russia (in particular with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF), Supercomputer at the National Research Center "Kurchatov Institute", IT4 in Ostrava, and others). The current approach utilizes a modified PanDA pilot framework for job submission to the supercomputers batch queues and local data management, with light-weight MPI wrappers to run singlethreaded workloads in parallel on Titan's multi-core worker nodes. This implementation was tested with a variety of Monte-Carlo workloads on several supercomputing platforms. We will present our current accomplishments in running PanDA WMS at supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facility's infrastructure for High Energy and Nuclear Physics, as well as other data-intensive science applications, such as bioinformatics and astro-particle physics.
Final Report: Quantification of Uncertainty in Extreme Scale Computations (QUEST)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marzouk, Youssef; Conrad, Patrick; Bigoni, Daniele
QUEST (\\url{www.quest-scidac.org}) is a SciDAC Institute that is focused on uncertainty quantification (UQ) in large-scale scientific computations. Our goals are to (1) advance the state of the art in UQ mathematics, algorithms, and software; and (2) provide modeling, algorithmic, and general UQ expertise, together with software tools, to other SciDAC projects, thereby enabling and guiding a broad range of UQ activities in their respective contexts. QUEST is a collaboration among six institutions (Sandia National Laboratories, Los Alamos National Laboratory, the University of Southern California, Massachusetts Institute of Technology, the University of Texas at Austin, and Duke University) with a historymore » of joint UQ research. Our vision encompasses all aspects of UQ in leadership-class computing. This includes the well-founded setup of UQ problems; characterization of the input space given available data/information; local and global sensitivity analysis; adaptive dimensionality and order reduction; forward and inverse propagation of uncertainty; handling of application code failures, missing data, and hardware/software fault tolerance; and model inadequacy, comparison, validation, selection, and averaging. The nature of the UQ problem requires the seamless combination of data, models, and information across this landscape in a manner that provides a self-consistent quantification of requisite uncertainties in predictions from computational models. Accordingly, our UQ methods and tools span an interdisciplinary space across applied math, information theory, and statistics. The MIT QUEST effort centers on statistical inference and methods for surrogate or reduced-order modeling. MIT personnel have been responsible for the development of adaptive sampling methods, methods for approximating computationally intensive models, and software for both forward uncertainty propagation and statistical inverse problems. A key software product of the MIT QUEST effort is the MIT Uncertainty Quantification library, called MUQ (\\url{muq.mit.edu}).« less
An, Gary; Bartels, John; Vodovotz, Yoram
2011-01-01
The clinical translation of promising basic biomedical findings, whether derived from reductionist studies in academic laboratories or as the product of extensive high-throughput and –content screens in the biotechnology and pharmaceutical industries, has reached a period of stagnation in which ever higher research and development costs are yielding ever fewer new drugs. Systems biology and computational modeling have been touted as potential avenues by which to break through this logjam. However, few mechanistic computational approaches are utilized in a manner that is fully cognizant of the inherent clinical realities in which the drugs developed through this ostensibly rational process will be ultimately used. In this article, we present a Translational Systems Biology approach to inflammation. This approach is based on the use of mechanistic computational modeling centered on inherent clinical applicability, namely that a unified suite of models can be applied to generate in silico clinical trials, individualized computational models as tools for personalized medicine, and rational drug and device design based on disease mechanism. PMID:21552346