Sample records for advanced computing center

  1. High-Performance Computing Data Center | Energy Systems Integration

    Science.gov Websites

    Facility | NREL High-Performance Computing Data Center High-Performance Computing Data Center The Energy Systems Integration Facility's High-Performance Computing Data Center is home to Peregrine -the largest high-performance computing system in the world exclusively dedicated to advancing

  2. Advanced Biomedical Computing Center (ABCC) | DSITP

    Cancer.gov

    The Advanced Biomedical Computing Center (ABCC), located in Frederick Maryland (MD), provides HPC resources for both NIH/NCI intramural scientists and the extramural biomedical research community. Its mission is to provide HPC support, to provide collaborative research, and to conduct in-house research in various areas of computational biology and biomedical research.

  3. Center for Advanced Computational Technology

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    2000-01-01

    The Center for Advanced Computational Technology (ACT) was established to serve as a focal point for diverse research activities pertaining to application of advanced computational technology to future aerospace systems. These activities include the use of numerical simulations, artificial intelligence methods, multimedia and synthetic environments, and computational intelligence, in the modeling, analysis, sensitivity studies, optimization, design and operation of future aerospace systems. The Center is located at NASA Langley and is an integral part of the School of Engineering and Applied Science of the University of Virginia. The Center has four specific objectives: 1) conduct innovative research on applications of advanced computational technology to aerospace systems; 2) act as pathfinder by demonstrating to the research community what can be done (high-potential, high-risk research); 3) help in identifying future directions of research in support of the aeronautical and space missions of the twenty-first century; and 4) help in the rapid transfer of research results to industry and in broadening awareness among researchers and engineers of the state-of-the-art in applications of advanced computational technology to the analysis, design prototyping and operations of aerospace and other high-performance engineering systems. In addition to research, Center activities include helping in the planning and coordination of the activities of a multi-center team of NASA and JPL researchers who are developing an intelligent synthesis environment for future aerospace systems; organizing workshops and national symposia; as well as writing state-of-the-art monographs and NASA special publications on timely topics.

  4. Center for Center for Technology for Advanced Scientific Component Software (TASCS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kostadin, Damevski

    A resounding success of the Scientific Discovery through Advanced Computing (SciDAC) program is that high-performance computational science is now universally recognized as a critical aspect of scientific discovery [71], complementing both theoretical and experimental research. As scientific communities prepare to exploit unprecedented computing capabilities of emerging leadership-class machines for multi-model simulations at the extreme scale [72], it is more important than ever to address the technical and social challenges of geographically distributed teams that combine expertise in domain science, applied mathematics, and computer science to build robust and flexible codes that can incorporate changes over time. The Center for Technologymore » for Advanced Scientific Component Software (TASCS)1 tackles these these issues by exploiting component-based software development to facilitate collaborative high-performance scientific computing.« less

  5. High performance computing for advanced modeling and simulation of materials

    NASA Astrophysics Data System (ADS)

    Wang, Jue; Gao, Fei; Vazquez-Poletti, Jose Luis; Li, Jianjiang

    2017-02-01

    The First International Workshop on High Performance Computing for Advanced Modeling and Simulation of Materials (HPCMS2015) was held in Austin, Texas, USA, Nov. 18, 2015. HPCMS 2015 was organized by Computer Network Information Center (Chinese Academy of Sciences), University of Michigan, Universidad Complutense de Madrid, University of Science and Technology Beijing, Pittsburgh Supercomputing Center, China Institute of Atomic Energy, and Ames Laboratory.

  6. White paper: A plan for cooperation between NASA and DARPA to establish a center for advanced architectures

    NASA Technical Reports Server (NTRS)

    Denning, P. J.; Adams, G. B., III; Brown, R. L.; Kanerva, P.; Leiner, B. M.; Raugh, M. R.

    1986-01-01

    Large, complex computer systems require many years of development. It is recognized that large scale systems are unlikely to be delivered in useful condition unless users are intimately involved throughout the design process. A mechanism is described that will involve users in the design of advanced computing systems and will accelerate the insertion of new systems into scientific research. This mechanism is embodied in a facility called the Center for Advanced Architectures (CAA). CAA would be a division of RIACS (Research Institute for Advanced Computer Science) and would receive its technical direction from a Scientific Advisory Board established by RIACS. The CAA described here is a possible implementation of a center envisaged in a proposed cooperation between NASA and DARPA.

  7. Cornell University Center for Advanced Computing

    Science.gov Websites

    Resource Center Data Management (RDMSG) Computational Agriculture National Science Foundation Other Public agriculture technology acquired Lifka joins National Science Foundation CISE Advisory Committee © Cornell

  8. Current state and future direction of computer systems at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Rogers, James L. (Editor); Tucker, Jerry H. (Editor)

    1992-01-01

    Computer systems have advanced at a rate unmatched by any other area of technology. As performance has dramatically increased there has been an equally dramatic reduction in cost. This constant cost performance improvement has precipitated the pervasiveness of computer systems into virtually all areas of technology. This improvement is due primarily to advances in microelectronics. Most people are now convinced that the new generation of supercomputers will be built using a large number (possibly thousands) of high performance microprocessors. Although the spectacular improvements in computer systems have come about because of these hardware advances, there has also been a steady improvement in software techniques. In an effort to understand how these hardware and software advances will effect research at NASA LaRC, the Computer Systems Technical Committee drafted this white paper to examine the current state and possible future directions of computer systems at the Center. This paper discusses selected important areas of computer systems including real-time systems, embedded systems, high performance computing, distributed computing networks, data acquisition systems, artificial intelligence, and visualization.

  9. CAROLINA CENTER FOR COMPUTATIONAL TOXICOLOGY

    EPA Science Inventory

    The Center will advance the field of computational toxicology through the development of new methods and tools, as well as through collaborative efforts. In each Project, new computer-based models will be developed and published that represent the state-of-the-art. The tools p...

  10. Advanced laptop and small personal computer technology

    NASA Technical Reports Server (NTRS)

    Johnson, Roger L.

    1991-01-01

    Advanced laptop and small personal computer technology is presented in the form of the viewgraphs. The following areas of hand carried computers and mobile workstation technology are covered: background, applications, high end products, technology trends, requirements for the Control Center application, and recommendations for the future.

  11. New developments in delivering public access to data from the National Center for Computational Toxicology at the EPA

    EPA Science Inventory

    Researchers at EPA’s National Center for Computational Toxicology integrate advances in biology, chemistry, and computer science to examine the toxicity of chemicals and help prioritize chemicals for further research based on potential human health risks. The goal of this researc...

  12. Displaying Computer Simulations Of Physical Phenomena

    NASA Technical Reports Server (NTRS)

    Watson, Val

    1991-01-01

    Paper discusses computer simulation as means of experiencing and learning to understand physical phenomena. Covers both present simulation capabilities and major advances expected in near future. Visual, aural, tactile, and kinesthetic effects used to teach such physical sciences as dynamics of fluids. Recommends classrooms in universities, government, and industry be linked to advanced computing centers so computer simulations integrated into education process.

  13. The Benefits of Making Data from the EPA National Center for Computational Toxicology available for reuse (ACS Fall meeting 3 of 12)

    EPA Science Inventory

    Researchers at EPA’s National Center for Computational Toxicology (NCCT) integrate advances in biology, chemistry, exposure and computer science to help prioritize chemicals for further research based on potential human health risks. The goal of this research is to quickly evalua...

  14. PCs: Key to the Future. Business Center Provides Sound Skills and Good Attitudes.

    ERIC Educational Resources Information Center

    Pay, Renee W.

    1991-01-01

    The Advanced Computing/Management Training Program at Jordan Technical Center (Sandy, Utah) simulates an automated office to teach five sets of skills: computer architecture and operating systems, word processing, data processing, communications skills, and management principles. (SK)

  15. CNC Turning Center Advanced Operations. Computer Numerical Control Operator/Programmer. 444-332.

    ERIC Educational Resources Information Center

    Skowronski, Steven D.; Tatum, Kenneth

    This student guide provides materials for a course designed to introduce the student to the operations and functions of a two-axis computer numerical control (CNC) turning center. The course consists of seven units. Unit 1 presents course expectations and syllabus, covers safety precautions, and describes the CNC turning center components, CNC…

  16. Computational mechanics and physics at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    South, Jerry C., Jr.

    1987-01-01

    An overview is given of computational mechanics and physics at NASA Langley Research Center. Computational analysis is a major component and tool in many of Langley's diverse research disciplines, as well as in the interdisciplinary research. Examples are given for algorithm development and advanced applications in aerodynamics, transition to turbulence and turbulence simulation, hypersonics, structures, and interdisciplinary optimization.

  17. Advanced Training Technologies and Learning Environments

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler); Malone, John B. (Compiler)

    1999-01-01

    This document contains the proceedings of the Workshop on Advanced Training Technologies and Learning Environments held at NASA Langley Research Center, Hampton, Virginia, March 9-10, 1999. The workshop was jointly sponsored by the University of Virginia's Center for Advanced Computational Technology and NASA. Workshop attendees were from NASA, other government agencies, industry, and universities. The objective of the workshop was to assess the status and effectiveness of different advanced training technologies and learning environments.

  18. Embedded Data Processor and Portable Computer Technology testbeds

    NASA Technical Reports Server (NTRS)

    Alena, Richard; Liu, Yuan-Kwei; Goforth, Andre; Fernquist, Alan R.

    1993-01-01

    Attention is given to current activities in the Embedded Data Processor and Portable Computer Technology testbed configurations that are part of the Advanced Data Systems Architectures Testbed at the Information Sciences Division at NASA Ames Research Center. The Embedded Data Processor Testbed evaluates advanced microprocessors for potential use in mission and payload applications within the Space Station Freedom Program. The Portable Computer Technology (PCT) Testbed integrates and demonstrates advanced portable computing devices and data system architectures. The PCT Testbed uses both commercial and custom-developed devices to demonstrate the feasibility of functional expansion and networking for portable computers in flight missions.

  19. Remote Science Operation Center research

    NASA Technical Reports Server (NTRS)

    Banks, P. M.

    1986-01-01

    Progress in the following areas is discussed: the design, planning and operation of a remote science payload operations control center; design and planning of a data link via satellite; and the design and prototyping of an advanced workstation environment for multi-media (3-D computer aided design/computer aided engineering, voice, video, text) communications and operations.

  20. Applied Computational Fluid Dynamics at NASA Ames Research Center

    NASA Technical Reports Server (NTRS)

    Holst, Terry L.; Kwak, Dochan (Technical Monitor)

    1994-01-01

    The field of Computational Fluid Dynamics (CFD) has advanced to the point where it can now be used for many applications in fluid mechanics research and aerospace vehicle design. A few applications being explored at NASA Ames Research Center will be presented and discussed. The examples presented will range in speed from hypersonic to low speed incompressible flow applications. Most of the results will be from numerical solutions of the Navier-Stokes or Euler equations in three space dimensions for general geometry applications. Computational results will be used to highlight the presentation as appropriate. Advances in computational facilities including those associated with NASA's CAS (Computational Aerosciences) Project of the Federal HPCC (High Performance Computing and Communications) Program will be discussed. Finally, opportunities for future research will be presented and discussed. All material will be taken from non-sensitive, previously-published and widely-disseminated work.

  1. Center of Excellence for Hypersonics Research

    DTIC Science & Technology

    2012-01-25

    detailed simulations of actual combustor configurations, and ultimately for the optimization of hypersonic air - breathing propulsion system flow paths... vehicle development programs. The Center engaged leading experts in experimental and computational analysis of hypersonic flows to provide research...advanced hypersonic vehicles and space access systems will require significant advances in the design methods and ground testing techniques to ensure

  2. Attitudes toward Advanced and Multivariate Statistics When Using Computers.

    ERIC Educational Resources Information Center

    Kennedy, Robert L.; McCallister, Corliss Jean

    This study investigated the attitudes toward statistics of graduate students who studied advanced statistics in a course in which the focus of instruction was the use of a computer program in class. The use of the program made it possible to provide an individualized, self-paced, student-centered, and activity-based course. The three sections…

  3. Investigating Impact Metrics for Performance for the US EPA National Center for Computational Toxicology (ACS Fall meeting)

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Computational Toxicology Program integrates advances in biology, chemistry, and computer science to help prioritize chemicals for further research based on potential human health risks. This work involves computational and data drive...

  4. Vanderbilt University Institute of Imaging Science Center for Computational Imaging XNAT: A multimodal data archive and processing environment.

    PubMed

    Harrigan, Robert L; Yvernault, Benjamin C; Boyd, Brian D; Damon, Stephen M; Gibney, Kyla David; Conrad, Benjamin N; Phillips, Nicholas S; Rogers, Baxter P; Gao, Yurui; Landman, Bennett A

    2016-01-01

    The Vanderbilt University Institute for Imaging Science (VUIIS) Center for Computational Imaging (CCI) has developed a database built on XNAT housing over a quarter of a million scans. The database provides framework for (1) rapid prototyping, (2) large scale batch processing of images and (3) scalable project management. The system uses the web-based interfaces of XNAT and REDCap to allow for graphical interaction. A python middleware layer, the Distributed Automation for XNAT (DAX) package, distributes computation across the Vanderbilt Advanced Computing Center for Research and Education high performance computing center. All software are made available in open source for use in combining portable batch scripting (PBS) grids and XNAT servers. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. NETL - Supercomputing: NETL Simulation Based Engineering User Center (SBEUC)

    ScienceCinema

    None

    2018-02-07

    NETL's Simulation-Based Engineering User Center, or SBEUC, integrates one of the world's largest high-performance computers with an advanced visualization center. The SBEUC offers a collaborative environment among researchers at NETL sites and those working through the NETL-Regional University Alliance.

  6. NETL - Supercomputing: NETL Simulation Based Engineering User Center (SBEUC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2013-09-30

    NETL's Simulation-Based Engineering User Center, or SBEUC, integrates one of the world's largest high-performance computers with an advanced visualization center. The SBEUC offers a collaborative environment among researchers at NETL sites and those working through the NETL-Regional University Alliance.

  7. Delivering an Informational Hub for Data at the National Center for Computational Toxicology (ACS Spring Meeting) 7 of 7

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Computational Toxicology Program integrates advances in biology, chemistry, and computer science to help prioritize chemicals for further research based on potential human health risks. This work involves computational and data drive...

  8. Development of Advanced Computational Aeroelasticity Tools at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Bartels, R. E.

    2008-01-01

    NASA Langley Research Center has continued to develop its long standing computational tools to address new challenges in aircraft and launch vehicle design. This paper discusses the application and development of those computational aeroelastic tools. Four topic areas will be discussed: 1) Modeling structural and flow field nonlinearities; 2) Integrated and modular approaches to nonlinear multidisciplinary analysis; 3) Simulating flight dynamics of flexible vehicles; and 4) Applications that support both aeronautics and space exploration.

  9. Development and applications of nondestructive evaluation at Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    Whitaker, Ann F.

    1990-01-01

    A brief description of facility design and equipment, facility usage, and typical investigations are presented for the following: Surface Inspection Facility; Advanced Computer Tomography Inspection Station (ACTIS); NDE Data Evaluation Facility; Thermographic Test Development Facility; Radiographic Test Facility; Realtime Radiographic Test Facility; Eddy Current Research Facility; Acoustic Emission Monitoring System; Advanced Ultrasonic Test Station (AUTS); Ultrasonic Test Facility; and Computer Controlled Scanning (CONSCAN) System.

  10. Advanced Group Support Systems and Facilities

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler); Malone, John B. (Compiler)

    1999-01-01

    The document contains the proceedings of the Workshop on Advanced Group Support Systems and Facilities held at NASA Langley Research Center, Hampton, Virginia, July 19-20, 1999. The workshop was jointly sponsored by the University of Virginia Center for Advanced Computational Technology and NASA. Workshop attendees came from NASA, other government agencies, industry, and universities. The objectives of the workshop were to assess the status of advanced group support systems and to identify the potential of these systems for use in future collaborative distributed design and synthesis environments. The presentations covered the current status and effectiveness of different group support systems.

  11. [Activities of Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2001-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administrations missions. RIACS is located at the NASA Ames Research Center, Moffett Field, California. RIACS research focuses on the three cornerstones of IT research necessary to meet the future challenges of NASA missions: 1. Automated Reasoning for Autonomous Systems Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. 2. Human-Centered Computing Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities. 3. High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to analysis of large scientific datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply IT research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, visiting scientist programs and student summer programs, designed to encourage and facilitate collaboration between the university and NASA IT research communities.

  12. Delivering The Benefits of Chemical-Biological Integration in Computational Toxicology at the EPA (ACS Fall meeting)

    EPA Science Inventory

    Abstract: Researchers at the EPA’s National Center for Computational Toxicology integrate advances in biology, chemistry, and computer science to examine the toxicity of chemicals and help prioritize chemicals for further research based on potential human health risks. The intent...

  13. A Serious Game of Success

    ERIC Educational Resources Information Center

    Nikirk, Martin

    2006-01-01

    This article discusses a computer game design and animation pilot at Washington County Technical High School as part of the advanced computer applications completer program. The focus of the instructional program is to teach students the 16 components of computer game design through a team-centered, problem-solving instructional format. Among…

  14. Reinventing patient-centered computing for the twenty-first century.

    PubMed

    Goldberg, H S; Morales, A; Gottlieb, L; Meador, L; Safran, C

    2001-01-01

    Despite evidence over the past decade that patients like and will use patient-centered computing systems in managing their health, patients have remained forgotten stakeholders in advances in clinical computing systems. We present a framework for patient empowerment and the technical realization of that framework in an architecture called CareLink. In an evaluation of the initial deployment of CareLink in the support of neonatal intensive care, we have demonstrated a reduction in the length of stay for very-low birthweight infants, and an improvement in family satisfaction with care delivery. With the ubiquitous adoption of the Internet into the general culture, patient-centered computing provides the opportunity to mend broken health care relationships and reconnect patients to the care delivery process. CareLink itself provides functionality to support both clinical care and research, and provides a living laboratory for the further study of patient-centered computing.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hules, John

    This 1998 annual report from the National Scientific Energy Research Computing Center (NERSC) presents the year in review of the following categories: Computational Science; Computer Science and Applied Mathematics; and Systems and Services. Also presented are science highlights in the following categories: Basic Energy Sciences; Biological and Environmental Research; Fusion Energy Sciences; High Energy and Nuclear Physics; and Advanced Scientific Computing Research and Other Projects.

  16. 76 FR 50460 - Privacy Act of 1974; Notice of a Computer Matching Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-15

    ...; Notice of a Computer Matching Program AGENCY: Defense Manpower Data Center, Department of Defense (DoD). ACTION: Notice of a Computer Matching Program. SUMMARY: Subsection (e)(12) of the Privacy Act of 1974, as amended, (5 U.S.C. 552a) requires agencies to publish advance notice of any proposed or revised computer...

  17. 76 FR 77811 - Privacy Act of 1974; Notice of a Computer Matching Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-14

    ...; Notice of a Computer Matching Program AGENCY: Defense Manpower Data Center, Department of Defense (DoD). ACTION: Notice of a Computer Matching Program. SUMMARY: Subsection (e)(12) of the Privacy Act of 1974, as amended, (5 U.S.C. 552a) requires agencies to publish advance notice of any proposed or revised computer...

  18. Center for space microelectronics technology

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The 1992 Technical Report of the Jet Propulsion Laboratory Center for Space Microelectronics Technology summarizes the technical accomplishments, publications, presentations, and patents of the center during the past year. The report lists 187 publications, 253 presentations, and 111 new technology reports and patents in the areas of solid-state devices, photonics, advanced computing, and custom microcircuits.

  19. The Role of Computers in Research and Development at Langley Research Center

    NASA Technical Reports Server (NTRS)

    Wieseman, Carol D. (Compiler)

    1994-01-01

    This document is a compilation of presentations given at a workshop on the role cf computers in research and development at the Langley Research Center. The objectives of the workshop were to inform the Langley Research Center community of the current software systems and software practices in use at Langley. The workshop was organized in 10 sessions: Software Engineering; Software Engineering Standards, methods, and CASE tools; Solutions of Equations; Automatic Differentiation; Mosaic and the World Wide Web; Graphics and Image Processing; System Design Integration; CAE Tools; Languages; and Advanced Topics.

  20. NASA Center for Climate Simulation (NCCS) Advanced Technology AT5 Virtualized Infiniband Report

    NASA Technical Reports Server (NTRS)

    Thompson, John H.; Bledsoe, Benjamin C.; Wagner, Mark; Shakshober, John; Fromkin, Russ

    2013-01-01

    The NCCS is part of the Computational and Information Sciences and Technology Office (CISTO) of Goddard Space Flight Center's (GSFC) Sciences and Exploration Directorate. The NCCS's mission is to enable scientists to increase their understanding of the Earth, the solar system, and the universe by supplying state-of-the-art high performance computing (HPC) solutions. To accomplish this mission, the NCCS (https://www.nccs.nasa.gov) provides high performance compute engines, mass storage, and network solutions to meet the specialized needs of the Earth and space science user communities

  1. Application of technology developed for flight simulation at NASA. Langley Research Center

    NASA Technical Reports Server (NTRS)

    Cleveland, Jeff I., II

    1991-01-01

    In order to meet the stringent time-critical requirements for real-time man-in-the-loop flight simulation, computer processing operations including mathematical model computation and data input/output to the simulators must be deterministic and be completed in as short a time as possible. Personnel at NASA's Langley Research Center are currently developing the use of supercomputers for simulation mathematical model computation for real-time simulation. This, coupled with the use of an open systems software architecture, will advance the state-of-the-art in real-time flight simulation.

  2. Tutorial: Advanced fault tree applications using HARP

    NASA Technical Reports Server (NTRS)

    Dugan, Joanne Bechta; Bavuso, Salvatore J.; Boyd, Mark A.

    1993-01-01

    Reliability analysis of fault tolerant computer systems for critical applications is complicated by several factors. These modeling difficulties are discussed and dynamic fault tree modeling techniques for handling them are described and demonstrated. Several advanced fault tolerant computer systems are described, and fault tree models for their analysis are presented. HARP (Hybrid Automated Reliability Predictor) is a software package developed at Duke University and NASA Langley Research Center that is capable of solving the fault tree models presented.

  3. Computer-Assisted Performance Evaluation for Navy Anti-Air Warfare Training: Concepts, Methods, and Constraints.

    ERIC Educational Resources Information Center

    Chesler, David J.

    An improved general methodological approach for the development of computer-assisted evaluation of trainee performance in the computer-based simulation environment is formulated in this report. The report focuses on the Tactical Advanced Combat Direction and Electronic Warfare system (TACDEW) at the Fleet Anti-Air Warfare Training Center at San…

  4. PEO Integration Acronym Book

    DTIC Science & Technology

    2011-02-01

    Command CASE Computer Aided Software Engineering CASEVAC Casualty Evacuation CASTFOREM Combined Arms And Support Task Force Evaluation Model CAT Center For...Advanced Technologies CAT Civil Affairs Team CAT Combined Arms Training CAT Crew Integration CAT Crisis Action Team CATIA Computer-Aided Three...Dimensional Interactive Application CATOX Catalytic Oxidation CATS Combined Arms Training Strategy CATT Combined Arms Tactical Trainer CATT Computer

  5. Silicon Wafer Advanced Packaging (SWAP). Multichip Module (MCM) Foundry Study. Version 2

    DTIC Science & Technology

    1991-04-08

    Next Layer Dielectric Spacing - Additional Metal Thickness Impact on Dielectric Uniformity/Adhiesion. The first step in .!Ie EPerimental design would be... design CAM - computer aided manufacturing CAE - computer aided engineering CALCE - computer aided life cycle engineering center CARMA - computer aided...expansion 5 j- CVD - chemical vapor deposition J . ..- j DA - design automation J , DEC - Digital Equipment Corporation --- DFT - design for testability

  6. The Nuclear Energy Advanced Modeling and Simulation Enabling Computational Technologies FY09 Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diachin, L F; Garaizar, F X; Henson, V E

    2009-10-12

    In this document we report on the status of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Enabling Computational Technologies (ECT) effort. In particular, we provide the context for ECT In the broader NEAMS program and describe the three pillars of the ECT effort, namely, (1) tools and libraries, (2) software quality assurance, and (3) computational facility (computers, storage, etc) needs. We report on our FY09 deliverables to determine the needs of the integrated performance and safety codes (IPSCs) in these three areas and lay out the general plan for software quality assurance to meet the requirements of DOE andmore » the DOE Advanced Fuel Cycle Initiative (AFCI). We conclude with a brief description of our interactions with the Idaho National Laboratory computer center to determine what is needed to expand their role as a NEAMS user facility.« less

  7. Research Institute for Advanced Computer Science: Annual Report October 1998 through September 1999

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.; Gross, Anthony R. (Technical Monitor)

    1999-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center (ARC). It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. ARC has been designated NASA's Center of Excellence in Information Technology. In this capacity, ARC is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA ARC and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, and visiting scientist programs, designed to encourage and facilitate collaboration between the university and NASA information technology research communities.

  8. Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2000-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center. It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. Ames has been designated NASA's Center of Excellence in Information Technology. In this capacity, Ames is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA Ames and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth; (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking. Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, and visiting scientist programs, designed to encourage and facilitate collaboration between the university and NASA information technology research communities.

  9. Intricacies of modern supercomputing illustrated with recent advances in simulations of strongly correlated electron systems

    NASA Astrophysics Data System (ADS)

    Schulthess, Thomas C.

    2013-03-01

    The continued thousand-fold improvement in sustained application performance per decade on modern supercomputers keeps opening new opportunities for scientific simulations. But supercomputers have become very complex machines, built with thousands or tens of thousands of complex nodes consisting of multiple CPU cores or, most recently, a combination of CPU and GPU processors. Efficient simulations on such high-end computing systems require tailored algorithms that optimally map numerical methods to particular architectures. These intricacies will be illustrated with simulations of strongly correlated electron systems, where the development of quantum cluster methods, Monte Carlo techniques, as well as their optimal implementation by means of algorithms with improved data locality and high arithmetic density have gone hand in hand with evolving computer architectures. The present work would not have been possible without continued access to computing resources at the National Center for Computational Science of Oak Ridge National Laboratory, which is funded by the Facilities Division of the Office of Advanced Scientific Computing Research, and the Swiss National Supercomputing Center (CSCS) that is funded by ETH Zurich.

  10. Television campaign.

    PubMed

    2006-01-01

    Virginia Hospital Center embarked on a branding effort in hopes of raising customer awareness of the hospital's state-of-the-art technologies in advanced medical care. The campaign launched a new phase of TV spots that highlight the facility's advanced services, such as the computed tomography angiogram, the argon plasma coagulator, and heart valve replacement surgery.

  11. Disabled Access to Technological Advances (DATA) Final Report.

    ERIC Educational Resources Information Center

    Cress, Cynthia J.

    Disabled Access to Technological Advances (DATA) was a 3-year federally funded project to demonstrate how the application of computer technology can increase the employability of severely disabled persons. Services were provided through the integrated efforts of four agencies in Dane County, Wisconsin: an independent living center, a…

  12. Final Technical Report - Center for Technology for Advanced Scientific Component Software (TASCS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sussman, Alan

    2014-10-21

    This is a final technical report for the University of Maryland work in the SciDAC Center for Technology for Advanced Scientific Component Software (TASCS). The Maryland work focused on software tools for coupling parallel software components built using the Common Component Architecture (CCA) APIs. Those tools are based on the Maryland InterComm software framework that has been used in multiple computational science applications to build large-scale simulations of complex physical systems that employ multiple separately developed codes.

  13. Advances in Computational Capabilities for Hypersonic Flows

    NASA Technical Reports Server (NTRS)

    Kumar, Ajay; Gnoffo, Peter A.; Moss, James N.; Drummond, J. Philip

    1997-01-01

    The paper reviews the growth and advances in computational capabilities for hypersonic applications over the period from the mid-1980's to the present day. The current status of the code development issues such as surface and field grid generation, algorithms, physical and chemical modeling, and validation is provided. A brief description of some of the major codes being used at NASA Langley Research Center for hypersonic continuum and rarefied flows is provided, along with their capabilities and deficiencies. A number of application examples are presented, and future areas of research to enhance accuracy, reliability, efficiency, and robustness of computational codes are discussed.

  14. Scholarly Information Centers in ARL Libraries. SPEC Kit 175.

    ERIC Educational Resources Information Center

    Allen, Nancy, Comp.; Godden, Irene, Comp.

    Noting that the rapid evolution of telecommunications technology, the relentless advancement of computing capabilities, and the seemingly endless proliferation of electronic data have had a profound impact on research libraries, this Systems and Procedures Exchange Center (SPEC) kit explores the extent to which these technologies have come…

  15. Computational structures technology and UVA Center for CST

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    1992-01-01

    Rapid advances in computer hardware have had a profound effect on various engineering and mechanics disciplines, including the materials, structures, and dynamics disciplines. A new technology, computational structures technology (CST), has recently emerged as an insightful blend between material modeling, structural and dynamic analysis and synthesis on the one hand, and other disciplines such as computer science, numerical analysis, and approximation theory, on the other hand. CST is an outgrowth of finite element methods developed over the last three decades. The focus of this presentation is on some aspects of CST which can impact future airframes and propulsion systems, as well as on the newly established University of Virginia (UVA) Center for CST. The background and goals for CST are described along with the motivations for developing CST, and a brief discussion is made on computational material modeling. We look at the future in terms of technical needs, computing environment, and research directions. The newly established UVA Center for CST is described. One of the research projects of the Center is described, and a brief summary of the presentation is given.

  16. QCCM Center for Quantum Algorithms

    DTIC Science & Technology

    2008-10-17

    algorithms (e.g., quantum walks and adiabatic computing ), as well as theoretical advances relating algorithms to physical implementations (e.g...Park, NC 27709-2211 15. SUBJECT TERMS Quantum algorithms, quantum computing , fault-tolerant error correction Richard Cleve MITACS East Academic...0511200 Algebraic results on quantum automata A. Ambainis, M. Beaudry, M. Golovkins, A. Kikusts, M. Mercer, D. Thrien Theory of Computing Systems 39(2006

  17. Computer Assisted Virtual Environment - CAVE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erickson, Phillip; Podgorney, Robert; Weingartner,

    Research at the Center for Advanced Energy Studies is taking on another dimension with a 3-D device known as a Computer Assisted Virtual Environment. The CAVE uses projection to display high-end computer graphics on three walls and the floor. By wearing 3-D glasses to create depth perception and holding a wand to move and rotate images, users can delve into data.

  18. Computer Assisted Virtual Environment - CAVE

    ScienceCinema

    Erickson, Phillip; Podgorney, Robert; Weingartner,

    2018-05-30

    Research at the Center for Advanced Energy Studies is taking on another dimension with a 3-D device known as a Computer Assisted Virtual Environment. The CAVE uses projection to display high-end computer graphics on three walls and the floor. By wearing 3-D glasses to create depth perception and holding a wand to move and rotate images, users can delve into data.

  19. Computational structural mechanics methods research using an evolving framework

    NASA Technical Reports Server (NTRS)

    Knight, N. F., Jr.; Lotts, C. G.; Gillian, R. E.

    1990-01-01

    Advanced structural analysis and computational methods that exploit high-performance computers are being developed in a computational structural mechanics research activity sponsored by the NASA Langley Research Center. These new methods are developed in an evolving framework and applied to representative complex structural analysis problems from the aerospace industry. An overview of the methods development environment is presented, and methods research areas are described. Selected application studies are also summarized.

  20. Multimedia Instructional Tools and Student Learning in a Computer Applications Course

    ERIC Educational Resources Information Center

    Chapman, Debra L.; Wang, Shuyan

    2015-01-01

    Advances in technology and changes in educational strategies have resulted in the integration of technology in the classroom. Multimedia instructional tools (MMIT) provide student-centered active-learning instructional activities. MMITs are common in introductory computer applications courses based on the premise that MMITs should increase student…

  1. Multimedia Instructional Tools and Student Learning in Computer Applications Courses

    ERIC Educational Resources Information Center

    Chapman, Debra Laier

    2013-01-01

    Advances in technology and changes in educational strategies have resulted in the integration of technology into the classroom. Multimedia instructional tools (MMIT) have been identified as a way to provide student-centered active-learning instructional material to students. MMITs are common in introductory computer applications courses based on…

  2. Activities of the Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Oliger, Joseph

    1994-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. Research at RIACS is currently being done in the following areas: (1) parallel computing; (2) advanced methods for scientific computing; (3) high performance networks; and (4) learning systems. RIACS technical reports are usually preprints of manuscripts that have been submitted to research journals or conference proceedings. A list of these reports for the period January 1, 1994 through December 31, 1994 is in the Reports and Abstracts section of this report.

  3. Simulation Applications at NASA Ames Research Center

    NASA Technical Reports Server (NTRS)

    Inouye, M.

    1984-01-01

    Aeronautical applications of simulation technology at Ames Research Center are described. The largest wind tunnel in the world is used to determine the flow field and aerodynamic characteristics of various aircraft, helicopter, and missile configurations. Large computers are used to obtain similar results through numerical solutions of the governing equations. Capabilities are illustrated by computer simulations of turbulence, aileron buzz, and an exhaust jet. Flight simulators are used to assess the handling qualities of advanced aircraft, particularly during takeoff and landing.

  4. Bringing Precision Medicine to Community Oncologists.

    PubMed

    2017-01-01

    Quest Diagnostics has teamed up with Memorial Sloan Kettering Cancer Center and IBM Watson Health to offer IBM Watson Genomics to its network of community cancer centers and hospitals. This new service aims to advance precision medicine by combining genomic tumor sequencing with the power of cognitive computing. ©2017 American Association for Cancer Research.

  5. Soft computing in design and manufacturing of advanced materials

    NASA Technical Reports Server (NTRS)

    Cios, Krzysztof J.; Baaklini, George Y; Vary, Alex

    1993-01-01

    The potential of fuzzy sets and neural networks, often referred to as soft computing, for aiding in all aspects of manufacturing of advanced materials like ceramics is addressed. In design and manufacturing of advanced materials, it is desirable to find which of the many processing variables contribute most to the desired properties of the material. There is also interest in real time quality control of parameters that govern material properties during processing stages. The concepts of fuzzy sets and neural networks are briefly introduced and it is shown how they can be used in the design and manufacturing processes. These two computational methods are alternatives to other methods such as the Taguchi method. The two methods are demonstrated by using data collected at NASA Lewis Research Center. Future research directions are also discussed.

  6. A Distributed User Information System

    DTIC Science & Technology

    1990-03-01

    NOE08 Department of Computer Science NOVO 8 1990 University of Maryland S College Park, MD 20742 D Abstract Current user information database technology ...Transactions on Computer Systems, May 1988. [So189] K. Sollins. A plan for internet directory services. Technical report, DDN Network Information Center...2424 A Distributed User Information System DTiC Steven D. Miller, Scott Carson, and Leo Mark DELECTE Institute for Advanced Computer Studies and

  7. Applications of Computer Technology in Complex Craniofacial Reconstruction.

    PubMed

    Day, Kristopher M; Gabrick, Kyle S; Sargent, Larry A

    2018-03-01

    To demonstrate our use of advanced 3-dimensional (3D) computer technology in the analysis, virtual surgical planning (VSP), 3D modeling (3DM), and treatment of complex congenital and acquired craniofacial deformities. We present a series of craniofacial defects treated at a tertiary craniofacial referral center utilizing state-of-the-art 3D computer technology. All patients treated at our center using computer-assisted VSP, prefabricated custom-designed 3DMs, and/or 3D printed custom implants (3DPCI) in the reconstruction of craniofacial defects were included in this analysis. We describe the use of 3D computer technology to precisely analyze, plan, and reconstruct 31 craniofacial deformities/syndromes caused by: Pierre-Robin (7), Treacher Collins (5), Apert's (2), Pfeiffer (2), Crouzon (1) Syndromes, craniosynostosis (6), hemifacial microsomia (2), micrognathia (2), multiple facial clefts (1), and trauma (3). In select cases where the available bone was insufficient for skeletal reconstruction, 3DPCIs were fabricated using 3D printing. We used VSP in 30, 3DMs in all 31, distraction osteogenesis in 16, and 3DPCIs in 13 cases. Utilizing these technologies, the above complex craniofacial defects were corrected without significant complications and with excellent aesthetic results. Modern 3D technology allows the surgeon to better analyze complex craniofacial deformities, precisely plan surgical correction with computer simulation of results, customize osteotomies, plan distractions, and print 3DPCI, as needed. The use of advanced 3D computer technology can be applied safely and potentially improve aesthetic and functional outcomes after complex craniofacial reconstruction. These techniques warrant further study and may be reproducible in various centers of care.

  8. NASA Space Engineering Research Center for VLSI systems design

    NASA Technical Reports Server (NTRS)

    1991-01-01

    This annual review reports the center's activities and findings on very large scale integration (VLSI) systems design for 1990, including project status, financial support, publications, the NASA Space Engineering Research Center (SERC) Symposium on VLSI Design, research results, and outreach programs. Processor chips completed or under development are listed. Research results summarized include a design technique to harden complementary metal oxide semiconductors (CMOS) memory circuits against single event upset (SEU); improved circuit design procedures; and advances in computer aided design (CAD), communications, computer architectures, and reliability design. Also described is a high school teacher program that exposes teachers to the fundamentals of digital logic design.

  9. From the Bench to the Clinic Part 1: Martin McIntosh, Ph.D., Introduces His Lab's Immunotherapy Research | Office of Cancer Genomics

    Cancer.gov

    The field of immunotherapy is rapidly advancing and genomics techniques are being incorporated to add a “precision” approach. OCG spoke with two CTD2 investigators from the Fred Hutchinson Cancer Research Center (FHCRC) about new advances in immunotherapy. For the first article of this two-part series, we interviewed Martin McIntosh, Ph.D., member of the Fred Hutchinson Translational Research program and previously Program Head in Computational Biology at FHCRC/University of Washington Comprehensive Cancer Center.

  10. Effect of Technological Changes in Information Transfer on the Delivery of Pharmacy Services.

    ERIC Educational Resources Information Center

    Barker, Kenneth N.; And Others

    1989-01-01

    Personal computer technology has arrived in health care. Specific technological advances are optical disc storage, smart cards, voice recognition, and robotics. This paper discusses computers in medicine, in nursing, in conglomerates, and with patients. Future health care will be delivered in primary care centers, medical supermarkets, specialized…

  11. Collaborative Research Goes to School: Guided Inquiry with Computers in Classrooms. Technical Report.

    ERIC Educational Resources Information Center

    Wiske, Martha Stone; And Others

    Twin aims--to advance theory and to improve practice in science, mathematics, and computing education--guided the Educational Technology Center's (ETC) research from its inception in 1983. These aims led ETC to establish collaborative research groups in which people whose primary interest was classroom teaching and learning, and researchers…

  12. Advances and trends in the development of computational models for tires

    NASA Technical Reports Server (NTRS)

    Noor, A. K.; Tanner, J. A.

    1985-01-01

    Status and some recent developments of computational models for tires are summarized. Discussion focuses on a number of aspects of tire modeling and analysis including: tire materials and their characterization; evolution of tire models; characteristics of effective finite element models for analyzing tires; analysis needs for tires; and impact of the advances made in finite element technology, computational algorithms, and new computing systems on tire modeling and analysis. An initial set of benchmark problems has been proposed in concert with the U.S. tire industry. Extensive sets of experimental data will be collected for these problems and used for evaluating and validating different tire models. Also, the new Aircraft Landing Dynamics Facility (ALDF) at NASA Langley Research Center is described.

  13. Automated Help System For A Supercomputer

    NASA Technical Reports Server (NTRS)

    Callas, George P.; Schulbach, Catherine H.; Younkin, Michael

    1994-01-01

    Expert-system software developed to provide automated system of user-helping displays in supercomputer system at Ames Research Center Advanced Computer Facility. Users located at remote computer terminals connected to supercomputer and each other via gateway computers, local-area networks, telephone lines, and satellite links. Automated help system answers routine user inquiries about how to use services of computer system. Available 24 hours per day and reduces burden on human experts, freeing them to concentrate on helping users with complicated problems.

  14. An Object-Oriented Software Reuse Tool

    DTIC Science & Technology

    1989-04-01

    Square Cambridge, MA 02139 I. CONTROLLING OFFICE NAME ANO ADDRESS 12. REPORT DATIE Advanced Research Projects Agency April 1989 1400 Wilson Blvd. IS...Office of Naval Research UNCLASSIFIED Information Systems Arlington, VA 22217 1s,. DECLASSIFICATION/DOWNGRAOINGSCHEDUL.E 6. O:STRIILJTION STATEMENT (of...DISTRIBUTION: Defense Technical Information Center Computer Sciences Division ONR, Code 1133 Navy Center for Applied Research in Artificial

  15. Human Factors Model

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Jack is an advanced human factors software package that provides a three dimensional model for predicting how a human will interact with a given system or environment. It can be used for a broad range of computer-aided design applications. Jack was developed by the computer Graphics Research Laboratory of the University of Pennsylvania with assistance from NASA's Johnson Space Center, Ames Research Center and the Army. It is the University's first commercial product. Jack is still used for academic purposes at the University of Pennsylvania. Commercial rights were given to Transom Technologies, Inc.

  16. Algorithms for Port-of-Entry Inspection

    DTIC Science & Technology

    2007-05-29

    Devdatt Lad, Rutgers University, Center for Advanced Information Processing Mingyu Li, Rutgers University, Statistics Francesco Longo, University of...Industrial and Systems Engineering graduate student Devdatt Lad, Rutgers University, Electrical & Computer Engineering, graduate student Mingyu Li

  17. Advanced computational tools for 3-D seismic analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barhen, J.; Glover, C.W.; Protopopescu, V.A.

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advancemore » in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.« less

  18. Prediction of helicopter rotor discrete frequency noise: A computer program incorporating realistic blade motions and advanced acoustic formulation

    NASA Technical Reports Server (NTRS)

    Brentner, K. S.

    1986-01-01

    A computer program has been developed at the Langley Research Center to predict the discrete frequency noise of conventional and advanced helicopter rotors. The program, called WOPWOP, uses the most advanced subsonic formulation of Farassat that is less sensitive to errors and is valid for nearly all helicopter rotor geometries and flight conditions. A brief derivation of the acoustic formulation is presented along with a discussion of the numerical implementation of the formulation. The computer program uses realistic helicopter blade motion and aerodynamic loadings, input by the user, for noise calculation in the time domain. A detailed definition of all the input variables, default values, and output data is included. A comparison with experimental data shows good agreement between prediction and experiment; however, accurate aerodynamic loading is needed.

  19. Composite mechanics for engine structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1987-01-01

    Recent research activities and accomplishments at Lewis Research Center on composite mechanics for engine structures are summarized. The activities focused mainly on developing procedures for the computational simulation of composite intrinsic and structural behavior. The computational simulation encompasses all aspects of composite mechanics, advanced three-dimensional finite-element methods, damage tolerance, composite structural and dynamic response, and structural tailoring and optimization.

  20. Composite mechanics for engine structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1989-01-01

    Recent research activities and accomplishments at Lewis Research Center on composite mechanics for engine structures are summarized. The activities focused mainly on developing procedures for the computational simulation of composite intrinsic and structural behavior. The computational simulation encompasses all aspects of composite mechanics, advanced three-dimensional finite-element methods, damage tolerance, composite structural and dynamic response, and structural tailoring and optimization.

  1. Using SPEEDES to simulate the blue gene interconnect network

    NASA Technical Reports Server (NTRS)

    Springer, P.; Upchurch, E.

    2003-01-01

    JPL and the Center for Advanced Computer Architecture (CACR) is conducting application and simulation analyses of BG/L in order to establish a range of effectiveness for the Blue Gene/L MPP architecture in performing important classes of computations and to determine the design sensitivity of the global interconnect network in support of real world ASCI application execution.

  2. Dynamic Voltage Frequency Scaling Simulator for Real Workflows Energy-Aware Management in Green Cloud Computing

    PubMed Central

    Cotes-Ruiz, Iván Tomás; Prado, Rocío P.; García-Galán, Sebastián; Muñoz-Expósito, José Enrique; Ruiz-Reyes, Nicolás

    2017-01-01

    Nowadays, the growing computational capabilities of Cloud systems rely on the reduction of the consumed power of their data centers to make them sustainable and economically profitable. The efficient management of computing resources is at the heart of any energy-aware data center and of special relevance is the adaptation of its performance to workload. Intensive computing applications in diverse areas of science generate complex workload called workflows, whose successful management in terms of energy saving is still at its beginning. WorkflowSim is currently one of the most advanced simulators for research on workflows processing, offering advanced features such as task clustering and failure policies. In this work, an expected power-aware extension of WorkflowSim is presented. This new tool integrates a power model based on a computing-plus-communication design to allow the optimization of new management strategies in energy saving considering computing, reconfiguration and networks costs as well as quality of service, and it incorporates the preeminent strategy for on host energy saving: Dynamic Voltage Frequency Scaling (DVFS). The simulator is designed to be consistent in different real scenarios and to include a wide repertory of DVFS governors. Results showing the validity of the simulator in terms of resources utilization, frequency and voltage scaling, power, energy and time saving are presented. Also, results achieved by the intra-host DVFS strategy with different governors are compared to those of the data center using a recent and successful DVFS-based inter-host scheduling strategy as overlapped mechanism to the DVFS intra-host technique. PMID:28085932

  3. Dynamic Voltage Frequency Scaling Simulator for Real Workflows Energy-Aware Management in Green Cloud Computing.

    PubMed

    Cotes-Ruiz, Iván Tomás; Prado, Rocío P; García-Galán, Sebastián; Muñoz-Expósito, José Enrique; Ruiz-Reyes, Nicolás

    2017-01-01

    Nowadays, the growing computational capabilities of Cloud systems rely on the reduction of the consumed power of their data centers to make them sustainable and economically profitable. The efficient management of computing resources is at the heart of any energy-aware data center and of special relevance is the adaptation of its performance to workload. Intensive computing applications in diverse areas of science generate complex workload called workflows, whose successful management in terms of energy saving is still at its beginning. WorkflowSim is currently one of the most advanced simulators for research on workflows processing, offering advanced features such as task clustering and failure policies. In this work, an expected power-aware extension of WorkflowSim is presented. This new tool integrates a power model based on a computing-plus-communication design to allow the optimization of new management strategies in energy saving considering computing, reconfiguration and networks costs as well as quality of service, and it incorporates the preeminent strategy for on host energy saving: Dynamic Voltage Frequency Scaling (DVFS). The simulator is designed to be consistent in different real scenarios and to include a wide repertory of DVFS governors. Results showing the validity of the simulator in terms of resources utilization, frequency and voltage scaling, power, energy and time saving are presented. Also, results achieved by the intra-host DVFS strategy with different governors are compared to those of the data center using a recent and successful DVFS-based inter-host scheduling strategy as overlapped mechanism to the DVFS intra-host technique.

  4. Computer systems and software engineering

    NASA Technical Reports Server (NTRS)

    Mckay, Charles W.

    1988-01-01

    The High Technologies Laboratory (HTL) was established in the fall of 1982 at the University of Houston Clear Lake. Research conducted at the High Tech Lab is focused upon computer systems and software engineering. There is a strong emphasis on the interrelationship of these areas of technology and the United States' space program. In Jan. of 1987, NASA Headquarters announced the formation of its first research center dedicated to software engineering. Operated by the High Tech Lab, the Software Engineering Research Center (SERC) was formed at the University of Houston Clear Lake. The High Tech Lab/Software Engineering Research Center promotes cooperative research among government, industry, and academia to advance the edge-of-knowledge and the state-of-the-practice in key topics of computer systems and software engineering which are critical to NASA. The center also recommends appropriate actions, guidelines, standards, and policies to NASA in matters pertinent to the center's research. Results of the research conducted at the High Tech Lab/Software Engineering Research Center have given direction to many decisions made by NASA concerning the Space Station Program.

  5. Proposal for continued research in intelligent machines at the Center for Engineering Systems Advanced Research (CESAR) for FY 1988 to FY 1991

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weisbin, C.R.

    1987-03-01

    This document reviews research accomplishments achieved by the staff of the Center for Engineering Systems Advanced Research (CESAR) during the fiscal years 1984 through 1987. The manuscript also describes future CESAR objectives for the 1988-1991 planning horizon, and beyond. As much as possible, the basic research goals are derived from perceived Department of Energy (DOE) needs for increased safety, productivity, and competitiveness in the United States energy producing and consuming facilities. Research areas covered include the HERMIES-II Robot, autonomous robot navigation, hypercube computers, machine vision, and manipulators.

  6. Center for Computational Structures Technology

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Perry, Ferman W.

    1995-01-01

    The Center for Computational Structures Technology (CST) is intended to serve as a focal point for the diverse CST research activities. The CST activities include the use of numerical simulation and artificial intelligence methods in modeling, analysis, sensitivity studies, and optimization of flight-vehicle structures. The Center is located at NASA Langley and is an integral part of the School of Engineering and Applied Science of the University of Virginia. The key elements of the Center are: (1) conducting innovative research on advanced topics of CST; (2) acting as pathfinder by demonstrating to the research community what can be done (high-potential, high-risk research); (3) strong collaboration with NASA scientists and researchers from universities and other government laboratories; and (4) rapid dissemination of CST to industry, through integration of industrial personnel into the ongoing research efforts.

  7. Knowledge management: Role of the the Radiation Safety Information Computational Center (RSICC)

    NASA Astrophysics Data System (ADS)

    Valentine, Timothy

    2017-09-01

    The Radiation Safety Information Computational Center (RSICC) at Oak Ridge National Laboratory (ORNL) is an information analysis center that collects, archives, evaluates, synthesizes and distributes information, data and codes that are used in various nuclear technology applications. RSICC retains more than 2,000 software packages that have been provided by code developers from various federal and international agencies. RSICC's customers (scientists, engineers, and students from around the world) obtain access to such computing codes (source and/or executable versions) and processed nuclear data files to promote on-going research, to ensure nuclear and radiological safety, and to advance nuclear technology. The role of such information analysis centers is critical for supporting and sustaining nuclear education and training programs both domestically and internationally, as the majority of RSICC's customers are students attending U.S. universities. Additionally, RSICC operates a secure CLOUD computing system to provide access to sensitive export-controlled modeling and simulation (M&S) tools that support both domestic and international activities. This presentation will provide a general review of RSICC's activities, services, and systems that support knowledge management and education and training in the nuclear field.

  8. Assessing Advanced Technology in CENATE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tallent, Nathan R.; Barker, Kevin J.; Gioiosa, Roberto

    PNNL's Center for Advanced Technology Evaluation (CENATE) is a new U.S. Department of Energy center whose mission is to assess and facilitate access to emerging computing technology. CENATE is assessing a range of advanced technologies, from evolutionary to disruptive. Technologies of interest include the processor socket (homogeneous and accelerated systems), memories (dynamic, static, memory cubes), motherboards, networks (network interface cards and switches), and input/output and storage devices. CENATE is developing a multi-perspective evaluation process based on integrating advanced system instrumentation, performance measurements, and modeling and simulation. We show evaluations of two emerging network technologies: silicon photonics interconnects and the Datamore » Vortex network. CENATE's evaluation also addresses the question of which machine is best for a given workload under certain constraints. We show a performance-power tradeoff analysis of a well-known machine learning application on two systems.« less

  9. A Comparative Analysis of Computer-Assisted Instruction and Traditional Lecture Instruction for Administration and Management Topics in Physical Therapy Education

    ERIC Educational Resources Information Center

    Hyland, Matthew R.; Pinto-Zipp, Genevieve; Olson, Valerie; Lichtman, Steven W.

    2010-01-01

    Technological advancements and competition in student recruitment have challenged educational institutions to expand upon traditional teaching methods in order to attract, engage and retain students. One strategy to meet this shift from educator-directed teaching to student-centered learning is greater computer utilization as an integral aspect of…

  10. Assessment of the Unstructured Grid Software TetrUSS for Drag Prediction of the DLR-F4 Configuration

    NASA Technical Reports Server (NTRS)

    Pirzadeh, Shahyar Z.; Frink, Neal T.

    2002-01-01

    An application of the NASA unstructured grid software system TetrUSS is presented for the prediction of aerodynamic drag on a transport configuration. The paper briefly describes the underlying methodology and summarizes the results obtained on the DLR-F4 transport configuration recently presented in the first AIAA computational fluid dynamics (CFD) Drag Prediction Workshop. TetrUSS is a suite of loosely coupled unstructured grid CFD codes developed at the NASA Langley Research Center. The meshing approach is based on the advancing-front and the advancing-layers procedures. The flow solver employs a cell-centered, finite volume scheme for solving the Reynolds Averaged Navier-Stokes equations on tetrahedral grids. For the present computations, flow in the viscous sublayer has been modeled with an analytical wall function. The emphasis of the paper is placed on the practicality of the methodology for accurately predicting aerodynamic drag data.

  11. Operational numerical weather prediction on the CYBER 205 at the National Meteorological Center

    NASA Technical Reports Server (NTRS)

    Deaven, D.

    1984-01-01

    The Development Division of the National Meteorological Center (NMC), having the responsibility of maintaining and developing the numerical weather forecasting systems of the center, is discussed. Because of the mission of NMC data products must be produced reliably and on time twice daily free of surprises for forecasters. Personnel of Development Division are in a rather unique situation. They must develop new advanced techniques for numerical analysis and prediction utilizing current state-of-the-art techniques, and implement them in an operational fashion without damaging the operations of the center. With the computational speeds and resources now available from the CYBER 205, Development Division Personnel will be able to introduce advanced analysis and prediction techniques into the operational job suite without disrupting the daily schedule. The capabilities of the CYBER 205 are discussed.

  12. US Department of Energy High School Student Supercomputing Honors Program: A follow-up assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1987-01-01

    The US DOE High School Student Supercomputing Honors Program was designed to recognize high school students with superior skills in mathematics and computer science and to provide them with formal training and experience with advanced computer equipment. This document reports on the participants who attended the first such program, which was held at the National Magnetic Fusion Energy Computer Center at the Lawrence Livermore National Laboratory (LLNL) during August 1985.

  13. A cross-sectional study of the effects of load carriage on running characteristics and tibial mechanical stress: implications for stress fracture injuries in women

    DTIC Science & Technology

    2017-03-23

    performance computing resources made available by the US Department of Defense High Performance Computing Modernization Program at the Air Force...1Department of Defense Biotechnology High Performance Computing Software Applications Institute, Telemedicine and Advanced Technology Research Center, United...States Army Medical Research and Materiel Command, Fort Detrick, Maryland, USA Full list of author information is available at the end of the article

  14. Converged photonic data storage and switch platform for exascale disaggregated data centers

    NASA Astrophysics Data System (ADS)

    Pitwon, R.; Wang, K.; Worrall, A.

    2017-02-01

    We report on a converged optically enabled Ethernet storage, switch and compute platform, which could support future disaggregated data center architectures. The platform includes optically enabled Ethernet switch controllers, an advanced electro-optical midplane and optically interchangeable generic end node devices. We demonstrate system level performance using optically enabled Ethernet disk drives and micro-servers across optical links of varied lengths.

  15. Water-Cooled Data Center Packs More Power Per Rack | Poster

    Cancer.gov

    By Frank Blanchard and Ken Michaels, Staff Writers Behind each tall, black computer rack in the data center at the Advanced Technology Research Facility (ATRF) is something both strangely familiar and oddly out of place: It looks like a radiator. The back door of each cabinet is gridded with the coils of the Liebert cooling system, which circulates chilled water to remove heat

  16. Recommendations for Establishing the Texas Roadway Research Implementation Center

    DOT National Transportation Integrated Search

    1998-07-01

    The overall objective of the Roadway Research Initiative study was to describe an advanced testing capability, on that would speed implementation of the results from traditional computer and laboratory-based research efforts by providing a reusable t...

  17. Diabat Interpolation for Polymorph Free-Energy Differences.

    PubMed

    Kamat, Kartik; Peters, Baron

    2017-02-02

    Existing methods to compute free-energy differences between polymorphs use harmonic approximations, advanced non-Boltzmann bias sampling techniques, and/or multistage free-energy perturbations. This work demonstrates how Bennett's diabat interpolation method ( J. Comput. Phys. 1976, 22, 245 ) can be combined with energy gaps from lattice-switch Monte Carlo techniques ( Phys. Rev. E 2000, 61, 906 ) to swiftly estimate polymorph free-energy differences. The new method requires only two unbiased molecular dynamics simulations, one for each polymorph. To illustrate the new method, we compute the free-energy difference between face-centered cubic and body-centered cubic polymorphs for a Gaussian core solid. We discuss the justification for parabolic models of the free-energy diabats and similarities to methods that have been used in studies of electron transfer.

  18. Image Understanding Research and Its Application to Cartography and Computer-Based Analysis of Aerial Imagery

    DTIC Science & Technology

    1983-09-01

    Report Al-TR-346. Artifcial Intelligence Laboratory, Mamachusetts Institute of Tech- niugy. Cambridge, Mmeh mett. June 19 [G.usmn@ A. Gaman-Arenas...Testbed Coordinator, 415/859-4395 Artificial Intelligence Center Computer Science and Technology Division Prepared for: Defense Advanced Research...to support processing of aerial photographs for such military applications as cartography, Intelligence , weapon guidance, and targeting. A key

  19. An Advanced Framework for Improving Situational Awareness in Electric Power Grid Operation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Huang, Zhenyu; Zhou, Ning

    With the deployment of new smart grid technologies and the penetration of renewable energy in power systems, significant uncertainty and variability is being introduced into power grid operation. Traditionally, the Energy Management System (EMS) operates the power grid in a deterministic mode, and thus will not be sufficient for the future control center in a stochastic environment with faster dynamics. One of the main challenges is to improve situational awareness. This paper reviews the current status of power grid operation and presents a vision of improving wide-area situational awareness for a future control center. An advanced framework, consisting of parallelmore » state estimation, state prediction, parallel contingency selection, parallel contingency analysis, and advanced visual analytics, is proposed to provide capabilities needed for better decision support by utilizing high performance computing (HPC) techniques and advanced visual analytic techniques. Research results are presented to support the proposed vision and framework.« less

  20. Nano Goes Magnetic to Attract Big Business

    NASA Technical Reports Server (NTRS)

    2006-01-01

    Glenn Research Center has combined state-of-the-art electrical designs with complex, computer-aided analyses to develop some of today s most advanced power systems, in space and on Earth. The center s Power and On-Board Propulsion Technology Division is the brain behind many of these power systems. For space, this division builds technologies that help power the International Space Station, the Hubble Space Telescope, and Earth-orbiting satellites. For Earth, it has woven advanced aerospace power concepts into commercial energy applications that include solar and nuclear power generation, battery and fuel cell energy storage, communications and telecommunications satellites, cryocoolers, hybrid and electric vehicles, and heating and air-conditioning systems.

  1. New and emerging patient-centered CT imaging and image-guided treatment paradigms for maxillofacial trauma.

    PubMed

    Dreizin, David; Nam, Arthur J; Hirsch, Jeffrey; Bernstein, Mark P

    2018-06-20

    This article reviews the conceptual framework, available evidence, and practical considerations pertaining to nascent and emerging advances in patient-centered CT-imaging and CT-guided surgery for maxillofacial trauma. These include cinematic rendering-a novel method for advanced 3D visualization, incorporation of quantitative CT imaging into the assessment of orbital fractures, low-dose CT imaging protocols made possible with contemporary scanners and reconstruction techniques, the rapidly growing use of cone-beam CT, virtual fracture reduction with design software for surgical pre-planning, the use of 3D printing for fabricating models and implants, and new avenues in CT-guided computer-aided surgery.

  2. Overview of the NASA/Marshall Space Flight Center (MSFC) CFD Consortium for Applications in Propulsion Technology

    NASA Astrophysics Data System (ADS)

    McConnaughey, P. K.; Schutzenhofer, L. A.

    1992-07-01

    This paper presents an overview of the NASA/Marshall Space Flight Center (MSFC) Computational Fluid Dynamics (CFD) Consortium for Applications in Propulsion Technology (CAPT). The objectives of this consortium are discussed, as is the approach of managing resources and technology to achieve these objectives. Significant results by the three CFD CAPT teams (Turbine, Pump, and Combustion) are briefly highlighted with respect to the advancement of CFD applications, the development and evaluation of advanced hardware concepts, and the integration of these results and CFD as a design tool to support Space Transportation Main Engine and National Launch System development.

  3. NASA's Participation in the National Computational Grid

    NASA Technical Reports Server (NTRS)

    Feiereisen, William J.; Zornetzer, Steve F. (Technical Monitor)

    1998-01-01

    Over the last several years it has become evident that the character of NASA's supercomputing needs has changed. One of the major missions of the agency is to support the design and manufacture of aero- and space-vehicles with technologies that will significantly reduce their cost. It is becoming clear that improvements in the process of aerospace design and manufacturing will require a high performance information infrastructure that allows geographically dispersed teams to draw upon resources that are broader than traditional supercomputing. A computational grid draws together our information resources into one system. We can foresee the time when a Grid will allow engineers and scientists to use the tools of supercomputers, databases and on line experimental devices in a virtual environment to collaborate with distant colleagues. The concept of a computational grid has been spoken of for many years, but several events in recent times are conspiring to allow us to actually build one. In late 1997 the National Science Foundation initiated the Partnerships for Advanced Computational Infrastructure (PACI) which is built around the idea of distributed high performance computing. The Alliance lead, by the National Computational Science Alliance (NCSA), and the National Partnership for Advanced Computational Infrastructure (NPACI), lead by the San Diego Supercomputing Center, have been instrumental in drawing together the "Grid Community" to identify the technology bottlenecks and propose a research agenda to address them. During the same period NASA has begun to reformulate parts of two major high performance computing research programs to concentrate on distributed high performance computing and has banded together with the PACI centers to address the research agenda in common.

  4. ISCR Annual Report: Fical Year 2004

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McGraw, J R

    2005-03-03

    Large-scale scientific computation and all of the disciplines that support and help to validate it have been placed at the focus of Lawrence Livermore National Laboratory (LLNL) by the Advanced Simulation and Computing (ASC) program of the National Nuclear Security Administration (NNSA) and the Scientific Discovery through Advanced Computing (SciDAC) initiative of the Office of Science of the Department of Energy (DOE). The maturation of computational simulation as a tool of scientific and engineering research is underscored in the November 2004 statement of the Secretary of Energy that, ''high performance computing is the backbone of the nation's science and technologymore » enterprise''. LLNL operates several of the world's most powerful computers--including today's single most powerful--and has undertaken some of the largest and most compute-intensive simulations ever performed. Ultrascale simulation has been identified as one of the highest priorities in DOE's facilities planning for the next two decades. However, computers at architectural extremes are notoriously difficult to use efficiently. Furthermore, each successful terascale simulation only points out the need for much better ways of interacting with the resulting avalanche of data. Advances in scientific computing research have, therefore, never been more vital to LLNL's core missions than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, LLNL must engage researchers at many academic centers of excellence. In Fiscal Year 2004, the Institute for Scientific Computing Research (ISCR) served as one of LLNL's main bridges to the academic community with a program of collaborative subcontracts, visiting faculty, student internships, workshops, and an active seminar series. The ISCR identifies researchers from the academic community for computer science and computational science collaborations with LLNL and hosts them for short- and long-term visits with the aim of encouraging long-term academic research agendas that address LLNL's research priorities. Through such collaborations, ideas and software flow in both directions, and LLNL cultivates its future workforce. The Institute strives to be LLNL's ''eyes and ears'' in the computer and information sciences, keeping the Laboratory aware of and connected to important external advances. It also attempts to be the ''feet and hands'' that carry those advances into the Laboratory and incorporates them into practice. ISCR research participants are integrated into LLNL's Computing and Applied Research (CAR) Department, especially into its Center for Applied Scientific Computing (CASC). In turn, these organizations address computational challenges arising throughout the rest of the Laboratory. Administratively, the ISCR flourishes under LLNL's University Relations Program (URP). Together with the other five institutes of the URP, it navigates a course that allows LLNL to benefit from academic exchanges while preserving national security. While it is difficult to operate an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and worth the continued effort.« less

  5. High-Performance Computing Data Center Water Usage Efficiency |

    Science.gov Websites

    cooler-an advanced dry cooler that uses refrigerant in a passive cycle to dissipate heat-was installed at efficiency-using wet cooling when it's hot and dry cooling when it's not. Learn more about NREL's partnership

  6. Sandia National Laboratories: Advanced Simulation and Computing

    Science.gov Websites

    Programs Nuclear Weapons About Nuclear Weapons Safety & Security Weapons Science & Technology Robotics R&D 100 Awards Laboratory Directed Research & Development Technology Deployment Centers Audit Sandia's Economic Impact Licensing & Technology Transfer Browse Technology Portfolios

  7. U.S. Climate Change Technology Program: Strategic Plan

    DTIC Science & Technology

    2006-09-01

    and Long Term, provides details on the 85 technologies in the R&D portfolio. 21 (Figure 2-1) Continuing Process The United States, in partnership with...locations may be centered near or in residential locations, and work processes and products may be more commonly communicated or delivered via digital... chemical properties, along with advanced methods to simulate processes , will stem from advances in computational technology. Current Portfolio The current

  8. NASA Computational Fluid Dynamics Conference. Volume 1: Sessions 1-6

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Presentations given at the NASA Computational Fluid Dynamics (CFD) Conference held at the NASA Ames Research Center, Moffett Field, California, March 7-9, 1989 are given. Topics covered include research facility overviews of CFD research and applications, validation programs, direct simulation of compressible turbulence, turbulence modeling, advances in Runge-Kutta schemes for solving 3-D Navier-Stokes equations, grid generation and invicid flow computation around aircraft geometries, numerical simulation of rotorcraft, and viscous drag prediction for rotor blades.

  9. Advanced technologies impact on compressor design and development: A perspective

    NASA Technical Reports Server (NTRS)

    Ball, Calvin L.

    1989-01-01

    A historical perspective of the impact of advanced technologies on compression system design and development for aircraft gas turbine applications is presented. A bright view of the future is projected in which further advancements in compression system technologies will be made. These advancements will have a significant impact on the ability to meet the ever-more-demanding requirements being imposed on the propulsion system for advanced aircraft. Examples are presented of advanced compression system concepts now being studied. The status and potential impact of transitioning from an empirically derived design system to a computationally oriented system are highlighted. A current NASA Lewis Research Center program to enhance this transitioning is described.

  10. RIACS

    NASA Technical Reports Server (NTRS)

    Moore, Robert C.

    1998-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities that serves as a bridge between NASA and the academic community. Under a five-year co-operative agreement with NASA, research at RIACS is focused on areas that are strategically enabling to the Ames Research Center's role as NASA's Center of Excellence for Information Technology. The primary mission of RIACS is charted to carry out research and development in computer science. This work is devoted in the main to tasks that are strategically enabling with respect to NASA's bold mission in space exploration and aeronautics. There are three foci for this work: (1) Automated Reasoning. (2) Human-Centered Computing. and (3) High Performance Computing and Networking. RIACS has the additional goal of broadening the base of researcher in these areas of importance to the nation's space and aeronautics enterprises. Through its visiting scientist program, RIACS facilitates the participation of university-based researchers, including both faculty and students, in the research activities of NASA and RIACS. RIACS researchers work in close collaboration with NASA computer scientists on projects such as the Remote Agent Experiment on Deep Space One mission, and Super-Resolution Surface Modeling.

  11. Center for Cancer Genomics | Office of Cancer Genomics

    Cancer.gov

    The Center for Cancer Genomics (CCG) was established to unify the National Cancer Institute's activities in cancer genomics, with the goal of advancing genomics research and translating findings into the clinic to improve the precise diagnosis and treatment of cancers. In addition to promoting genomic sequencing approaches, CCG aims to accelerate structural, functional and computational research to explore cancer mechanisms, discover new cancer targets, and develop new therapeutics.

  12. Nanobiotechnology

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler)

    2000-01-01

    This document contains the proceedings of the Training Workshop on Nanobiotechnology held at NASA Langley Research Center, Hampton, Virginia, June 14-15, 2000. The workshop was jointly sponsored by the University of Virginia's Center for Advanced Computational Technolog and NASA. Workshop attendees were from NASA, other government agencies, industry and universities. The objectives of the workshop were to give overviews of the diverse activities in nanobiotechnology and to identify their potential for future aerospace systems.

  13. Integrating Technology into K-12 School Design.

    ERIC Educational Resources Information Center

    Syvertsen, Ken

    2002-01-01

    Asserting that advanced technology in schools is no longer reserved solely for spaces such as computer labs, media centers, and libraries, discusses how technology integration affects school design, addressing areas such as installation, space and proportion, lighting, furniture, and flexibility and simplicity. (EV)

  14. Advanced intellect-augmentation techniques

    NASA Technical Reports Server (NTRS)

    Engelbart, D. C.

    1972-01-01

    User experience in applying our augmentation tools and techniques to various normal working tasks within our center is described so as to convey a subjective impression of what it is like to work in an augmented environment. It is concluded that working-support, computer-aid systems for augmenting individuals and teams, are undoubtedly going to be widely developed and used. A very special role in this development is seen for multi-access computer networks.

  15. Contention Bounds for Combinations of Computation Graphs and Network Topologies

    DTIC Science & Technology

    2014-08-08

    member of STARnet, a Semiconductor Research Corporation program sponsored by MARCO and DARPA, and ASPIRE Lab industrial sponsors and affiliates Intel...Google, Nokia, NVIDIA , Oracle, MathWorks and Samsung. Also funded by U.S. DOE Office of Science, Office of Advanced Scientific Computing Research...DARPA Award Number HR0011-12-2- 0016, the Center for Future Architecture Research, a mem- ber of STARnet, a Semiconductor Research Corporation

  16. SCinet Architecture: Featured at the International Conference for High Performance Computing,Networking, Storage and Analysis 2016

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lyonnais, Marc; Smith, Matt; Mace, Kate P.

    SCinet is the purpose-built network that operates during the International Conference for High Performance Computing,Networking, Storage and Analysis (Super Computing or SC). Created each year for the conference, SCinet brings to life a high-capacity network that supports applications and experiments that are a hallmark of the SC conference. The network links the convention center to research and commercial networks around the world. This resource serves as a platform for exhibitors to demonstrate the advanced computing resources of their home institutions and elsewhere by supporting a wide variety of applications. Volunteers from academia, government and industry work together to design andmore » deliver the SCinet infrastructure. Industry vendors and carriers donate millions of dollars in equipment and services needed to build and support the local and wide area networks. Planning begins more than a year in advance of each SC conference and culminates in a high intensity installation in the days leading up to the conference. The SCinet architecture for SC16 illustrates a dramatic increase in participation from the vendor community, particularly those that focus on network equipment. Software-Defined Networking (SDN) and Data Center Networking (DCN) are present in nearly all aspects of the design.« less

  17. Functional and performance requirements of the next NOAA-Kasas City computer system

    NASA Technical Reports Server (NTRS)

    Mosher, F. R.

    1985-01-01

    The development of the Advanced Weather Interactive Processing System for the 1990's (AWIPS-90) will result in more timely and accurate forecasts with improved cost effectiveness. As part of the AWIPS-90 initiative, the National Meteorological Center (NMC), the National Severe Storms Forecast Center (NSSFC), and the National Hurricane Center (NHC) are to receive upgrades of interactive processing systems. This National Center Upgrade program will support the specialized inter-center communications, data acquisition, and processing needs of these centers. The missions, current capabilities and general functional requirements for the upgrade to the NSSFC are addressed. System capabilities are discussed along with the requirements for the upgraded system.

  18. Developments at the Advanced Design Technologies Testbed

    NASA Technical Reports Server (NTRS)

    VanDalsem, William R.; Livingston, Mary E.; Melton, John E.; Torres, Francisco J.; Stremel, Paul M.

    2003-01-01

    A report presents background and historical information, as of August 1998, on the Advanced Design Technologies Testbed (ADTT) at Ames Research Center. The ADTT is characterized as an activity initiated to facilitate improvements in aerospace design processes; provide a proving ground for product-development methods and computational software and hardware; develop bridging methods, software, and hardware that can facilitate integrated solutions to design problems; and disseminate lessons learned to the aerospace and information technology communities.

  19. New frontiers in design synthesis

    NASA Technical Reports Server (NTRS)

    Goldin, D. S.; Venneri, S. L.; Noor, A. K.

    1999-01-01

    The Intelligent Synthesis Environment (ISE), which is one of the major strategic technologies under development at NASA centers and the University of Virginia, is described. One of the major objectives of ISE is to significantly enhance the rapid creation of innovative affordable products and missions. ISE uses a synergistic combination of leading-edge technologies, including high performance computing, high capacity communications and networking, human-centered computing, knowledge-based engineering, computational intelligence, virtual product development, and product information management. The environment will link scientists, design teams, manufacturers, suppliers, and consultants who participate in the mission synthesis as well as in the creation and operation of the aerospace system. It will radically advance the process by which complex science missions are synthesized, and high-tech engineering Systems are designed, manufactured and operated. The five major components critical to ISE are human-centered computing, infrastructure for distributed collaboration, rapid synthesis and simulation tools, life cycle integration and validation, and cultural change in both the engineering and science creative process. The five components and their subelements are described. Related U.S. government programs are outlined and the future impact of ISE on engineering research and education is discussed.

  20. A framework for developing and integrating effective routing strategies within the emergency management decision-support system : [research brief].

    DOT National Transportation Integrated Search

    2012-05-01

    The terrorist attacks on September 11th, as well as other coordinated attacks on transit centers in Madrid and London, have underscored the importance of evacuation planning to : transportation professionals. With computer technology advancement, urb...

  1. An Ongoing Revolution: Resource Sharing and OCLC.

    ERIC Educational Resources Information Center

    Nevins, Kate

    1998-01-01

    Discusses early developments in the Online Computer Library Center (OCLC) interlibrary loan, including use of OCLC for verification and request transmittal, improved service to patrons, internal cost control, affect on work flow and borrowing patterns. Describes advances in OCLC, including internationalization, electronic information access,…

  2. Advances in Toxico-Cheminformatics: Supporting a New Paradigm for Predictive Toxicology

    EPA Science Inventory

    EPA’s National Center for Computational Toxicology is building capabilities to support a new paradigm for toxicity screening and prediction through the harnessing of legacy toxicity data, creation of data linkages, and generation of new high-throughput screening (HTS) data. The D...

  3. Water-Cooled Data Center Packs More Power Per Rack | Poster

    Cancer.gov

    By Frank Blanchard and Ken Michaels, Staff Writers Behind each tall, black computer rack in the data center at the Advanced Technology Research Facility (ATRF) is something both strangely familiar and oddly out of place: It looks like a radiator. The back door of each cabinet is gridded with the coils of the Liebert cooling system, which circulates chilled water to remove heat generated by the high-speed, high-capacity, fault-tolerant equipment.

  4. Biomedical informatics research network: building a national collaboratory to hasten the derivation of new understanding and treatment of disease.

    PubMed

    Grethe, Jeffrey S; Baru, Chaitan; Gupta, Amarnath; James, Mark; Ludaescher, Bertram; Martone, Maryann E; Papadopoulos, Philip M; Peltier, Steven T; Rajasekar, Arcot; Santini, Simone; Zaslavsky, Ilya N; Ellisman, Mark H

    2005-01-01

    Through support from the National Institutes of Health's National Center for Research Resources, the Biomedical Informatics Research Network (BIRN) is pioneering the use of advanced cyberinfrastructure for medical research. By synchronizing developments in advanced wide area networking, distributed computing, distributed database federation, and other emerging capabilities of e-science, the BIRN has created a collaborative environment that is paving the way for biomedical research and clinical information management. The BIRN Coordinating Center (BIRN-CC) is orchestrating the development and deployment of key infrastructure components for immediate and long-range support of biomedical and clinical research being pursued by domain scientists in three neuroimaging test beds.

  5. 1999 NCCS Highlights

    NASA Technical Reports Server (NTRS)

    Bennett, Jerome (Technical Monitor)

    2002-01-01

    The NASA Center for Computational Sciences (NCCS) is a high-performance scientific computing facility operated, maintained and managed by the Earth and Space Data Computing Division (ESDCD) of NASA Goddard Space Flight Center's (GSFC) Earth Sciences Directorate. The mission of the NCCS is to advance leading-edge science by providing the best people, computers, and data storage systems to NASA's Earth and space sciences programs and those of other U.S. Government agencies, universities, and private institutions. Among the many computationally demanding Earth science research efforts supported by the NCCS in Fiscal Year 1999 (FY99) are the NASA Seasonal-to-Interannual Prediction Project, the NASA Search and Rescue Mission, Earth gravitational model development efforts, the National Weather Service's North American Observing System program, Data Assimilation Office studies, a NASA-sponsored project at the Center for Ocean-Land-Atmosphere Studies, a NASA-sponsored microgravity project conducted by researchers at the City University of New York and the University of Pennsylvania, the completion of a satellite-derived global climate data set, simulations of a new geodynamo model, and studies of Earth's torque. This document presents highlights of these research efforts and an overview of the NCCS, its facilities, and its people.

  6. The evolution of process-based hydrologic models: historical challenges and the collective quest for physical realism

    NASA Astrophysics Data System (ADS)

    Clark, Martyn P.; Bierkens, Marc F. P.; Samaniego, Luis; Woods, Ross A.; Uijlenhoet, Remko; Bennett, Katrina E.; Pauwels, Valentijn R. N.; Cai, Xitian; Wood, Andrew W.; Peters-Lidard, Christa D.

    2017-07-01

    The diversity in hydrologic models has historically led to great controversy on the correct approach to process-based hydrologic modeling, with debates centered on the adequacy of process parameterizations, data limitations and uncertainty, and computational constraints on model analysis. In this paper, we revisit key modeling challenges on requirements to (1) define suitable model equations, (2) define adequate model parameters, and (3) cope with limitations in computing power. We outline the historical modeling challenges, provide examples of modeling advances that address these challenges, and define outstanding research needs. We illustrate how modeling advances have been made by groups using models of different type and complexity, and we argue for the need to more effectively use our diversity of modeling approaches in order to advance our collective quest for physically realistic hydrologic models.

  7. The evolution of process-based hydrologic models: historical challenges and the collective quest for physical realism

    NASA Astrophysics Data System (ADS)

    Clark, M. P.; Nijssen, B.; Wood, A.; Mizukami, N.; Newman, A. J.

    2017-12-01

    The diversity in hydrologic models has historically led to great controversy on the "correct" approach to process-based hydrologic modeling, with debates centered on the adequacy of process parameterizations, data limitations and uncertainty, and computational constraints on model analysis. In this paper, we revisit key modeling challenges on requirements to (1) define suitable model equations, (2) define adequate model parameters, and (3) cope with limitations in computing power. We outline the historical modeling challenges, provide examples of modeling advances that address these challenges, and define outstanding research needs. We illustrate how modeling advances have been made by groups using models of different type and complexity, and we argue for the need to more effectively use our diversity of modeling approaches in order to advance our collective quest for physically realistic hydrologic models.

  8. Laboratory for Computer Science Progress Report 21, July 1983-June 1984.

    DTIC Science & Technology

    1984-06-01

    Systems 269 4. Distributed Consensus 270 5. Election of a Leader in a Distributed Ring of Processors 273 6. Distributed Network Algorithms 274 7. Diagnosis...multiprocessor systems. This facility, funded by the new!y formed Strategic Computing Program of the Defense Advanced Research Projects Agency, will enable...Academic Staff P. Szo)ovits, Group Leader R. Patil Collaborating Investigators M. Criscitiello, M.D., Tufts-New England Medical Center Hospital R

  9. Research and Development Annual Report, 1992

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Issued as a companion to Johnson Space Center's Research and Technology Annual Report, which reports JSC accomplishments under NASA Research and Technology Operating Plan (RTOP) funding, this report describes 42 additional JSC projects that are funded through sources other than the RTOP. Emerging technologies in four major disciplines are summarized: space systems technology, medical and life sciences, mission operations, and computer systems. Although these projects focus on support of human spacecraft design, development, and safety, most have wide civil and commercial applications in areas such as advanced materials, superconductors, advanced semiconductors, digital imaging, high density data storage, high performance computers, optoelectronics, artificial intelligence, robotics and automation, sensors, biotechnology, medical devices and diagnosis, and human factors engineering.

  10. The JSC Research and Development Annual Report 1993

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Issued as a companion to Johnson Space Center's Research and Technology Annual Report, which reports JSC accomplishments under NASA Research and Technology Operating Plan (RTOP) funding, this report describes 47 additional projects that are funded through sources other than the RTOP. Emerging technologies in four major disciplines are summarized: space systems technology, medical and life sciences, mission operations, and computer systems. Although these projects focus on support of human spacecraft design, development, and safety, most have wide civil and commercial applications in areas such as advanced materials, superconductors, advanced semiconductors, digital imaging, high density data storage, high performance computers, optoelectronics, artificial intelligence, robotics and automation, sensors, biotechnology, medical devices and diagnosis, and human factors engineering.

  11. Use of a Computer Program for Advance Care Planning with African American Participants.

    PubMed

    Markham, Sarah A; Levi, Benjamin H; Green, Michael J; Schubart, Jane R

    2015-02-01

    The authors wish to acknowledge the support and assistance of Dr. William Lawrence for his contribution to the M.A.UT model used in the decision aid, Making Your Wishes Known: Planning Your Medical Future (MYWK), Dr. Cheryl Dellasega for her leadership in focus group activities, Charles Sabatino for his review of legal aspects of MYWK, Dr. Robert Pearlman and his collaborative team for use of the advance care planning booklet "Your Life, Your Choices," Megan Whitehead for assistance in grant preparation and project organization, and the Instructional Media Development Center at the University of Wisconsin as well as JPL Integrated Communications for production and programming of MYWK. For various cultural and historical reasons, African Americans are less likely than Caucasians to engage in advance care planning (ACP) for healthcare decisions. This pilot study tested whether an interactive computer program could help overcome barriers to effective ACP among African Americans. African American adults were recruited from traditionally Black churches to complete an interactive computer program on ACP, pre-/post-questionnaires, and a follow-up phone interview. Eighteen adults (mean age =53.2 years, 83% female) completed the program without any problems. Knowledge about ACP significantly increased following the computer intervention (44.9% → 61.3%, p=0.0004), as did individuals' sense of self-determination. Participants were highly satisfied with the ACP process (9.4; 1 = not at all satisfied, 10 = extremely satisfied), and reported that the computer-generated advance directive accurately reflected their wishes (6.4; 1 = not at all accurate, 7 = extremely accurate). Follow-up phone interviews found that >80% of participants reported having shared their advance directives with family members and spokespeople. Preliminary evidence suggests that an interactive computer program can help African Americans engage in effective advance care planning, including creating an accurate advance directive document that will be shared with loved ones. © 2015 National Medical Association. Published by Elsevier Inc. All rights reserved.

  12. Computational Methods Development at Ames

    NASA Technical Reports Server (NTRS)

    Kwak, Dochan; Smith, Charles A. (Technical Monitor)

    1998-01-01

    This viewgraph presentation outlines the development at Ames Research Center of advanced computational methods to provide appropriate fidelity computational analysis/design capabilities. Current thrusts of the Ames research include: 1) methods to enhance/accelerate viscous flow simulation procedures, and the development of hybrid/polyhedral-grid procedures for viscous flow; 2) the development of real time transonic flow simulation procedures for a production wind tunnel, and intelligent data management technology; and 3) the validation of methods and the flow physics study gives historical precedents to above research, and speculates on its future course.

  13. CSM Testbed Development and Large-Scale Structural Applications

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Gillian, R. E.; Mccleary, Susan L.; Lotts, C. G.; Poole, E. L.; Overman, A. L.; Macy, S. C.

    1989-01-01

    A research activity called Computational Structural Mechanics (CSM) conducted at the NASA Langley Research Center is described. This activity is developing advanced structural analysis and computational methods that exploit high-performance computers. Methods are developed in the framework of the CSM Testbed software system and applied to representative complex structural analysis problems from the aerospace industry. An overview of the CSM Testbed methods development environment is presented and some new numerical methods developed on a CRAY-2 are described. Selected application studies performed on the NAS CRAY-2 are also summarized.

  14. Computational Modeling of Microstructural-Evolution in AISI 1005 Steel During Gas Metal Arc Butt Welding

    DTIC Science & Technology

    2013-05-01

    of ferrite possessing an acicular/ lenticular -plate morphology which grows into the untrans- formed austenite from the austenite/austenite grain...ferrite and lenticular -shaped Wid- manstatten plates advancing from the allotriomorphic ferrite/ austenite interfaces toward the grain centers is depicted

  15. Design of a robotic vehicle with self-contained intelligent wheels

    NASA Astrophysics Data System (ADS)

    Poulson, Eric A.; Jacob, John S.; Gunderson, Robert W.; Abbott, Ben A.

    1998-08-01

    The Center for Intelligent Systems has developed a small robotic vehicle named the Advanced Rover Chassis 3 (ARC 3) with six identical intelligent wheel units attached to a payload via a passive linkage suspension system. All wheels are steerable, so the ARC 3 can move in any direction while rotating at any rate allowed by the terrain and motors. Each intelligent wheel unit contains a drive motor, steering motor, batteries, and computer. All wheel units are identical, so manufacturing, programing, and spare replacement are greatly simplified. The intelligent wheel concept would allow the number and placement of wheels on the vehicle to be changed with no changes to the control system, except to list the position of all the wheels relative to the vehicle center. The task of controlling the ARC 3 is distributed between one master computer and the wheel computers. Tasks such as controlling the steering motors and calculating the speed of each wheel relative to the vehicle speed in a corner are dependent on the location of a wheel relative to the vehicle center and ar processed by the wheel computers. Conflicts between the wheels are eliminated by computing the vehicle velocity control in the master computer. Various approaches to this distributed control problem, and various low level control methods, have been explored.

  16. Personalizing Drug Selection Using Advanced Clinical Decision Support

    PubMed Central

    Pestian, John; Spencer, Malik; Matykiewicz, Pawel; Zhang, Kejian; Vinks, Alexander A.; Glauser, Tracy

    2009-01-01

    This article describes the process of developing an advanced pharmacogenetics clinical decision support at one of the United States’ leading pediatric academic medical centers. This system, called CHRISTINE, combines clinical and genetic data to identify the optimal drug therapy when treating patients with epilepsy or Attention Deficit Hyperactivity Disorder. In the discussion a description of clinical decision support systems is provided, along with an overview of neurocognitive computing and how it is applied in this setting. PMID:19898682

  17. Demonstration Advanced Avionics System (DAAS) functional description. [Cessna 402B aircraft

    NASA Technical Reports Server (NTRS)

    1980-01-01

    A comprehensive set of general aviation avionics were defined for integration into an advanced hardware mechanization for demonstration in a Cessna 402B aircraft. Block diagrams are shown and system and computer architecture as well as significant hardware elements are described. The multifunction integrated data control center and electronic horizontal situation indicator are discussed. The functions that the DAAS will perform are examined. This function definition is the basis for the DAAS hardware and software design.

  18. RIACS

    NASA Technical Reports Server (NTRS)

    Moore, Robert C.

    1998-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities that serves as a bridge between NASA and the academic community. Under a five-year co-operative agreement with NASA, research at RIACS is focused on areas that are strategically enabling to the Ames Research Center's role as NASA's Center of Excellence for Information Technology. Research is carried out by a staff of full-time scientist,augmented by visitors, students, post doctoral candidates and visiting university faculty. The primary mission of RIACS is charted to carry out research and development in computer science. This work is devoted in the main to tasks that are strategically enabling with respect to NASA's bold mission in space exploration and aeronautics. There are three foci for this work: Automated Reasoning. Human-Centered Computing. and High Performance Computing and Networking. RIACS has the additional goal of broadening the base of researcher in these areas of importance to the nation's space and aeronautics enterprises. Through its visiting scientist program, RIACS facilitates the participation of university-based researchers, including both faculty and students, in the research activities of NASA and RIACS. RIACS researchers work in close collaboration with NASA computer scientists on projects such as the Remote Agent Experiment on Deep Space One mission, and Super-Resolution Surface Modeling.

  19. Optimization of a Monte Carlo Model of the Transient Reactor Test Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Kristin; DeHart, Mark; Goluoglu, Sedat

    2017-03-01

    The ultimate goal of modeling and simulation is to obtain reasonable answers to problems that don’t have representations which can be easily evaluated while minimizing the amount of computational resources. With the advances during the last twenty years of large scale computing centers, researchers have had the ability to create a multitude of tools to minimize the number of approximations necessary when modeling a system. The tremendous power of these centers requires the user to possess an immense amount of knowledge to optimize the models for accuracy and efficiency.This paper seeks to evaluate the KENO model of TREAT to optimizemore » calculational efforts.« less

  20. The Practical Impact of Recent Computer Advances on the Analysis and Design of Large Scale Networks

    DTIC Science & Technology

    1974-06-01

    Capacity Considerations," ARPA Network Information Center, Stanford Research Institute. 10. Gitman , I., R. M. VanSlyke, H. Frank: "On Splitting...281-285. 12. Gitman , I., "On : ^e Capacity of Slotted ALOHA Networks and Some Design Problems", ARPANET Network Information Center, Stanford...sum of the average demands of that population." Gitman , Van Slyke, and Frank [3], have addressed the problem of splitting a channel between two

  1. IDEAS: A multidisciplinary computer-aided conceptual design system for spacecraft

    NASA Technical Reports Server (NTRS)

    Ferebee, M. J., Jr.

    1984-01-01

    During the conceptual development of advanced aerospace vehicles, many compromises must be considered to balance economy and performance of the total system. Subsystem tradeoffs may need to be made in order to satisfy system-sensitive attributes. Due to the increasingly complex nature of aerospace systems, these trade studies have become more difficult and time-consuming to complete and involve interactions of ever-larger numbers of subsystems, components, and performance parameters. The current advances of computer-aided synthesis, modeling and analysis techniques have greatly helped in the evaluation of competing design concepts. Langley Research Center's Space Systems Division is currently engaged in trade studies for a variety of systems which include advanced ground-launched space transportation systems, space-based orbital transfer vehicles, large space antenna concepts and space stations. The need for engineering analysis tools to aid in the rapid synthesis and evaluation of spacecraft has led to the development of the Interactive Design and Evaluation of Advanced Spacecraft (IDEAS) computer-aided design system. The ADEAS system has been used to perform trade studies of competing technologies and requirements in order to pinpoint possible beneficial areas for research and development. IDEAS is presented as a multidisciplinary tool for the analysis of advanced space systems. Capabilities range from model generation and structural and thermal analysis to subsystem synthesis and performance analysis.

  2. Enabling campus grids with open science grid technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weitzel, Derek; Bockelman, Brian; Swanson, David

    2011-01-01

    The Open Science Grid is a recognized key component of the US national cyber-infrastructure enabling scientific discovery through advanced high throughput computing. The principles and techniques that underlie the Open Science Grid can also be applied to Campus Grids since many of the requirements are the same, even if the implementation technologies differ. We find five requirements for a campus grid: trust relationships, job submission, resource independence, accounting, and data management. The Holland Computing Center's campus grid at the University of Nebraska-Lincoln was designed to fulfill the requirements of a campus grid. A bridging daemon was designed to bring non-Condormore » clusters into a grid managed by Condor. Condor features which make it possible to bridge Condor sites into a multi-campus grid have been exploited at the Holland Computing Center as well.« less

  3. Advanced Technology for Engineering Education

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler); Malone, John B. (Compiler)

    1998-01-01

    This document contains the proceedings of the Workshop on Advanced Technology for Engineering Education, held at the Peninsula Graduate Engineering Center, Hampton, Virginia, February 24-25, 1998. The workshop was jointly sponsored by the University of Virginia's Center for Advanced Computational Technology and NASA. Workshop attendees came from NASA, other government agencies, industry and universities. The objectives of the workshop were to assess the status of advanced technologies for engineering education and to explore the possibility of forming a consortium of interested individuals/universities for curriculum reform and development using advanced technologies. The presentations covered novel delivery systems and several implementations of new technologies for engineering education. Certain materials and products are identified in this publication in order to specify adequately the materials and products that were investigated in the research effort. In no case does such identification imply recommendation or endorsement of products by NASA, nor does it imply that the materials and products are the only ones or the best ones available for this purpose. In many cases equivalent materials and products are available and would probably produce equivalent results.

  4. ASCR Cybersecurity for Scientific Computing Integrity - Research Pathways and Ideas Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peisert, Sean; Potok, Thomas E.; Jones, Todd

    At the request of the U.S. Department of Energy's (DOE) Office of Science (SC) Advanced Scientific Computing Research (ASCR) program office, a workshop was held June 2-3, 2015, in Gaithersburg, MD, to identify potential long term (10 to +20 year) cybersecurity fundamental basic research and development challenges, strategies and roadmap facing future high performance computing (HPC), networks, data centers, and extreme-scale scientific user facilities. This workshop was a follow-on to the workshop held January 7-9, 2015, in Rockville, MD, that examined higher level ideas about scientific computing integrity specific to the mission of the DOE Office of Science. Issues includedmore » research computation and simulation that takes place on ASCR computing facilities and networks, as well as network-connected scientific instruments, such as those run by various DOE Office of Science programs. Workshop participants included researchers and operational staff from DOE national laboratories, as well as academic researchers and industry experts. Participants were selected based on the submission of abstracts relating to the topics discussed in the previous workshop report [1] and also from other ASCR reports, including "Abstract Machine Models and Proxy Architectures for Exascale Computing" [27], the DOE "Preliminary Conceptual Design for an Exascale Computing Initiative" [28], and the January 2015 machine learning workshop [29]. The workshop was also attended by several observers from DOE and other government agencies. The workshop was divided into three topic areas: (1) Trustworthy Supercomputing, (2) Extreme-Scale Data, Knowledge, and Analytics for Understanding and Improving Cybersecurity, and (3) Trust within High-end Networking and Data Centers. Participants were divided into three corresponding teams based on the category of their abstracts. The workshop began with a series of talks from the program manager and workshop chair, followed by the leaders for each of the three topics and a representative of each of the four major DOE Office of Science Advanced Scientific Computing Research Facilities: the Argonne Leadership Computing Facility (ALCF), the Energy Sciences Network (ESnet), the National Energy Research Scientific Computing Center (NERSC), and the Oak Ridge Leadership Computing Facility (OLCF). The rest of the workshop consisted of topical breakout discussions and focused writing periods that produced much of this report.« less

  5. Innovative Materials for Aircraft Morphing

    NASA Technical Reports Server (NTRS)

    Simpson, J. O.; Wise, S. A.; Bryant, R. G.; Cano, R. J.; Gates, T. S.; Hinkley, J. A.; Rogowski, R. S.; Whitley, K. S.

    1997-01-01

    Reported herein is an overview of the research being conducted within the Materials Division at NASA Langley Research Center on the development of smart material technologies for advanced airframe systems. The research is a part of the Aircraft Morphing Program which is a new six-year research program to develop smart components for self-adaptive airframe systems. The fundamental areas of materials research within the program are computational materials; advanced piezoelectric materials; advanced fiber optic sensing techniques; and fabrication of integrated composite structures. This paper presents a portion of the ongoing research in each of these areas of materials research.

  6. A One-of-a-Kind Technology Expansion.

    ERIC Educational Resources Information Center

    Wiens, Janet

    2002-01-01

    Describes the design of the expansion of the National Center for Supercomputing Applications (NCSA) Advanced Computation Building at the University of Illinois, Champaign. Discusses how the design incorporated column-free space for flexibility, cooling capacity, a freight elevator, and a 6-foot raised access floor to neatly house airflow, wiring,…

  7. Embedded 100 Gbps Photonic Components

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuznia, Charlie

    This innovation to fiber optic component technology increases the performance, reduces the size and reduces the power consumption of optical communications within dense network systems, such as advanced distributed computing systems and data centers. VCSEL technology is enabling short-reach (< 100 m) and >100 Gbps optical interconnections over multi-mode fiber in commercial applications.

  8. Investigation of Superdetonative Ram Accelerator Drive Modes

    DTIC Science & Technology

    1989-12-15

    137. 18. Dwoyer, D.L., Kutler, P., and Povinelli , L.A., "Retooling CFD for Hypersonic Aircraft," Aerospace America, Vol. 25, Oct. 1987, pp 32-35. 19... Povinelli , L.A., "Advanced Computational Techniques for Hypersonic Propulsion," NASA Technical Memorandum No. 102005, NASA Lewis Research Center, Sept

  9. The Research Exchange, 2001.

    ERIC Educational Resources Information Center

    Research Exchange, 2001

    2001-01-01

    These three newsletters focus on advances and challenges in disability research. The first issue focuses on the results of a survey that investigated how many consumers with disabilities had a computer available in their home and their Internet use. The study involved administrators of Independent Living Centers (ILC) and ILC consumers. Findings…

  10. CESAR research in intelligent machines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weisbin, C.R.

    1986-01-01

    The Center for Engineering Systems Advanced Research (CESAR) was established in 1983 as a national center for multidisciplinary, long-range research and development in machine intelligence and advanced control theory for energy-related applications. Intelligent machines of interest here are artificially created operational systems that are capable of autonomous decision making and action. The initial emphasis for research is remote operations, with specific application to dexterous manipulation in unstructured dangerous environments where explosives, toxic chemicals, or radioactivity may be present, or in other environments with significant risk such as coal mining or oceanographic missions. Potential benefits include reduced risk to man inmore » hazardous situations, machine replication of scarce expertise, minimization of human error due to fear or fatigue, and enhanced capability using high resolution sensors and powerful computers. A CESAR goal is to explore the interface between the advanced teleoperation capability of today, and the autonomous machines of the future.« less

  11. Research in mobile robotics at ORNL/CESAR (Oak Ridge National Laboratory/Center for Engineering Systems Advanced Research)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mann, R.C.; Weisbin, C.R.; Pin, F.G.

    1989-01-01

    This paper reviews ongoing and planned research with mobile autonomous robots at the Oak Ridge National Laboratory (ORNL), Center for Engineering Systems Advanced Research (CESAR). Specifically we report on results obtained with the robot HERMIES-IIB in navigation, intelligent sensing, learning, and on-board parallel computing in support of these functions. We briefly summarize an experiment with HERMIES-IIB that demonstrates the capability of smooth transitions between robot autonomy and tele-operation. This experiment results from collaboration among teams at the Universities of Florida, Michigan, Tennessee, and Texas; and ORNL in a program targeted at robotics for advanced nuclear power stations. We conclude bymore » summarizing ongoing R D with our new mobile robot HERMIES-III which is equipped with a seven degree-of-freedom research manipulator arm. 12 refs., 4 figs.« less

  12. Use of a Food and Drug Administration-Approved Type 1 Diabetes Mellitus Simulator to Evaluate and Optimize a Proportional-Integral-Derivative Controller

    DTIC Science & Technology

    2012-11-01

    performance . The simulations confirm that the PID algorithm can be applied to this cohort without the risk of hypoglycemia . Funding: The study was... Performance Computing Software Applications Institute, Telemedicine and Advanced Technology Research Center, U.S. Army Medical Research and Materiel Command...safe operating region, type 1 diabetes mellitus simulator Corresponding Author: Jaques Reifman, Ph.D., DoD Biotechnology High- Performance Computing

  13. Research in the Aloha system

    NASA Technical Reports Server (NTRS)

    Abramson, N.

    1974-01-01

    The Aloha system was studied and developed and extended to advanced forms of computer communications networks. Theoretical and simulation studies of Aloha type radio channels for use in packet switched communications networks were performed. Improved versions of the Aloha communications techniques and their extensions were tested experimentally. A packet radio repeater suitable for use with the Aloha system operational network was developed. General studies of the organization of multiprocessor systems centered on the development of the BCC 500 computer were concluded.

  14. HPC Annual Report 2017

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dennig, Yasmin

    Sandia National Laboratories has a long history of significant contributions to the high performance community and industry. Our innovative computer architectures allowed the United States to become the first to break the teraFLOP barrier—propelling us to the international spotlight. Our advanced simulation and modeling capabilities have been integral in high consequence US operations such as Operation Burnt Frost. Strong partnerships with industry leaders, such as Cray, Inc. and Goodyear, have enabled them to leverage our high performance computing (HPC) capabilities to gain a tremendous competitive edge in the marketplace. As part of our continuing commitment to providing modern computing infrastructuremore » and systems in support of Sandia missions, we made a major investment in expanding Building 725 to serve as the new home of HPC systems at Sandia. Work is expected to be completed in 2018 and will result in a modern facility of approximately 15,000 square feet of computer center space. The facility will be ready to house the newest National Nuclear Security Administration/Advanced Simulation and Computing (NNSA/ASC) Prototype platform being acquired by Sandia, with delivery in late 2019 or early 2020. This new system will enable continuing advances by Sandia science and engineering staff in the areas of operating system R&D, operation cost effectiveness (power and innovative cooling technologies), user environment and application code performance.« less

  15. The new Langley Research Center advanced real-time simulation (ARTS) system

    NASA Technical Reports Server (NTRS)

    Crawford, D. J.; Cleveland, J. I., II

    1986-01-01

    Based on a survey of current local area network technology with special attention paid to high bandwidth and very low transport delay requirements, NASA's Langley Research Center designed a new simulation subsystem using the computer automated measurement and control (CAMAC) network. This required significant modifications to the standard CAMAC system and development of a network switch, a clocking system, new conversion equipment, new consoles, supporting software, etc. This system is referred to as the advanced real-time simulation (ARTS) system. It is presently being built at LaRC. This paper provides a functional and physical description of the hardware and a functional description of the software. The requirements which drove the design are presented as well as present performance figures and status.

  16. Contour Mapping

    NASA Technical Reports Server (NTRS)

    1995-01-01

    In the early 1990s, the Ohio State University Center for Mapping, a NASA Center for the Commercial Development of Space (CCDS), developed a system for mobile mapping called the GPSVan. While driving, the users can map an area from the sophisticated mapping van equipped with satellite signal receivers, video cameras and computer systems for collecting and storing mapping data. George J. Igel and Company and the Ohio State University Center for Mapping advanced the technology for use in determining the contours of a construction site. The new system reduces the time required for mapping and staking, and can monitor the amount of soil moved.

  17. Research and educational initiatives at the Syracuse University Center for Hypersonics

    NASA Technical Reports Server (NTRS)

    Spina, E.; Lagraff, J.; Davidson, B.; Bogucz, E.; Dang, T.

    1995-01-01

    The Department of Mechanical, Aerospace, and Manufacturing Engineering and the Northeast Parallel Architectures Center of Syracuse University have been funded by NASA to establish a program to educate young engineers in the hypersonic disciplines. This goal is being achieved through a comprehensive five-year program that includes elements of undergraduate instruction, advanced graduate coursework, undergraduate research, and leading-edge hypersonics research. The research foci of the Syracuse Center for Hypersonics are three-fold; high-temperature composite materials, measurements in turbulent hypersonic flows, and the application of high-performance computing to hypersonic fluid dynamics.

  18. Stochastic Feedforward Control Technique

    NASA Technical Reports Server (NTRS)

    Halyo, Nesim

    1990-01-01

    Class of commanded trajectories modeled as stochastic process. Advanced Transport Operating Systems (ATOPS) research and development program conducted by NASA Langley Research Center aimed at developing capabilities for increases in capacities of airports, safe and accurate flight in adverse weather conditions including shear, winds, avoidance of wake vortexes, and reduced consumption of fuel. Advances in techniques for design of modern controls and increased capabilities of digital flight computers coupled with accurate guidance information from Microwave Landing System (MLS). Stochastic feedforward control technique developed within context of ATOPS program.

  19. High performance network and channel-based storage

    NASA Technical Reports Server (NTRS)

    Katz, Randy H.

    1991-01-01

    In the traditional mainframe-centered view of a computer system, storage devices are coupled to the system through complex hardware subsystems called input/output (I/O) channels. With the dramatic shift towards workstation-based computing, and its associated client/server model of computation, storage facilities are now found attached to file servers and distributed throughout the network. We discuss the underlying technology trends that are leading to high performance network-based storage, namely advances in networks, storage devices, and I/O controller and server architectures. We review several commercial systems and research prototypes that are leading to a new approach to high performance computing based on network-attached storage.

  20. The evolution of process-based hydrologic models: historical challenges and the collective quest for physical realism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clark, Martyn P.; Bierkens, Marc F. P.; Samaniego, Luis

    The diversity in hydrologic models has historically led to great controversy on the correct approach to process-based hydrologic modeling, with debates centered on the adequacy of process parameterizations, data limitations and uncertainty, and computational constraints on model analysis. Here, we revisit key modeling challenges on requirements to (1) define suitable model equations, (2) define adequate model parameters, and (3) cope with limitations in computing power. We outline the historical modeling challenges, provide examples of modeling advances that address these challenges, and define outstanding research needs. We also illustrate how modeling advances have been made by groups using models of different type and complexity,more » and we argue for the need to more effectively use our diversity of modeling approaches in order to advance our collective quest for physically realistic hydrologic models.« less

  1. The evolution of process-based hydrologic models: historical challenges and the collective quest for physical realism

    DOE PAGES

    Clark, Martyn P.; Bierkens, Marc F. P.; Samaniego, Luis; ...

    2017-07-11

    The diversity in hydrologic models has historically led to great controversy on the correct approach to process-based hydrologic modeling, with debates centered on the adequacy of process parameterizations, data limitations and uncertainty, and computational constraints on model analysis. Here, we revisit key modeling challenges on requirements to (1) define suitable model equations, (2) define adequate model parameters, and (3) cope with limitations in computing power. We outline the historical modeling challenges, provide examples of modeling advances that address these challenges, and define outstanding research needs. We also illustrate how modeling advances have been made by groups using models of different type and complexity,more » and we argue for the need to more effectively use our diversity of modeling approaches in order to advance our collective quest for physically realistic hydrologic models.« less

  2. User participation in the development of the human/computer interface for control centers

    NASA Technical Reports Server (NTRS)

    Broome, Richard; Quick-Campbell, Marlene; Creegan, James; Dutilly, Robert

    1996-01-01

    Technological advances coupled with the requirements to reduce operations staffing costs led to the demand for efficient, technologically-sophisticated mission operations control centers. The control center under development for the earth observing system (EOS) is considered. The users are involved in the development of a control center in order to ensure that it is cost-efficient and flexible. A number of measures were implemented in the EOS program in order to encourage user involvement in the area of human-computer interface development. The following user participation exercises carried out in relation to the system analysis and design are described: the shadow participation of the programmers during a day of operations; the flight operations personnel interviews; and the analysis of the flight operations team tasks. The user participation in the interface prototype development, the prototype evaluation, and the system implementation are reported on. The involvement of the users early in the development process enables the requirements to be better understood and the cost to be reduced.

  3. The National Center for Biomedical Ontology

    PubMed Central

    Noy, Natalya F; Shah, Nigam H; Whetzel, Patricia L; Chute, Christopher G; Story, Margaret-Anne; Smith, Barry

    2011-01-01

    The National Center for Biomedical Ontology is now in its seventh year. The goals of this National Center for Biomedical Computing are to: create and maintain a repository of biomedical ontologies and terminologies; build tools and web services to enable the use of ontologies and terminologies in clinical and translational research; educate their trainees and the scientific community broadly about biomedical ontology and ontology-based technology and best practices; and collaborate with a variety of groups who develop and use ontologies and terminologies in biomedicine. The centerpiece of the National Center for Biomedical Ontology is a web-based resource known as BioPortal. BioPortal makes available for research in computationally useful forms more than 270 of the world's biomedical ontologies and terminologies, and supports a wide range of web services that enable investigators to use the ontologies to annotate and retrieve data, to generate value sets and special-purpose lexicons, and to perform advanced analytics on a wide range of biomedical data. PMID:22081220

  4. A Look Inside Argonne's Center for Nanoscale Materials

    ScienceCinema

    Divan, Ralu; Rosenthal, Dan; Rose, Volker; Wai Hla

    2018-05-23

    At a very small, or "nano" scale, materials behave differently. The study of nanomaterials is much more than miniaturization - scientists are discovering how changes in size change a material's properties. From sunscreen to computer memory, the applications of nanoscale materials research are all around us. Researchers at Argonne's Center for Nanoscale Materials are creating new materials, methods and technologies to address some of the world's greatest challenges in energy security, lightweight but durable materials, high-efficiency lighting, information storage, environmental stewardship and advanced medical devices.

  5. Symposium on Automation, Robotics and Advanced Computing for the National Space Program (2nd) Held in Arlington, Virginia on 9-11 March 1987

    DTIC Science & Technology

    1988-02-28

    enormous investment in software. This is an area extremely important objective. We need additional where better methodologies , tools and theories...microscopy (SEM) and optical mi- [131 Hanson, A., et a. "A Methodology for the Develop- croscopy. Current activities include the study of SEM im- ment...through a phased knowledge engineering methodology Center (ARC) and NASA Johnson Space Center consisting of: prototype knowledge base develop- iJSC

  6. Johnson Space Center Research and Technology 1997 Annual Report

    NASA Technical Reports Server (NTRS)

    1998-01-01

    This report highlights key projects and technologies at Johnson Space Center for 1997. The report focuses on the commercial potential of the projects and technologies and is arranged by CorpTech Major Products Groups. Emerging technologies in these major disciplines we summarized: solar system sciences, life sciences, technology transfer, computer sciences, space technology, and human support technology. Them NASA advances have a range of potential commercial applications, from a school internet manager for networks to a liquid metal mirror for optical measurements.

  7. A study of workstation computational performance for real-time flight simulation

    NASA Technical Reports Server (NTRS)

    Maddalon, Jeffrey M.; Cleveland, Jeff I., II

    1995-01-01

    With recent advances in microprocessor technology, some have suggested that modern workstations provide enough computational power to properly operate a real-time simulation. This paper presents the results of a computational benchmark, based on actual real-time flight simulation code used at Langley Research Center, which was executed on various workstation-class machines. The benchmark was executed on different machines from several companies including: CONVEX Computer Corporation, Cray Research, Digital Equipment Corporation, Hewlett-Packard, Intel, International Business Machines, Silicon Graphics, and Sun Microsystems. The machines are compared by their execution speed, computational accuracy, and porting effort. The results of this study show that the raw computational power needed for real-time simulation is now offered by workstations.

  8. Preliminary Computational Study for Future Tests in the NASA Ames 9 foot' x 7 foot Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Pearl, Jason M.; Carter, Melissa B.; Elmiligui, Alaa A.; WInski, Courtney S.; Nayani, Sudheer N.

    2016-01-01

    The NASA Advanced Air Vehicles Program, Commercial Supersonics Technology Project seeks to advance tools and techniques to make over-land supersonic flight feasible. In this study, preliminary computational results are presented for future tests in the NASA Ames 9 foot x 7 foot supersonic wind tunnel to be conducted in early 2016. Shock-plume interactions and their effect on pressure signature are examined for six model geometries. Near- field pressure signatures are assessed using the CFD code USM3D to model the proposed test geometries in free-air. Additionally, results obtained using the commercial grid generation software Pointwise Reigistered Trademark are compared to results using VGRID, the NASA Langley Research Center in-house mesh generation program.

  9. Global satellite composites - 20 years of evolution

    NASA Astrophysics Data System (ADS)

    Kohrs, Richard A.; Lazzara, Matthew A.; Robaidek, Jerrold O.; Santek, David A.; Knuth, Shelley L.

    2014-01-01

    For two decades, the University of Wisconsin Space Science and Engineering Center (SSEC) and the Antarctic Meteorological Research Center (AMRC) have been creating global, regional and hemispheric satellite composites. These composites have proven useful in research, operational forecasting, commercial applications and educational outreach. Using the Man computer Interactive Data System (McIDAS) software developed at SSEC, infrared window composites were created by combining Geostationary Operational Environmental Satellite (GOES), and polar orbiting data from the SSEC Data Center and polar data acquired at McMurdo and Palmer stations, Antarctica. Increased computer processing speed has allowed for more advanced algorithms to address the decision making process for co-located pixels. The algorithms have evolved from a simplistic maximum brightness temperature to those that account for distance from the sub-satellite point, parallax displacement, pixel time and resolution. The composites are the state-of-the-art means for merging/mosaicking satellite imagery.

  10. ISCR FY2005 Annual Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keyes, D E; McGraw, J R

    2006-02-02

    Large-scale scientific computation and all of the disciplines that support and help validate it have been placed at the focus of Lawrence Livermore National Laboratory (LLNL) by the Advanced Simulation and Computing (ASC) program of the National Nuclear Security Administration (NNSA) and the Scientific Discovery through Advanced Computing (SciDAC) initiative of the Office of Science of the Department of Energy (DOE). The maturation of simulation as a fundamental tool of scientific and engineering research is underscored in the President's Information Technology Advisory Committee (PITAC) June 2005 finding that ''computational science has become critical to scientific leadership, economic competitiveness, and nationalmore » security''. LLNL operates several of the world's most powerful computers--including today's single most powerful--and has undertaken some of the largest and most compute-intensive simulations ever performed, most notably the molecular dynamics simulation that sustained more than 100 Teraflop/s and won the 2005 Gordon Bell Prize. Ultrascale simulation has been identified as one of the highest priorities in DOE's facilities planning for the next two decades. However, computers at architectural extremes are notoriously difficult to use in an efficient manner. Furthermore, each successful terascale simulation only points out the need for much better ways of interacting with the resulting avalanche of data. Advances in scientific computing research have, therefore, never been more vital to the core missions of LLNL than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, LLNL must engage researchers at many academic centers of excellence. In FY 2005, the Institute for Scientific Computing Research (ISCR) served as one of LLNL's main bridges to the academic community with a program of collaborative subcontracts, visiting faculty, student internships, workshops, and an active seminar series. The ISCR identifies researchers from the academic community for computer science and computational science collaborations with LLNL and hosts them for both brief and extended visits with the aim of encouraging long-term academic research agendas that address LLNL research priorities. Through these collaborations, ideas and software flow in both directions, and LLNL cultivates its future workforce. The Institute strives to be LLNL's ''eyes and ears'' in the computer and information sciences, keeping the Laboratory aware of and connected to important external advances. It also attempts to be the ''hands and feet'' that carry those advances into the Laboratory and incorporate them into practice. ISCR research participants are integrated into LLNL's Computing Applications and Research (CAR) Department, especially into its Center for Applied Scientific Computing (CASC). In turn, these organizations address computational challenges arising throughout the rest of the Laboratory. Administratively, the ISCR flourishes under LLNL's University Relations Program (URP). Together with the other four institutes of the URP, the ISCR navigates a course that allows LLNL to benefit from academic exchanges while preserving national security. While it is difficult to operate an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and worth the continued effort. The pages of this annual report summarize the activities of the faculty members, postdoctoral researchers, students, and guests from industry and other laboratories who participated in LLNL's computational mission under the auspices of the ISCR during FY 2005.« less

  11. CESDIS

    NASA Technical Reports Server (NTRS)

    1994-01-01

    CESDIS, the Center of Excellence in Space Data and Information Sciences was developed jointly by NASA, Universities Space Research Association (USRA), and the University of Maryland in 1988 to focus on the design of advanced computing techniques and data systems to support NASA Earth and space science research programs. CESDIS is operated by USRA under contract to NASA. The Director, Associate Director, Staff Scientists, and administrative staff are located on-site at NASA's Goddard Space Flight Center in Greenbelt, Maryland. The primary CESDIS mission is to increase the connection between computer science and engineering research programs at colleges and universities and NASA groups working with computer applications in Earth and space science. The 1993-94 CESDIS year included a broad range of computer science research applied to NASA problems. This report provides an overview of these research projects and programs as well as a summary of the various other activities of CESDIS in support of NASA and the university research community, We have had an exciting and challenging year.

  12. NASA Intelligent Systems Project: Results, Accomplishments and Impact on Science Missions.

    NASA Astrophysics Data System (ADS)

    Coughlan, J. C.

    2005-12-01

    The Intelligent Systems Project was responsible for much of NASA's programmatic investment in artificial intelligence and advanced information technologies. IS has completed three major project milestones which demonstrated increased capabilities in autonomy, human centered computing, and intelligent data understanding. Autonomy involves the ability of a robot to place an instrument on a remote surface with a single command cycle, human centered computing supported a collaborative, mission centric data and planning system for the Mars Exploration Rovers and data understanding has produced key components of a terrestrial satellite observation system with automated modeling and data analysis capabilities. This paper summarizes the technology demonstrations and metrics which quantify and summarize these new technologies which are now available for future NASA missions.

  13. NASA Intelligent Systems Project: Results, Accomplishments and Impact on Science Missions

    NASA Technical Reports Server (NTRS)

    Coughlan, Joseph C.

    2005-01-01

    The Intelligent Systems Project was responsible for much of NASA's programmatic investment in artificial intelligence and advanced information technologies. IS has completed three major project milestones which demonstrated increased capabilities in autonomy, human centered computing, and intelligent data understanding. Autonomy involves the ability of a robot to place an instrument on a remote surface with a single command cycle. Human centered computing supported a collaborative, mission centric data and planning system for the Mars Exploration Rovers and data understanding has produced key components of a terrestrial satellite observation system with automated modeling and data analysis capabilities. This paper summarizes the technology demonstrations and metrics which quantify and summarize these new technologies which are now available for future Nasa missions.

  14. The Inside View

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Bio-Imaging Research, Inc., has been included in Spinoff 1990 and 1993 with spinoffs from their ACTIS (Advanced Computed Tomography Inspection System) product developed under a Marshall Space Flight Center SBIR (Small Business Innovative Research) contract. The latest application is for noninvasive nuclear waste drum inspection. With the ACTIS CT (computed tomography, CATScan) scanner, radioactive waste is examined to prove that they do not contain one-half percent free liquid or that the drum wall has lost integrity before being moved across state lines or before being permanently disposed.

  15. The Future is Hera: Analyzing Astronomical Data Over the Internet

    NASA Astrophysics Data System (ADS)

    Valencic, Lynne A.; Snowden, S.; Chai, P.; Shafer, R.

    2009-01-01

    Hera is the new data processing facility provided by the HEASARC at the NASA Goddard Space Flight Center for analyzing astronomical data. Hera provides all the preinstalled software packages, local disk space, and computing resources needed to do general processing of FITS format data files residing on the user's local computer, and to do advanced research using the publicly available data from High Energy Astrophysics missions. Qualified students, educators, and researchers may freely use the Hera services over the internet for research and educational purposes.

  16. Basic EMC Technology Advancement for C(3) Systems - SHIELD. Volume IV B. A Digital Computer Program for Computing Crosstalk between Shielded Cables

    DTIC Science & Technology

    1982-11-01

    your organization , please notify RADC OBCT) Griffiss AFB NY 13441. This will assist us in maintaining a current mailing list. Do not return copies of...RMING ORGANIZATION NAME r AND ADDRESS 10. PROGRAM ELEMENT. PROJECT. TASK Southeastern Center for Electrical AREA6WORKUNITNUMBERS Engineering Education...The program requires that the input data groups be organized as shown in Table 1 where the number of unshielded wires is U and the number of shielded

  17. Computer Support for Knowledge Communication in Science Exhibitions: Novel Perspectives from Research on Collaborative Learning

    ERIC Educational Resources Information Center

    Knipfer, Kristin; Mayr, Eva; Zahn, Carmen; Schwan, Stephan; Hesse, Friedrich W.

    2009-01-01

    In this article, the potentials of advanced technologies for learning in science exhibitions are outlined. For this purpose, we conceptualize science exhibitions as "dynamic information space for knowledge building" which includes three pathways of knowledge communication. This article centers on the second pathway, that is, knowledge…

  18. A Review of Models and Frameworks for Designing Mobile Learning Experiences and Environments

    ERIC Educational Resources Information Center

    Hsu, Yu-Chang; Ching, Yu-Hui

    2015-01-01

    Mobile learning has become increasingly popular in the past decade due to the unprecedented technological affordances achieved through the advancement of mobile computing, which makes ubiquitous and situated learning possible. At the same time, there have been research and implementation projects whose efforts centered on developing mobile…

  19. High Tech and the Upward Mobility of Non-Technical People.

    ERIC Educational Resources Information Center

    Kammire, Linda L.

    The social and psychological effects of rapid technological advancement in the computer industry is the subject of this paper, which focuses on the concerns of people with non-technical backgrounds. It describes the career series, High Tech for Non-Technical People, created by the Georgia State University Career Development Center. The three…

  20. Low-Cost Terminal Alternative for Learning Center Managers. Final Report.

    ERIC Educational Resources Information Center

    Nix, C. Jerome; And Others

    This study established the feasibility of replacing high performance and relatively expensive computer terminals with less expensive ones adequate for supporting specific tasks of Advanced Instructional System (AIS) at Lowry AFB, Colorado. Surveys of user requirements and available devices were conducted and the results used in a system analysis.…

  1. Edison - A New Cray Supercomputer Advances Discovery at NERSC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dosanjh, Sudip; Parkinson, Dula; Yelick, Kathy

    2014-02-06

    When a supercomputing center installs a new system, users are invited to make heavy use of the computer as part of the rigorous testing. In this video, find out what top scientists have discovered using Edison, a Cray XC30 supercomputer, and how NERSC's newest supercomputer will accelerate their future research.

  2. Army Maneuver Center of Excellence

    DTIC Science & Technology

    2012-10-18

    agreements throughout DoD DARPA, JIEDDO, DHS, FAA, DoE, NSA , NASA, SMDC, etc. Strategic Partnerships Benefit the Army Materiel Enterprise External... Neuroscience Network Sciences Hierarchical Computing Extreme Energy Science Autonomous Systems Technology Emerging Sciences Meso-scale (grain...scales • Improvements in Soldier-system overall performance → operational neuroscience and advanced simulation and training technologies

  3. Edison - A New Cray Supercomputer Advances Discovery at NERSC

    ScienceCinema

    Dosanjh, Sudip; Parkinson, Dula; Yelick, Kathy; Trebotich, David; Broughton, Jeff; Antypas, Katie; Lukic, Zarija, Borrill, Julian; Draney, Brent; Chen, Jackie

    2018-01-16

    When a supercomputing center installs a new system, users are invited to make heavy use of the computer as part of the rigorous testing. In this video, find out what top scientists have discovered using Edison, a Cray XC30 supercomputer, and how NERSC's newest supercomputer will accelerate their future research.

  4. OCLC Research: 2012 Activity Report

    ERIC Educational Resources Information Center

    OCLC Online Computer Library Center, Inc., 2013

    2013-01-01

    The mission of the Online Computer Library Center (OCLC) Research is to expand knowledge that advances OCLC's public purposes of furthering access to the world's information and reducing library costs. OCLC Research is dedicated to three roles: (1)To act as a community resource for shared research and development (R&D); (2) To provide advanced…

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    East, D. R.; Sexton, J.

    This was a collaborative effort between Lawrence Livermore National Security, LLC as manager and operator of Lawrence Livermore National Laboratory (LLNL) and IBM TJ Watson Research Center to research, assess feasibility and develop an implementation plan for a High Performance Computing Innovation Center (HPCIC) in the Livermore Valley Open Campus (LVOC). The ultimate goal of this work was to help advance the State of California and U.S. commercial competitiveness in the arena of High Performance Computing (HPC) by accelerating the adoption of computational science solutions, consistent with recent DOE strategy directives. The desired result of this CRADA was a well-researched,more » carefully analyzed market evaluation that would identify those firms in core sectors of the US economy seeking to adopt or expand their use of HPC to become more competitive globally, and to define how those firms could be helped by the HPCIC with IBM as an integral partner.« less

  6. A static data flow simulation study at Ames Research Center

    NASA Technical Reports Server (NTRS)

    Barszcz, Eric; Howard, Lauri S.

    1987-01-01

    Demands in computational power, particularly in the area of computational fluid dynamics (CFD), led NASA Ames Research Center to study advanced computer architectures. One architecture being studied is the static data flow architecture based on research done by Jack B. Dennis at MIT. To improve understanding of this architecture, a static data flow simulator, written in Pascal, has been implemented for use on a Cray X-MP/48. A matrix multiply and a two-dimensional fast Fourier transform (FFT), two algorithms used in CFD work at Ames, have been run on the simulator. Execution times can vary by a factor of more than 2 depending on the partitioning method used to assign instructions to processing elements. Service time for matching tokens has proved to be a major bottleneck. Loop control and array address calculation overhead can double the execution time. The best sustained MFLOPS rates were less than 50% of the maximum capability of the machine.

  7. The QUANTGRID Project (RO)—Quantum Security in GRID Computing Applications

    NASA Astrophysics Data System (ADS)

    Dima, M.; Dulea, M.; Petre, M.; Petre, C.; Mitrica, B.; Stoica, M.; Udrea, M.; Sterian, R.; Sterian, P.

    2010-01-01

    The QUANTGRID Project, financed through the National Center for Programme Management (CNMP-Romania), is the first attempt at using Quantum Crypted Communications (QCC) in large scale operations, such as GRID Computing, and conceivably in the years ahead in the banking sector and other security tight communications. In relation with the GRID activities of the Center for Computing & Communications (Nat.'l Inst. Nucl. Phys.—IFIN-HH), the Quantum Optics Lab. (Nat.'l Inst. Plasma and Lasers—INFLPR) and the Physics Dept. (University Polytechnica—UPB) the project will build a demonstrator infrastructure for this technology. The status of the project in its incipient phase is reported, featuring tests for communications in classical security mode: socket level communications under AES (Advanced Encryption Std.), both proprietary code in C++ technology. An outline of the planned undertaking of the project is communicated, highlighting its impact in quantum physics, coherent optics and information technology.

  8. Challenging Technology, and Technology Infusion into 21st Century

    NASA Technical Reports Server (NTRS)

    Chau, S. N.; Hunter, D. J.

    2001-01-01

    In preparing for the space exploration challenges of the next century, the National Aeronautics and Space Administration (NASA) Center for Integrated Space Micro-Systems (CISM) is chartered to develop advanced spacecraft systems that can be adapted for a large spectrum of future space missions. Enabling this task are revolutions in the miniaturization of electrical, mechanical, and computational functions. On the other hand, these revolutionary technologies usually have much lower readiness levels than those required by flight projects. The mission of the Advanced Micro Spacecraft (AMS) task in CISM is to bridge the readiness gap between advanced technologies and flight projects. Additional information is contained in the original extended abstract.

  9. Computational Science: A Research Methodology for the 21st Century

    NASA Astrophysics Data System (ADS)

    Orbach, Raymond L.

    2004-03-01

    Computational simulation - a means of scientific discovery that employs computer systems to simulate a physical system according to laws derived from theory and experiment - has attained peer status with theory and experiment. Important advances in basic science are accomplished by a new "sociology" for ultrascale scientific computing capability (USSCC), a fusion of sustained advances in scientific models, mathematical algorithms, computer architecture, and scientific software engineering. Expansion of current capabilities by factors of 100 - 1000 open up new vistas for scientific discovery: long term climatic variability and change, macroscopic material design from correlated behavior at the nanoscale, design and optimization of magnetic confinement fusion reactors, strong interactions on a computational lattice through quantum chromodynamics, and stellar explosions and element production. The "virtual prototype," made possible by this expansion, can markedly reduce time-to-market for industrial applications such as jet engines and safer, more fuel efficient cleaner cars. In order to develop USSCC, the National Energy Research Scientific Computing Center (NERSC) announced the competition "Innovative and Novel Computational Impact on Theory and Experiment" (INCITE), with no requirement for current DOE sponsorship. Fifty nine proposals for grand challenge scientific problems were submitted for a small number of awards. The successful grants, and their preliminary progress, will be described.

  10. Future experimental needs to support applied aerodynamics - A transonic perspective

    NASA Technical Reports Server (NTRS)

    Gloss, Blair B.

    1992-01-01

    Advancements in facilities, test techniques, and instrumentation are needed to provide data required for the development of advanced aircraft and to verify computational methods. An industry survey of major users of wind tunnel facilities at Langley Research Center (LaRC) was recently carried out to determine future facility requirements, test techniques, and instrumentation requirements; results from this survey are reflected in this paper. In addition, areas related to transonic testing at LaRC which are either currently being developed or are recognized as needing improvements are discussed.

  11. Advanced Plant Habitat

    NASA Image and Video Library

    2016-11-17

    A test unit, or prototype, of NASA's Advanced Plant Habitat (APH) was delivered to the Space Station Processing Facility at the agency's Kennedy Space Center in Florida. Inside a laboratory, Engineering Services Contract engineers set up test parameters on computers. From left, are Glenn Washington, ESC quality engineer; Claton Grosse, ESC mechanical engineer; and Jeff Richards, ESC project scientist. The APH is the largest plant chamber built for the agency. It will have 180 sensors and four times the light output of Veggie. The APH will be delivered to the International Space Station in March 2017.

  12. Advanced technology airfoil research, volume 1, part 2

    NASA Technical Reports Server (NTRS)

    1978-01-01

    This compilation contains papers presented at the NASA Conference on Advanced Technology Airfoil Research held at Langley Research Center on March 7-9, 1978, which have unlimited distribution. This conference provided a comprehensive review of all NASA airfoil research, conducted in-house and under grant and contract. A broad spectrum of airfoil research outside of NASA was also reviewed. The major thrust of the technical sessions were in three areas: development of computational aerodynamic codes for airfoil analysis and design, development of experimental facilities and test techniques, and all types of airfoil applications.

  13. Advanced Methodologies for NASA Science Missions

    NASA Astrophysics Data System (ADS)

    Hurlburt, N. E.; Feigelson, E.; Mentzel, C.

    2017-12-01

    Most of NASA's commitment to computational space science involves the organization and processing of Big Data from space-based satellites, and the calculations of advanced physical models based on these datasets. But considerable thought is also needed on what computations are needed. The science questions addressed by space data are so diverse and complex that traditional analysis procedures are often inadequate. The knowledge and skills of the statistician, applied mathematician, and algorithmic computer scientist must be incorporated into programs that currently emphasize engineering and physical science. NASA's culture and administrative mechanisms take full cognizance that major advances in space science are driven by improvements in instrumentation. But it is less well recognized that new instruments and science questions give rise to new challenges in the treatment of satellite data after it is telemetered to the ground. These issues might be divided into two stages: data reduction through software pipelines developed within NASA mission centers; and science analysis that is performed by hundreds of space scientists dispersed through NASA, U.S. universities, and abroad. Both stages benefit from the latest statistical and computational methods; in some cases, the science result is completely inaccessible using traditional procedures. This paper will review the current state of NASA and present example applications using modern methodologies.

  14. Time-efficient simulations of tight-binding electronic structures with Intel Xeon PhiTM many-core processors

    NASA Astrophysics Data System (ADS)

    Ryu, Hoon; Jeong, Yosang; Kang, Ji-Hoon; Cho, Kyu Nam

    2016-12-01

    Modelling of multi-million atomic semiconductor structures is important as it not only predicts properties of physically realizable novel materials, but can accelerate advanced device designs. This work elaborates a new Technology-Computer-Aided-Design (TCAD) tool for nanoelectronics modelling, which uses a sp3d5s∗ tight-binding approach to describe multi-million atomic structures, and simulate electronic structures with high performance computing (HPC), including atomic effects such as alloy and dopant disorders. Being named as Quantum simulation tool for Advanced Nanoscale Devices (Q-AND), the tool shows nice scalability on traditional multi-core HPC clusters implying the strong capability of large-scale electronic structure simulations, particularly with remarkable performance enhancement on latest clusters of Intel Xeon PhiTM coprocessors. A review of the recent modelling study conducted to understand an experimental work of highly phosphorus-doped silicon nanowires, is presented to demonstrate the utility of Q-AND. Having been developed via Intel Parallel Computing Center project, Q-AND will be open to public to establish a sound framework of nanoelectronics modelling with advanced HPC clusters of a many-core base. With details of the development methodology and exemplary study of dopant electronics, this work will present a practical guideline for TCAD development to researchers in the field of computational nanoelectronics.

  15. Optical interconnection networks for high-performance computing systems

    NASA Astrophysics Data System (ADS)

    Biberman, Aleksandr; Bergman, Keren

    2012-04-01

    Enabled by silicon photonic technology, optical interconnection networks have the potential to be a key disruptive technology in computing and communication industries. The enduring pursuit of performance gains in computing, combined with stringent power constraints, has fostered the ever-growing computational parallelism associated with chip multiprocessors, memory systems, high-performance computing systems and data centers. Sustaining these parallelism growths introduces unique challenges for on- and off-chip communications, shifting the focus toward novel and fundamentally different communication approaches. Chip-scale photonic interconnection networks, enabled by high-performance silicon photonic devices, offer unprecedented bandwidth scalability with reduced power consumption. We demonstrate that the silicon photonic platforms have already produced all the high-performance photonic devices required to realize these types of networks. Through extensive empirical characterization in much of our work, we demonstrate such feasibility of waveguides, modulators, switches and photodetectors. We also demonstrate systems that simultaneously combine many functionalities to achieve more complex building blocks. We propose novel silicon photonic devices, subsystems, network topologies and architectures to enable unprecedented performance of these photonic interconnection networks. Furthermore, the advantages of photonic interconnection networks extend far beyond the chip, offering advanced communication environments for memory systems, high-performance computing systems, and data centers.

  16. Computational Toxicology as Implemented by the US EPA ...

    EPA Pesticide Factsheets

    Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, the U.S. Environmental Protection Agency EPA is developing robust and flexible computational tools that can be applied to the thousands of chemicals in commerce, and contaminant mixtures found in air, water, and hazardous-waste sites. The Office of Research and Development (ORD) Computational Toxicology Research Program (CTRP) is composed of three main elements. The largest component is the National Center for Computational Toxicology (NCCT), which was established in 2005 to coordinate research on chemical screening and prioritization, informatics, and systems modeling. The second element consists of related activities in the National Health and Environmental Effects Research Laboratory (NHEERL) and the National Exposure Research Laboratory (NERL). The third and final component consists of academic centers working on various aspects of computational toxicology and funded by the U.S. EPA Science to Achieve Results (STAR) program. Together these elements form the key components in the implementation of both the initial strategy, A Framework for a Computational Toxicology Research Program (U.S. EPA, 2003), and the newly released The U.S. Environmental Protection Agency's Strategic Plan for Evaluating the T

  17. Flow Solution for Advanced Separate Flow Nozzles Response A: Structured Grid Navier-Stokes Approach

    NASA Technical Reports Server (NTRS)

    Kenzakowski, D. C.; Shipman, J.; Dash, S. M.; Saiyed, Naseem (Technical Monitor)

    2001-01-01

    NASA Glenn Research Center funded a computational study to investigate the effect of chevrons and tabs on the exhaust plume from separate flow nozzles. Numerical studies were conducted at typical takeoff power with 0.28 M flight speed. Report provides numerical data and insights into the mechanisms responsible for increased mixing.

  18. How to Quickly Import CAD Geometry into Thermal Desktop

    NASA Technical Reports Server (NTRS)

    Wright, Shonte; Beltran, Emilio

    2002-01-01

    There are several groups at JPL (Jet Propulsion Laboratory) that are committed to concurrent design efforts, two are featured here. Center for Space Mission Architecture and Design (CSMAD) enables the practical application of advanced process technologies in JPL's mission architecture process. Team I functions as an incubator for projects that are in the Discovery, and even pre-Discovery proposal stages. JPL's concurrent design environment is to a large extent centered on the CAD (Computer Aided Design) file. During concurrent design sessions CAD geometry is ported to other more specialized engineering design packages.

  19. Lobachevsky Year at Kazan University: Center of Science, Education, Intellectual-Cognitive Tourism "Kazan - GeoNa - 2020+" and "Kazan-Moon-2020+" projects

    NASA Astrophysics Data System (ADS)

    Gusev, A.; Trudkova, N.

    2017-09-01

    Center "GeoNa" will enable scientists and teachers of the Russian universities to join to advanced achievements of a science, information technologies; to establish scientific communications with foreign colleagues in sphere of the high technology, educational projects and Intellectual-Cognitive Tourism. The Project "Kazan - Moon - 2020+" is directed on the decision of fundamental problems of celestial mechanics, selenodesy and geophysics of the Moon(s) connected to carrying out of complex theoretical researches and computer modelling.

  20. Introduction to USRA

    NASA Technical Reports Server (NTRS)

    Davis, M. H. (Editor); Singy, A. (Editor)

    1994-01-01

    The Universities Space Research Association (USRA) was incorporated 25 years ago in the District of Columbia as a private nonprofit corporation under the auspices of the National Academy of Sciences. Institutional membership in the association has grown from 49 colleges and universities, when it was founded, to 76 in 1993. USRA provides a mechanism through which universities can cooperate effectively with one another, with the government, and with other organizations to further space science and technology and to promote education in these areas. Its mission is carried out through the institutes, centers, divisions, and programs that are described in detail in this booklet. These include the Lunar and Planetary Institute, the Institute for Computer Applications in Science and Engineering (ICASE), the Research Institute for Advanced Computer Science (RIACS), and the Center of Excellence in Space Data and Information Sciences (CESDIS).

  1. Computational fluid dynamics applications at McDonnel Douglas

    NASA Technical Reports Server (NTRS)

    Hakkinen, R. J.

    1987-01-01

    Representative examples are presented of applications and development of advanced Computational Fluid Dynamics (CFD) codes for aerodynamic design at the McDonnell Douglas Corporation (MDC). Transonic potential and Euler codes, interactively coupled with boundary layer computation, and solutions of slender-layer Navier-Stokes approximation are applied to aircraft wing/body calculations. An optimization procedure using evolution theory is described in the context of transonic wing design. Euler methods are presented for analysis of hypersonic configurations, and helicopter rotors in hover and forward flight. Several of these projects were accepted for access to the Numerical Aerodynamic Simulation (NAS) facility at the NASA-Ames Research Center.

  2. Binary Black Hole Mergers, Gravitational Waves, and LISA

    NASA Astrophysics Data System (ADS)

    Centrella, Joan; Baker, J.; Boggs, W.; Kelly, B.; McWilliams, S.; van Meter, J.

    2007-12-01

    The final merger of comparable mass binary black holes is expected to be the strongest source of gravitational waves for LISA. Since these mergers take place in regions of extreme gravity, we need to solve Einstein's equations of general relativity on a computer in order to calculate these waveforms. For more than 30 years, scientists have tried to compute black hole mergers using the methods of numerical relativity. The resulting computer codes have been plagued by instabilities, causing them to crash well before the black holes in the binary could complete even a single orbit. Within the past few years, however, this situation has changed dramatically, with a series of remarkable breakthroughs. We will present the results of new simulations of black hole mergers with unequal masses and spins, focusing on the gravitational waves emitted and the accompanying astrophysical "kicks.” The magnitude of these kicks has bearing on the production and growth of supermassive blackholes during the epoch of structure formation, and on the retention of black holes in stellar clusters. This work was supported by NASA grant 06-BEFS06-19, and the simulations were carried out using Project Columbia at the NASA Advanced Supercomputing Division (Ames Research Center) and at the NASA Center for Computational Sciences (Goddard Space Flight Center).

  3. Advanced ballistic range technology

    NASA Technical Reports Server (NTRS)

    Yates, Leslie A.

    1994-01-01

    The research conducted supported two facilities at NASA Ames Research Center: the Hypervelocity Free-Flight Aerodynamic Facility and the 16-Inch Shock Tunnel. During the grant period, a computerized film-reading system was developed, and five- and six-degree-of-freedom parameter-identification routines were written and successfully implemented. Studies of flow separation were conducted, and methods to extract phase shift information from finite-fringe interferograms were developed. Methods for constructing optical images from Computational Fluid Dynamics solutions were also developed, and these methods were used for one-to-one comparisons of experiment and computations.

  4. Computational Fluid Dynamics Symposium on Aeropropulsion

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Recognizing the considerable advances that have been made in computational fluid dynamics, the Internal Fluid Mechanics Division of NASA Lewis Research Center sponsored this symposium with the objective of providing a forum for exchanging information regarding recent developments in numerical methods, physical and chemical modeling, and applications. This conference publication is a compilation of 4 invited and 34 contributed papers presented in six sessions: algorithms one and two, turbomachinery, turbulence, components application, and combustors. Topics include numerical methods, grid generation, chemically reacting flows, turbulence modeling, inlets, nozzles, and unsteady flows.

  5. Advances in Human-Computer Interaction: Graphics and Animation Components for Interface Design

    NASA Astrophysics Data System (ADS)

    Cipolla Ficarra, Francisco V.; Nicol, Emma; Cipolla-Ficarra, Miguel; Richardson, Lucy

    We present an analysis of communicability methodology in graphics and animation components for interface design, called CAN (Communicability, Acceptability and Novelty). This methodology has been under development between 2005 and 2010, obtaining excellent results in cultural heritage, education and microcomputing contexts. In studies where there is a bi-directional interrelation between ergonomics, usability, user-centered design, software quality and the human-computer interaction. We also present the heuristic results about iconography and layout design in blogs and websites of the following countries: Spain, Italy, Portugal and France.

  6. Overview of High-Fidelity Modeling Activities in the Numerical Propulsion System Simulations (NPSS) Project

    NASA Technical Reports Server (NTRS)

    Veres, Joseph P.

    2002-01-01

    A high-fidelity simulation of a commercial turbofan engine has been created as part of the Numerical Propulsion System Simulation Project. The high-fidelity computer simulation utilizes computer models that were developed at NASA Glenn Research Center in cooperation with turbofan engine manufacturers. The average-passage (APNASA) Navier-Stokes based viscous flow computer code is used to simulate the 3D flow in the compressors and turbines of the advanced commercial turbofan engine. The 3D National Combustion Code (NCC) is used to simulate the flow and chemistry in the advanced aircraft combustor. The APNASA turbomachinery code and the NCC combustor code exchange boundary conditions at the interface planes at the combustor inlet and exit. This computer simulation technique can evaluate engine performance at steady operating conditions. The 3D flow models provide detailed knowledge of the airflow within the fan and compressor, the high and low pressure turbines, and the flow and chemistry within the combustor. The models simulate the performance of the engine at operating conditions that include sea level takeoff and the altitude cruise condition.

  7. Investigation of advanced counterrotation blade configuration concepts for high speed turboprop systems. Task 4: Advanced fan section aerodynamic analysis computer program user's manual

    NASA Technical Reports Server (NTRS)

    Crook, Andrew J.; Delaney, Robert A.

    1992-01-01

    The computer program user's manual for the ADPACAPES (Advanced Ducted Propfan Analysis Code-Average Passage Engine Simulation) program is included. The objective of the computer program is development of a three-dimensional Euler/Navier-Stokes flow analysis for fan section/engine geometries containing multiple blade rows and multiple spanwise flow splitters. An existing procedure developed by Dr. J. J. Adamczyk and associates at the NASA Lewis Research Center was modified to accept multiple spanwise splitter geometries and simulate engine core conditions. The numerical solution is based upon a finite volume technique with a four stage Runge-Kutta time marching procedure. Multiple blade row solutions are based upon the average-passage system of equations. The numerical solutions are performed on an H-type grid system, with meshes meeting the requirement of maintaining a common axisymmetric mesh for each blade row grid. The analysis was run on several geometry configurations ranging from one to five blade rows and from one to four radial flow splitters. The efficiency of the solution procedure was shown to be the same as the original analysis.

  8. 2002 Computing and Interdisciplinary Systems Office Review and Planning Meeting

    NASA Technical Reports Server (NTRS)

    Lytle, John; Follen, Gregory; Lopez, Isaac; Veres, Joseph; Lavelle, Thomas; Sehra, Arun; Freeh, Josh; Hah, Chunill

    2003-01-01

    The technologies necessary to enable detailed numerical simulations of complete propulsion systems are being developed at the NASA Glenn Research Center in cooperation with NASA Glenn s Propulsion program, NASA Ames, industry, academia and other government agencies. Large scale, detailed simulations will be of great value to the nation because they eliminate some of the costly testing required to develop and certify advanced propulsion systems. In addition, time and cost savings will be achieved by enabling design details to be evaluated early in the development process before a commitment is made to a specific design. This year s review meeting describes the current status of the NPSS and the Object Oriented Development Kit with specific emphasis on the progress made over the past year on air breathing propulsion applications for aeronautics and space transportation applications. Major accomplishments include the first 3-D simulation of the primary flow path of a large turbofan engine in less than 15 hours, and the formal release of the NPSS Version 1.5 that includes elements of rocket engine systems and a visual based syntax layer. NPSS and the Development Kit are managed by the Computing and Interdisciplinary Systems Office (CISO) at the NASA Glenn Research Center and financially supported in fiscal year 2002 by the Computing, Networking and Information Systems (CNIS) project managed at NASA Ames, the Glenn Aerospace Propulsion and Power Program and the Advanced Space Transportation Program.

  9. Scientific Computing Strategic Plan for the Idaho National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whiting, Eric Todd

    Scientific computing is a critical foundation of modern science. Without innovations in the field of computational science, the essential missions of the Department of Energy (DOE) would go unrealized. Taking a leadership role in such innovations is Idaho National Laboratory’s (INL’s) challenge and charge, and is central to INL’s ongoing success. Computing is an essential part of INL’s future. DOE science and technology missions rely firmly on computing capabilities in various forms. Modeling and simulation, fueled by innovations in computational science and validated through experiment, are a critical foundation of science and engineering. Big data analytics from an increasing numbermore » of widely varied sources is opening new windows of insight and discovery. Computing is a critical tool in education, science, engineering, and experiments. Advanced computing capabilities in the form of people, tools, computers, and facilities, will position INL competitively to deliver results and solutions on important national science and engineering challenges. A computing strategy must include much more than simply computers. The foundational enabling component of computing at many DOE national laboratories is the combination of a showcase like data center facility coupled with a very capable supercomputer. In addition, network connectivity, disk storage systems, and visualization hardware are critical and generally tightly coupled to the computer system and co located in the same facility. The existence of these resources in a single data center facility opens the doors to many opportunities that would not otherwise be possible.« less

  10. Computational Toxicology at the US EPA | Science Inventory ...

    EPA Pesticide Factsheets

    Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, EPA is developing robust and flexible computational tools that can be applied to the thousands of chemicals in commerce, and contaminant mixtures found in America’s air, water, and hazardous-waste sites. The ORD Computational Toxicology Research Program (CTRP) is composed of three main elements. The largest component is the National Center for Computational Toxicology (NCCT), which was established in 2005 to coordinate research on chemical screening and prioritization, informatics, and systems modeling. The second element consists of related activities in the National Health and Environmental Effects Research Laboratory (NHEERL) and the National Exposure Research Laboratory (NERL). The third and final component consists of academic centers working on various aspects of computational toxicology and funded by the EPA Science to Achieve Results (STAR) program. Key intramural projects of the CTRP include digitizing legacy toxicity testing information toxicity reference database (ToxRefDB), predicting toxicity (ToxCast™) and exposure (ExpoCast™), and creating virtual liver (v-Liver™) and virtual embryo (v-Embryo™) systems models. The models and underlying data are being made publicly available t

  11. A Demonstration Advanced Avionics System for general aviation

    NASA Technical Reports Server (NTRS)

    Denery, D. G.; Callas, G. P.; Jackson, C. T.; Berkstresser, B. K.; Hardy, G. H.

    1979-01-01

    A program initiated within NASA has emphasized the use of a data bus, microprocessors, electronic displays and data entry devices for general aviation. A Demonstration Advanced Avionics System (DAAS) capable of evaluating critical and promising elements of an integrating system that will perform the functions of (1) automated guidance and navigation; (2) flight planning; (3) weight and balance performance computations; (4) monitoring and warning; and (5) storage of normal and emergency check lists and operational limitations is described. Consideration is given to two major parts of the DAAS instrument panel: the integrated data control center and an electronic horizontal situation indicator, and to the system architecture. The system is to be installed in the Ames Research Center's Cessna 402B in the latter part of 1980; engineering flight testing will begin in the first part of 1981.

  12. USRA/RIACS

    NASA Technical Reports Server (NTRS)

    Oliger, Joseph

    1992-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on 6 June 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under a cooperative agreement with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. A flexible scientific staff is provided through a university faculty visitor program, a post doctoral program, and a student visitor program. Not only does this provide appropriate expertise but it also introduces scientists outside of NASA to NASA problems. A small group of core RIACS staff provides continuity and interacts with an ARC technical monitor and scientific advisory group to determine the RIACS mission. RIACS activities are reviewed and monitored by a USRA advisory council and ARC technical monitor. Research at RIACS is currently being done in the following areas: Parallel Computing; Advanced Methods for Scientific Computing; Learning Systems; High Performance Networks and Technology; Graphics, Visualization, and Virtual Environments.

  13. Computational Nanotechnology at NASA Ames Research Center, 1996

    NASA Technical Reports Server (NTRS)

    Globus, Al; Bailey, David; Langhoff, Steve; Pohorille, Andrew; Levit, Creon; Chancellor, Marisa K. (Technical Monitor)

    1996-01-01

    Some forms of nanotechnology appear to have enormous potential to improve aerospace and computer systems; computational nanotechnology, the design and simulation of programmable molecular machines, is crucial to progress. NASA Ames Research Center has begun a computational nanotechnology program including in-house work, external research grants, and grants of supercomputer time. Four goals have been established: (1) Simulate a hypothetical programmable molecular machine replicating itself and building other products. (2) Develop molecular manufacturing CAD (computer aided design) software and use it to design molecular manufacturing systems and products of aerospace interest, including computer components. (3) Characterize nanotechnologically accessible materials of aerospace interest. Such materials may have excellent strength and thermal properties. (4) Collaborate with experimentalists. Current in-house activities include: (1) Development of NanoDesign, software to design and simulate a nanotechnology based on functionalized fullerenes. Early work focuses on gears. (2) A design for high density atomically precise memory. (3) Design of nanotechnology systems based on biology. (4) Characterization of diamonoid mechanosynthetic pathways. (5) Studies of the laplacian of the electronic charge density to understand molecular structure and reactivity. (6) Studies of entropic effects during self-assembly. Characterization of properties of matter for clusters up to sizes exhibiting bulk properties. In addition, the NAS (NASA Advanced Supercomputing) supercomputer division sponsored a workshop on computational molecular nanotechnology on March 4-5, 1996 held at NASA Ames Research Center. Finally, collaborations with Bill Goddard at CalTech, Ralph Merkle at Xerox Parc, Don Brenner at NCSU (North Carolina State University), Tom McKendree at Hughes, and Todd Wipke at UCSC are underway.

  14. Internal fluid mechanics research on supercomputers for aerospace propulsion systems

    NASA Technical Reports Server (NTRS)

    Miller, Brent A.; Anderson, Bernhard H.; Szuch, John R.

    1988-01-01

    The Internal Fluid Mechanics Division of the NASA Lewis Research Center is combining the key elements of computational fluid dynamics, aerothermodynamic experiments, and advanced computational technology to bring internal computational fluid mechanics (ICFM) to a state of practical application for aerospace propulsion systems. The strategies used to achieve this goal are to: (1) pursue an understanding of flow physics, surface heat transfer, and combustion via analysis and fundamental experiments, (2) incorporate improved understanding of these phenomena into verified 3-D CFD codes, and (3) utilize state-of-the-art computational technology to enhance experimental and CFD research. Presented is an overview of the ICFM program in high-speed propulsion, including work in inlets, turbomachinery, and chemical reacting flows. Ongoing efforts to integrate new computer technologies, such as parallel computing and artificial intelligence, into high-speed aeropropulsion research are described.

  15. Computer Model Predicts the Movement of Dust

    NASA Technical Reports Server (NTRS)

    2002-01-01

    A new computer model of the atmosphere can now actually pinpoint where global dust events come from, and can project where they're going. The model may help scientists better evaluate the impact of dust on human health, climate, ocean carbon cycles, ecosystems, and atmospheric chemistry. Also, by seeing where dust originates and where it blows people with respiratory problems can get advanced warning of approaching dust clouds. 'The model is physically more realistic than previous ones,' said Mian Chin, a co-author of the study and an Earth and atmospheric scientist at Georgia Tech and the Goddard Space Flight Center (GSFC) in Greenbelt, Md. 'It is able to reproduce the short term day-to-day variations and long term inter-annual variations of dust concentrations and distributions that are measured from field experiments and observed from satellites.' The above images show both aerosols measured from space (left) and the movement of aerosols predicted by computer model for the same date (right). For more information, read New Computer Model Tracks and Predicts Paths Of Earth's Dust Images courtesy Paul Giroux, Georgia Tech/NASA Goddard Space Flight Center

  16. NASA Rotor 37 CFD Code Validation: Glenn-HT Code

    NASA Technical Reports Server (NTRS)

    Ameri, Ali A.

    2010-01-01

    In order to advance the goals of NASA aeronautics programs, it is necessary to continuously evaluate and improve the computational tools used for research and design at NASA. One such code is the Glenn-HT code which is used at NASA Glenn Research Center (GRC) for turbomachinery computations. Although the code has been thoroughly validated for turbine heat transfer computations, it has not been utilized for compressors. In this work, Glenn-HT was used to compute the flow in a transonic compressor and comparisons were made to experimental data. The results presented here are in good agreement with this data. Most of the measures of performance are well within the measurement uncertainties and the exit profiles of interest agree with the experimental measurements.

  17. The Center for Advanced Systems and Engineering (CASE)

    DTIC Science & Technology

    2012-01-01

    targets from multiple sensors. Qinru Qiu, State University of New York at Binghamton – A Neuromorphic Approach for Intelligent Text Recognition...Rogers, SUNYIT, Basic Research, Development and Emulation of Derived Models of Neuromorphic Brain Processes to Investigate the Computational Architecture...Issues They Present Work pertaining to the basic research, development and emulation of derived models of Neuromorphic brain processes to

  18. Astronomical Information Center - Naval Oceanography Portal

    Science.gov Websites

    section Advanced Search... Sections Home Time Earth Orientation Astronomy Meteorology Oceanography Ice You the Moon Illuminated Time Universal Time and Greenwich Mean Time What is Terrestrial Time? Computing Greenwich Apparent Sidereal Time What are the U.S. Time Zones? World Time Zone Map When Does Daylight Time

  19. 2009 ALCF annual report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beckman, P.; Martin, D.; Drugan, C.

    2010-11-23

    This year the Argonne Leadership Computing Facility (ALCF) delivered nearly 900 million core hours of science. The research conducted at their leadership class facility touched our lives in both minute and massive ways - whether it was studying the catalytic properties of gold nanoparticles, predicting protein structures, or unearthing the secrets of exploding stars. The authors remained true to their vision to act as the forefront computational center in extending science frontiers by solving pressing problems for our nation. Our success in this endeavor was due mainly to the Department of Energy's (DOE) INCITE (Innovative and Novel Computational Impact onmore » Theory and Experiment) program. The program awards significant amounts of computing time to computationally intensive, unclassified research projects that can make high-impact scientific advances. This year, DOE allocated 400 million hours of time to 28 research projects at the ALCF. Scientists from around the world conducted the research, representing such esteemed institutions as the Princeton Plasma Physics Laboratory, National Institute of Standards and Technology, and European Center for Research and Advanced Training in Scientific Computation. Argonne also provided Director's Discretionary allocations for research challenges, addressing such issues as reducing aerodynamic noise, critical for next-generation 'green' energy systems. Intrepid - the ALCF's 557-teraflops IBM Blue/Gene P supercomputer - enabled astounding scientific solutions and discoveries. Intrepid went into full production five months ahead of schedule. As a result, the ALCF nearly doubled the days of production computing available to the DOE Office of Science, INCITE awardees, and Argonne projects. One of the fastest supercomputers in the world for open science, the energy-efficient system uses about one-third as much electricity as a machine of comparable size built with more conventional parts. In October 2009, President Barack Obama recognized the excellence of the entire Blue Gene series by awarding it to the National Medal of Technology and Innovation. Other noteworthy achievements included the ALCF's collaboration with the National Energy Research Scientific Computing Center (NERSC) to examine cloud computing as a potential new computing paradigm for scientists. Named Magellan, the DOE-funded initiative will explore which science application programming models work well within the cloud, as well as evaluate the challenges that come with this new paradigm. The ALCF obtained approval for its next-generation machine, a 10-petaflops system to be delivered in 2012. This system will allow us to resolve ever more pressing problems, even more expeditiously through breakthrough science in the years to come.« less

  20. Geoscience Through the Lens of Art: a collaborative course of science and art for undergraduates of various disciplines

    NASA Astrophysics Data System (ADS)

    Ellins, K. K.; Eriksson, S. C.; Samsel, F.; Lavier, L.

    2017-12-01

    A new undergraduate, upper level geoscience course was developed and taught by faculty and staff of the UT Austin Jackson School of Geosciences, the Center for Agile Technology, and the Texas Advanced Computational Center. The course examined the role of the visual arts in placing the scientific process and knowledge in a broader context and introduced students to innovations in the visual arts that promote scientific investigation through collaboration between geoscientists and artists. The course addressed (1) the role of the visual arts in teaching geoscience concepts and promoting geoscience learning; (2) the application of innovative visualization and artistic techniques to large volumes of geoscience data to enhance scientific understanding and to move scientific investigation forward; and (3) the illustrative power of art to communicate geoscience to the public. In-class activities and discussions, computer lab instruction on the application of Paraview software, reading assignments, lectures, and group projects with presentations comprised the two-credit, semester-long "special topics" course, which was taken by geoscience, computer science, and engineering students. Assessment of student learning was carried out by the instructors and course evaluation was done by an external evaluator using rubrics, likert-scale surveys and focus goups. The course achieved its goals of students' learning the concepts and techniques of the visual arts. The final projects demonstrated this, along with the communication of geologic concepts using what they had learned in the course. The basic skill of sketching for learning and using best practices in visual communication were used extensively and, in most cases, very effectively. The use of an advanced visualization tool, Paraview, was received with mixed reviews because of the lack of time to really learn the tool and the fact that it is not a tool used routinely in geoscience. Those senior students with advanced computer skills saw the importance of this tool. Students worked in teams, more or less effectively, and made suggestions for improving future offerings of the course.

  1. Computer network access to scientific information systems for minority universities

    NASA Astrophysics Data System (ADS)

    Thomas, Valerie L.; Wakim, Nagi T.

    1993-08-01

    The evolution of computer networking technology has lead to the establishment of a massive networking infrastructure which interconnects various types of computing resources at many government, academic, and corporate institutions. A large segment of this infrastructure has been developed to facilitate information exchange and resource sharing within the scientific community. The National Aeronautics and Space Administration (NASA) supports both the development and the application of computer networks which provide its community with access to many valuable multi-disciplinary scientific information systems and on-line databases. Recognizing the need to extend the benefits of this advanced networking technology to the under-represented community, the National Space Science Data Center (NSSDC) in the Space Data and Computing Division at the Goddard Space Flight Center has developed the Minority University-Space Interdisciplinary Network (MU-SPIN) Program: a major networking and education initiative for Historically Black Colleges and Universities (HBCUs) and Minority Universities (MUs). In this paper, we will briefly explain the various components of the MU-SPIN Program while highlighting how, by providing access to scientific information systems and on-line data, it promotes a higher level of collaboration among faculty and students and NASA scientists.

  2. Test and control computer user's guide for a digital beam former test system

    NASA Technical Reports Server (NTRS)

    Alexovich, Robert E.; Mallasch, Paul G.

    1992-01-01

    A Digital Beam Former Test System was developed to determine the effects of noise, interferers and distortions, and digital implementations of beam forming as applied to the Tracking and Data Relay Satellite 2 (TDRS 2) architectures. The investigation of digital beam forming with application to TDRS 2 architectures, as described in TDRS 2 advanced concept design studies, was conducted by the NASA/Lewis Research Center for NASA/Goddard Space Flight Center. A Test and Control Computer (TCC) was used as the main controlling element of the digital Beam Former Test System. The Test and Control Computer User's Guide for a Digital Beam Former Test System provides an organized description of the Digital Beam Former Test System commands. It is written for users who wish to conduct tests of the Digital Beam forming Test processor using the TCC. The document describes the function, use, and syntax of the TCC commands available to the user while summarizing and demonstrating the use of the commands wtihin DOS batch files.

  3. A parallel-processing approach to computing for the geographic sciences

    USGS Publications Warehouse

    Crane, Michael; Steinwand, Dan; Beckmann, Tim; Krpan, Greg; Haga, Jim; Maddox, Brian; Feller, Mark

    2001-01-01

    The overarching goal of this project is to build a spatially distributed infrastructure for information science research by forming a team of information science researchers and providing them with similar hardware and software tools to perform collaborative research. Four geographically distributed Centers of the U.S. Geological Survey (USGS) are developing their own clusters of low-cost personal computers into parallel computing environments that provide a costeffective way for the USGS to increase participation in the high-performance computing community. Referred to as Beowulf clusters, these hybrid systems provide the robust computing power required for conducting research into various areas, such as advanced computer architecture, algorithms to meet the processing needs for real-time image and data processing, the creation of custom datasets from seamless source data, rapid turn-around of products for emergency response, and support for computationally intense spatial and temporal modeling.

  4. Modeling Materials: Design for Planetary Entry, Electric Aircraft, and Beyond

    NASA Technical Reports Server (NTRS)

    Thompson, Alexander; Lawson, John W.

    2014-01-01

    NASA missions push the limits of what is possible. The development of high-performance materials must keep pace with the agency's demanding, cutting-edge applications. Researchers at NASA's Ames Research Center are performing multiscale computational modeling to accelerate development times and further the design of next-generation aerospace materials. Multiscale modeling combines several computationally intensive techniques ranging from the atomic level to the macroscale, passing output from one level as input to the next level. These methods are applicable to a wide variety of materials systems. For example: (a) Ultra-high-temperature ceramics for hypersonic aircraft-we utilized the full range of multiscale modeling to characterize thermal protection materials for faster, safer air- and spacecraft, (b) Planetary entry heat shields for space vehicles-we computed thermal and mechanical properties of ablative composites by combining several methods, from atomistic simulations to macroscale computations, (c) Advanced batteries for electric aircraft-we performed large-scale molecular dynamics simulations of advanced electrolytes for ultra-high-energy capacity batteries to enable long-distance electric aircraft service; and (d) Shape-memory alloys for high-efficiency aircraft-we used high-fidelity electronic structure calculations to determine phase diagrams in shape-memory transformations. Advances in high-performance computing have been critical to the development of multiscale materials modeling. We used nearly one million processor hours on NASA's Pleiades supercomputer to characterize electrolytes with a fidelity that would be otherwise impossible. For this and other projects, Pleiades enables us to push the physics and accuracy of our calculations to new levels.

  5. HPCC and the National Information Infrastructure: an overview.

    PubMed Central

    Lindberg, D A

    1995-01-01

    The National Information Infrastructure (NII) or "information superhighway" is a high-priority federal initiative to combine communications networks, computers, databases, and consumer electronics to deliver information services to all U.S. citizens. The NII will be used to improve government and social services while cutting administrative costs. Operated by the private sector, the NII will rely on advanced technologies developed under the direction of the federal High Performance Computing and Communications (HPCC) Program. These include computing systems capable of performing trillions of operations (teraops) per second and networks capable of transmitting billions of bits (gigabits) per second. Among other activities, the HPCC Program supports the national supercomputer research centers, the federal portion of the Internet, and the development of interface software, such as Mosaic, that facilitates access to network information services. Health care has been identified as a critical demonstration area for HPCC technology and an important application area for the NII. As an HPCC participant, the National Library of Medicine (NLM) assists hospitals and medical centers to connect to the Internet through projects directed by the Regional Medical Libraries and through an Internet Connections Program cosponsored by the National Science Foundation. In addition to using the Internet to provide enhanced access to its own information services, NLM sponsors health-related applications of HPCC technology. Examples include the "Visible Human" project and recently awarded contracts for test-bed networks to share patient data and medical images, telemedicine projects to provide consultation and medical care to patients in rural areas, and advanced computer simulations of human anatomy for training in "virtual surgery." PMID:7703935

  6. National Kidney Registry: 213 transplants in three years.

    PubMed

    Veale, Jeffrey; Hil, Garet

    2010-01-01

    Since its establishment in 2008, the National Kidney Registry has facilitated 213 kidney transplants between unrelated living donors and recipients at 28 transplant centers. Rapid innovations in matching strategies, advanced computer technologies, good communication and an evolving understanding of the processes at participating transplant centers and histocompatibility laboratories are among the factors driving the success of the NKR. Virtual cross match accuracy has improved from 43% to 91% as a result of changes to the HLA typing requirements for potential donors and improved mechanisms to list unacceptable HLA antigens for sensitized patients. A uniform financial agreement among participating centers eliminated a major roadblock to facilitate unbalanced donor kidney exchanges among centers. The NKR transplanted 64% of the patients registered since 2008 and the average waiting time for those transplanted in 2010 was 11 months.

  7. Documentary of MFENET, a national computer network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shuttleworth, B.O.

    1977-06-01

    The national Magnetic Fusion Energy Computer Network (MFENET) is a newly operational star network of geographically separated heterogeneous hosts and a communications subnetwork of PDP-11 processors. Host processors interfaced to the subnetwork currently include a CDC 7600 at the Central Computer Center (CCC) and several DECsystem-10's at User Service Centers (USC's). The network was funded by a U.S. government agency (ERDA) to provide in an economical manner the needed computational resources to magnetic confinement fusion researchers. Phase I operation of MFENET distributed the processing power of the CDC 7600 among the USC's through the provision of file transport between anymore » two hosts and remote job entry to the 7600. Extending the capabilities of Phase I, MFENET Phase II provided interactive terminal access to the CDC 7600 from the USC's. A file management system is maintained at the CCC for all network users. The history and development of MFENET are discussed, with emphasis on the protocols used to link the host computers and the USC software. Comparisons are made of MFENET versus ARPANET (Advanced Research Projects Agency Computer Network) and DECNET (Digital Distributed Network Architecture). DECNET and MFENET host-to host, host-to-CCP, and link protocols are discussed in detail. The USC--CCP interface is described briefly. 43 figures, 2 tables.« less

  8. Environmental impact statement Space Shuttle advanced solid rocket motor program

    NASA Technical Reports Server (NTRS)

    1989-01-01

    The proposed action is design, development, testing, and evaluation of Advanced Solid Rocket Motors (ASRM) to replace the motors currently used to launch the Space Shuttle. The proposed action includes design, construction, and operation of new government-owned, contractor-operated facilities for manufacturing and testing the ASRM's. The proposed action also includes transport of propellant-filled rocket motor segments from the manufacturing facility to the testing and launch sites and the return of used and/or refurbished segments to the manufacturing site. Sites being considered for the new facilities include John C. Stennis Space Center, Hancock County, Mississippi; the Yellow Creek site in Tishomingo County, Mississippi, which is currently in the custody and control of the Tennessee Valley Authority; and John F. Kennedy Space Center, Brevard County, Florida. TVA proposes to transfer its site to the custody and control of NASA if it is the selected site. All facilities need not be located at the same site. Existing facilities which may provide support for the program include Michoud Assembly Facility, New Orleans Parish, Louisiana; and Slidell Computer Center, St. Tammany Parish, Louisiana. NASA's preferred production location is the Yellow Creek site, and the preferred test location is the Stennis Space Center.

  9. Improving Conceptual Design for Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Olds, John R.

    1998-01-01

    This report summarizes activities performed during the second year of a three year cooperative agreement between NASA - Langley Research Center and Georgia Tech. Year 1 of the project resulted in the creation of a new Cost and Business Assessment Model (CABAM) for estimating the economic performance of advanced reusable launch vehicles including non-recurring costs, recurring costs, and revenue. The current year (second year) activities were focused on the evaluation of automated, collaborative design frameworks (computation architectures or computational frameworks) for automating the design process in advanced space vehicle design. Consistent with NASA's new thrust area in developing and understanding Intelligent Synthesis Environments (ISE), the goals of this year's research efforts were to develop and apply computer integration techniques and near-term computational frameworks for conducting advanced space vehicle design. NASA - Langley (VAB) has taken a lead role in developing a web-based computing architectures within which the designer can interact with disciplinary analysis tools through a flexible web interface. The advantages of this approach are, 1) flexible access to the designer interface through a simple web browser (e.g. Netscape Navigator), 2) ability to include existing 'legacy' codes, and 3) ability to include distributed analysis tools running on remote computers. To date, VAB's internal emphasis has been on developing this test system for the planetary entry mission under the joint Integrated Design System (IDS) program with NASA - Ames and JPL. Georgia Tech's complementary goals this year were to: 1) Examine an alternate 'custom' computational architecture for the three-discipline IDS planetary entry problem to assess the advantages and disadvantages relative to the web-based approach.and 2) Develop and examine a web-based interface and framework for a typical launch vehicle design problem.

  10. The Center for Nanophase Materials Sciences

    NASA Astrophysics Data System (ADS)

    Lowndes, Douglas

    2005-03-01

    The Center for Nanophase Materials Sciences (CNMS) located at Oak Ridge National Laboratory (ORNL) will be the first DOE Nanoscale Science Research Center to begin operation, with construction to be completed in April 2005 and initial operations in October 2005. The CNMS' scientific program has been developed through workshops with the national community, with the goal of creating a highly collaborative research environment to accelerate discovery and drive technological advances. Research at the CNMS is organized under seven Scientific Themes selected to address challenges to understanding and to exploit particular ORNL strengths (see http://cnms.ornl.govhttp://cnms.ornl.gov). These include extensive synthesis and characterization capabilities for soft, hard, nanostructured, magnetic and catalytic materials and their composites; neutron scattering at the Spallation Neutron Source and High Flux Isotope Reactor; computational nanoscience in the CNMS' Nanomaterials Theory Institute and utilizing facilities and expertise of the Center for Computational Sciences and the new Leadership Scientific Computing Facility at ORNL; a new CNMS Nanofabrication Research Laboratory; and a suite of unique and state-of-the-art instruments to be made reliably available to the national community for imaging, manipulation, and properties measurements on nanoscale materials in controlled environments. The new research facilities will be described together with the planned operation of the user research program, the latter illustrated by the current ``jump start'' user program that utilizes existing ORNL/CNMS facilities.

  11. Faster Aerodynamic Simulation With Cart3D

    NASA Technical Reports Server (NTRS)

    2003-01-01

    A NASA-developed aerodynamic simulation tool is ensuring the safety of future space operations while providing designers and engineers with an automated, highly accurate computer simulation suite. Cart3D, co-winner of NASA's 2002 Software of the Year award, is the result of over 10 years of research and software development conducted by Michael Aftosmis and Dr. John Melton of Ames Research Center and Professor Marsha Berger of the Courant Institute at New York University. Cart3D offers a revolutionary approach to computational fluid dynamics (CFD), the computer simulation of how fluids and gases flow around an object of a particular design. By fusing technological advancements in diverse fields such as mineralogy, computer graphics, computational geometry, and fluid dynamics, the software provides a new industrial geometry processing and fluid analysis capability with unsurpassed automation and efficiency.

  12. Updated Panel-Method Computer Program

    NASA Technical Reports Server (NTRS)

    Ashby, Dale L.

    1995-01-01

    Panel code PMARC_12 (Panel Method Ames Research Center, version 12) computes potential-flow fields around complex three-dimensional bodies such as complete aircraft models. Contains several advanced features, including internal mathematical modeling of flow, time-stepping wake model for simulating either steady or unsteady motions, capability for Trefftz computation of drag induced by plane, and capability for computation of off-body and on-body streamlines, and capability of computation of boundary-layer parameters by use of two-dimensional integral boundary-layer method along surface streamlines. Investigators interested in visual representations of phenomena, may want to consider obtaining program GVS (ARC-13361), General visualization System. GVS is Silicon Graphics IRIS program created to support scientific-visualization needs of PMARC_12. GVS available separately from COSMIC. PMARC_12 written in standard FORTRAN 77, with exception of NAMELIST extension used for input.

  13. Extracting semantics from audio-visual content: the final frontier in multimedia retrieval.

    PubMed

    Naphade, M R; Huang, T S

    2002-01-01

    Multimedia understanding is a fast emerging interdisciplinary research area. There is tremendous potential for effective use of multimedia content through intelligent analysis. Diverse application areas are increasingly relying on multimedia understanding systems. Advances in multimedia understanding are related directly to advances in signal processing, computer vision, pattern recognition, multimedia databases, and smart sensors. We review the state-of-the-art techniques in multimedia retrieval. In particular, we discuss how multimedia retrieval can be viewed as a pattern recognition problem. We discuss how reliance on powerful pattern recognition and machine learning techniques is increasing in the field of multimedia retrieval. We review the state-of-the-art multimedia understanding systems with particular emphasis on a system for semantic video indexing centered around multijects and multinets. We discuss how semantic retrieval is centered around concepts and context and the various mechanisms for modeling concepts and context.

  14. A Historical Perspective on Dynamics Testing at the Langley Research Center

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Kvaternik, Raymond G.; Hanks, Brantley R.

    2000-01-01

    The experience and advancement of Structural dynamics testing for space system applications at the Langley Research Center of the National Aeronautics and Space Administration (NASA) over the past four decades is reviewed. This experience began in the 1960's with the development of a technology base using a variety of physical models to explore dynamic phenomena and to develop reliable analytical modeling capability for space systems. It continued through the 1970's and 80's with the development of rapid, computer-aided test techniques, the testing of low-natural frequency, gravity-sensitive systems, the testing of integrated structures with active flexible motion control, and orbital flight measurements, It extended into the 1990's where advanced computerized system identification methods were developed for estimating the dynamic states of complex, lightweight, flexible aerospace systems, The scope of discussion in this paper includes ground and flight tests and summarizes lessons learned in both successes and failures.

  15. AHPCRC (Army High Performance Computing Research Center) Bulletin. Volume 3, Issue 1

    DTIC Science & Technology

    2011-01-01

    release; distribution is unlimited. Multiscale Modeling of Materials The rotating reflector antenna associated with airport traffic control systems is...batteries and phased-array antennas . Power and efficiency studies evaluate on-board HPC systems and advanced image processing applications. 2010 marked...giving way in some applications to a newer technology called the phased array antenna system (sometimes called a beamformer, example shown at right

  16. Using a Genome-Scale Metabolic Network Model to Elucidate the Mechanism of Chloroquine Action in Plasmodium falciparum

    DTIC Science & Technology

    2017-03-22

    Department of Defense Biotechnology High Performance Computing Software Applications Institute, Telemedicine and Advanced Technology Research Center, US...2017 Available online 22 March 2017 Keywords: Plasmodium Chloroquine Metabolic network modeling Redox metabolism Carbon fixation* Corresponding... available (Antony and Parija, 2016), their efficacy has declined appreciably in the last few decades owing to widespread drug resistance developed by the

  17. Computational Study of Thrombus Formation and Clotting Factor Effects under Venous Flow Conditions

    DTIC Science & Technology

    2016-04-26

    Telemedicine and Advanced Technology Research Center, U.S. Army Medical Research and Materiel Command, Fort Detrick, MarylandABSTRACT A comprehensive... experimental study. The model allowed us to identify the distinct patterns character- izing the spatial distributions of thrombin, platelets, and fibrin...time, elevated fibrinogen levels may contribute to the development of thrombosis (4,6,12). Quantitative knowledge about the interactions between fibrin

  18. Development and Validation of a Fast, Accurate and Cost-Effective Aeroservoelastic Method on Advanced Parallel Computing Systems

    NASA Technical Reports Server (NTRS)

    Goodwin, Sabine A.; Raj, P.

    1999-01-01

    Progress to date towards the development and validation of a fast, accurate and cost-effective aeroelastic method for advanced parallel computing platforms such as the IBM SP2 and the SGI Origin 2000 is presented in this paper. The ENSAERO code, developed at the NASA-Ames Research Center has been selected for this effort. The code allows for the computation of aeroelastic responses by simultaneously integrating the Euler or Navier-Stokes equations and the modal structural equations of motion. To assess the computational performance and accuracy of the ENSAERO code, this paper reports the results of the Navier-Stokes simulations of the transonic flow over a flexible aeroelastic wing body configuration. In addition, a forced harmonic oscillation analysis in the frequency domain and an analysis in the time domain are done on a wing undergoing a rigid pitch and plunge motion. Finally, to demonstrate the ENSAERO flutter-analysis capability, aeroelastic Euler and Navier-Stokes computations on an L-1011 wind tunnel model including pylon, nacelle and empennage are underway. All computational solutions are compared with experimental data to assess the level of accuracy of ENSAERO. As the computations described above are performed, a meticulous log of computational performance in terms of wall clock time, execution speed, memory and disk storage is kept. Code scalability is also demonstrated by studying the impact of varying the number of processors on computational performance on the IBM SP2 and the Origin 2000 systems.

  19. Techniques for animation of CFD results. [computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Horowitz, Jay; Hanson, Jeffery C.

    1992-01-01

    Video animation is becoming increasingly vital to the computational fluid dynamics researcher, not just for presentation, but for recording and comparing dynamic visualizations that are beyond the current capabilities of even the most powerful graphic workstation. To meet these needs, Lewis Research Center has recently established a facility to provide users with easy access to advanced video animation capabilities. However, producing animation that is both visually effective and scientifically accurate involves various technological and aesthetic considerations that must be understood both by the researcher and those supporting the visualization process. These considerations include: scan conversion, color conversion, and spatial ambiguities.

  20. Advanced Keyboard

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Using chordic technology, a data entry operator can finger key combinations for text or graphics input. Because only one hand is needed, a disabled person may use it. Strain and fatigue are less than when using a conventional keyboard; input is faster, and the system can be learned in about an hour. Infogrip, Inc. developed chordic input technology with Stennis Space Center (SSC). (NASA is interested in potentially faster human/computer interaction on spacecraft as well as a low cost tactile/visual training system for the handicapped.) The company is now marketing the BAT as an improved system for both disabled and non-disabled computer operators.

  1. Recent advances at NASA in calculating the electronic spectra of diatomic molecules

    NASA Technical Reports Server (NTRS)

    Whiting, Ellis E.; Paterson, John A.

    1988-01-01

    Advanced entry vehicles, such as the proposed Aero-assisted Orbital Transfer Vehicle, provide new and challenging problems for spectroscopy. Large portions of the flow field about such vehicles will be characterized by chemical and thermal nonequilibrium. Only by considering the actual overlap of the atomic and rotational lines emitted by the species present can the impact of radiative transport within the flow field be assessed correctly. To help make such an assessment, a new computer program is described that can generate high-resolution, line-by-line spectra for any spin-allowed transitions in diatomic molecules. The program includes the matrix elements for the rotational energy and distortion to the fourth order; the spin-orbit, spin-spin, and spin-rotation interactions to first order; and the lambda splitting by a perturbation calculation. An overview of the Computational Chemistry Branch at Ames Research Center is also presented.

  2. Beginning the 21st century with advanced Automatic Parts Identification (API)

    NASA Technical Reports Server (NTRS)

    Schramm, Fred; Roxby, Don

    1994-01-01

    Under the direction of the NASA George C. Marshall Space Flight Center, Huntsville, Alabama, the development and commercialization of an advanced Automated Parts Indentification (API) system is being undertaken by Rockwell International Corporation. The new API system is based on a variable sized, machine-readable, two-dimensioanl matrix symbol that can be applied directly onto most metallic and nonmetallic materials using safe, permanent marking methods. Its checkerboard-like structure is the most space efficient of all symbologies. This high data-density symbology can be applied to products of different material sizes and geometries using application-dependent, computer-driven marking devices. The high fidelity markings produced by these devices can then be captured using a specially designed camera linked to any IBM-compatible computer. Applications of compressed symbology technology will reduce costs and improve quality, productivity, and processes in a wide variety of federal and commercial applications.

  3. National Laboratory for Advanced Scientific Visualization at UNAM - Mexico

    NASA Astrophysics Data System (ADS)

    Manea, Marina; Constantin Manea, Vlad; Varela, Alfredo

    2016-04-01

    In 2015, the National Autonomous University of Mexico (UNAM) joined the family of Universities and Research Centers where advanced visualization and computing plays a key role to promote and advance missions in research, education, community outreach, as well as business-oriented consulting. This initiative provides access to a great variety of advanced hardware and software resources and offers a range of consulting services that spans a variety of areas related to scientific visualization, among which are: neuroanatomy, embryonic development, genome related studies, geosciences, geography, physics and mathematics related disciplines. The National Laboratory for Advanced Scientific Visualization delivers services through three main infrastructure environments: the 3D fully immersive display system Cave, the high resolution parallel visualization system Powerwall, the high resolution spherical displays Earth Simulator. The entire visualization infrastructure is interconnected to a high-performance-computing-cluster (HPCC) called ADA in honor to Ada Lovelace, considered to be the first computer programmer. The Cave is an extra large 3.6m wide room with projected images on the front, left and right, as well as floor walls. Specialized crystal eyes LCD-shutter glasses provide a strong stereo depth perception, and a variety of tracking devices allow software to track the position of a user's hand, head and wand. The Powerwall is designed to bring large amounts of complex data together through parallel computing for team interaction and collaboration. This system is composed by 24 (6x4) high-resolution ultra-thin (2 mm) bezel monitors connected to a high-performance GPU cluster. The Earth Simulator is a large (60") high-resolution spherical display used for global-scale data visualization like geophysical, meteorological, climate and ecology data. The HPCC-ADA, is a 1000+ computing core system, which offers parallel computing resources to applications that requires large quantity of memory as well as large and fast parallel storage systems. The entire system temperature is controlled by an energy and space efficient cooling solution, based on large rear door liquid cooled heat exchangers. This state-of-the-art infrastructure will boost research activities in the region, offer a powerful scientific tool for teaching at undergraduate and graduate levels, and enhance association and cooperation with business-oriented organizations.

  4. Computational fluid dynamics at NASA Ames and the numerical aerodynamic simulation program

    NASA Technical Reports Server (NTRS)

    Peterson, V. L.

    1985-01-01

    Computers are playing an increasingly important role in the field of aerodynamics such as that they now serve as a major complement to wind tunnels in aerospace research and development. Factors pacing advances in computational aerodynamics are identified, including the amount of computational power required to take the next major step in the discipline. The four main areas of computational aerodynamics research at NASA Ames Research Center which are directed toward extending the state of the art are identified and discussed. Example results obtained from approximate forms of the governing equations are presented and discussed, both in the context of levels of computer power required and the degree to which they either further the frontiers of research or apply to programs of practical importance. Finally, the Numerical Aerodynamic Simulation Program--with its 1988 target of achieving a sustained computational rate of 1 billion floating-point operations per second--is discussed in terms of its goals, status, and its projected effect on the future of computational aerodynamics.

  5. Annual report

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The overall goal of the Tuskegee University Center for Food Production, Processing and Waste Management in Controlled Ecological Life Support Systems (CELSS) is to provide tested information and technologies applicable to bioregenerative food production systems for life support on long-term manned space mission. Specifically, the center is developing information, computer simulated models, methodologies and technology for sweetpotato and peanut biomass production and processing, inclusive of waste management and recycling of these crops selected by NASA for CELSS. The Center is organized into interdisciplinary teams of life scientists and engineers that work together on specific objectives and long-term goals. Integral to the goal of the Center is the development of both basic and applied research information and the training of young scientists and engineers, especially underrepresented minorities that will increase the professional pool in these disciplines and contribute to the advancement of space sciences and exploration.

  6. Advanced computed tomography inspection system (ACTIS): an overview of the technology and its applications

    NASA Astrophysics Data System (ADS)

    Beshears, Ronald D.; Hediger, Lisa H.

    1994-10-01

    The Advanced Computed Tomography Inspection System (ACTIS) was developed by the Marshall Space Flight Center to support in-house solid propulsion test programs. ACTIS represents a significant advance in state-of-the-art inspection systems. Its flexibility and superior technical performance have made ACTIS very popular, both within and outside the aerospace community. Through Technology Utilization efforts, ACTIS has been applied to inspection problems in commercial aerospace, lumber, automotive, and nuclear waste disposal industries. ACTIS has even been used to inspect items of historical interest. ACTIS has consistently produced valuable results, providing information which was unattainable through conventional inspection methods. Although many successes have already been demonstrated, the full potential of ACTIS has not yet been realized. It is currently being applied in the commercial aerospace industry by Boeing Aerospace Company. Smaller systems, based on ACTIS technology are becoming increasingly available. This technology has much to offer small businesses and industry, especially in identifying design and process problems early in the product development cycle to prevent defects. Several options are available to businesses interested in pursuing this technology.

  7. Emerging aerospace technologies

    NASA Technical Reports Server (NTRS)

    Ballhaus, W. F., Jr.; Milov, L. A.

    1985-01-01

    The United States Government has a long history of promoting the advancement of technology to strengthen the economy and national defense. An example is NASA, which was formed in 1958 to establish and maintain U.S. space technology leadership. This leadership has resulted in technological benefits to many fields and the establishment of new commercial industries, such as satellite communications. Currently, NASA's leading technology development at Ames Research Center includes the Tilt Rotor XV-15, which provides the versatility of a helicopter with the speed of a turboprop aircraft; the Numerical Aerodynamic Simulator, which is pushing the state of the art in advanced computational mathematics and computer simulation; and the Advanced Automation and Robotics programs, which will improve all areas of space development as well as life on Earth. Private industry is involved in maintaining technological leadership through NASA's Commercial Use of Space Program, which provides for synergistic relationships among government, industry, and academia. The plan for a space station by 1992 has framed much of NASA's future goals and has provided new areas of opportunity for both domestic space technology and leadership improvement of life on Earth.

  8. Exploratory study on potential safeguards applications for shared ledger technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frazar, Sarah L.; Jarman, Kenneth D.; Joslyn, Cliff A.

    The International Atomic Energy Agency (IAEA) is responsible for providing credible assurance that countries are meeting their obligations not to divert or misuse nuclear materials and facilities for non-peaceful purposes. To this end, the IAEA integrates information about States’ nuclear material inventories and transactions with other types of data to draw its safeguards conclusions. As the amount and variety of data and information has increased, the IAEA’s data acquisition, management, and analysis processes have greatly benefited from advancements in computer science, data management, and cybersecurity during the last 20 years. Despite these advancements, inconsistent use of advanced computer technologies asmore » well as political concerns among certain IAEA Member States centered on trust, transparency, and IAEA authorities limit the overall effectiveness and efficiency of IAEA safeguards. As a result, there is an ongoing need to strengthen the effectiveness and efficiency of IAEA safeguards while improving Member State cooperation and trust in the safeguards system. These chronic safeguards needs could be met with some emerging technologies, specifically those associated with the digital currency bitcoin.« less

  9. The Center of Excellence for Hypersonics Training and Research at the University of Texas at Austin

    NASA Technical Reports Server (NTRS)

    Dolling, David S.

    1993-01-01

    Over the period of this grant (1986-92), 23 graduate students were supported by the Center and received education and training in hypersonics through MS and Ph.D. programs. An additional 8 Ph.D. candidates and 2 MS candidates, with their own fellowship support, were attracted to The University of Texas and were recruited into the hypersonics program because of the Center. Their research, supervised by the 10 faculty involved in the Center, resulted in approximately 50 publications and presentations in journals and at national and international technical conferences. To provide broad-based training, a new hypersonics curriculum was created, enabling students to take 8 core classes in theoretical, computational, and experimental hypersonics, and other option classes over a two to four semester period. The Center also developed an active continuing education program. The Hypersonics Short Course was taught 3 times, twice in the USA and once in Europe. Approximately 300 persons were attracted to hear lectures by more than 25 of the leading experts in the field. In addition, a hypersonic aerodynamics short course was offered through AIAA, as well as short courses on computational fluid dynamics (CFD) and advanced CFD. The existence of the Center also enabled faculty to leverage a substantial volume of additional funds from other agencies, for research and graduate student training. Overall, this was a highly successful and highly visible program.

  10. Data management and analysis for the Earth System Grid

    NASA Astrophysics Data System (ADS)

    Williams, D. N.; Ananthakrishnan, R.; Bernholdt, D. E.; Bharathi, S.; Brown, D.; Chen, M.; Chervenak, A. L.; Cinquini, L.; Drach, R.; Foster, I. T.; Fox, P.; Hankin, S.; Henson, V. E.; Jones, P.; Middleton, D. E.; Schwidder, J.; Schweitzer, R.; Schuler, R.; Shoshani, A.; Siebenlist, F.; Sim, A.; Strand, W. G.; Wilhelmi, N.; Su, M.

    2008-07-01

    The international climate community is expected to generate hundreds of petabytes of simulation data within the next five to seven years. This data must be accessed and analyzed by thousands of analysts worldwide in order to provide accurate and timely estimates of the likely impact of climate change on physical, biological, and human systems. Climate change is thus not only a scientific challenge of the first order but also a major technological challenge. In order to address this technological challenge, the Earth System Grid Center for Enabling Technologies (ESG-CET) has been established within the U.S. Department of Energy's Scientific Discovery through Advanced Computing (SciDAC)-2 program, with support from the offices of Advanced Scientific Computing Research and Biological and Environmental Research. ESG-CET's mission is to provide climate researchers worldwide with access to the data, information, models, analysis tools, and computational capabilities required to make sense of enormous climate simulation datasets. Its specific goals are to (1) make data more useful to climate researchers by developing Grid technology that enhances data usability; (2) meet specific distributed database, data access, and data movement needs of national and international climate projects; (3) provide a universal and secure web-based data access portal for broad multi-model data collections; and (4) provide a wide-range of Grid-enabled climate data analysis tools and diagnostic methods to international climate centers and U.S. government agencies. Building on the successes of the previous Earth System Grid (ESG) project, which has enabled thousands of researchers to access tens of terabytes of data from a small number of ESG sites, ESG-CET is working to integrate a far larger number of distributed data providers, high-bandwidth wide-area networks, and remote computers in a highly collaborative problem-solving environment.

  11. Towards prediction of correlated material properties using quantum Monte Carlo methods

    NASA Astrophysics Data System (ADS)

    Wagner, Lucas

    Correlated electron systems offer a richness of physics far beyond noninteracting systems. If we would like to pursue the dream of designer correlated materials, or, even to set a more modest goal, to explain in detail the properties and effective physics of known materials, then accurate simulation methods are required. Using modern computational resources, quantum Monte Carlo (QMC) techniques offer a way to directly simulate electron correlations. I will show some recent results on a few extremely challenging materials including the metal-insulator transition of VO2, the ground state of the doped cuprates, and the pressure dependence of magnetic properties in FeSe. By using a relatively simple implementation of QMC, at least some properties of these materials can be described truly from first principles, without any adjustable parameters. Using the QMC platform, we have developed a way of systematically deriving effective lattice models from the simulation. This procedure is particularly attractive for correlated electron systems because the QMC methods treat the one-body and many-body components of the wave function and Hamiltonian on completely equal footing. I will show some examples of using this downfolding technique and the high accuracy of QMC to connect our intuitive ideas about interacting electron systems with high fidelity simulations. The work in this presentation was supported in part by NSF DMR 1206242, the U.S. Department of Energy, Office of Science, Office of Advanced Scientific Computing Research, Scientific Discovery through Advanced Computing (SciDAC) program under Award Number FG02-12ER46875, and the Center for Emergent Superconductivity, Department of Energy Frontier Research Center under Grant No. DEAC0298CH1088. Computing resources were provided by a Blue Waters Illinois grant and INCITE PhotSuper and SuperMatSim allocations.

  12. Aircraft integrated design and analysis: A classroom experience

    NASA Technical Reports Server (NTRS)

    Weisshaar, Terrence A.

    1989-01-01

    AAE 451 is the capstone course required of all senior undergraduates in the School of Aeronautics and Astronautics at Purdue University. During the past year the first steps of a long evolutionary process were taken to change the content and expectations of this course. These changes are the result of the availability of advanced computational capabilities and sophisticated electronic media availability at Purdue. This presentation will describe both the long range objectives and this year's experience using the High Speed Commercial Transport design, the AIAA Long Duration Aircraft design and RPV design proposal as project objectives. The central goal of these efforts is to provide a user-friendly, computer-software-based environment to supplement traditional design course methodology. The Purdue University Computer Center (PUCC), the Engineering Computer Network (ECN) and stand-alone PC's are being used for this development. This year's accomplishments center primarily on aerodynamics software obtained from NASA/Langley and its integration into the classroom. Word processor capability for oral and written work and computer graphics were also blended into the course. A total of ten HSCT designs were generated, ranging from twin-fuselage aircraft, forward swept wing aircraft to the more traditional delta and double-delta wing aircraft. Four Long Duration Aircraft designs were submitted, together with one RPV design tailored for photographic surveillance.

  13. Human Modeling for Ground Processing Human Factors Engineering Analysis

    NASA Technical Reports Server (NTRS)

    Stambolian, Damon B.; Lawrence, Brad A.; Stelges, Katrine S.; Steady, Marie-Jeanne O.; Ridgwell, Lora C.; Mills, Robert E.; Henderson, Gena; Tran, Donald; Barth, Tim

    2011-01-01

    There have been many advancements and accomplishments over the last few years using human modeling for human factors engineering analysis for design of spacecraft. The key methods used for this are motion capture and computer generated human models. The focus of this paper is to explain the human modeling currently used at Kennedy Space Center (KSC), and to explain the future plans for human modeling for future spacecraft designs

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Svetlana Shasharina

    The goal of the Center for Technology for Advanced Scientific Component Software is to fundamentally changing the way scientific software is developed and used by bringing component-based software development technologies to high-performance scientific and engineering computing. The role of Tech-X work in TASCS project is to provide an outreach to accelerator physics and fusion applications by introducing TASCS tools into applications, testing tools in the applications and modifying the tools to be more usable.

  15. Computer Center Reference Manual. Volume 1

    DTIC Science & Technology

    1990-09-30

    Unlimited o- 0 0 91o1 UNCLASSI FI ED SECURITY CLASSIFICATION OF THIS PAGE REPORT DOCUMENTATION PAGE la . REPORT SECURITY CLASSIFICATION lb. RESTRICTIVE...with connection to INTERNET ) (host tables allow transfer to some other networks) OASYS - the DTRC Office Automation System The following can be reached...and buffers, two windows, and some word processing commands. Advanced editing commands are entered through the use of a command line. EVE las its own

  16. Minority University-Space Interdisciplinary Network Conference Proceedings of the Seventh Annual Users' Conference

    NASA Technical Reports Server (NTRS)

    Harrington, James L., Jr.; Brown, Robin L.; Shukla, Pooja

    1998-01-01

    Seventh annual conference proceedings of the Minority University-SPace Interdisciplinary Network (MU-SPIN) conference. MU-SPIN is cosponsored by NASA Goddard Space Flight Center and the National Science Foundation, and is a comprehensive educational initiative for Historically Black Colleges and Universities, and minority universities. MU-SPIN focuses on the transfer of advanced computer networking technologies to these institutions and their use for supporting multidisciplinary research.

  17. ILLIAC IV Applications Research

    DTIC Science & Technology

    1974-12-31

    Reducino a Real Matrix to the Unper-Hessenbern Form," CAC Document Ko . 11: Center for Advanced Computation, Pniversitv of Illinois at Urbana...Complex for small-scale interactive imane analysis; (4) The APPA Network for decentrali7ed user access Ko t.hp system; -11- APPA FINAL REPORT (5...Grossman David M. Grothe Bruce P. Hanna Bruce M. Hannon James H. Hansen David C. Healy Steven F, Holmgren Dorothy J. Hopkin Robert J. Husby Renato

  18. Active Control Technology at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Antcliff, Richard R.; McGowan, Anna-Marie R.

    2000-01-01

    NASA Langley has a long history of attacking important technical opportunities from a broad base of supporting disciplines. The research and development at Langley in this subject area range from the test tube to the test flight. The information covered here will range from the development of innovative new materials, sensors and actuators, to the incorporation of smart sensors and actuators in practical devices, to the optimization of the location of these devices, to, finally, a wide variety of applications of these devices utilizing Langley's facilities and expertise. Advanced materials are being developed for sensors and actuators, as well as polymers for integrating smart devices into composite structures. Contributions reside in three key areas: computational materials; advanced piezoelectric materials; and integrated composite structures. The computational materials effort is focused on developing predictive tools for the efficient design of new materials with the appropriate combination of properties for next generation smart airframe systems. Research in the area of advanced piezoelectrics includes optimizing the efficiency, force output, use temperature, and energy transfer between the structure and device for both ceramic and polymeric materials. For structural health monitoring, advanced non-destructive techniques including fiber optics are being developed for detection of delaminations, cracks and environmental deterioration in aircraft structures. The computational materials effort is focused on developing predictive tools for the efficient design of new materials with the appropriate combination of properties for next generation smart airframe system. Innovative fabrication techniques processing structural composites with sensor and actuator integration are being developed.

  19. University of Washington/ Northwest National Marine Renewable Energy Center Tidal Current Technology Test Protocol, Instrumentation, Design Code, and Oceanographic Modeling Collaboration: Cooperative Research and Development Final Report, CRADA Number CRD-11-452

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Driscoll, Frederick R.

    The University of Washington (UW) - Northwest National Marine Renewable Energy Center (UW-NNMREC) and the National Renewable Energy Laboratory (NREL) will collaborate to advance research and development (R&D) of Marine Hydrokinetic (MHK) renewable energy technology, specifically renewable energy captured from ocean tidal currents. UW-NNMREC is endeavoring to establish infrastructure, capabilities and tools to support in-water testing of marine energy technology. NREL is leveraging its experience and capabilities in field testing of wind systems to develop protocols and instrumentation to advance field testing of MHK systems. Under this work, UW-NNMREC and NREL will work together to develop a common instrumentation systemmore » and testing methodologies, standards and protocols. UW-NNMREC is also establishing simulation capabilities for MHK turbine and turbine arrays. NREL has extensive experience in wind turbine array modeling and is developing several computer based numerical simulation capabilities for MHK systems. Under this CRADA, UW-NNMREC and NREL will work together to augment single device and array modeling codes. As part of this effort UW NNMREC will also work with NREL to run simulations on NREL's high performance computer system.« less

  20. Active Piezoelectric Structures for Tip Clearance Management Assessed

    NASA Technical Reports Server (NTRS)

    1995-01-01

    Managing blade tip clearance in turbomachinery stages is critical to developing advanced subsonic propulsion systems. Active casing structures with embedded piezoelectric actuators appear to be a promising solution. They can control static and dynamic tip clearance, compensate for uneven deflections, and accomplish electromechanical coupling at the material level. In addition, they have a compact design. To assess the feasibility of this concept and assist the development of these novel structures, the NASA Lewis Research Center developed in-house computational capabilities for composite structures with piezoelectric actuators and sensors, and subsequently used them to simulate candidate active casing structures. The simulations indicated the potential of active casings to modify the blade tip clearance enough to improve stage efficiency. They also provided valuable design information, such as preliminary actuator configurations (number and location) and the corresponding voltage patterns required to compensate for uneven casing deformations. An active ovalization of a casing with four discrete piezoceramic actuators attached on the outer surface is shown. The center figure shows the predicted radial displacements along the hoop direction that are induced when electrostatic voltage is applied at the piezoceramic actuators. This work, which has demonstrated the capabilities of in-house computational models to analyze and design active casing structures, is expected to contribute toward the development of advanced subsonic engines.

  1. Recent manufacturing advances for spiral bevel gears

    NASA Technical Reports Server (NTRS)

    Handschuh, Robert F.; Bill, Robert C.

    1991-01-01

    The U.S. Army Aviation Systems Command (AVSCOM), through the Propulsion Directorate at NASA Lewis Research Center, has recently sponsored projects to advance the manufacturing process for spiral bevel gears. This type of gear is a critical component in rotary-wing propulsion systems. Two successfully completed contracted projects are described. The first project addresses the automated inspection of spiral bevel gears through the use of coordinate measuring machines. The second project entails the computer-numerical-control (CNC) conversion of a spiral bevel gear grinding machine that is used for all aerospace spiral bevel gears. The results of these projects are described with regard to the savings effected in manufacturing time.

  2. Automatic braking system modification for the Advanced Transport Operating Systems (ATOPS) Transportation Systems Research Vehicle (TSRV)

    NASA Technical Reports Server (NTRS)

    Coogan, J. J.

    1986-01-01

    Modifications were designed for the B-737-100 Research Aircraft autobrake system hardware of the Advanced Transport Operating Systems (ATOPS) Program at Langley Research Center. These modifications will allow the on-board flight control computer to control the aircraft deceleration after landing to a continuously variable level for the purpose of executing automatic high speed turn-offs from the runway. A bread board version of the proposed modifications was built and tested in simulated stopping conditions. Test results, for various aircraft weights, turnoff speed, winds, and runway conditions show that the turnoff speeds are achieved generally with errors less than 1 ft/sec.

  3. Space Weather in the Machine Learning Era: A Multidisciplinary Approach

    NASA Astrophysics Data System (ADS)

    Camporeale, E.; Wing, S.; Johnson, J.; Jackman, C. M.; McGranaghan, R.

    2018-01-01

    The workshop entitled Space Weather: A Multidisciplinary Approach took place at the Lorentz Center, University of Leiden, Netherlands, on 25-29 September 2017. The aim of this workshop was to bring together members of the Space Weather, Mathematics, Statistics, and Computer Science communities to address the use of advanced techniques such as Machine Learning, Information Theory, and Deep Learning, to better understand the Sun-Earth system and to improve space weather forecasting. Although individual efforts have been made toward this goal, the community consensus is that establishing interdisciplinary collaborations is the most promising strategy for fully utilizing the potential of these advanced techniques in solving Space Weather-related problems.

  4. Earth System Grid II, Turning Climate Datasets into Community Resources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Middleton, Don

    2006-08-01

    The Earth System Grid (ESG) II project, funded by the Department of Energy’s Scientific Discovery through Advanced Computing program, has transformed climate data into community resources. ESG II has accomplished this goal by creating a virtual collaborative environment that links climate centers and users around the world to models and data via a computing Grid, which is based on the Department of Energy’s supercomputing resources and the Internet. Our project’s success stems from partnerships between climate researchers and computer scientists to advance basic and applied research in the terrestrial, atmospheric, and oceanic sciences. By interfacing with other climate science projects,more » we have learned that commonly used methods to manage and remotely distribute data among related groups lack infrastructure and under-utilize existing technologies. Knowledge and expertise gained from ESG II have helped the climate community plan strategies to manage a rapidly growing data environment more effectively. Moreover, approaches and technologies developed under the ESG project have impacted datasimulation integration in other disciplines, such as astrophysics, molecular biology and materials science.« less

  5. Turbulent Navier-Stokes Flow Analysis of an Advanced Semispan Diamond-Wing Model in Tunnel and Free Air at High-Lift Conditions

    NASA Technical Reports Server (NTRS)

    Ghaffari, Farhad; Biedron, Robert T.; Luckring, James M.

    2002-01-01

    Turbulent Navier-Stokes computational results are presented for an advanced diamond wing semispan model at low-speed, high-lift conditions. The numerical results are obtained in support of a wind-tunnel test that was conducted in the National Transonic Facility at the NASA Langley Research Center. The model incorporated a generic fuselage and was mounted on the tunnel sidewall using a constant-width non-metric standoff. The computations were performed at to a nominal approach and landing flow conditions.The computed high-lift flow characteristics for the model in both the tunnel and in free-air environment are presented. The computed wing pressure distributions agreed well with the measured data and they both indicated a small effect due to the tunnel wall interference effects. However, the wall interference effects were found to be relatively more pronounced in the measured and the computed lift, drag and pitching moment. Although the magnitudes of the computed forces and moment were slightly off compared to the measured data, the increments due the wall interference effects were predicted reasonably well. The numerical results are also presented on the combined effects of the tunnel sidewall boundary layer and the standoff geometry on the fuselage forebody pressure distributions and the resulting impact on the configuration longitudinal aerodynamic characteristics.

  6. RIACS/USRA

    NASA Technical Reports Server (NTRS)

    Oliger, Joseph

    1993-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on 6 June 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. A flexible scientific staff is provided through a university faculty visitor program, a post doctoral program, and a student visitor program. Not only does this provide appropriate expertise but it also introduces scientists outside of NASA to NASA problems. A small group of core RIACS staff provides continuity and interacts with an ARC technical monitor and scientific advisory group to determine the RIACS mission. RIACS activities are reviewed and monitored by a USRA advisory council and ARC technical monitor. Research at RIACS is currently being done in the following areas: Parallel Computing, Advanced Methods for Scientific Computing, High Performance Networks and Technology, and Learning Systems. Parallel compiler techniques, adaptive numerical methods for flows in complicated geometries, and optimization were identified as important problems to investigate for ARC's involvement in the Computational Grand Challenges of the next decade.

  7. Education through the prism of computation

    NASA Astrophysics Data System (ADS)

    Kaurov, Vitaliy

    2014-03-01

    With the rapid development of technology, computation claims its irrevocable place among research components of modern science. Thus to foster a successful future scientist, engineer or educator we need to add computation to the foundations of scientific education. We will discuss what type of paradigm shifts it brings to these foundations on the example of Wolfram Science Summer School. It is one of the most advanced computational outreach programs run by Wolfram Foundation, welcoming participants of almost all ages and backgrounds. Centered on complexity science and physics, it also covers numerous adjacent and interdisciplinary fields such as finance, biology, medicine and even music. We will talk about educational and research experiences in this program during the 12 years of its existence. We will review statistics and outputs the program has produced. Among these are interactive electronic publications at the Wolfram Demonstrations Project and contributions to the computational knowledge engine Wolfram|Alpa.

  8. Center of Excellence for Geospatial Information Science research plan 2013-18

    USGS Publications Warehouse

    Usery, E. Lynn

    2013-01-01

    The U.S. Geological Survey Center of Excellence for Geospatial Information Science (CEGIS) was created in 2006 and since that time has provided research primarily in support of The National Map. The presentations and publications of the CEGIS researchers document the research accomplishments that include advances in electronic topographic map design, generalization, data integration, map projections, sea level rise modeling, geospatial semantics, ontology, user-centered design, volunteer geographic information, and parallel and grid computing for geospatial data from The National Map. A research plan spanning 2013–18 has been developed extending the accomplishments of the CEGIS researchers and documenting new research areas that are anticipated to support The National Map of the future. In addition to extending the 2006–12 research areas, the CEGIS research plan for 2013–18 includes new research areas in data models, geospatial semantics, high-performance computing, volunteered geographic information, crowdsourcing, social media, data integration, and multiscale representations to support the Three-Dimensional Elevation Program (3DEP) and The National Map of the future of the U.S. Geological Survey.

  9. Virtual medicine: Utilization of the advanced cardiac imaging patient avatar for procedural planning and facilitation.

    PubMed

    Shinbane, Jerold S; Saxon, Leslie A

    Advances in imaging technology have led to a paradigm shift from planning of cardiovascular procedures and surgeries requiring the actual patient in a "brick and mortar" hospital to utilization of the digitalized patient in the virtual hospital. Cardiovascular computed tomographic angiography (CCTA) and cardiovascular magnetic resonance (CMR) digitalized 3-D patient representation of individual patient anatomy and physiology serves as an avatar allowing for virtual delineation of the most optimal approaches to cardiovascular procedures and surgeries prior to actual hospitalization. Pre-hospitalization reconstruction and analysis of anatomy and pathophysiology previously only accessible during the actual procedure could potentially limit the intrinsic risks related to time in the operating room, cardiac procedural laboratory and overall hospital environment. Although applications are specific to areas of cardiovascular specialty focus, there are unifying themes related to the utilization of technologies. The virtual patient avatar computer can also be used for procedural planning, computational modeling of anatomy, simulation of predicted therapeutic result, printing of 3-D models, and augmentation of real time procedural performance. Examples of the above techniques are at various stages of development for application to the spectrum of cardiovascular disease processes, including percutaneous, surgical and hybrid minimally invasive interventions. A multidisciplinary approach within medicine and engineering is necessary for creation of robust algorithms for maximal utilization of the virtual patient avatar in the digital medical center. Utilization of the virtual advanced cardiac imaging patient avatar will play an important role in the virtual health care system. Although there has been a rapid proliferation of early data, advanced imaging applications require further assessment and validation of accuracy, reproducibility, standardization, safety, efficacy, quality, cost effectiveness, and overall value to medical care. Copyright © 2018 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.

  10. Computer architectures for computational physics work done by Computational Research and Technology Branch and Advanced Computational Concepts Group

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Slides are reproduced that describe the importance of having high performance number crunching and graphics capability. They also indicate the types of research and development underway at Ames Research Center to ensure that, in the near term, Ames is a smart buyer and user, and in the long-term that Ames knows the best possible solutions for number crunching and graphics needs. The drivers for this research are real computational physics applications of interest to Ames and NASA. They are concerned with how to map the applications, and how to maximize the physics learned from the results of the calculations. The computer graphics activities are aimed at getting maximum information from the three-dimensional calculations by using the real time manipulation of three-dimensional data on the Silicon Graphics workstation. Work is underway on new algorithms that will permit the display of experimental results that are sparse and random, the same way that the dense and regular computed results are displayed.

  11. Aerothermodynamics research at NASA Ames Research Center

    NASA Technical Reports Server (NTRS)

    Deiwert, George S.

    1987-01-01

    Research activity in the aerothermodynamics branch at the NASA Ames Research Center is reviewed. Advanced concepts and mission studies relating to the next generation aerospace transportation systems are summarized and directions for continued research identified. Theoretical and computational studies directed at determining flow fields and radiative and convective heating loads in real gases are described. Included are Navier-Stokes codes for equilibrium and thermochemical nonequilibrium air. Experimental studies in the 3.5-ft hypersonic wind tunnel, the ballistic ranges, and the electric arc driven shock tube are described. Tested configurations include generic hypersonic aerospace plane configurations, aeroassisted orbital transfer vehicle shapes and Galileo probe models.

  12. USGS science in Menlo Park -- a science strategy for the U.S. Geological Survey Menlo Park Science Center, 2005-2015

    USGS Publications Warehouse

    Brocher, Thomas M.; Carr, Michael D.; Halsing, David L.; John, David A.; Langenheim, V.E.; Mangan, Margaret T.; Marvin-DiPasquale, Mark C.; Takekawa, John Y.; Tiedeman, Claire

    2006-01-01

    In the spring of 2004, the U.S. Geological Survey (USGS) Menlo Park Center Council commissioned an interdisciplinary working group to develop a forward-looking science strategy for the USGS Menlo Park Science Center in California (hereafter also referred to as "the Center"). The Center has been the flagship research center for the USGS in the western United States for more than 50 years, and the Council recognizes that science priorities must be the primary consideration guiding critical decisions made about the future evolution of the Center. In developing this strategy, the working group consulted widely within the USGS and with external clients and collaborators, so that most stakeholders had an opportunity to influence the science goals and operational objectives.The Science Goals are to: Natural Hazards: Conduct natural-hazard research and assessments critical to effective mitigation planning, short-term forecasting, and event response. Ecosystem Change: Develop a predictive understanding of ecosystem change that advances ecosystem restoration and adaptive management. Natural Resources: Advance the understanding of natural resources in a geologic, hydrologic, economic, environmental, and global context. Modeling Earth System Processes: Increase and improve capabilities for quantitative simulation, prediction, and assessment of Earth system processes.The strategy presents seven key Operational Objectives with specific actions to achieve the scientific goals. These Operational Objectives are to:Provide a hub for technology, laboratories, and library services to support science in the Western Region. Increase advanced computing capabilities and promote sharing of these resources. Enhance the intellectual diversity, vibrancy, and capacity of the work force through improved recruitment and retention. Strengthen client and collaborative relationships in the community at an institutional level.Expand monitoring capability by increasing density, sensitivity, and efficiency and reducing costs of instruments and networks. Encourage a breadth of scientific capabilities in Menlo Park to foster interdisciplinary science. Communicate USGS science to a diverse audience.

  13. Aircraft integrated design and analysis: A classroom experience

    NASA Technical Reports Server (NTRS)

    1988-01-01

    AAE 451 is the capstone course required of all senior undergraduates in the School of Aeronautics and Astronautics at Purdue University. During the past year the first steps of a long evolutionary process were taken to change the content and expectations of this course. These changes are the result of the availability of advanced computational capabilities and sophisticated electronic media availability at Purdue. This presentation will describe both the long range objectives and this year's experience using the High Speed Commercial Transport (HSCT) design, the AIAA Long Duration Aircraft design and a Remotely Piloted Vehicle (RPV) design proposal as project objectives. The central goal of these efforts was to provide a user-friendly, computer-software-based, environment to supplement traditional design course methodology. The Purdue University Computer Center (PUCC), the Engineering Computer Network (ECN), and stand-alone PC's were used for this development. This year's accomplishments centered primarily on aerodynamics software obtained from the NASA Langley Research Center and its integration into the classroom. Word processor capability for oral and written work and computer graphics were also blended into the course. A total of 10 HSCT designs were generated, ranging from twin-fuselage and forward-swept wing aircraft, to the more traditional delta and double-delta wing aircraft. Four Long Duration Aircraft designs were submitted, together with one RPV design tailored for photographic surveillance. Supporting these activities were three video satellite lectures beamed from NASA/Langley to Purdue. These lectures covered diverse areas such as an overview of HSCT design, supersonic-aircraft stability and control, and optimization of aircraft performance. Plans for next year's effort will be reviewed, including dedicated computer workstation utilization, remote satellite lectures, and university/industrial cooperative efforts.

  14. The development and use of a computer-interactive data acquisition and display system in a flight environment

    NASA Technical Reports Server (NTRS)

    Bever, G. A.

    1981-01-01

    The flight test data requirements at the NASA Dryden Flight Research Center increased in complexity, and more advanced instrumentation became necessary to accomplish mission goals. This paper describes the way in which an airborne computer was used to perform real-time calculations on critical flight test parameters during a flight test on a winglet-equipped KC-135A aircraft. With the computer, an airborne flight test engineer can select any sensor for airborne display in several formats, including engineering units. The computer is able to not only calculate values derived from the sensor outputs but also to interact with the data acquisition system. It can change the data cycle format and data rate, and even insert the derived values into the pulse code modulation (PCM) bit stream for recording.

  15. Biological Visualization, Imaging and Simulation(Bio-VIS) at NASA Ames Research Center: Developing New Software and Technology for Astronaut Training and Biology Research in Space

    NASA Technical Reports Server (NTRS)

    Smith, Jeffrey

    2003-01-01

    The Bio- Visualization, Imaging and Simulation (BioVIS) Technology Center at NASA's Ames Research Center is dedicated to developing and applying advanced visualization, computation and simulation technologies to support NASA Space Life Sciences research and the objectives of the Fundamental Biology Program. Research ranges from high resolution 3D cell imaging and structure analysis, virtual environment simulation of fine sensory-motor tasks, computational neuroscience and biophysics to biomedical/clinical applications. Computer simulation research focuses on the development of advanced computational tools for astronaut training and education. Virtual Reality (VR) and Virtual Environment (VE) simulation systems have become important training tools in many fields from flight simulation to, more recently, surgical simulation. The type and quality of training provided by these computer-based tools ranges widely, but the value of real-time VE computer simulation as a method of preparing individuals for real-world tasks is well established. Astronauts routinely use VE systems for various training tasks, including Space Shuttle landings, robot arm manipulations and extravehicular activities (space walks). Currently, there are no VE systems to train astronauts for basic and applied research experiments which are an important part of many missions. The Virtual Glovebox (VGX) is a prototype VE system for real-time physically-based simulation of the Life Sciences Glovebox where astronauts will perform many complex tasks supporting research experiments aboard the International Space Station. The VGX consists of a physical display system utilizing duel LCD projectors and circular polarization to produce a desktop-sized 3D virtual workspace. Physically-based modeling tools (Arachi Inc.) provide real-time collision detection, rigid body dynamics, physical properties and force-based controls for objects. The human-computer interface consists of two magnetic tracking devices (Ascention Inc.) attached to instrumented gloves (Immersion Inc.) which co-locate the user's hands with hand/forearm representations in the virtual workspace. Force-feedback is possible in a work volume defined by a Phantom Desktop device (SensAble inc.). Graphics are written in OpenGL. The system runs on a 2.2 GHz Pentium 4 PC. The prototype VGX provides astronauts and support personnel with a real-time physically-based VE system to simulate basic research tasks both on Earth and in the microgravity of Space. The immersive virtual environment of the VGX also makes it a useful tool for virtual engineering applications including CAD development, procedure design and simulation of human-system systems in a desktop-sized work volume.

  16. RIACS FY2002 Annual Report

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.; Gross, Anthony R. (Technical Monitor)

    2002-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. Operated by the Universities Space Research Association (a non-profit university consortium), RIACS is located at the NASA Ames Research Center, Moffett Field, California. It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in September 2003. Ames has been designated NASA's Center of Excellence in Information Technology. In this capacity, Ames is charged with the responsibility to build an Information Technology (IT) Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA Ames and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of IT research necessary to meet the future challenges of NASA missions: 1) Automated Reasoning for Autonomous Systems; 2) Human-Centered Computing; and 3) High Performance Computing and Networking. In addition, RIACS collaborates with NASA scientists to apply IT research to a variety of NASA application domains including aerospace technology, earth science, life sciences, and astrobiology. RIACS also engages in other activities, such as workshops, seminars, visiting scientist programs and student summer programs, designed to encourage and facilitate collaboration between the university and NASA IT research communities.

  17. Flight experience with a fail-operational digital fly-by-wire control system

    NASA Technical Reports Server (NTRS)

    Brown, S. R.; Szalai, K. J.

    1977-01-01

    The NASA Dryden Flight Research Center is flight testing a triply redundant digital fly-by-wire (DFBW) control system installed in an F-8 aircraft. The full-time, full-authority system performs three-axis flight control computations, including stability and command augmentation, autopilot functions, failure detection and isolation, and self-test functions. Advanced control law experiments include an active flap mode for ride smoothing and maneuver drag reduction. This paper discusses research being conducted on computer synchronization, fault detection, fault isolation, and recovery from transient faults. The F-8 DFBW system has demonstrated immunity from nuisance fault declarations while quickly identifying truly faulty components.

  18. Computer-aided controllability assessment of generic manned Space Station concepts

    NASA Technical Reports Server (NTRS)

    Ferebee, M. J.; Deryder, L. J.; Heck, M. L.

    1984-01-01

    NASA's Concept Development Group assessment methodology for the on-orbit rigid body controllability characteristics of each generic configuration proposed for the manned space station is presented; the preliminary results obtained represent the first step in the analysis of these eight configurations. Analytical computer models of each configuration were developed by means of the Interactive Design Evaluation of Advanced Spacecraft CAD system, which created three-dimensional geometry models of each configuration to establish dimensional requirements for module connectivity, payload accommodation, and Space Shuttle berthing; mass, center-of-gravity, inertia, and aerodynamic drag areas were then derived. Attention was also given to the preferred flight attitude of each station concept.

  19. Data Acquisition Systems

    NASA Technical Reports Server (NTRS)

    1994-01-01

    In the mid-1980s, Kinetic Systems and Langley Research Center determined that high speed CAMAC (Computer Automated Measurement and Control) data acquisition systems could significantly improve Langley's ARTS (Advanced Real Time Simulation) system. The ARTS system supports flight simulation R&D, and the CAMAC equipment allowed 32 high performance simulators to be controlled by centrally located host computers. This technology broadened Kinetic Systems' capabilities and led to several commercial applications. One of them is General Atomics' fusion research program. Kinetic Systems equipment allows tokamak data to be acquired four to 15 times more rapidly. Ford Motor company uses the same technology to control and monitor transmission testing facilities.

  20. Basic JCL for the CRAY-1 operating system (COS) with emphasis on making the transition from CDC 7600/SCOPE

    NASA Technical Reports Server (NTRS)

    Howe, G.; Saunders, D.

    1983-01-01

    Users of the CDC 7600 at Ames are assisted in making the transition to the CRAY-1. Similarities and differences in the basic JCL are summarized, and a dozen or so examples of typical batch jobs for the two systems are shown in parallel. Some changes to look for in FORTRAN programs and in the use of UPDATE are also indicated. No attempt is made to cover magnetic tape handling. The material here should not be considered a substitute for reading the more conventional manuals or the User's Guide for the Advanced Computational Facility, available from the Computer Information Center.

  1. Advances in the production of freeform optical surfaces

    NASA Astrophysics Data System (ADS)

    Tohme, Yazid E.; Luniya, Suneet S.

    2007-05-01

    Recent market demands for free-form optics have challenged the industry to find new methods and techniques to manufacture free-form optical surfaces with a high level of accuracy and reliability. Production techniques are becoming a mix of multi-axis single point diamond machining centers or deterministic ultra precision grinding centers coupled with capable measurement systems to accomplish the task. It has been determined that a complex software tool is required to seamlessly integrate all aspects of the manufacturing process chain. Advances in computational power and improved performance of computer controlled precision machinery have driven the use of such software programs to measure, visualize, analyze, produce and re-validate the 3D free-form design thus making the process of manufacturing such complex surfaces a viable task. Consolidation of the entire production cycle in a comprehensive software tool that can interact with all systems in design, production and measurement phase will enable manufacturers to solve these complex challenges providing improved product quality, simplified processes, and enhanced performance. The work being presented describes the latest advancements in developing such software package for the entire fabrication process chain for aspheric and free-form shapes. It applies a rational B-spline based kernel to transform an optical design in the form of parametrical definition (optical equation), standard CAD format, or a cloud of points to a central format that drives the simulation. This software tool creates a closed loop for the fabrication process chain. It integrates surface analysis and compensation, tool path generation, and measurement analysis in one package.

  2. Computed tomographic colonography for colorectal cancer screening: risk factors for the detection of advanced neoplasia.

    PubMed

    Hassan, Cesare; Pooler, B Dustin; Kim, David H; Rinaldi, Antonio; Repici, Alessandro; Pickhardt, Perry J

    2013-07-15

    The objective of this study was to determine whether age, sex, a positive family history of colorectal cancer, and body mass index (BMI) are important predictors of advanced neoplasia in the setting of screening computed tomographic colonography (CTC). Consecutive patients who were referred for first-time screening CTC from 2004 to 2011 at a single medical center were enrolled. Results at pathology were recorded for all patients who underwent polypectomy. Logistic regression was used to identify significant predictor variables for advanced neoplasia (any adenoma ≥ 10 mm or with villous component, high-grade dysplasia, or adenocarcinoma). Odds ratios (ORs) were used to express associations between the study variables (age, sex, BMI, and a positive family history of colorectal cancer) and advanced neoplasia. In total, 7620 patients underwent CTC screening. Of these, 276 patients (3.6%; 95% confidence interval [CI], 3.2%-4.1%) ultimately were diagnosed with advanced neoplasia. At multivariate analysis, age (mean OR per 10-year increase, 1.8; 95% CI, 1.6-2.0) and being a man (OR, 1.7; 95% CI, 1.3-2.2) were independent predictors of advanced neoplasia, whereas BMI and a positive family history of colorectal cancer were not. The number needed to screen to detect 1 case of advanced neoplasia varied from 51 among women aged ≤ 55 years to 10 among men aged >65 years. The number of post-CTC colonoscopies needed to detect 1 case of advanced neoplasia varied from 2 to 4. Age and sex were identified as important independent predictors of advanced neoplasia risk in individuals undergoing screening CTC, whereas BMI and a positive family history of colorectal cancer were not. These results have implications for appropriate patient selection. © 2013 American Cancer Society.

  3. Computational mechanics research and support for aerodynamics and hydraulics at TFHRC, year 2 quarter 1 progress report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lottes, S.A.; Bojanowski, C.; Shen, J.

    2012-04-09

    The computational fluid dynamics (CFD) and computational structural mechanics (CSM) focus areas at Argonne's Transportation Research and Analysis Computing Center (TRACC) initiated a project to support and compliment the experimental programs at the Turner-Fairbank Highway Research Center (TFHRC) with high performance computing based analysis capabilities in August 2010. The project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at TFHRC for a period of five years, beginning in October 2010. Themore » analysis methods employ well-benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focus of the project are wind and water effects on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to improve design allowing for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, CFD analysis of the operation of the wind tunnel in the TFHRC wind engineering laboratory. This quarterly report documents technical progress on the project tasks for the period of October through December 2011.« less

  4. Computational mechanics research and support for aerodynamics and hydraulics at TFHRC, year 2 quarter 2 progress report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lottes, S.A.; Bojanowski, C.; Shen, J.

    2012-06-28

    The computational fluid dynamics (CFD) and computational structural mechanics (CSM) focus areas at Argonne's Transportation Research and Analysis Computing Center (TRACC) initiated a project to support and compliment the experimental programs at the Turner-Fairbank Highway Research Center (TFHRC) with high performance computing based analysis capabilities in August 2010. The project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at TFHRC for a period of five years, beginning in October 2010. Themore » analysis methods employ well benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focus of the project are wind and water effects on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to improve design allowing for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, CFD analysis of the operation of the wind tunnel in the TFHRC wind engineering laboratory. This quarterly report documents technical progress on the project tasks for the period of January through March 2012.« less

  5. Computational mechanics research and support for aerodynamics and hydraulics at TFHRC, year 1 quarter 3 progress report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lottes, S.A.; Kulak, R.F.; Bojanowski, C.

    2011-08-26

    The computational fluid dynamics (CFD) and computational structural mechanics (CSM) focus areas at Argonne's Transportation Research and Analysis Computing Center (TRACC) initiated a project to support and compliment the experimental programs at the Turner-Fairbank Highway Research Center (TFHRC) with high performance computing based analysis capabilities in August 2010. The project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at TFHRC for a period of five years, beginning in October 2010. Themore » analysis methods employ well-benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focus of the project are wind and water loads on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to assess them for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, vehicle stability under high wind loading, and the use of electromagnetic shock absorbers to improve vehicle stability under high wind conditions. This quarterly report documents technical progress on the project tasks for the period of April through June 2011.« less

  6. Silicon photonics for high-performance interconnection networks

    NASA Astrophysics Data System (ADS)

    Biberman, Aleksandr

    2011-12-01

    We assert in the course of this work that silicon photonics has the potential to be a key disruptive technology in computing and communication industries. The enduring pursuit of performance gains in computing, combined with stringent power constraints, has fostered the ever-growing computational parallelism associated with chip multiprocessors, memory systems, high-performance computing systems, and data centers. Sustaining these parallelism growths introduces unique challenges for on- and off-chip communications, shifting the focus toward novel and fundamentally different communication approaches. This work showcases that chip-scale photonic interconnection networks, enabled by high-performance silicon photonic devices, enable unprecedented bandwidth scalability with reduced power consumption. We demonstrate that the silicon photonic platforms have already produced all the high-performance photonic devices required to realize these types of networks. Through extensive empirical characterization in much of this work, we demonstrate such feasibility of waveguides, modulators, switches, and photodetectors. We also demonstrate systems that simultaneously combine many functionalities to achieve more complex building blocks. Furthermore, we leverage the unique properties of available silicon photonic materials to create novel silicon photonic devices, subsystems, network topologies, and architectures to enable unprecedented performance of these photonic interconnection networks and computing systems. We show that the advantages of photonic interconnection networks extend far beyond the chip, offering advanced communication environments for memory systems, high-performance computing systems, and data centers. Furthermore, we explore the immense potential of all-optical functionalities implemented using parametric processing in the silicon platform, demonstrating unique methods that have the ability to revolutionize computation and communication. Silicon photonics enables new sets of opportunities that we can leverage for performance gains, as well as new sets of challenges that we must solve. Leveraging its inherent compatibility with standard fabrication techniques of the semiconductor industry, combined with its capability of dense integration with advanced microelectronics, silicon photonics also offers a clear path toward commercialization through low-cost mass-volume production. Combining empirical validations of feasibility, demonstrations of massive performance gains in large-scale systems, and the potential for commercial penetration of silicon photonics, the impact of this work will become evident in the many decades that follow.

  7. Information Management for a Large Multidisciplinary Project

    NASA Technical Reports Server (NTRS)

    Jones, Kennie H.; Randall, Donald P.; Cronin, Catherine K.

    1992-01-01

    In 1989, NASA's Langley Research Center (LaRC) initiated the High-Speed Airframe Integration Research (HiSAIR) Program to develop and demonstrate an integrated environment for high-speed aircraft design using advanced multidisciplinary analysis and optimization procedures. The major goals of this program were to evolve the interactions among disciplines and promote sharing of information, to provide a timely exchange of information among aeronautical disciplines, and to increase the awareness of the effects each discipline has upon other disciplines. LaRC historically has emphasized the advancement of analysis techniques. HiSAIR was founded to synthesize these advanced methods into a multidisciplinary design process emphasizing information feedback among disciplines and optimization. Crucial to the development of such an environment are the definition of the required data exchanges and the methodology for both recording the information and providing the exchanges in a timely manner. These requirements demand extensive use of data management techniques, graphic visualization, and interactive computing. HiSAIR represents the first attempt at LaRC to promote interdisciplinary information exchange on a large scale using advanced data management methodologies combined with state-of-the-art, scientific visualization techniques on graphics workstations in a distributed computing environment. The subject of this paper is the development of the data management system for HiSAIR.

  8. 3-D Imaging In Virtual Environment: A Scientific Clinical and Teaching Tool

    NASA Technical Reports Server (NTRS)

    Ross, Muriel D.; DeVincenzi, Donald L. (Technical Monitor)

    1996-01-01

    The advent of powerful graphics workstations and computers has led to the advancement of scientific knowledge through three-dimensional (3-D) reconstruction and imaging of biological cells and tissues. The Biocomputation Center at NASA Ames Research Center pioneered the effort to produce an entirely computerized method for reconstruction of objects from serial sections studied in a transmission electron microscope (TEM). The software developed, ROSS (Reconstruction of Serial Sections), is now being distributed to users across the United States through Space Act Agreements. The software is in widely disparate fields such as geology, botany, biology and medicine. In the Biocomputation Center, ROSS serves as the basis for development of virtual environment technologies for scientific and medical use. This report will describe the Virtual Surgery Workstation Project that is ongoing with clinicians at Stanford University Medical Center, and the role of the Visible Human data in the project.

  9. Real-time automated failure identification in the Control Center Complex (CCC)

    NASA Technical Reports Server (NTRS)

    Kirby, Sarah; Lauritsen, Janet; Pack, Ginger; Ha, Anhhoang; Jowers, Steven; Mcnenny, Robert; Truong, The; Dell, James

    1993-01-01

    A system which will provide real-time failure management support to the Space Station Freedom program is described. The system's use of a simplified form of model based reasoning qualifies it as an advanced automation system. However, it differs from most such systems in that it was designed from the outset to meet two sets of requirements. First, it must provide a useful increment to the fault management capabilities of the Johnson Space Center (JSC) Control Center Complex (CCC) Fault Detection Management system. Second, it must satisfy CCC operational environment constraints such as cost, computer resource requirements, verification, and validation, etc. The need to meet both requirement sets presents a much greater design challenge than would have been the case had functionality been the sole design consideration. The choice of technology, discussing aspects of that choice and the process for migrating it into the control center is overviewed.

  10. Protecting genomic data analytics in the cloud: state of the art and opportunities.

    PubMed

    Tang, Haixu; Jiang, Xiaoqian; Wang, Xiaofeng; Wang, Shuang; Sofia, Heidi; Fox, Dov; Lauter, Kristin; Malin, Bradley; Telenti, Amalio; Xiong, Li; Ohno-Machado, Lucila

    2016-10-13

    The outsourcing of genomic data into public cloud computing settings raises concerns over privacy and security. Significant advancements in secure computation methods have emerged over the past several years, but such techniques need to be rigorously evaluated for their ability to support the analysis of human genomic data in an efficient and cost-effective manner. With respect to public cloud environments, there are concerns about the inadvertent exposure of human genomic data to unauthorized users. In analyses involving multiple institutions, there is additional concern about data being used beyond agreed research scope and being prcoessed in untrused computational environments, which may not satisfy institutional policies. To systematically investigate these issues, the NIH-funded National Center for Biomedical Computing iDASH (integrating Data for Analysis, 'anonymization' and SHaring) hosted the second Critical Assessment of Data Privacy and Protection competition to assess the capacity of cryptographic technologies for protecting computation over human genomes in the cloud and promoting cross-institutional collaboration. Data scientists were challenged to design and engineer practical algorithms for secure outsourcing of genome computation tasks in working software, whereby analyses are performed only on encrypted data. They were also challenged to develop approaches to enable secure collaboration on data from genomic studies generated by multiple organizations (e.g., medical centers) to jointly compute aggregate statistics without sharing individual-level records. The results of the competition indicated that secure computation techniques can enable comparative analysis of human genomes, but greater efficiency (in terms of compute time and memory utilization) are needed before they are sufficiently practical for real world environments.

  11. CESDIS

    NASA Technical Reports Server (NTRS)

    1994-01-01

    CESDIS, the Center of Excellence in Space Data and Information Sciences was developed jointly by NASA, Universities Space Research Association (USRA), and the University of Maryland in 1988 to focus on the design of advanced computing techniques and data systems to support NASA Earth and space science research programs. CESDIS is operated by USRA under contract to NASA. The Director, Associate Director, Staff Scientists, and administrative staff are located on-site at NASA's Goddard Space Flight Center in Greenbelt, Maryland. The primary CESDIS mission is to increase the connection between computer science and engineering research programs at colleges and universities and NASA groups working with computer applications in Earth and space science. Research areas of primary interest at CESDIS include: 1) High performance computing, especially software design and performance evaluation for massively parallel machines; 2) Parallel input/output and data storage systems for high performance parallel computers; 3) Data base and intelligent data management systems for parallel computers; 4) Image processing; 5) Digital libraries; and 6) Data compression. CESDIS funds multiyear projects at U. S. universities and colleges. Proposals are accepted in response to calls for proposals and are selected on the basis of peer reviews. Funds are provided to support faculty and graduate students working at their home institutions. Project personnel visit Goddard during academic recess periods to attend workshops, present seminars, and collaborate with NASA scientists on research projects. Additionally, CESDIS takes on specific research tasks of shorter duration for computer science research requested by NASA Goddard scientists.

  12. NASA's Pleiades Supercomputer Crunches Data For Groundbreaking Analysis and Visualizations

    NASA Image and Video Library

    2016-11-23

    The Pleiades supercomputer at NASA's Ames Research Center, recently named the 13th fastest computer in the world, provides scientists and researchers high-fidelity numerical modeling of complex systems and processes. By using detailed analyses and visualizations of large-scale data, Pleiades is helping to advance human knowledge and technology, from designing the next generation of aircraft and spacecraft to understanding the Earth's climate and the mysteries of our galaxy.

  13. Langley applications experiments data management system study. [for space shuttles

    NASA Technical Reports Server (NTRS)

    Lanham, C. C., Jr.

    1975-01-01

    A data management system study is presented that defines, in functional terms, the most cost effective ground data management system to support Advanced Technology Laboratory (ATL) flights of the space shuttle. Results from each subtask performed and the recommended system configuration for reformatting the experiment instrumentation tapes to computer compatible tape are examined. Included are cost factors for development of a mini control center for real-time support of the ATL flights.

  14. Construction of Blaze at the University of Illinois at Chicago: A Shared, High-Performance, Visual Computer for Next-Generation Cyberinfrastructure-Accelerated Scientific, Engineering, Medical and Public Policy Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Maxine D.; Leigh, Jason

    2014-02-17

    The Blaze high-performance visual computing system serves the high-performance computing research and education needs of University of Illinois at Chicago (UIC). Blaze consists of a state-of-the-art, networked, computer cluster and ultra-high-resolution visualization system called CAVE2(TM) that is currently not available anywhere in Illinois. This system is connected via a high-speed 100-Gigabit network to the State of Illinois' I-WIRE optical network, as well as to national and international high speed networks, such as the Internet2, and the Global Lambda Integrated Facility. This enables Blaze to serve as an on-ramp to national cyberinfrastructure, such as the National Science Foundation’s Blue Waters petascalemore » computer at the National Center for Supercomputing Applications at the University of Illinois at Chicago and the Department of Energy’s Argonne Leadership Computing Facility (ALCF) at Argonne National Laboratory. DOE award # DE-SC005067, leveraged with NSF award #CNS-0959053 for “Development of the Next-Generation CAVE Virtual Environment (NG-CAVE),” enabled us to create a first-of-its-kind high-performance visual computing system. The UIC Electronic Visualization Laboratory (EVL) worked with two U.S. companies to advance their commercial products and maintain U.S. leadership in the global information technology economy. New applications are being enabled with the CAVE2/Blaze visual computing system that is advancing scientific research and education in the U.S. and globally, and help train the next-generation workforce.« less

  15. 76 FR 41234 - Advanced Scientific Computing Advisory Committee Charter Renewal

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-13

    ... Secretariat, General Services Administration, notice is hereby given that the Advanced Scientific Computing... advice and recommendations concerning the Advanced Scientific Computing program in response only to... Advanced Scientific Computing Research program and recommendations based thereon; --Advice on the computing...

  16. Interoperability through standardization: Electronic mail, and X Window systems

    NASA Technical Reports Server (NTRS)

    Amin, Ashok T.

    1993-01-01

    Since the introduction of computing machines, there has been continual advances in computer and communication technologies and approaching limits. The user interface has evolved from a row of switches, character based interface using teletype terminals and then video terminals, to present day graphical user interface. It is expected that next significant advances will come in the availability of services, such as electronic mail and directory services, as the standards for applications are developed and in the 'easy to use' interfaces, such as Graphical User Interface for example Window and X Window, which are being standardized. Various proprietary electronic mail (email) systems are in use within organizations at each center of NASA. Each system provides email services to users within an organization, however the support for email services across organizations and across centers exists at centers to a varying degree and is often easy to use. A recent NASA email initiative is intended 'to provide a simple way to send email across organizational boundaries without disruption of installed base.' The initiative calls for integration of existing organizational email systems through gateways connected by a message switch, supporting X.400 and SMTP protocols, to create a NASA wide email system and for implementation of NASA wide email directory services based on OSI standard X.500. A brief overview of MSFC efforts as a part of this initiative are described. Window based graphical user interfaces make computers easy to use. X window protocol has been developed at Massachusetts Institute of Technology in 1984/1985 to provide uniform window based interface in a distributed computing environment with heterogenous computers. It has since become a standard supported by a number of major manufacturers. Z Windows systems, terminals and workstations, and X Window applications are becoming available. However impact of its use in the Local Area Network environment on the network traffic are not well understood. It is expected that the use of X Windows systems will increase at MSFC especially for Unix based systems. An overview of X Window protocol is presented and its impact on the network traffic is examined. It is proposed that an analytical model of X Window systems in the network environment be developed and validated through the use of measurements to generate application and user profiles.

  17. Scientific Grid activities and PKI deployment in the Cybermedia Center, Osaka University.

    PubMed

    Akiyama, Toyokazu; Teranishi, Yuuichi; Nozaki, Kazunori; Kato, Seiichi; Shimojo, Shinji; Peltier, Steven T; Lin, Abel; Molina, Tomas; Yang, George; Lee, David; Ellisman, Mark; Naito, Sei; Koike, Atsushi; Matsumoto, Shuichi; Yoshida, Kiyokazu; Mori, Hirotaro

    2005-10-01

    The Cybermedia Center (CMC), Osaka University, is a research institution that offers knowledge and technology resources obtained from advanced researches in the areas of large-scale computation, information and communication, multimedia content and education. Currently, CMC is involved in Japanese national Grid projects such as JGN II (Japan Gigabit Network), NAREGI and BioGrid. Not limited to Japan, CMC also actively takes part in international activities such as PRAGMA. In these projects and international collaborations, CMC has developed a Grid system that allows scientists to perform their analysis by remote-controlling the world's largest ultra-high voltage electron microscope located in Osaka University. In another undertaking, CMC has assumed a leadership role in BioGrid by sharing its experiences and knowledge on the system development for the area of biology. In this paper, we will give an overview of the BioGrid project and introduce the progress of the Telescience unit, which collaborates with the Telescience Project led by the National Center for Microscopy and Imaging Research (NCMIR). Furthermore, CMC collaborates with seven Computing Centers in Japan, NAREGI and National Institute of Informatics to deploy PKI base authentication infrastructure. The current status of this project and future collaboration with Grid Projects will be delineated in this paper.

  18. Realizing the potential of the CUAHSI Water Data Center to advance Earth Science

    NASA Astrophysics Data System (ADS)

    Hooper, R. P.; Seul, M.; Pollak, J.; Couch, A.

    2015-12-01

    The CUAHSI Water Data Center has developed a cloud-based system for data publication, discovery and access. Key features of this system are a semantically enabled catalog to discover data across more than 100 different services and delivery of data and metadata in a standard format. While this represents a significant technical achievement, the purpose of this system is to support data reanalysis for advancing science. A new web-based client, HydroClient, improves access to the data from previous clients. This client is envisioned as the first step in a workflow that can involve visualization and analysis using web-processing services, followed by download to local computers for further analysis. The release of the WaterML library in the R package CRAN repository is an initial attempt at linking the WDC services in a larger analysis workflow. We are seeking community input on other resources required to make the WDC services more valuable in scientific research and education.

  19. The Eukaryotic Pathogen Databases: a functional genomic resource integrating data from human and veterinary parasites.

    PubMed

    Harb, Omar S; Roos, David S

    2015-01-01

    Over the past 20 years, advances in high-throughput biological techniques and the availability of computational resources including fast Internet access have resulted in an explosion of large genome-scale data sets "big data." While such data are readily available for download and personal use and analysis from a variety of repositories, often such analysis requires access to seldom-available computational skills. As a result a number of databases have emerged to provide scientists with online tools enabling the interrogation of data without the need for sophisticated computational skills beyond basic knowledge of Internet browser utility. This chapter focuses on the Eukaryotic Pathogen Databases (EuPathDB: http://eupathdb.org) Bioinformatic Resource Center (BRC) and illustrates some of the available tools and methods.

  20. Advanced digital signal processing for short haul optical fiber transmission beyond 100G

    NASA Astrophysics Data System (ADS)

    Kikuchi, Nobuhiko

    2017-01-01

    Significant increase of intra and inter data center traffic has been expected by the rapid spread of various network applications like SNS, IoT, mobile and cloud computing, and the needs for ultra-high speed and cost-effective short- to medium-reach optical fiber links beyond 100-Gbit/s is becoming larger and larger. Such high-speed links typically use multilevel modulation to lower signaling speed, which in turn face serious challenges in limited loss budget and waveform distortion tolerance. One of the promising techniques to overcome them is the use of advanced digital signal processing (DSP) and we review various DSP applications for short-to-medium reach applications.

  1. A Hybrid Cloud Computing Service for Earth Sciences

    NASA Astrophysics Data System (ADS)

    Yang, C. P.

    2016-12-01

    Cloud Computing is becoming a norm for providing computing capabilities for advancing Earth sciences including big Earth data management, processing, analytics, model simulations, and many other aspects. A hybrid spatiotemporal cloud computing service is bulit at George Mason NSF spatiotemporal innovation center to meet this demands. This paper will report the service including several aspects: 1) the hardware includes 500 computing services and close to 2PB storage as well as connection to XSEDE Jetstream and Caltech experimental cloud computing environment for sharing the resource; 2) the cloud service is geographically distributed at east coast, west coast, and central region; 3) the cloud includes private clouds managed using open stack and eucalyptus, DC2 is used to bridge these and the public AWS cloud for interoperability and sharing computing resources when high demands surfing; 4) the cloud service is used to support NSF EarthCube program through the ECITE project, ESIP through the ESIP cloud computing cluster, semantics testbed cluster, and other clusters; 5) the cloud service is also available for the earth science communities to conduct geoscience. A brief introduction about how to use the cloud service will be included.

  2. Collaborating and sharing data in epilepsy research.

    PubMed

    Wagenaar, Joost B; Worrell, Gregory A; Ives, Zachary; Dümpelmann, Matthias; Matthias, Dümpelmann; Litt, Brian; Schulze-Bonhage, Andreas

    2015-06-01

    Technological advances are dramatically advancing translational research in Epilepsy. Neurophysiology, imaging, and metadata are now recorded digitally in most centers, enabling quantitative analysis. Basic and translational research opportunities to use these data are exploding, but academic and funding cultures prevent this potential from being realized. Research on epileptogenic networks, antiepileptic devices, and biomarkers could progress rapidly if collaborative efforts to digest this "big neuro data" could be organized. Higher temporal and spatial resolution data are driving the need for novel multidimensional visualization and analysis tools. Crowd-sourced science, the same that drives innovation in computer science, could easily be mobilized for these tasks, were it not for competition for funding, attribution, and lack of standard data formats and platforms. As these efforts mature, there is a great opportunity to advance Epilepsy research through data sharing and increase collaboration between the international research community.

  3. Computer aided system engineering for space construction

    NASA Technical Reports Server (NTRS)

    Racheli, Ugo

    1989-01-01

    This viewgraph presentation covers the following topics. Construction activities envisioned for the assembly of large platforms in space (as well as interplanetary spacecraft and bases on extraterrestrial surfaces) require computational tools that exceed the capability of conventional construction management programs. The Center for Space Construction is investigating the requirements for new computational tools and, at the same time, suggesting the expansion of graduate and undergraduate curricula to include proficiency in Computer Aided Engineering (CAE) though design courses and individual or team projects in advanced space systems design. In the center's research, special emphasis is placed on problems of constructability and of the interruptability of planned activity sequences to be carried out by crews operating under hostile environmental conditions. The departure point for the planned work is the acquisition of the MCAE I-DEAS software, developed by the Structural Dynamics Research Corporation (SDRC), and its expansion to the level of capability denoted by the acronym IDEAS**2 currently used for configuration maintenance on Space Station Freedom. In addition to improving proficiency in the use of I-DEAS and IDEAS**2, it is contemplated that new software modules will be developed to expand the architecture of IDEAS**2. Such modules will deal with those analyses that require the integration of a space platform's configuration with a breakdown of planned construction activities and with a failure modes analysis to support computer aided system engineering (CASE) applied to space construction.

  4. Advanced biologically plausible algorithms for low-level image processing

    NASA Astrophysics Data System (ADS)

    Gusakova, Valentina I.; Podladchikova, Lubov N.; Shaposhnikov, Dmitry G.; Markin, Sergey N.; Golovan, Alexander V.; Lee, Seong-Whan

    1999-08-01

    At present, in computer vision, the approach based on modeling the biological vision mechanisms is extensively developed. However, up to now, real world image processing has no effective solution in frameworks of both biologically inspired and conventional approaches. Evidently, new algorithms and system architectures based on advanced biological motivation should be developed for solution of computational problems related to this visual task. Basic problems that should be solved for creation of effective artificial visual system to process real world imags are a search for new algorithms of low-level image processing that, in a great extent, determine system performance. In the present paper, the result of psychophysical experiments and several advanced biologically motivated algorithms for low-level processing are presented. These algorithms are based on local space-variant filter, context encoding visual information presented in the center of input window, and automatic detection of perceptually important image fragments. The core of latter algorithm are using local feature conjunctions such as noncolinear oriented segment and composite feature map formation. Developed algorithms were integrated into foveal active vision model, the MARR. It is supposed that proposed algorithms may significantly improve model performance while real world image processing during memorizing, search, and recognition.

  5. Advanced interdisciplinary undergraduate program: light engineering

    NASA Astrophysics Data System (ADS)

    Bakholdin, Alexey; Bougrov, Vladislav; Voznesenskaya, Anna; Ezhova, Kseniia

    2016-09-01

    The undergraduate educational program "Light Engineering" of an advanced level of studies is focused on development of scientific learning outcomes and training of professionals, whose activities are in the interdisciplinary fields of Optical engineering and Technical physics. The program gives practical experience in transmission, reception, storage, processing and displaying information using opto-electronic devices, automation of optical systems design, computer image modeling, automated quality control and characterization of optical devices. The program is implemented in accordance with Educational standards of the ITMO University. The specific features of the Program is practice- and problem-based learning implemented by engaging students to perform research and projects, internships at the enterprises and in leading Russian and international research educational centers. The modular structure of the Program and a significant proportion of variable disciplines provide the concept of individual learning for each student. Learning outcomes of the program's graduates include theoretical knowledge and skills in natural science and core professional disciplines, deep knowledge of modern computer technologies, research expertise, design skills, optical and optoelectronic systems and devices.

  6. Retinal imaging analysis based on vessel detection.

    PubMed

    Jamal, Arshad; Hazim Alkawaz, Mohammed; Rehman, Amjad; Saba, Tanzila

    2017-07-01

    With an increase in the advancement of digital imaging and computing power, computationally intelligent technologies are in high demand to be used in ophthalmology cure and treatment. In current research, Retina Image Analysis (RIA) is developed for optometrist at Eye Care Center in Management and Science University. This research aims to analyze the retina through vessel detection. The RIA assists in the analysis of the retinal images and specialists are served with various options like saving, processing and analyzing retinal images through its advanced interface layout. Additionally, RIA assists in the selection process of vessel segment; processing these vessels by calculating its diameter, standard deviation, length, and displaying detected vessel on the retina. The Agile Unified Process is adopted as the methodology in developing this research. To conclude, Retina Image Analysis might help the optometrist to get better understanding in analyzing the patient's retina. Finally, the Retina Image Analysis procedure is developed using MATLAB (R2011b). Promising results are attained that are comparable in the state of art. © 2017 Wiley Periodicals, Inc.

  7. Towards Portable Large-Scale Image Processing with High-Performance Computing.

    PubMed

    Huo, Yuankai; Blaber, Justin; Damon, Stephen M; Boyd, Brian D; Bao, Shunxing; Parvathaneni, Prasanna; Noguera, Camilo Bermudez; Chaganti, Shikha; Nath, Vishwesh; Greer, Jasmine M; Lyu, Ilwoo; French, William R; Newton, Allen T; Rogers, Baxter P; Landman, Bennett A

    2018-05-03

    High-throughput, large-scale medical image computing demands tight integration of high-performance computing (HPC) infrastructure for data storage, job distribution, and image processing. The Vanderbilt University Institute for Imaging Science (VUIIS) Center for Computational Imaging (CCI) has constructed a large-scale image storage and processing infrastructure that is composed of (1) a large-scale image database using the eXtensible Neuroimaging Archive Toolkit (XNAT), (2) a content-aware job scheduling platform using the Distributed Automation for XNAT pipeline automation tool (DAX), and (3) a wide variety of encapsulated image processing pipelines called "spiders." The VUIIS CCI medical image data storage and processing infrastructure have housed and processed nearly half-million medical image volumes with Vanderbilt Advanced Computing Center for Research and Education (ACCRE), which is the HPC facility at the Vanderbilt University. The initial deployment was natively deployed (i.e., direct installations on a bare-metal server) within the ACCRE hardware and software environments, which lead to issues of portability and sustainability. First, it could be laborious to deploy the entire VUIIS CCI medical image data storage and processing infrastructure to another HPC center with varying hardware infrastructure, library availability, and software permission policies. Second, the spiders were not developed in an isolated manner, which has led to software dependency issues during system upgrades or remote software installation. To address such issues, herein, we describe recent innovations using containerization techniques with XNAT/DAX which are used to isolate the VUIIS CCI medical image data storage and processing infrastructure from the underlying hardware and software environments. The newly presented XNAT/DAX solution has the following new features: (1) multi-level portability from system level to the application level, (2) flexible and dynamic software development and expansion, and (3) scalable spider deployment compatible with HPC clusters and local workstations.

  8. Qualification and Certification of 3D Printed Parts for Naval Ships

    DTIC Science & Technology

    2017-12-01

    advances in computer systems, power generators, missile capabilities and product construction, yet they rarely change how something is created, designed or...settings), but section 1.3 takes a look at what must be determined during the design process to ensure the best product can be created. As seen in Chapter...center as well as the printed products . 46 Figure 33. Test Block Design All test cubes were immediately labeled upon being removed from the

  9. Computational Simulation of High Energy Density Plasmas

    DTIC Science & Technology

    2009-10-30

    the imploding liner. The PFS depends on a lithium barrier foil slowing the advance of deuterium up the coaxial gun to the corner. There the plasma ...the coaxial gun section, and Figure 4 shows the physical state of the plasma just prior to pinch. Figure 5 shows neutron yield reaching 1014 in this...details the channel geometry between the center cylinder and coaxial gas gun . The deuterium injection starts when the pressure of the deuterium gas in

  10. High Performance Computing and Enabling Technologies for Nano and Bio Systems and Interfaces

    DTIC Science & Technology

    2014-12-12

    data analysis of protein – aptamer interaction systems were developed. All research investigations contributed to the research education , and training...achieved a 3.5 GPA to 4.0 (4.0 max scale): Number of graduating undergraduates funded by a DoD funded Center of Excellence grant for Education , Research...Research, education and training of future US work force in such nano- bio systems have significant potential for advancement in medical and health

  11. A visiting scientist program in atmospheric sciences for the Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Davis, M. H.

    1989-01-01

    A visiting scientist program was conducted in the atmospheric sciences and related areas at the Goddard Laboratory for Atmospheres. Research was performed in mathematical analysis as applied to computer modeling of the atmospheres; development of atmospheric modeling programs; analysis of remotely sensed atmospheric, surface, and oceanic data and its incorporation into atmospheric models; development of advanced remote sensing instrumentation; and related research areas. The specific research efforts are detailed by tasks.

  12. National Combustion Code, a Multidisciplinary Combustor Design System, Will Be Transferred to the Commercial Sector

    NASA Technical Reports Server (NTRS)

    Steele, Gynelle C.

    1999-01-01

    The NASA Lewis Research Center and Flow Parametrics will enter into an agreement to commercialize the National Combustion Code (NCC). This multidisciplinary combustor design system utilizes computer-aided design (CAD) tools for geometry creation, advanced mesh generators for creating solid model representations, a common framework for fluid flow and structural analyses, modern postprocessing tools, and parallel processing. This integrated system can facilitate and enhance various phases of the design and analysis process.

  13. Design of an Advanced Modular Automated Evaluation System for Experimental High Power SGTOS

    DTIC Science & Technology

    2013-06-01

    POWER SGTOS Shelby Lacouture, Kevin Lawson, Stephen Bayne, Michael Giesselmann, Heather O’Brien 1 , Aderinto Ogunniyi 1 , Charles J...Travis T. Vollmer and Michael G. Giesselmann, Rapid Capacitor Charging Power Supply for an 1800J PFN, Proceedings of the 2012 Power Modulator and High Voltage Conference, San Diego, CA, June 3-7, 2012. 1023 ...Scozzie 1 Center for Pulsed Power and Power Electronics Department of Electrical & Computer Engineering Texas Tech

  14. NASA Ames aerospace systems directorate research

    NASA Technical Reports Server (NTRS)

    Albers, James A.

    1991-01-01

    The Aerospace Systems Directorate is one of four research directorates at the NASA Ames Research Center. The Directorate conducts research and technology development for advanced aircraft and aircraft systems in intelligent computational systems and human-machine systems for aeronautics and space. The Directorate manages research and aircraft technology development projects, and operates and maintains major wind tunnels and flight simulation facilities. The Aerospace Systems Directorate's research and technology as it relates to NASA agency goals and specific strategic thrusts are discussed.

  15. AGENDA: A task organizer and scheduler

    NASA Technical Reports Server (NTRS)

    Fratter, Isabelle

    1993-01-01

    AGENDA will be the main tool used in running the SPOT 4 Earth Observation Satellite's Operational Control Center. It will reduce the operator's work load and make the task easier. AGENDA sets up the work plan for a day of operations, automatically puts the day's tasks into sequence and monitors their progress in real time. Monitoring is centralized, and the tasks are run on different computers in the Center. Once informed of any problems, the operator can intervene at any time while an activity is taking place. To carry out the various functions, the operator has an advanced, efficient, ergonomic graphic interface based on X11 and OSF/MOTIF. Since AGENDA is the heart of the Center, it has to satisfy several constraints that have been taken into account during the various development phases. AGENDA is currently in its final development stages.

  16. Strategic Computing Computer Vision: Taking Image Understanding To The Next Plateau

    NASA Astrophysics Data System (ADS)

    Simpson, R. L., Jr.

    1987-06-01

    The overall objective of the Strategic Computing (SC) Program of the Defense Advanced Research Projects Agency (DARPA) is to develop and demonstrate a new generation of machine intelligence technology which can form the basis for more capable military systems in the future and also maintain a position of world leadership for the US in computer technology. Begun in 1983, SC represents a focused research strategy for accelerating the evolution of new technology and its rapid prototyping in realistic military contexts. Among the very ambitious demonstration prototypes being developed within the SC Program are: 1) the Pilot's Associate which will aid the pilot in route planning, aerial target prioritization, evasion of missile threats, and aircraft emergency safety procedures during flight; 2) two battle management projects one for the for the Army, which is just getting started, called the AirLand Battle Management program (ALBM) which will use knowledge-based systems technology to assist in the generation and evaluation of tactical options and plans at the Corps level; 3) the other more established program for the Navy is the Fleet Command Center Battle Management Program (FCCBIVIP) at Pearl Harbor. The FCCBMP is employing knowledge-based systems and natural language technology in a evolutionary testbed situated in an operational command center to demonstrate and evaluate intelligent decision-aids which can assist in the evaluation of fleet readiness and explore alternatives during contingencies; and 4) the Autonomous Land Vehicle (ALV) which integrates in a major robotic testbed the technologies for dynamic image understanding, knowledge-based route planning with replanning during execution, hosted on new advanced parallel architectures. The goal of the Strategic Computing computer vision technology base (SCVision) is to develop generic technology that will enable the construction of complete, robust, high performance image understanding systems to support a wide range of DoD applications. Possible applications include autonomous vehicle navigation, photointerpretation, smart weapons, and robotic manipulation. This paper provides an overview of the technical and program management plans being used in evolving this critical national technology.

  17. Distributed Observer Network

    NASA Technical Reports Server (NTRS)

    2008-01-01

    NASA s advanced visual simulations are essential for analyses associated with life cycle planning, design, training, testing, operations, and evaluation. Kennedy Space Center, in particular, uses simulations for ground services and space exploration planning in an effort to reduce risk and costs while improving safety and performance. However, it has been difficult to circulate and share the results of simulation tools among the field centers, and distance and travel expenses have made timely collaboration even harder. In response, NASA joined with Valador Inc. to develop the Distributed Observer Network (DON), a collaborative environment that leverages game technology to bring 3-D simulations to conventional desktop and laptop computers. DON enables teams of engineers working on design and operations to view and collaborate on 3-D representations of data generated by authoritative tools. DON takes models and telemetry from these sources and, using commercial game engine technology, displays the simulation results in a 3-D visual environment. Multiple widely dispersed users, working individually or in groups, can view and analyze simulation results on desktop and laptop computers in real time.

  18. Stability of hypersonic compression cones

    NASA Astrophysics Data System (ADS)

    Reed, Helen; Kuehl, Joseph; Perez, Eduardo; Kocian, Travis; Oliviero, Nicholas

    2012-11-01

    Our activities focus on the identification and understanding of the second-mode instability for representative configurations in hypersonic flight. These include the Langley 93-10 flared cone and the Purdue compression cone, both at 0 degrees angle of attack at Mach 6. Through application of nonlinear parabolized stability equations (NPSE) and linear parabolized stability equations (PSE) to both geometries, it is concluded that mean-flow distortion tends to amplify frequencies less than the peak frequency and stabilize those greater by modifying the boundary-layer thickness. As initial disturbance amplitude is increased and/or a broad spectrum disturbance is introduced, direct numerical simulations (DNS) or NPSE appear to be the proper choices to model the evolution, and relative evolution, because these computational tools include these nonlinear effects (mean-flow distortion). Support from AFOSR/NASA National Center for Hypersonic Research in Laminar-Turbulent Transition through Grant FA9550-09-1-0341 is gratefully acknowledged. The authors also thank Pointwise, AeroSoft, and Texas Advanced Computing Center (TACC).

  19. Patient-Centered e-Health Record over the Cloud.

    PubMed

    Koumaditis, Konstantinos; Themistocleous, Marinos; Vassilacopoulos, George; Prentza, Andrianna; Kyriazis, Dimosthenis; Malamateniou, Flora; Maglaveras, Nicos; Chouvarda, Ioanna; Mourouzis, Alexandros

    2014-01-01

    The purpose of this paper is to introduce the Patient-Centered e-Health (PCEH) conceptual aspects alongside a multidisciplinary project that combines state-of-the-art technologies like cloud computing. The project, by combining several aspects of PCEH, such as: (a) electronic Personal Healthcare Record (e-PHR), (b) homecare telemedicine technologies, (c) e-prescribing, e-referral, e-learning, with advanced technologies like cloud computing and Service Oriented Architecture (SOA), will lead to an innovative integrated e-health platform of many benefits to the society, the economy, the industry, and the research community. To achieve this, a consortium of experts, both from industry (two companies, one hospital and one healthcare organization) and academia (three universities), was set to investigate, analyse, design, build and test the new platform. This paper provides insights to the PCEH concept and to the current stage of the project. In doing so, we aim at increasing the awareness of this important endeavor and sharing the lessons learned so far throughout our work.

  20. A Future State for NASA Laboratories - Working in the 21st Century

    NASA Technical Reports Server (NTRS)

    Kegelman, Jerome T.; Harris, Charles E.; Antcliff, Richard R.; Bushnell, Dennis M.; Dwoyer, Douglas L.

    2009-01-01

    The name "21 st Century Laboratory" is an emerging concept of how NASA (and the world) will conduct research in the very near future. Our approach is to carefully plan for significant technological changes in products, organization, and society. The NASA mission can be the beneficiary of these changes, provided the Agency prepares for the role of 21st Century laboratories in research and technology development and its deployment in this new age. It has been clear for some time now that the technology revolutions, technology "mega-trends" that we are in the midst of now, all have a common element centered around advanced computational modeling of small scale physics. Whether it is nano technology, bio technology or advanced computational technology, all of these megatrends are converging on science at the very small scale where it is profoundly important to consider the quantum effects at play with physics at that scale. Whether it is the bio-technology creation of "nanites" designed to mimic our immune system or the creation of nanoscale infotechnology devices, allowing an order of magnitude increase in computational capability, all involve quantum physics that serves as the heart of these revolutionary changes.

  1. 76 FR 31945 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-02

    ... DEPARTMENT OF ENERGY Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy... teleconference meeting of the Advanced Scientific Computing Advisory Committee (ASCAC). The Federal [email protected] . FOR FURTHER INFORMATION CONTACT: Melea Baker, Office of Advanced Scientific Computing...

  2. Description of MSFC engineering photographic analysis

    NASA Technical Reports Server (NTRS)

    Earle, Jim; Williams, Frank

    1988-01-01

    Utilizing a background that includes development of basic launch and test photographic coverage and analysis procedures, the MSFC Photographic Evaluation Group has built a body of experience that enables it to effectively satisfy MSFC's engineering photographic analysis needs. Combining the basic soundness of reliable, proven techniques of the past with the newer technical advances of computers and computer-related devices, the MSFC Photo Evaluation Group is in a position to continue to provide photo and video analysis service center-wide and NASA-wide to supply an improving photo analysis product to meet the photo evaluation needs of the future; and to provide new standards in the state-of-the-art of photo analysis of dynamic events.

  3. NAS Technical Summaries, March 1993 - February 1994

    NASA Technical Reports Server (NTRS)

    1995-01-01

    NASA created the Numerical Aerodynamic Simulation (NAS) Program in 1987 to focus resources on solving critical problems in aeroscience and related disciplines by utilizing the power of the most advanced supercomputers available. The NAS Program provides scientists with the necessary computing power to solve today's most demanding computational fluid dynamics problems and serves as a pathfinder in integrating leading-edge supercomputing technologies, thus benefitting other supercomputer centers in government and industry. The 1993-94 operational year concluded with 448 high-speed processor projects and 95 parallel projects representing NASA, the Department of Defense, other government agencies, private industry, and universities. This document provides a glimpse at some of the significant scientific results for the year.

  4. NAS technical summaries. Numerical aerodynamic simulation program, March 1992 - February 1993

    NASA Technical Reports Server (NTRS)

    1994-01-01

    NASA created the Numerical Aerodynamic Simulation (NAS) Program in 1987 to focus resources on solving critical problems in aeroscience and related disciplines by utilizing the power of the most advanced supercomputers available. The NAS Program provides scientists with the necessary computing power to solve today's most demanding computational fluid dynamics problems and serves as a pathfinder in integrating leading-edge supercomputing technologies, thus benefitting other supercomputer centers in government and industry. The 1992-93 operational year concluded with 399 high-speed processor projects and 91 parallel projects representing NASA, the Department of Defense, other government agencies, private industry, and universities. This document provides a glimpse at some of the significant scientific results for the year.

  5. Clinical and pathologic parameters predicting recurrence of facial basal cell carcinoma: a retrospective audit in an advanced care center.

    PubMed

    Troeltzsch, Matthias; Probst, Florian A; Knösel, Thomas; Mast, Gerson; Ehrenfeld, Michael; Otto, Sven

    2016-11-01

    This study was designed to investigate the associations between clinical, pathologic, and therapeutic parameters of facial basal cell carcinoma (BCC) and recurrence rates in patients treated at an advanced care center. A retrospective cohort study was performed. Patients who presented to an advanced care center within a 6-year period with facial BCC and who received surgical treatment were included for further review according to predefined inclusion criteria. The predictor variable was defined as "negative-margin (R0) resection after the first surgery". The primary outcome variable was defined as "BCC recurrence". Descriptive and inferential statistics were computed. The significance level was set at P ≤ 0.05. A total of 71 patients (29 female, 42 male; average age: 71.76 years) were found to meet all of the study inclusion criteria. All BCCs had been referred, and 50.7% had been submitted to previous surgery. The mean ± standard deviation tumor diameter was 2.3 ± 1.8 cm. Recurrence of BCC was observed in 11 patients (15.5%). Large tumor diameters, increased patient age, and failure to achieve R0 resection at the first surgical appointment significantly increased recurrence rates. Complete facial BCC excision at the first surgical appointment is pivotal in reducing the likelihood of recurrence. The influence of the anatomic location of facial BCC on recurrence rates may be limited. © 2016 The International Society of Dermatology.

  6. 75 FR 9887 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-04

    ... DEPARTMENT OF ENERGY Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy... Advanced Scientific Computing Advisory Committee (ASCAC). Federal Advisory Committee Act (Pub. L. 92-463... INFORMATION CONTACT: Melea Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building...

  7. 76 FR 9765 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-22

    ... DEPARTMENT OF ENERGY Advanced Scientific Computing Advisory Committee AGENCY: Office of Science... Advanced Scientific Computing Advisory Committee (ASCAC). The Federal Advisory Committee Act (Pub. L. 92... INFORMATION CONTACT: Melea Baker, Office of Advanced Scientific Computing Research, SC-21/Germantown Building...

  8. 77 FR 45345 - DOE/Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-31

    ... Recompetition results for Scientific Discovery through Advanced Computing (SciDAC) applications Co-design Public... DEPARTMENT OF ENERGY DOE/Advanced Scientific Computing Advisory Committee AGENCY: Office of... the Advanced Scientific Computing Advisory Committee (ASCAC). The Federal Advisory Committee Act (Pub...

  9. 75 FR 64720 - DOE/Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-20

    ... DEPARTMENT OF ENERGY DOE/Advanced Scientific Computing Advisory Committee AGENCY: Department of... the Advanced Scientific Computing Advisory Committee (ASCAC). Federal Advisory Committee Act (Pub. L.... FOR FURTHER INFORMATION CONTACT: Melea Baker, Office of Advanced Scientific Computing Research; SC-21...

  10. Pervasive brain monitoring and data sharing based on multi-tier distributed computing and linked data technology

    PubMed Central

    Zao, John K.; Gan, Tchin-Tze; You, Chun-Kai; Chung, Cheng-En; Wang, Yu-Te; Rodríguez Méndez, Sergio José; Mullen, Tim; Yu, Chieh; Kothe, Christian; Hsiao, Ching-Teng; Chu, San-Liang; Shieh, Ce-Kuen; Jung, Tzyy-Ping

    2014-01-01

    EEG-based Brain-computer interfaces (BCI) are facing basic challenges in real-world applications. The technical difficulties in developing truly wearable BCI systems that are capable of making reliable real-time prediction of users' cognitive states in dynamic real-life situations may seem almost insurmountable at times. Fortunately, recent advances in miniature sensors, wireless communication and distributed computing technologies offered promising ways to bridge these chasms. In this paper, we report an attempt to develop a pervasive on-line EEG-BCI system using state-of-art technologies including multi-tier Fog and Cloud Computing, semantic Linked Data search, and adaptive prediction/classification models. To verify our approach, we implement a pilot system by employing wireless dry-electrode EEG headsets and MEMS motion sensors as the front-end devices, Android mobile phones as the personal user interfaces, compact personal computers as the near-end Fog Servers and the computer clusters hosted by the Taiwan National Center for High-performance Computing (NCHC) as the far-end Cloud Servers. We succeeded in conducting synchronous multi-modal global data streaming in March and then running a multi-player on-line EEG-BCI game in September, 2013. We are currently working with the ARL Translational Neuroscience Branch to use our system in real-life personal stress monitoring and the UCSD Movement Disorder Center to conduct in-home Parkinson's disease patient monitoring experiments. We shall proceed to develop the necessary BCI ontology and introduce automatic semantic annotation and progressive model refinement capability to our system. PMID:24917804

  11. Pervasive brain monitoring and data sharing based on multi-tier distributed computing and linked data technology.

    PubMed

    Zao, John K; Gan, Tchin-Tze; You, Chun-Kai; Chung, Cheng-En; Wang, Yu-Te; Rodríguez Méndez, Sergio José; Mullen, Tim; Yu, Chieh; Kothe, Christian; Hsiao, Ching-Teng; Chu, San-Liang; Shieh, Ce-Kuen; Jung, Tzyy-Ping

    2014-01-01

    EEG-based Brain-computer interfaces (BCI) are facing basic challenges in real-world applications. The technical difficulties in developing truly wearable BCI systems that are capable of making reliable real-time prediction of users' cognitive states in dynamic real-life situations may seem almost insurmountable at times. Fortunately, recent advances in miniature sensors, wireless communication and distributed computing technologies offered promising ways to bridge these chasms. In this paper, we report an attempt to develop a pervasive on-line EEG-BCI system using state-of-art technologies including multi-tier Fog and Cloud Computing, semantic Linked Data search, and adaptive prediction/classification models. To verify our approach, we implement a pilot system by employing wireless dry-electrode EEG headsets and MEMS motion sensors as the front-end devices, Android mobile phones as the personal user interfaces, compact personal computers as the near-end Fog Servers and the computer clusters hosted by the Taiwan National Center for High-performance Computing (NCHC) as the far-end Cloud Servers. We succeeded in conducting synchronous multi-modal global data streaming in March and then running a multi-player on-line EEG-BCI game in September, 2013. We are currently working with the ARL Translational Neuroscience Branch to use our system in real-life personal stress monitoring and the UCSD Movement Disorder Center to conduct in-home Parkinson's disease patient monitoring experiments. We shall proceed to develop the necessary BCI ontology and introduce automatic semantic annotation and progressive model refinement capability to our system.

  12. Flight Avionics Hardware Roadmap

    NASA Technical Reports Server (NTRS)

    Some, Raphael; Goforth, Monte; Chen, Yuan; Powell, Wes; Paulick, Paul; Vitalpur, Sharada; Buscher, Deborah; Wade, Ray; West, John; Redifer, Matt; hide

    2014-01-01

    The Avionics Technology Roadmap takes an 80% approach to technology investment in spacecraft avionics. It delineates a suite of technologies covering foundational, component, and subsystem-levels, which directly support 80% of future NASA space mission needs. The roadmap eschews high cost, limited utility technologies in favor of lower cost, and broadly applicable technologies with high return on investment. The roadmap is also phased to support future NASA mission needs and desires, with a view towards creating an optimized investment portfolio that matures specific, high impact technologies on a schedule that matches optimum insertion points of these technologies into NASA missions. The roadmap looks out over 15+ years and covers some 114 technologies, 58 of which are targeted for TRL6 within 5 years, with 23 additional technologies to be at TRL6 by 2020. Of that number, only a few are recommended for near term investment: 1. Rad Hard High Performance Computing 2. Extreme temperature capable electronics and packaging 3. RFID/SAW-based spacecraft sensors and instruments 4. Lightweight, low power 2D displays suitable for crewed missions 5. Radiation tolerant Graphics Processing Unit to drive crew displays 6. Distributed/reconfigurable, extreme temperature and radiation tolerant, spacecraft sensor controller and sensor modules 7. Spacecraft to spacecraft, long link data communication protocols 8. High performance and extreme temperature capable C&DH subsystem In addition, the roadmap team recommends several other activities that it believes are necessary to advance avionics technology across NASA: center dot Engage the OCT roadmap teams to coordinate avionics technology advances and infusion into these roadmaps and their mission set center dot Charter a team to develop a set of use cases for future avionics capabilities in order to decouple this roadmap from specific missions center dot Partner with the Software Steering Committee to coordinate computing hardware and software technology roadmaps and investment recommendations center dot Continue monitoring foundational technologies upon which future avionics technologies will be dependent, e.g., RHBD and COTS semiconductor technologies

  13. Structural Analysis Made 'NESSUSary'

    NASA Technical Reports Server (NTRS)

    2005-01-01

    Everywhere you look, chances are something that was designed and tested by a computer will be in plain view. Computers are now utilized to design and test just about everything imaginable, from automobiles and airplanes to bridges and boats, and elevators and escalators to streets and skyscrapers. Computer-design engineering first emerged in the 1970s, in the automobile and aerospace industries. Since computers were in their infancy, however, architects and engineers during the time were limited to producing only designs similar to hand-drafted drawings. (At the end of 1970s, a typical computer-aided design system was a 16-bit minicomputer with a price tag of $125,000.) Eventually, computers became more affordable and related software became more sophisticated, offering designers the "bells and whistles" to go beyond the limits of basic drafting and rendering, and venture into more skillful applications. One of the major advancements was the ability to test the objects being designed for the probability of failure. This advancement was especially important for the aerospace industry, where complicated and expensive structures are designed. The ability to perform reliability and risk assessment without using extensive hardware testing is critical to design and certification. In 1984, NASA initiated the Probabilistic Structural Analysis Methods (PSAM) project at Glenn Research Center to develop analysis methods and computer programs for the probabilistic structural analysis of select engine components for current Space Shuttle and future space propulsion systems. NASA envisioned that these methods and computational tools would play a critical role in establishing increased system performance and durability, and assist in structural system qualification and certification. Not only was the PSAM project beneficial to aerospace, it paved the way for a commercial risk- probability tool that is evaluating risks in diverse, down- to-Earth application

  14. Human factors issues in the use of artificial intelligence in air traffic control. October 1990 Workshop

    NASA Technical Reports Server (NTRS)

    Hockaday, Stephen; Kuhlenschmidt, Sharon (Editor)

    1991-01-01

    The objective of the workshop was to explore the role of human factors in facilitating the introduction of artificial intelligence (AI) to advanced air traffic control (ATC) automation concepts. AI is an umbrella term which is continually expanding to cover a variety of techniques where machines are performing actions taken based upon dynamic, external stimuli. AI methods can be implemented using more traditional programming languages such as LISP or PROLOG, or they can be implemented using state-of-the-art techniques such as object-oriented programming, neural nets (hardware or software), and knowledge based expert systems. As this technology advances and as increasingly powerful computing platforms become available, the use of AI to enhance ATC systems can be realized. Substantial efforts along these lines are already being undertaken at the FAA Technical Center, NASA Ames Research Center, academic institutions, industry, and elsewhere. Although it is clear that the technology is ripe for bringing computer automation to ATC systems, the proper scope and role of automation are not at all apparent. The major concern is how to combine human controllers with computer technology. A wide spectrum of options exists, ranging from using automation only to provide extra tools to augment decision making by human controllers to turning over moment-by-moment control to automated systems and using humans as supervisors and system managers. Across this spectrum, it is now obvious that the difficulties that occur when tying human and automated systems together must be resolved so that automation can be introduced safely and effectively. The focus of the workshop was to further explore the role of injecting AI into ATC systems and to identify the human factors that need to be considered for successful application of the technology to present and future ATC systems.

  15. National Coalition of Advanced Technology Centers Proposal to the Nation.

    ERIC Educational Resources Information Center

    National Coalition of Advanced Technology Centers, Waco, TX.

    In 1988, nine institutions operating advanced technology centers (ATC's) to provide workers with up-to-date technical skills formed the National Coalition of Advanced Technology Centers (NCATC). The center was established to increase awareness of ATC's, serve as a forum for the discussion and demonstration of new and underused technologies,…

  16. Status of the Short-Pulse X-ray Project at the Advanced Photon Source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nassiri, A; Berenc, T G; Borland, M

    2012-07-01

    The Advanced Photon Source Upgrade (APS-U) Project at Argonne will include generation of short-pulse x-rays based on Zholents deflecting cavity scheme. We have chosen superconducting (SC) cavities in order to have a continuous train of crabbed bunches and flexibility of operating modes. In collaboration with Jefferson Laboratory, we are prototyping and testing a number of single-cell deflecting cavities and associated auxiliary systems with promising initial results. In collaboration with Lawrence Berkeley National Laboratory, we are working to develop state-of-the-art timing, synchronization, and differential rf phase stability systems that are required for SPX. Collaboration with Advanced Computations Department at Stanford Linearmore » Accelerator Center is looking into simulations of complex, multi-cavity geometries with lower- and higher-order modes waveguide dampers using ACE3P. This contribution provides the current R&D status of the SPX project.« less

  17. T/BEST: Technology Benefit Estimator for Composites and Applications to Engine Structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos

    1997-01-01

    Progress in the field of aerospace propulsion has heightened the need to combine advanced technologies. These benefits will provide guidelines for identifying and prioritizing high-payoff research areas, will help manage research with limited resources, and will show the link between advanced and basic concepts. An effort was undertaken at the NASA Lewis Research Center to develop a formal computational method, T/BEST (Technology Benefit Estimator), to assess advanced aerospace technologies, such as fibrous composites, and credibly communicate the benefits of research. Fibrous composites are ideal for structural applications such as high-performance aircraft engine blades where high strength-to-weight and stiffness-to-weight ratios are required. These factors - along with the flexibility to select the composite system and layup, and to favorably orient fiber directions - reduce the displacements and stresses caused by large rotational speeds in aircraft engines.

  18. A Framework for Managing Inter-Site Storage Area Networks using Grid Technologies

    NASA Technical Reports Server (NTRS)

    Kobler, Ben; McCall, Fritz; Smorul, Mike

    2006-01-01

    The NASA Goddard Space Flight Center and the University of Maryland Institute for Advanced Computer Studies are studying mechanisms for installing and managing Storage Area Networks (SANs) that span multiple independent collaborating institutions using Storage Area Network Routers (SAN Routers). We present a framework for managing inter-site distributed SANs that uses Grid Technologies to balance the competing needs to control local resources, share information, delegate administrative access, and manage the complex trust relationships between the participating sites.

  19. The Simulation of a Jumbo Jet Transport Aircraft. Volume 2: Modeling Data

    NASA Technical Reports Server (NTRS)

    Hanke, C. R.; Nordwall, D. R.

    1970-01-01

    The manned simulation of a large transport aircraft is described. Aircraft and systems data necessary to implement the mathematical model described in Volume I and a discussion of how these data are used in model are presented. The results of the real-time computations in the NASA Ames Research Center Flight Simulator for Advanced Aircraft are shown and compared to flight test data and to the results obtained in a training simulator known to be satisfactory.

  20. Engineering of Transition Metal Catalysts Confined in Zeolites

    PubMed Central

    2018-01-01

    Transition metal–zeolite composites are versatile catalytic materials for a wide range of industrial and lab-scale processes. Significant advances in fabrication and characterization of well-defined metal centers confined in zeolite matrixes have greatly expanded the library of available materials and, accordingly, their catalytic utility. In this review, we summarize recent developments in the field from the perspective of materials chemistry, focusing on synthesis, postsynthesis modification, (operando) spectroscopy characterization, and computational modeling of transition metal–zeolite catalysts. PMID:29861546

  1. Flight Planning

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Seagull Technology, Inc., Sunnyvale, CA, produced a computer program under a Langley Research Center Small Business Innovation Research (SBIR) grant called STAFPLAN (Seagull Technology Advanced Flight Plan) that plans optimal trajectory routes for small to medium sized airlines to minimize direct operating costs while complying with various airline operating constraints. STAFPLAN incorporates four input databases, weather, route data, aircraft performance, and flight-specific information (times, payload, crew, fuel cost) to provide the correct amount of fuel optimal cruise altitude, climb and descent points, optimal cruise speed, and flight path.

  2. EHR standards--A comparative study.

    PubMed

    Blobel, Bernd; Pharow, Peter

    2006-01-01

    For ensuring quality and efficiency of patient's care, the care paradigm moves from organization-centered over process-controlled towards personal care. Such health system paradigm change leads to new paradigms for analyzing, designing, implementing and deploying supporting health information systems including EHR systems as core application in a distributed eHealth environment. The paper defines the architectural paradigm for future-proof EHR systems. It compares advanced EHR architectures referencing them at the Generic Component Model. The paper introduces the evolving paradigm of autonomous computing for self-organizing health information systems.

  3. The Si elegans project at the interface of experimental and computational Caenorhabditis elegans neurobiology and behavior

    NASA Astrophysics Data System (ADS)

    Petrushin, Alexey; Ferrara, Lorenzo; Blau, Axel

    2016-12-01

    Objective. In light of recent progress in mapping neural function to behavior, we briefly and selectively review past and present endeavors to reveal and reconstruct nervous system function in Caenorhabditis elegans through simulation. Approach. Rather than presenting an all-encompassing review on the mathematical modeling of C. elegans, this contribution collects snapshots of pathfinding key works and emerging technologies that recent single- and multi-center simulation initiatives are building on. We thereby point out a few general limitations and problems that these undertakings are faced with and discuss how these may be addressed and overcome. Main results. Lessons learned from past and current computational approaches to deciphering and reconstructing information flow in the C. elegans nervous system corroborate the need of refining neural response models and linking them to intra- and extra-environmental interactions to better reflect and understand the actual biological, biochemical and biophysical events that lead to behavior. Together with single-center research efforts, the Si elegans and OpenWorm projects aim at providing the required, in some cases complementary tools for different hardware architectures to support advancement into this direction. Significance. Despite its seeming simplicity, the nervous system of the hermaphroditic nematode C. elegans with just 302 neurons gives rise to a rich behavioral repertoire. Besides controlling vital functions (feeding, defecation, reproduction), it encodes different stimuli-induced as well as autonomous locomotion modalities (crawling, swimming and jumping). For this dichotomy between system simplicity and behavioral complexity, C. elegans has challenged neurobiologists and computational scientists alike. Understanding the underlying mechanisms that lead to a context-modulated functionality of individual neurons would not only advance our knowledge on nervous system function and its failure in pathological states, but have directly exploitable benefits for robotics and the engineering of brain-mimetic computational architectures that are orthogonal to current von-Neumann-type machines.

  4. Novel Multidisciplinary Models Assess the Capabilities of Smart Structures to Manage Vibration, Sound, and Thermal Distortion in Aeropropulsion Components

    NASA Technical Reports Server (NTRS)

    Saravanos, Dimitris A.

    1997-01-01

    The development of aeropropulsion components that incorporate "smart" composite laminates with embedded piezoelectric actuators and sensors is expected to ameliorate critical problems in advanced aircraft engines related to vibration, noise emission, and thermal stability. To facilitate the analytical needs of this effort, the NASA Lewis Research Center has developed mechanics and multidisciplinary computational models to analyze the complicated electromechanical behavior of realistic smart-structure configurations operating in combined mechanical, thermal, and acoustic environments. The models have been developed to accommodate the particular geometries, environments, and technical challenges encountered in advanced aircraft engines, yet their unique analytical features are expected to facilitate application of this new technology in a variety of commercial applications.

  5. Active Control Technology at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Antcliff, Richard R.; McGowan, Anna-Marie R.

    2000-01-01

    NASA Langley has a long history of attacking important technical Opportunities from a broad base of supporting disciplines. The research and development at Langley in this subject area range from the test tube to the test flight, The information covered here will range from the development of innovative new materials, sensors and actuators, to the incorporation of smart sensors and actuators in practical devices, to the optimization of the location of these devices, to, finally, a wide variety of applications of these devices utilizing Langley's facilities and expertise. Advanced materials are being developed for sensors and actuators, as well as polymers for integrating smart devices into composite structures. Contributions reside in three key areas: computational materials; advanced piezoelectric materials; and integrated composite structures.

  6. Applications of iQID cameras

    NASA Astrophysics Data System (ADS)

    Han, Ling; Miller, Brian W.; Barrett, Harrison H.; Barber, H. Bradford; Furenlid, Lars R.

    2017-09-01

    iQID is an intensified quantum imaging detector developed in the Center for Gamma-Ray Imaging (CGRI). Originally called BazookaSPECT, iQID was designed for high-resolution gamma-ray imaging and preclinical gamma-ray single-photon emission computed tomography (SPECT). With the use of a columnar scintillator, an image intensifier and modern CCD/CMOS sensors, iQID cameras features outstanding intrinsic spatial resolution. In recent years, many advances have been achieved that greatly boost the performance of iQID, broadening its applications to cover nuclear and particle imaging for preclinical, clinical and homeland security settings. This paper presents an overview of the recent advances of iQID technology and its applications in preclinical and clinical scintigraphy, preclinical SPECT, particle imaging (alpha, neutron, beta, and fission fragment), and digital autoradiography.

  7. Special issue on the "Consortium for Advanced Simulation of Light Water Reactors Research and Development Progress"

    NASA Astrophysics Data System (ADS)

    Turinsky, Paul J.; Martin, William R.

    2017-04-01

    In this special issue of the Journal of Computational Physics, the research and development completed at the time of manuscript submission by the Consortium for Advanced Simulation of Light Water Reactors (CASL) is presented. CASL is the first of several Energy Innovation Hubs that have been created by the Department of Energy. The Hubs are modeled after the strong scientific management characteristics of the Manhattan Project and AT&T Bell Laboratories, and function as integrated research centers that combine basic and applied research with engineering to accelerate scientific discovery that addresses critical energy issues. Lifetime of a Hub is expected to be five or ten years depending upon performance, with CASL being granted a ten year lifetime.

  8. Secure distributed genome analysis for GWAS and sequence comparison computation.

    PubMed

    Zhang, Yihua; Blanton, Marina; Almashaqbeh, Ghada

    2015-01-01

    The rapid increase in the availability and volume of genomic data makes significant advances in biomedical research possible, but sharing of genomic data poses challenges due to the highly sensitive nature of such data. To address the challenges, a competition for secure distributed processing of genomic data was organized by the iDASH research center. In this work we propose techniques for securing computation with real-life genomic data for minor allele frequency and chi-squared statistics computation, as well as distance computation between two genomic sequences, as specified by the iDASH competition tasks. We put forward novel optimizations, including a generalization of a version of mergesort, which might be of independent interest. We provide implementation results of our techniques based on secret sharing that demonstrate practicality of the suggested protocols and also report on performance improvements due to our optimization techniques. This work describes our techniques, findings, and experimental results developed and obtained as part of iDASH 2015 research competition to secure real-life genomic computations and shows feasibility of securely computing with genomic data in practice.

  9. Secure distributed genome analysis for GWAS and sequence comparison computation

    PubMed Central

    2015-01-01

    Background The rapid increase in the availability and volume of genomic data makes significant advances in biomedical research possible, but sharing of genomic data poses challenges due to the highly sensitive nature of such data. To address the challenges, a competition for secure distributed processing of genomic data was organized by the iDASH research center. Methods In this work we propose techniques for securing computation with real-life genomic data for minor allele frequency and chi-squared statistics computation, as well as distance computation between two genomic sequences, as specified by the iDASH competition tasks. We put forward novel optimizations, including a generalization of a version of mergesort, which might be of independent interest. Results We provide implementation results of our techniques based on secret sharing that demonstrate practicality of the suggested protocols and also report on performance improvements due to our optimization techniques. Conclusions This work describes our techniques, findings, and experimental results developed and obtained as part of iDASH 2015 research competition to secure real-life genomic computations and shows feasibility of securely computing with genomic data in practice. PMID:26733307

  10. 2017 ISCB Accomplishment by a Senior Scientist Award: Pavel Pevzner

    PubMed Central

    Fogg, Christiana N.; Kovats, Diane E.; Berger, Bonnie

    2017-01-01

    The International Society for Computational Biology ( ISCB) recognizes an established scientist each year with the Accomplishment by a Senior Scientist Award for significant contributions he or she has made to the field. This award honors scientists who have contributed to the advancement of computational biology and bioinformatics through their research, service, and education work. Pavel Pevzner, PhD, Ronald R. Taylor Professor of Computer Science and Director of the NIH Center for Computational Mass Spectrometry at University of California, San Diego, has been selected as the winner of the 2017 Accomplishment by a Senior Scientist Award. The ISCB awards committee, chaired by Dr. Bonnie Berger of the Massachusetts Institute of Technology, selected Pevzner as the 2017 winner. Pevzner will receive his award and deliver a keynote address at the 2017 Intelligent Systems for Molecular Biology-European Conference on Computational Biology joint meeting ( ISMB/ECCB 2017) held in Prague, Czech Republic from July 21-July 25, 2017. ISMB/ECCB is a biennial joint meeting that brings together leading scientists in computational biology and bioinformatics from around the globe. PMID:28713548

  11. A brief overview of computational structures technology related activities at NASA Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Hopkins, Dale A.

    1992-01-01

    The presentation gives a partial overview of research and development underway in the Structures Division of LeRC, which collectively is referred to as the Computational Structures Technology Program. The activities in the program are diverse and encompass four major categories: (1) composite materials and structures; (2) probabilistic analysis and reliability; (3) design optimization and expert systems; and (4) computational methods and simulation. The approach of the program is comprehensive and entails exploration of fundamental theories of structural mechanics to accurately represent the complex physics governing engine structural performance, formulation, and implementation of computational techniques and integrated simulation strategies to provide accurate and efficient solutions of the governing theoretical models by exploiting the emerging advances in computer technology, and validation and verification through numerical and experimental tests to establish confidence and define the qualities and limitations of the resulting theoretical models and computational solutions. The program comprises both in-house and sponsored research activities. The remainder of the presentation provides a sample of activities to illustrate the breadth and depth of the program and to demonstrate the accomplishments and benefits that have resulted.

  12. The 3d International Workshop on Computational Electronics

    NASA Astrophysics Data System (ADS)

    Goodnick, Stephen M.

    1994-09-01

    The Third International Workshop on Computational Electronics (IWCE) was held at the Benson Hotel in downtown Portland, Oregon, on May 18, 19, and 20, 1994. The workshop was devoted to a broad range of topics in computational electronics related to the simulation of electronic transport in semiconductors and semiconductor devices, particularly those which use large computational resources. The workshop was supported by the National Science Foundation (NSF), the Office of Naval Research and the Army Research Office, as well as local support from the Oregon Joint Graduate Schools of Engineering and the Oregon Center for Advanced Technology Education. There were over 100 participants in the Portland workshop, of which more than one quarter represented research groups outside of the United States from Austria, Canada, France, Germany, Italy, Japan, Switzerland, and the United Kingdom. There were a total 81 papers presented at the workshop, 9 invited talks, 26 oral presentations and 46 poster presentations. The emphasis of the contributions reflected the interdisciplinary nature of computational electronics with researchers from the Chemistry, Computer Science, Mathematics, Engineering, and Physics communities participating in the workshop.

  13. The National Kidney Registry: 175 transplants in one year.

    PubMed

    Veale, Jeffrey; Hil, Garet

    2011-01-01

    Since organizing its first swap in 2008, the National Kidney Registry had facilitated 389 kidney transplants by the end of 2011 across 45 U.S. transplant centers. Rapid innovations, advanced computer technologies, and an evolving understanding of the processes at participating transplant centers and histocompatibility laboratories are among the factors driving the success of the NKR. Virtual cross match accuracy has improved from 43% to 94% as a result of improvements in the HLA typing process for donor antigens and enhanced mechanisms to list unacceptable HLA antigens for sensitized patients. By the end of 2011, the NKR had transplanted 66% of the patients enrolled since 2008. The 2011 wait time (from enrollment to transplant) for the 175 patients transplanted that year averaged 5 months.

  14. Force Feedback Joystick

    NASA Technical Reports Server (NTRS)

    1997-01-01

    I-FORCE, a computer peripheral from Immersion Corporation, was derived from virtual environment and human factors research at the Advanced Displays and Spatial Perception Laboratory at Ames Research Center in collaboration with Stanford University Center for Design Research. Entrepreneur Louis Rosenberg, a former Stanford researcher, now president of Immersion, collaborated with Dr. Bernard Adelstein at Ames on studies of perception in virtual reality. The result was an inexpensive way to incorporate motors and a sophisticated microprocessor into joysticks and other game controllers. These devices can emulate the feel of a car on the skid, a crashing plane, the bounce of a ball, compressed springs, or other physical phenomenon. The first products incorporating I-FORCE technology include CH- Products' line of FlightStick and CombatStick controllers.

  15. Science at the Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    White, Nicholas E.

    2012-01-01

    The Sciences and Exploration Directorate of the NASA Goddard Space Flight Center (GSFC) is the largest Earth and space science research organization in the world. Its scientists advance understanding of the Earth and its life-sustaining environment, the Sun, the solar system, and the wider universe beyond. Researchers in the Sciences and Exploration Directorate work with engineers, computer programmers, technologists, and other team members to develop the cutting-edge technology needed for space-based research. Instruments are also deployed on aircraft, balloons, and Earth's surface. I will give an overview of the current research activities and programs at GSFC including the James Web Space Telescope (JWST), future Earth Observing programs, experiments that are exploring our solar system and studying the interaction of the Sun with the Earth's magnetosphere.

  16. eHealth research from the user's perspective.

    PubMed

    Hesse, Bradford W; Shneiderman, Ben

    2007-05-01

    The application of information technology (IT) to issues of healthcare delivery has had a long and tortuous history in the United States. Within the field of eHealth, vanguard applications of advanced computing techniques, such as applications in artificial intelligence or expert systems, have languished in spite of a track record of scholarly publication and decisional accuracy. The problem is one of purpose, of asking the right questions for the science to solve. Historically, many computer science pioneers have been tempted to ask "what can the computer do?" New advances in eHealth are prompting developers to ask "what can people do?" How can eHealth take part in national goals for healthcare reform to empower relationships between healthcare professionals and patients, healthcare teams and families, and hospitals and communities to improve health equitably throughout the population? To do this, eHealth researchers must combine best evidence from the user sciences (human factors engineering, human-computer interaction, psychology, and usability) with best evidence in medicine to create transformational improvements in the quality of care that medicine offers. These improvements should follow recommendations from the Institute of Medicine to create a healthcare system that is (1) safe, (2) effective (evidence based), (3) patient centered, and (4) timely. Relying on the eHealth researcher's intuitive grasp of systems issues, improvements should be made with considerations of users and beneficiaries at the individual (patient-physician), group (family-staff), community, and broad environmental levels.

  17. Computational mechanics research and support for aerodynamics and hydraulics at TFHRC year 1 quarter 4 progress report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lottes, S.A.; Kulak, R.F.; Bojanowski, C.

    2011-12-09

    The computational fluid dynamics (CFD) and computational structural mechanics (CSM) focus areas at Argonne's Transportation Research and Analysis Computing Center (TRACC) initiated a project to support and compliment the experimental programs at the Turner-Fairbank Highway Research Center (TFHRC) with high performance computing based analysis capabilities in August 2010. The project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at TFHRC for a period of five years, beginning in October 2010. Themore » analysis methods employ well-benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focus of the project are wind and water effects on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to assess them for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, CFD analysis of the operation of the wind tunnel in the TFCHR wind engineering laboratory, vehicle stability under high wind loading, and the use of electromagnetic shock absorbers to improve vehicle stability under high wind conditions. This quarterly report documents technical progress on the project tasks for the period of July through September 2011.« less

  18. Artificial intelligence in medicine and cardiac imaging: harnessing big data and advanced computing to provide personalized medical diagnosis and treatment.

    PubMed

    Dilsizian, Steven E; Siegel, Eliot L

    2014-01-01

    Although advances in information technology in the past decade have come in quantum leaps in nearly every aspect of our lives, they seem to be coming at a slower pace in the field of medicine. However, the implementation of electronic health records (EHR) in hospitals is increasing rapidly, accelerated by the meaningful use initiatives associated with the Center for Medicare & Medicaid Services EHR Incentive Programs. The transition to electronic medical records and availability of patient data has been associated with increases in the volume and complexity of patient information, as well as an increase in medical alerts, with resulting "alert fatigue" and increased expectations for rapid and accurate diagnosis and treatment. Unfortunately, these increased demands on health care providers create greater risk for diagnostic and therapeutic errors. In the near future, artificial intelligence (AI)/machine learning will likely assist physicians with differential diagnosis of disease, treatment options suggestions, and recommendations, and, in the case of medical imaging, with cues in image interpretation. Mining and advanced analysis of "big data" in health care provide the potential not only to perform "in silico" research but also to provide "real time" diagnostic and (potentially) therapeutic recommendations based on empirical data. "On demand" access to high-performance computing and large health care databases will support and sustain our ability to achieve personalized medicine. The IBM Jeopardy! Challenge, which pitted the best all-time human players against the Watson computer, captured the imagination of millions of people across the world and demonstrated the potential to apply AI approaches to a wide variety of subject matter, including medicine. The combination of AI, big data, and massively parallel computing offers the potential to create a revolutionary way of practicing evidence-based, personalized medicine.

  19. U.S, Department of Energy's Bioenergy Research Centers An Overview of the Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2009-07-01

    Alternative fuels from renewable cellulosic biomass--plant stalks, trunks, stems, and leaves--are expected to significantly reduce U.S. dependence on imported oil while enhancing national energy security and decreasing the environmental impacts of energy use. Ethanol and other advanced biofuels from cellulosic biomass are renewable alternatives that could increase domestic production of transportation fuels, revitalize rural economies, and reduce carbon dioxide and pollutant emissions. According to U.S. Secretary of Energy Steven Chu, 'Developing the next generation of biofuels is key to our effort to end our dependence on foreign oil and address the climate crisis while creating millions of new jobs thatmore » can't be outsourced'. In the United States, the Energy Independence and Security Act (EISA) of 2007 is an important driver for the sustainable development of renewable biofuels. As part of EISA, the Renewable Fuel Standard mandates that 36 billion gallons of biofuels are to be produced annually by 2022, of which 16 billion gallons are expected to come from cellulosic feedstocks. Although cellulosic ethanol production has been demonstrated on a pilot level, developing a cost-effective, commercial-scale cellulosic biofuel industry will require transformational science to significantly streamline current production processes. Woodchips, grasses, cornstalks, and other cellulosic biomass are widely abundant but more difficult to break down into sugars than corn grain--the primary source of U.S. ethanol fuel production today. Biological research is key to accelerating the deconstruction of cellulosic biomass into sugars that can be converted to biofuels. The Department of Energy (DOE) Office of Science continues to play a major role in inspiring, supporting, and guiding the biotechnology revolution over the past 25 years. The DOE Genomic Science Program is advancing a new generation of research focused on achieving whole-systems understanding for biology. This program is bringing together scientists in diverse fields to understand the complex biology underlying solutions to DOE missions in energy production, environmental remediation, and climate change science. New interdisciplinary research communities are emerging, as are knowledgebases and scientific and computational resources critical to advancing large-scale, genome-based biology. To focus the most advanced biotechnology-based resources on the biological challenges of biofuel production, DOE established three Bioenergy Research Centers (BRCs) in September 2007. Each center is pursuing the basic research underlying a range of high-risk, high-return biological solutions for bioenergy applications. Advances resulting from the BRCs will provide the knowledge needed to develop new biobased products, methods, and tools that the emerging biofuel industry can use. The scientific rationale for these centers and for other fundamental genomic research critical to the biofuel industry was established at a DOE workshop involving members of the research community (see sidebar, Biofuel Research Plan, below). The DOE BRCs have developed automated, high-throughput analysis pipelines that will accelerate scientific discovery for biology-based biofuel research. The three centers, which were selected through a scientific peer-review process, are based in geographically diverse locations--the Southeast, the Midwest, and the West Coast--with partners across the nation. DOE's Oak Ridge National Laboratory leads the BioEnergy Science Center (BESC) in Tennessee; the University of Wisconsin-Madison leads the Great Lakes Bioenergy Research Center (GLBRC); and DOE's Lawrence Berkeley National Laboratory leads the DOE Joint BioEnergy Institute (JBEI) in California. Each center represents a multidisciplinary partnership with expertise spanning the physical and biological sciences, including genomics, microbial and plant biology, analytical chemistry, computational biology and bioinformatics, and engineering. Institutional partners include DOE national laboratories, universities, private companies, and nonprofit organizations.« less

  20. The Ames Power Monitoring System

    NASA Technical Reports Server (NTRS)

    Osetinsky, Leonid; Wang, David

    2003-01-01

    The Ames Power Monitoring System (APMS) is a centralized system of power meters, computer hardware, and specialpurpose software that collects and stores electrical power data by various facilities at Ames Research Center (ARC). This system is needed because of the large and varying nature of the overall ARC power demand, which has been observed to range from 20 to 200 MW. Large portions of peak demand can be attributed to only three wind tunnels (60, 180, and 100 MW, respectively). The APMS helps ARC avoid or minimize costly demand charges by enabling wind-tunnel operators, test engineers, and the power manager to monitor total demand for center in real time. These persons receive the information they need to manage and schedule energy-intensive research in advance and to adjust loads in real time to ensure that the overall maximum allowable demand is not exceeded. The APMS (see figure) includes a server computer running the Windows NT operating system and can, in principle, include an unlimited number of power meters and client computers. As configured at the time of reporting the information for this article, the APMS includes more than 40 power meters monitoring all the major research facilities, plus 15 Windows-based client personal computers that display real-time and historical data to users via graphical user interfaces (GUIs). The power meters and client computers communicate with the server using Transmission Control Protocol/Internet Protocol (TCP/IP) on Ethernet networks, variously, through dedicated fiber-optic cables or through the pre-existing ARC local-area network (ARCLAN). The APMS has enabled ARC to achieve significant savings ($1.2 million in 2001) in the cost of power and electric energy by helping personnel to maintain total demand below monthly allowable levels, to manage the overall power factor to avoid low power factor penalties, and to use historical system data to identify opportunities for additional energy savings. The APMS also provides power engineers and electricians with the information they need to plan modifications in advance and perform day-to-day maintenance of the ARC electric-power distribution system.

  1. Computational studies of horizontal axis wind turbines in high wind speed condition using advanced turbulence models

    NASA Astrophysics Data System (ADS)

    Benjanirat, Sarun

    Next generation horizontal-axis wind turbines (HAWTs) will operate at very high wind speeds. Existing engineering approaches for modeling the flow phenomena are based on blade element theory, and cannot adequately account for 3-D separated, unsteady flow effects. Therefore, researchers around the world are beginning to model these flows using first principles-based computational fluid dynamics (CFD) approaches. In this study, an existing first principles-based Navier-Stokes approach is being enhanced to model HAWTs at high wind speeds. The enhancements include improved grid topology, implicit time-marching algorithms, and advanced turbulence models. The advanced turbulence models include the Spalart-Allmaras one-equation model, k-epsilon, k-o and Shear Stress Transport (k-o-SST) models. These models are also integrated with detached eddy simulation (DES) models. Results are presented for a range of wind speeds, for a configuration termed National Renewable Energy Laboratory Phase VI rotor, tested at NASA Ames Research Center. Grid sensitivity studies are also presented. Additionally, effects of existing transition models on the predictions are assessed. Data presented include power/torque production, radial distribution of normal and tangential pressure forces, root bending moments, and surface pressure fields. Good agreement was obtained between the predictions and experiments for most of the conditions, particularly with the Spalart-Allmaras-DES model.

  2. From information technology to informatics: the information revolution in dental education.

    PubMed

    Schleyer, Titus K; Thyvalikakath, Thankam P; Spallek, Heiko; Dziabiak, Michael P; Johnson, Lynn A

    2012-01-01

    The capabilities of information technology (IT) have advanced precipitously in the last fifty years. Many of these advances have enabled new and beneficial applications of IT in dental education. However, conceptually, IT use in dental schools is only in its infancy. Challenges and opportunities abound for improving how we support clinical care, education, and research with IT. In clinical care, we need to move electronic dental records beyond replicating paper, connect information on oral health to that on systemic health, facilitate collaborative care through teledentistry, and help clinicians apply evidence-based dentistry and preventive management strategies. With respect to education, we should adopt an evidence-based approach to IT use for teaching and learning, share effective educational content and methods, leverage technology-mediated changes in the balance of power between faculty and students, improve technology support for clinical teaching, and build an information infrastructure centered on learners and organizations. In research, opportunities include reusing clinical care data for research studies, helping advance computational methods for research, applying generalizable research tools in dentistry, and reusing research data and scientific workflows. In the process, we transition from a focus on IT-the mere technical aspects of applying computer technology-to one on informatics: the what, how, and why of managing information.

  3. From Information Technology to Informatics: The Information Revolution in Dental Education

    PubMed Central

    Schleyer, Titus K.; Thyvalikakath, Thankam P.; Spallek, Heiko; Dziabiak, Michael P.; Johnson, Lynn A.

    2014-01-01

    The capabilities of information technology (IT) have advanced precipitously in the last fifty years. Many of these advances have enabled new and beneficial applications of IT in dental education. However, conceptually, IT use in dental schools is only in its infancy. Challenges and opportunities abound for improving how we support clinical care, education, and research with IT. In clinical care, we need to move electronic dental records beyond replicating paper, connect information on oral health to that on systemic health, facilitate collaborative care through teledentistry, and help clinicians apply evidence-based dentistry and preventive management strategies. With respect to education, we should adopt an evidence-based approach to IT use for teaching and learning, share effective educational content and methods, leverage technology-mediated changes in the balance of power between faculty and students, improve technology support for clinical teaching, and build an information infrastructure centered on learners and organizations. In research, opportunities include reusing clinical care data for research studies, helping advance computational methods for research, applying generalizable research tools in dentistry, and reusing research data and scientific workflows. In the process, we transition from a focus on IT—the mere technical aspects of applying computer technology—to one on informatics: the what, how, and why of managing information. PMID:22262557

  4. Nonlinear three-dimensional verification of the SPECYL and PIXIE3D magnetohydrodynamics codes for fusion plasmas

    NASA Astrophysics Data System (ADS)

    Bonfiglio, D.; Chacón, L.; Cappello, S.

    2010-08-01

    With the increasing impact of scientific discovery via advanced computation, there is presently a strong emphasis on ensuring the mathematical correctness of computational simulation tools. Such endeavor, termed verification, is now at the center of most serious code development efforts. In this study, we address a cross-benchmark nonlinear verification study between two three-dimensional magnetohydrodynamics (3D MHD) codes for fluid modeling of fusion plasmas, SPECYL [S. Cappello and D. Biskamp, Nucl. Fusion 36, 571 (1996)] and PIXIE3D [L. Chacón, Phys. Plasmas 15, 056103 (2008)], in their common limit of application: the simple viscoresistive cylindrical approximation. SPECYL is a serial code in cylindrical geometry that features a spectral formulation in space and a semi-implicit temporal advance, and has been used extensively to date for reversed-field pinch studies. PIXIE3D is a massively parallel code in arbitrary curvilinear geometry that features a conservative, solenoidal finite-volume discretization in space, and a fully implicit temporal advance. The present study is, in our view, a first mandatory step in assessing the potential of any numerical 3D MHD code for fluid modeling of fusion plasmas. Excellent agreement is demonstrated over a wide range of parameters for several fusion-relevant cases in both two- and three-dimensional geometries.

  5. Nonlinear three-dimensional verification of the SPECYL and PIXIE3D magnetohydrodynamics codes for fusion plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonfiglio, Daniele; Chacon, Luis; Cappello, Susanna

    2010-01-01

    With the increasing impact of scientific discovery via advanced computation, there is presently a strong emphasis on ensuring the mathematical correctness of computational simulation tools. Such endeavor, termed verification, is now at the center of most serious code development efforts. In this study, we address a cross-benchmark nonlinear verification study between two three-dimensional magnetohydrodynamics (3D MHD) codes for fluid modeling of fusion plasmas, SPECYL [S. Cappello and D. Biskamp, Nucl. Fusion 36, 571 (1996)] and PIXIE3D [L. Chacon, Phys. Plasmas 15, 056103 (2008)], in their common limit of application: the simple viscoresistive cylindrical approximation. SPECYL is a serial code inmore » cylindrical geometry that features a spectral formulation in space and a semi-implicit temporal advance, and has been used extensively to date for reversed-field pinch studies. PIXIE3D is a massively parallel code in arbitrary curvilinear geometry that features a conservative, solenoidal finite-volume discretization in space, and a fully implicit temporal advance. The present study is, in our view, a first mandatory step in assessing the potential of any numerical 3D MHD code for fluid modeling of fusion plasmas. Excellent agreement is demonstrated over a wide range of parameters for several fusion-relevant cases in both two- and three-dimensional geometries.« less

  6. In Pursuit of Improving Airburst and Ground Damage Predictions: Recent Advances in Multi-Body Aerodynamic Testing and Computational Tools Validation

    NASA Technical Reports Server (NTRS)

    Venkatapathy, Ethiraj; Gulhan, Ali; Aftosmis, Michael; Brock, Joseph; Mathias, Donovan; Need, Dominic; Rodriguez, David; Seltner, Patrick; Stern, Eric; Wiles, Sebastian

    2017-01-01

    An airburst from a large asteroid during entry can cause significant ground damage. The damage depends on the energy and the altitude of airburst. Breakup of asteroids into fragments and their lateral spread have been observed. Modeling the underlying physics of fragmented bodies interacting at hypersonic speeds and the spread of fragments is needed for a true predictive capability. Current models use heuristic arguments and assumptions such as pancaking or point source explosive energy release at pre-determined altitude or an assumed fragmentation spread rate to predict airburst damage. A multi-year collaboration between German Aerospace Center (DLR) and NASA has been established to develop validated computational tools to address the above challenge.

  7. Using concepts from biology to improve problem-solving methods

    NASA Astrophysics Data System (ADS)

    Goodman, Erik D.; Rothwell, Edward J.; Averill, Ronald C.

    2011-06-01

    Observing nature has been a cornerstone of engineering design. Today, engineers look not only at finished products, but imitate the evolutionary process by which highly optimized artifacts have appeared in nature. Evolutionary computation began by capturing only the simplest ideas of evolution, but today, researchers study natural evolution and incorporate an increasing number of concepts in order to evolve solutions to complex engineering problems. At the new BEACON Center for the Study of Evolution in Action, studies in the lab and field and in silico are laying the groundwork for new tools for evolutionary engineering design. This paper, which accompanies a keynote address, describes various steps in development and application of evolutionary computation, particularly as regards sensor design, and sets the stage for future advances.

  8. Computational Nanoelectronics and Nanotechnology at NASA ARC

    NASA Technical Reports Server (NTRS)

    Saini, Subhash; Kutler, Paul (Technical Monitor)

    1998-01-01

    Both physical and economic considerations indicate that the scaling era of CMOS will run out of steam around the year 2010. However, physical laws also indicate that it is possible to compute at a rate of a billion times present speeds with the expenditure of only one Watt of electrical power. NASA has long-term needs where ultra-small semiconductor devices are needed for critical applications: high performance, low power, compact computers for intelligent autonomous vehicles and Petaflop computing technology are some key examples. To advance the design, development, and production of future generation micro- and nano-devices, IT Modeling and Simulation Group has been started at NASA Ames with a goal to develop an integrated simulation environment that addresses problems related to nanoelectronics and molecular nanotechnology. Overview of nanoelectronics and nanotechnology research activities being carried out at Ames Research Center will be presented. We will also present the vision and the research objectives of the IT Modeling and Simulation Group including the applications of nanoelectronic based devices relevant to NASA missions.

  9. Computational Nanoelectronics and Nanotechnology at NASA ARC

    NASA Technical Reports Server (NTRS)

    Saini, Subhash

    1998-01-01

    Both physical and economic considerations indicate that the scaling era of CMOS will run out of steam around the year 2010. However, physical laws also indicate that it is possible to compute at a rate of a billion times present speeds with the expenditure of only one Watt of electrical power. NASA has long-term needs where ultra-small semiconductor devices are needed for critical applications: high performance, low power, compact computers for intelligent autonomous vehicles and Petaflop computing technolpgy are some key examples. To advance the design, development, and production of future generation micro- and nano-devices, IT Modeling and Simulation Group has been started at NASA Ames with a goal to develop an integrated simulation environment that addresses problems related to nanoelectronics and molecular nanotecnology. Overview of nanoelectronics and nanotechnology research activities being carried out at Ames Research Center will be presented. We will also present the vision and the research objectives of the IT Modeling and Simulation Group including the applications of nanoelectronic based devices relevant to NASA missions.

  10. Development of the HERMIES III mobile robot research testbed at Oak Ridge National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Manges, W.W.; Hamel, W.R.; Weisbin, C.R.

    1988-01-01

    The latest robot in the Hostile Environment Robotic Machine Intelligence Experiment Series (HERMIES) is now under development at the Center for Engineering Systems Advanced Research (CESAR) in the Oak Ridge National Laboratory. The HERMIES III robot incorporates a larger than human size 7-degree-of-freedom manipulator mounted on a 2-degree-of-freedom mobile platform including a variety of sensors and computers. The deployment of this robot represents a significant increase in research capabilities for the CESAR laboratory. The initial on-board computer capacity of the robot exceeds that of 20 Vax 11/780s. The navigation and vision algorithms under development make extensive use of the on-boardmore » NCUBE hypercube computer while the sensors are interfaced through five VME computers running the OS-9 real-time, multitasking operating system. This paper describes the motivation, key issues, and detailed design trade-offs of implementing the first phase (basic functionality) of the HERMIES III robot. 10 refs., 7 figs.« less

  11. Computational Fluid Dynamics Program at NASA Ames Research Center

    NASA Technical Reports Server (NTRS)

    Holst, Terry L.

    1989-01-01

    The Computational Fluid Dynamics (CFD) Program at NASA Ames Research Center is reviewed and discussed. The technical elements of the CFD Program are listed and briefly discussed. These elements include algorithm research, research and pilot code development, scientific visualization, advanced surface representation, volume grid generation, and numerical optimization. Next, the discipline of CFD is briefly discussed and related to other areas of research at NASA Ames including experimental fluid dynamics, computer science research, computational chemistry, and numerical aerodynamic simulation. These areas combine with CFD to form a larger area of research, which might collectively be called computational technology. The ultimate goal of computational technology research at NASA Ames is to increase the physical understanding of the world in which we live, solve problems of national importance, and increase the technical capabilities of the aerospace community. Next, the major programs at NASA Ames that either use CFD technology or perform research in CFD are listed and discussed. Briefly, this list includes turbulent/transition physics and modeling, high-speed real gas flows, interdisciplinary research, turbomachinery demonstration computations, complete aircraft aerodynamics, rotorcraft applications, powered lift flows, high alpha flows, multiple body aerodynamics, and incompressible flow applications. Some of the individual problems actively being worked in each of these areas is listed to help define the breadth or extent of CFD involvement in each of these major programs. State-of-the-art examples of various CFD applications are presented to highlight most of these areas. The main emphasis of this portion of the presentation is on examples which will not otherwise be treated at this conference by the individual presentations. Finally, a list of principal current limitations and expected future directions is given.

  12. Surface Modeling and Grid Generation of Orbital Sciences X34 Vehicle. Phase 1

    NASA Technical Reports Server (NTRS)

    Alter, Stephen J.

    1997-01-01

    The surface modeling and grid generation requirements, motivations, and methods used to develop Computational Fluid Dynamic volume grids for the X34-Phase 1 are presented. The requirements set forth by the Aerothermodynamics Branch at the NASA Langley Research Center serve as the basis for the final techniques used in the construction of all volume grids, including grids for parametric studies of the X34. The Integrated Computer Engineering and Manufacturing code for Computational Fluid Dynamics (ICEM/CFD), the Grid Generation code (GRIDGEN), the Three-Dimensional Multi-block Advanced Grid Generation System (3DMAGGS) code, and Volume Grid Manipulator (VGM) code are used to enable the necessary surface modeling, surface grid generation, volume grid generation, and grid alterations, respectively. All volume grids generated for the X34, as outlined in this paper, were used for CFD simulations within the Aerothermodynamics Branch.

  13. Advanced Software Techniques for Data Management Systems. Volume 2: Space Shuttle Flight Executive System: Functional Design

    NASA Technical Reports Server (NTRS)

    Pepe, J. T.

    1972-01-01

    A functional design of software executive system for the space shuttle avionics computer is presented. Three primary functions of the executive are emphasized in the design: task management, I/O management, and configuration management. The executive system organization is based on the applications software and configuration requirements established during the Phase B definition of the Space Shuttle program. Although the primary features of the executive system architecture were derived from Phase B requirements, it was specified for implementation with the IBM 4 Pi EP aerospace computer and is expected to be incorporated into a breadboard data management computer system at NASA Manned Spacecraft Center's Information system division. The executive system was structured for internal operation on the IBM 4 Pi EP system with its external configuration and applications software assumed to the characteristic of the centralized quad-redundant avionics systems defined in Phase B.

  14. Benefits of cloud computing for PACS and archiving.

    PubMed

    Koch, Patrick

    2012-01-01

    The goal of cloud-based services is to provide easy, scalable access to computing resources and IT services. The healthcare industry requires a private cloud that adheres to government mandates designed to ensure privacy and security of patient data while enabling access by authorized users. Cloud-based computing in the imaging market has evolved from a service that provided cost effective disaster recovery for archived data to fully featured PACS and vendor neutral archiving services that can address the needs of healthcare providers of all sizes. Healthcare providers worldwide are now using the cloud to distribute images to remote radiologists while supporting advanced reading tools, deliver radiology reports and imaging studies to referring physicians, and provide redundant data storage. Vendor managed cloud services eliminate large capital investments in equipment and maintenance, as well as staffing for the data center--creating a reduction in total cost of ownership for the healthcare provider.

  15. NHERI: Advancing the Research Infrastructure of the Multi-Hazard Community

    NASA Astrophysics Data System (ADS)

    Blain, C. A.; Ramirez, J. A.; Bobet, A.; Browning, J.; Edge, B.; Holmes, W.; Johnson, D.; Robertson, I.; Smith, T.; Zuo, D.

    2017-12-01

    The Natural Hazards Engineering Research Infrastructure (NHERI), supported by the National Science Foundation (NSF), is a distributed, multi-user national facility that provides the natural hazards research community with access to an advanced research infrastructure. Components of NHERI are comprised of a Network Coordination Office (NCO), a cloud-based cyberinfrastructure (DesignSafe-CI), a computational modeling and simulation center (SimCenter), and eight Experimental Facilities (EFs), including a post-disaster, rapid response research facility (RAPID). Utimately NHERI enables researchers to explore and test ground-breaking concepts to protect homes, businesses and infrastructure lifelines from earthquakes, windstorms, tsunamis, and surge enabling innovations to help prevent natural hazards from becoming societal disasters. When coupled with education and community outreach, NHERI will facilitate research and educational advances that contribute knowledge and innovation toward improving the resiliency of the nation's civil infrastructure to withstand natural hazards. The unique capabilities and coordinating activities over Year 1 between NHERI's DesignSafe-CI, the SimCenter, and individual EFs will be presented. Basic descriptions of each component are also found at https://www.designsafe-ci.org/facilities/. Additionally to be discussed are the various roles of the NCO in leading development of a 5-year multi-hazard science plan, coordinating facility scheduling and fostering the sharing of technical knowledge and best practices, leading education and outreach programs such as the recent Summer Institute and multi-facility REU program, ensuring a platform for technology transfer to practicing engineers, and developing strategic national and international partnerships to support a diverse multi-hazard research and user community.

  16. Graduate Automotive Technology Education (GATE) Program: Center of Automotive Technology Excellence in Advanced Hybrid Vehicle Technology at West Virginia University

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nigle N. Clark

    2006-12-31

    This report summarizes the technical and educational achievements of the Graduate Automotive Technology Education (GATE) Center at West Virginia University (WVU), which was created to emphasize Advanced Hybrid Vehicle Technology. The Center has supported the graduate studies of 17 students in the Department of Mechanical and Aerospace Engineering and the Lane Department of Computer Science and Electrical Engineering. These students have addressed topics such as hybrid modeling, construction of a hybrid sport utility vehicle (in conjunction with the FutureTruck program), a MEMS-based sensor, on-board data acquisition for hybrid design optimization, linear engine design and engine emissions. Courses have been developedmore » in Hybrid Vehicle Design, Mobile Source Powerplants, Advanced Vehicle Propulsion, Power Electronics for Automotive Applications and Sensors for Automotive Applications, and have been responsible for 396 hours of graduate student coursework. The GATE program also enhanced the WVU participation in the U.S. Department of Energy Student Design Competitions, in particular FutureTruck and Challenge X. The GATE support for hybrid vehicle technology enhanced understanding of hybrid vehicle design and testing at WVU and encouraged the development of a research agenda in heavy-duty hybrid vehicles. As a result, WVU has now completed three programs in hybrid transit bus emissions characterization, and WVU faculty are leading the Transportation Research Board effort to define life cycle costs for hybrid transit buses. Research and enrollment records show that approximately 100 graduate students have benefited substantially from the hybrid vehicle GATE program at WVU.« less

  17. Center for Advanced Space Propulsion (CASP)

    NASA Technical Reports Server (NTRS)

    1988-01-01

    With a mission to initiate and conduct advanced propulsion research in partnership with industry, and a goal to strengthen U.S. national capability in propulsion technology, the Center for Advanced Space Propulsion (CASP) is the only NASA Center for Commercial Development of Space (CCDS) which focuses on propulsion and associated technologies. Meetings with industrial partners and NASA Headquarters personnel provided an assessment of the constraints placed on, and opportunities afforded commercialization projects. Proprietary information, data rights, and patent rights were some of the areas where well defined information is crucial to project success and follow-on efforts. There were five initial CASP projects. At the end of the first year there are six active, two of which are approaching the ground test phase in their development. Progress in the current six projects has met all milestones and is detailed. Working closely with the industrial counterparts it was found that the endeavors in expert systems development, computational fluid dynamics, fluid management in microgravity, and electric propulsion were well received. One project with the Saturn Corporation which dealt with expert systems application in the assembly process, was placed on hold pending further direction from Saturn. The Contamination Measurment and Analysis project was not implemented since CASP was unable to identify an industrial participant. Additional propulsion and related projects were investigated during the year. A subcontract was let to a small business, MicroCraft, Inc., to study rocket engine certification standards. The study produced valuable results; however, based on a number of factors it was decided not to pursue this project further.

  18. Integrating three-dimensional digital technologies for comprehensive implant dentistry.

    PubMed

    Patel, Neal

    2010-06-01

    The increase in the popularity of and the demand for the use of dental implants to replace teeth has encouraged advancement in clinical technology and materials to improve patients' acceptance and clinical outcomes. Recent advances such as three-dimensional dental radiography with cone-beam computed tomography (CBCT), precision dental implant planning software and clinical execution with guided surgery all play a role in the success of implant dentistry. The author illustrates the technique of comprehensive implant dentistry planning through integration of computer-aided design/computer-aided manufacturing (CAD/CAM) and CBCT data. The technique includes clinical treatment with guided surgery, including the creation of a final restoration with a high-strength ceramic (IPS e.max CAD, Ivoclar Vivadent, Amherst, N.Y.). The author also introduces a technique involving CAD/CAM for fabricating custom implant abutments. The release of software integrating CEREC Acquisition Center with Bluecam (Sirona Dental Systems, Charlotte, N.C.) chairside CAD/CAM and Galileos CBCT imaging (Sirona Dental Systems) allows dentists to plan implant placement, perform implant dentistry with increased precision and provide predictable restorative results by using chairside IPS e.max CAD. The precision of clinical treatment provided by the integration of CAD/CAM and CBCT allows dentists to plan for ideal surgical placement and the appropriate thickness of restorative modalities before placing implants.

  19. MINIVER upgrade for the AVID system. Volume 3: EXITS user's and input guide

    NASA Technical Reports Server (NTRS)

    Pond, J. E.; Schmitz, C. P.

    1983-01-01

    The successful design of thermal protection systems for vehicles operating in atmosphere and near-space environments requires accurate analyses of heating rate and temperature histories encountered along a trajectory. For preliminary design calculations, however, the requirement for accuracy must be tempered by the need for speed and versatility in computational tools used to determine thermal environments and structural thermal response. The MINIVER program was found to provide the proper balance between versatility, speed and accuracy for an aerothermal prediction tool. The advancement in computer aided design concepts at Langley Research Center (LaRC) in the past few years has made it desirable to incorporate the MINIVER program into the LaRC Advanced Vehicle Integrated Design, AVID, system. In order to effectively incorporate MINIVER into the AVID system, several changes to MINIVER were made. The thermal conduction options in MINIVER were removed and a new Explicit Interactive Thermal Structures (EXITS) code was developed. Many upgrades to the MINIVER code were made and a new Langley version of MINIVER called LANMIN was created.

  20. GYROKINETIC PARTICLE SIMULATION OF TURBULENT TRANSPORT IN BURNING PLASMAS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horton, Claude Wendell

    2014-06-10

    The SciDAC project at the IFS advanced the state of high performance computing for turbulent structures and turbulent transport. The team project with Prof Zhihong Lin [PI] at Univ California Irvine produced new understanding of the turbulent electron transport. The simulations were performed at the Texas Advanced Computer Center TACC and the NERSC facility by Wendell Horton, Lee Leonard and the IFS Graduate Students working in that group. The research included a Validation of the electron turbulent transport code using the data from a steady state university experiment at the University of Columbia in which detailed probe measurements of themore » turbulence in steady state were used for wide range of temperature gradients to compare with the simulation data. These results were published in a joint paper with Texas graduate student Dr. Xiangrong Fu using the work in his PhD dissertation. X.R. Fu, W. Horton, Y. Xiao, Z. Lin, A.K. Sen and V. Sokolov, “Validation of electron Temperature gradient turbulence in the Columbia Linear Machine, Phys. Plasmas 19, 032303 (2012).« less

  1. MINIVER upgrade for the AVID system. Volume 1: LANMIN user's manual

    NASA Technical Reports Server (NTRS)

    Engel, C. D.; Praharaj, S. C.

    1983-01-01

    The successful design of thermal protection systems for vehicles operating in atmosphere and near space environments requires accurate analyses of heating rate and temperature histories encountered along a trajectory. For preliminary design calculations, however, the requirement for accuracy must be tempered by the need for speed and versatility in computational tools used to determine thermal environments and structural thermal response. The MINIVER program has been found to provide the proper balance between versatility, speed and accuracy for an aerothermal prediction tool. The advancement in computer aided design concepts at Langley Research Center (LaRC) in the past few years has made it desirable to incorporate the MINIVER program into the LaRC Advanced Vehicle Integrated Design, AVID, system. In order to effectively incorporate MINIVER into the AVID system, several changes to MINIVER were made. The thermal conduction options in MINIVER were removed and a new Explicit Interactive Thermal Structures (EXITS) code was developed. Many upgrades to the MINIVER code were made and a new Langley version of MINIVER called LANMIN was created. The theoretical methods and subroutine functions used in LANMIN are described.

  2. Correlating CFD Simulation with Wind Tunnel Test for the Full-Scale UH-60A Airloads Rotor

    NASA Technical Reports Server (NTRS)

    Romandr, Ethan; Norman, Thomas R.; Chang, I-Chung

    2011-01-01

    Data from the recent UH-60A Airloads Test in the National Full-Scale Aerodynamics Complex 40- by 80- Foot Wind Tunnel at NASA Ames Research Center are presented and compared to predictions computed by a loosely coupled Computational Fluid Dynamics (CFD)/Comprehensive analysis. Primary calculations model the rotor in free-air, but initial calculations are presented including a model of the tunnel test section. The conditions studied include a speed sweep at constant lift up to an advance ratio of 0.4 and a thrust sweep at constant speed into deep stall. Predictions show reasonable agreement with measurement for integrated performance indicators such as power and propulsive but occasionally deviate significantly. Detailed analysis of sectional airloads reveals good correlation in overall trends for normal force and pitching moment but pitching moment mean often differs. Chord force is frequently plagued by mean shifts and an overprediction of drag on the advancing side. Locations of significant aerodynamic phenomena are predicted accurately although the magnitude of individual events is often missed.

  3. Distributed simulation using a real-time shared memory network

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Mattern, Duane L.; Wong, Edmond; Musgrave, Jeffrey L.

    1993-01-01

    The Advanced Control Technology Branch of the NASA Lewis Research Center performs research in the area of advanced digital controls for aeronautic and space propulsion systems. This work requires the real-time implementation of both control software and complex dynamical models of the propulsion system. We are implementing these systems in a distributed, multi-vendor computer environment. Therefore, a need exists for real-time communication and synchronization between the distributed multi-vendor computers. A shared memory network is a potential solution which offers several advantages over other real-time communication approaches. A candidate shared memory network was tested for basic performance. The shared memory network was then used to implement a distributed simulation of a ramjet engine. The accuracy and execution time of the distributed simulation was measured and compared to the performance of the non-partitioned simulation. The ease of partitioning the simulation, the minimal time required to develop for communication between the processors and the resulting execution time all indicate that the shared memory network is a real-time communication technique worthy of serious consideration.

  4. Local Integration of the National Atmospheric Release Advisory Center with Cities (LINC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ermak, D L; Tull, J E; Mosley-Rovi, R

    The objective of the ''Local Integration of the National Atmospheric Release Advisory Center with Cities'' (LINC) program is to demonstrate the capability for providing local government agencies with an advanced operational atmospheric plume prediction capability, which can be seamlessly integrated with appropriate federal agency support for homeland security applications. LINC is a Domestic Demonstration and Application Program (DDAP) funded by the Chemical and Biological National Security Program (CBNP), which is part of the Department of Energy's (DOE) National Nuclear Security Administration (NNSA). LINC will make use of capabilities that have been developed the CBNP, and integrated into the National Atmosphericmore » Release Advisory Center (NARAC) at Lawrence Livermore National Laboratory (LLNL). NARAC tools services will be provided to pilot study cities and counties to map plumes from terrorism threats. Support to these local agencies will include training and customized support for exercises, special events, and general emergencies. NARAC provides tools and services that map the probable spread of hazardous material which have been accidentally or intentionally released into the atmosphere. Primarily supported by the DOE, NARAC is a national support and resource center for planning, real-time assessment and detailed studies of incidents involving a wide variety of hazards, including radiological, chemical, or biological releases. NARAC is a distributed system, providing modeling and geographical information tools for use on an end user's computer system, as well as real-time access to global meteorological and geographical databases and advanced three-dimensional model predictions.« less

  5. Numerical Simulations of Dynamical Mass Transfer in Binaries

    NASA Astrophysics Data System (ADS)

    Motl, P. M.; Frank, J.; Tohline, J. E.

    1999-05-01

    We will present results from our ongoing research project to simulate dynamically unstable mass transfer in near contact binaries with mass ratios different from one. We employ a fully three-dimensional self-consistent field technique to generate synchronously rotating polytropic binaries. With our self-consistent field code we can create equilibrium binaries where one component is, by radius, within about 99 of filling its Roche lobe for example. These initial configurations are evolved using a three-dimensional, Eulerian hydrodynamics code. We make no assumptions about the symmetry of the subsequent flow and the entire binary system is evolved self-consistently under the influence of its own gravitational potential. For a given mass ratio and polytropic index for the binary components, mass transfer via Roche lobe overflow can be predicted to be stable or unstable through simple theoretical arguments. The validity of the approximations made in the stability calculations are tested against our numerical simulations. We acknowledge support from the U.S. National Science Foundation through grants AST-9720771, AST-9528424, and DGE-9355007. This research has been supported, in part, by grants of high-performance computing time on NPACI facilities at the San Diego Supercomputer Center, the Texas Advanced Computing Center and through the PET program of the NAVOCEANO DoD Major Shared Resource Center in Stennis, MS.

  6. 75 FR 43518 - Advanced Scientific Computing Advisory Committee; Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-26

    ... DEPARTMENT OF ENERGY Advanced Scientific Computing Advisory Committee; Meeting AGENCY: Office of... Scientific Computing Advisory Committee (ASCAC). Federal Advisory Committee Act (Pub. L. 92-463, 86 Stat. 770...: Melea Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U. S...

  7. New computing systems, future computing environment, and their implications on structural analysis and design

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Housner, Jerrold M.

    1993-01-01

    Recent advances in computer technology that are likely to impact structural analysis and design of flight vehicles are reviewed. A brief summary is given of the advances in microelectronics, networking technologies, and in the user-interface hardware and software. The major features of new and projected computing systems, including high performance computers, parallel processing machines, and small systems, are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed. The impact of the advances in computer technology on structural analysis and the design of flight vehicles is described. A scenario for future computing paradigms is presented, and the near-term needs in the computational structures area are outlined.

  8. Computational oncology.

    PubMed

    Lefor, Alan T

    2011-08-01

    Oncology research has traditionally been conducted using techniques from the biological sciences. The new field of computational oncology has forged a new relationship between the physical sciences and oncology to further advance research. By applying physics and mathematics to oncologic problems, new insights will emerge into the pathogenesis and treatment of malignancies. One major area of investigation in computational oncology centers around the acquisition and analysis of data, using improved computing hardware and software. Large databases of cellular pathways are being analyzed to understand the interrelationship among complex biological processes. Computer-aided detection is being applied to the analysis of routine imaging data including mammography and chest imaging to improve the accuracy and detection rate for population screening. The second major area of investigation uses computers to construct sophisticated mathematical models of individual cancer cells as well as larger systems using partial differential equations. These models are further refined with clinically available information to more accurately reflect living systems. One of the major obstacles in the partnership between physical scientists and the oncology community is communications. Standard ways to convey information must be developed. Future progress in computational oncology will depend on close collaboration between clinicians and investigators to further the understanding of cancer using these new approaches.

  9. Berkeley Lab - Materials Sciences Division

    Science.gov Websites

    Computational Study of Excited-State Phenomena in Energy Materials Center for X-ray Optics MSD Facilities Ion and Materials Physics Scattering and Instrumentation Science Centers Center for Computational Study of Sciences Centers Center for Computational Study of Excited-State Phenomena in Energy Materials Center for X

  10. Distributed computing testbed for a remote experimental environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Butner, D.N.; Casper, T.A.; Howard, B.C.

    1995-09-18

    Collaboration is increasing as physics research becomes concentrated on a few large, expensive facilities, particularly in magnetic fusion energy research, with national and international participation. These facilities are designed for steady state operation and interactive, real-time experimentation. We are developing tools to provide for the establishment of geographically distant centers for interactive operations; such centers would allow scientists to participate in experiments from their home institutions. A testbed is being developed for a Remote Experimental Environment (REE), a ``Collaboratory.`` The testbed will be used to evaluate the ability of a remotely located group of scientists to conduct research on themore » DIII-D Tokamak at General Atomics. The REE will serve as a testing environment for advanced control and collaboration concepts applicable to future experiments. Process-to-process communications over high speed wide area networks provide real-time synchronization and exchange of data among multiple computer networks, while the ability to conduct research is enhanced by adding audio/video communication capabilities. The Open Software Foundation`s Distributed Computing Environment is being used to test concepts in distributed control, security, naming, remote procedure calls and distributed file access using the Distributed File Services. We are exploring the technology and sociology of remotely participating in the operation of a large scale experimental facility.« less

  11. Data communication network at the ASRM facility

    NASA Technical Reports Server (NTRS)

    Moorhead, Robert J., II; Smith, Wayne D.; Nirgudkar, Ravi; Zhu, Zhifan; Robinson, Walter

    1993-01-01

    The main objective of the report is to present the overall communication network structure for the Advanced Solid Rocket Motor (ASRM) facility being built at Yellow Creek near Iuka, Mississippi. This report is compiled using information received from NASA/MSFC, LMSC, AAD, and RUST Inc. As per the information gathered, the overall network structure will have one logical FDDI ring acting as a backbone for the whole complex. The buildings will be grouped into two categories viz. manufacturing critical and manufacturing non-critical. The manufacturing critical buildings will be connected via FDDI to the Operational Information System (OIS) in the main computing center in B 1000. The manufacturing non-critical buildings will be connected by 10BASE-FL to the Business Information System (BIS) in the main computing center. The workcells will be connected to the Area Supervisory Computers (ASCs) through the nearest manufacturing critical hub and one of the OIS hubs. The network structure described in this report will be the basis for simulations to be carried out next year. The Comdisco's Block Oriented Network Simulator (BONeS) will be used for the network simulation. The main aim of the simulations will be to evaluate the loading of the OIS, the BIS, the ASCs, and the network links by the traffic generated by the workstations and workcells throughout the site.

  12. Data communication network at the ASRM facility

    NASA Astrophysics Data System (ADS)

    Moorhead, Robert J., II; Smith, Wayne D.; Nirgudkar, Ravi; Zhu, Zhifan; Robinson, Walter

    1993-02-01

    The main objective of the report is to present the overall communication network structure for the Advanced Solid Rocket Motor (ASRM) facility being built at Yellow Creek near Iuka, Mississippi. This report is compiled using information received from NASA/MSFC, LMSC, AAD, and RUST Inc. As per the information gathered, the overall network structure will have one logical FDDI ring acting as a backbone for the whole complex. The buildings will be grouped into two categories viz. manufacturing critical and manufacturing non-critical. The manufacturing critical buildings will be connected via FDDI to the Operational Information System (OIS) in the main computing center in B 1000. The manufacturing non-critical buildings will be connected by 10BASE-FL to the Business Information System (BIS) in the main computing center. The workcells will be connected to the Area Supervisory Computers (ASCs) through the nearest manufacturing critical hub and one of the OIS hubs. The network structure described in this report will be the basis for simulations to be carried out next year. The Comdisco's Block Oriented Network Simulator (BONeS) will be used for the network simulation. The main aim of the simulations will be to evaluate the loading of the OIS, the BIS, the ASCs, and the network links by the traffic generated by the workstations and workcells throughout the site.

  13. The Atmospheric Data Acquisition And Interpolation Process For Center-TRACON Automation System

    NASA Technical Reports Server (NTRS)

    Jardin, M. R.; Erzberger, H.; Denery, Dallas G. (Technical Monitor)

    1995-01-01

    The Center-TRACON Automation System (CTAS), an advanced new air traffic automation program, requires knowledge of spatial and temporal atmospheric conditions such as the wind speed and direction, the temperature and the pressure in order to accurately predict aircraft trajectories. Real-time atmospheric data is available in a grid format so that CTAS must interpolate between the grid points to estimate the atmospheric parameter values. The atmospheric data grid is generally not in the same coordinate system as that used by CTAS so that coordinate conversions are required. Both the interpolation and coordinate conversion processes can introduce errors into the atmospheric data and reduce interpolation accuracy. More accurate algorithms may be computationally expensive or may require a prohibitively large amount of data storage capacity so that trade-offs must be made between accuracy and the available computational and data storage resources. The atmospheric data acquisition and processing employed by CTAS will be outlined in this report. The effects of atmospheric data processing on CTAS trajectory prediction will also be analyzed, and several examples of the trajectory prediction process will be given.

  14. Accelerating Science with the NERSC Burst Buffer Early User Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhimji, Wahid; Bard, Debbie; Romanus, Melissa

    NVRAM-based Burst Buffers are an important part of the emerging HPC storage landscape. The National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory recently installed one of the first Burst Buffer systems as part of its new Cori supercomputer, collaborating with Cray on the development of the DataWarp software. NERSC has a diverse user base comprised of over 6500 users in 700 different projects spanning a wide variety of scientific computing applications. The use-cases of the Burst Buffer at NERSC are therefore also considerable and diverse. We describe here performance measurements and lessons learned from the Burstmore » Buffer Early User Program at NERSC, which selected a number of research projects to gain early access to the Burst Buffer and exercise its capability to enable new scientific advancements. To the best of our knowledge this is the first time a Burst Buffer has been stressed at scale by diverse, real user workloads and therefore these lessons will be of considerable benefit to shaping the developing use of Burst Buffers at HPC centers.« less

  15. Supplemental final environmental impact statement for advanced solid rocket motor testing at Stennis Space Center

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Since the Final Environmental Impact Statement (FEIS) and Record of Decision on the FEIS describing the potential impacts to human health and the environment associated with the program, three factors have caused NASA to initiate additional studies regarding these issues. These factors are: (1) The U.S. Army Corps of Engineers and the Environmental Protection Agency (EPA) agreed to use the same comprehensive procedures to identify and delineate wetlands; (2) EPA has given NASA further guidance on how best to simulate the exhaust plume from the Advanced Solid Rocket Motor (ASRM) testing through computer modeling, enabling more realistic analysis of emission impacts; and (3) public concerns have been raised concerning short and long term impacts on human health and the environment from ASRM testing.

  16. Improving Vacuum Cleaners

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Under a Space Act Agreement between the Kirby company and Lewis Research Center, NASA technology was applied to a commercial vacuum cleaner product line. Kirby engineers were interested in advanced operational concepts, such as particle flow behavior and vibration, critical factors to improve vacuum cleaner performance. An evaluation of the company 1994 home care system, the Kirby G4, led to the refinement of the new G5 and future models. Under the cooperative agreement, Kirby had access to Lewis' holography equipment, which added insight into how long a vacuum cleaner fan would perform, as well as advanced computer software that can simulate the flow of air through fans. The collaboration resulted in several successes including fan blade redesign and continuing dialogue on how to improve air-flow traits in various nozzle designs.

  17. The flight telerobotic servicer and technology transfer

    NASA Technical Reports Server (NTRS)

    Andary, James F.; Bradford, Kayland Z.

    1991-01-01

    The Flight Telerobotic Servicer (FTS) project at the Goddard Space Flight Center is developing an advanced telerobotic system to assist in and reduce crew extravehicular activity (EVA) for Space Station Freedom (SSF). The FTS will provide a telerobotic capability in the early phases of the SSF program and will be employed for assembly, maintenance, and inspection applications. The current state of space technology and the general nature of the FTS tasks dictate that the FTS be designed with sophisticated teleoperational capabilities for its internal primary operating mode. However, technologies such as advanced computer vision and autonomous planning techniques would greatly enhance the FTS capabilities to perform autonomously in less structured work environments. Another objective of the FTS program is to accelerate technology transfer from research to U.S. industry.

  18. High-Performance Computing Data Center Warm-Water Liquid Cooling |

    Science.gov Websites

    Computational Science | NREL Warm-Water Liquid Cooling High-Performance Computing Data Center Warm-Water Liquid Cooling NREL's High-Performance Computing Data Center (HPC Data Center) is liquid water Liquid cooling technologies offer a more energy-efficient solution that also allows for effective

  19. Women's Center Volunteer Intern Program: Building Community While Advancing Social and Gender Justice

    ERIC Educational Resources Information Center

    Murray, Margaret A.; Vlasnik, Amber L.

    2015-01-01

    This program description explores the purpose, structure, activities, and outcomes of the volunteer intern program at the Wright State University Women's Center. Designed to create meaningful, hands-on learning experiences for students and to advance the center's mission, the volunteer intern program builds community while advancing social and…

  20. The Magellan Final Report on Cloud Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    ,; Coghlan, Susan; Yelick, Katherine

    The goal of Magellan, a project funded through the U.S. Department of Energy (DOE) Office of Advanced Scientific Computing Research (ASCR), was to investigate the potential role of cloud computing in addressing the computing needs for the DOE Office of Science (SC), particularly related to serving the needs of mid- range computing and future data-intensive computing workloads. A set of research questions was formed to probe various aspects of cloud computing from performance, usability, and cost. To address these questions, a distributed testbed infrastructure was deployed at the Argonne Leadership Computing Facility (ALCF) and the National Energy Research Scientific Computingmore » Center (NERSC). The testbed was designed to be flexible and capable enough to explore a variety of computing models and hardware design points in order to understand the impact for various scientific applications. During the project, the testbed also served as a valuable resource to application scientists. Applications from a diverse set of projects such as MG-RAST (a metagenomics analysis server), the Joint Genome Institute, the STAR experiment at the Relativistic Heavy Ion Collider, and the Laser Interferometer Gravitational Wave Observatory (LIGO), were used by the Magellan project for benchmarking within the cloud, but the project teams were also able to accomplish important production science utilizing the Magellan cloud resources.« less

  1. A Computational Methodology for Simulating Thermal Loss Testing of the Advanced Stirling Convertor

    NASA Technical Reports Server (NTRS)

    Reid, Terry V.; Wilson, Scott D.; Schifer, Nicholas A.; Briggs, Maxwell H.

    2012-01-01

    The U.S. Department of Energy (DOE) and Lockheed Martin Space Systems Company (LMSSC) have been developing the Advanced Stirling Radioisotope Generator (ASRG) for use as a power system for space science missions. This generator would use two highefficiency Advanced Stirling Convertors (ASCs), developed by Sunpower Inc. and NASA Glenn Research Center (GRC). The ASCs convert thermal energy from a radioisotope heat source into electricity. As part of ground testing of these ASCs, different operating conditions are used to simulate expected mission conditions. These conditions require achieving a particular operating frequency, hot end and cold end temperatures, and specified electrical power output for a given net heat input. In an effort to improve net heat input predictions, numerous tasks have been performed which provided a more accurate value for net heat input into the ASCs, including the use of multidimensional numerical models. Validation test hardware has also been used to provide a direct comparison of numerical results and validate the multi-dimensional numerical models used to predict convertor net heat input and efficiency. These validation tests were designed to simulate the temperature profile of an operating Stirling convertor and resulted in a measured net heat input of 244.4 W. The methodology was applied to the multi-dimensional numerical model which resulted in a net heat input of 240.3 W. The computational methodology resulted in a value of net heat input that was 1.7 percent less than that measured during laboratory testing. The resulting computational methodology and results are discussed.

  2. Hera: High Energy Astronomical Data Analysis via the Internet

    NASA Astrophysics Data System (ADS)

    Valencic, Lynne A.; Chai, P.; Pence, W.; Snowden, S.

    2011-09-01

    The HEASARC at NASA Goddard Space Flight Center has developed Hera, a data processing facility for analyzing high energy astronomical data over the internet. Hera provides all the software packages, disk space, and computing resources needed to do general processing of and advanced research on publicly available data from High Energy Astrophysics missions. The data and data products are kept on a server at GSFC and can be downloaded to a user's local machine. This service is provided for free to students, educators, and researchers for educational and research purposes.

  3. Development of Telemedicine Capabilities for a Joint US-Russian Space Biomedical Center for Training and Research

    NASA Technical Reports Server (NTRS)

    DeBakey, Michael E.

    1998-01-01

    From the perspective of scheduling, some medical consultations can have asynchronous and synchronous components. Consultations frequently involve the compilation of patient data, its analysis, a consultant's report, and a real-time conference between the referring physician and the consultant. The bandwidth of the Internet with Moscow and advances in the hardware and software of personal computing now make possible telemedicine events with store-and-forward components and real-time components. These are hybrid telemedicine and this paper describes such a case.

  4. Space - A unique environment for process modeling R&D

    NASA Technical Reports Server (NTRS)

    Overfelt, Tony

    1991-01-01

    Process modeling, the application of advanced computational techniques to simulate real processes as they occur in regular use, e.g., welding, casting and semiconductor crystal growth, is discussed. Using the low-gravity environment of space will accelerate the technical validation of the procedures and enable extremely accurate determinations of the many necessary thermophysical properties. Attention is given to NASA's centers for the commercial development of space; joint ventures of universities, industries, and goverment agencies to study the unique attributes of space that offer potential for applied R&D and eventual commercial exploitation.

  5. The evolving trend in spacecraft health analysis

    NASA Technical Reports Server (NTRS)

    Kirkpatrick, Russell L.

    1993-01-01

    The Space Flight Operations Center inaugurated the concept of a central data repository for spacecraft data and the distribution of computing power to the end users for that data's analysis at the Jet Propulsion Laboratory. The Advanced Multimission Operations System is continuing the evolution of this concept as new technologies emerge. Constant improvements in data management tools, data visualization, and hardware lead to ever expanding ideas for improving the analysis of spacecraft health in an era of budget constrained mission operations systems. The foundation of this evolution, its history, and its current plans will be discussed.

  6. The Fourth Annual Thermal and Fluids Analysis Workshop

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The Fourth Annual Thermal and Fluids Analysis Workshop was held from August 17-21, 1992, at NASA Lewis Research Center. The workshop consisted of classes, vendor demonstrations, and paper sessions. The classes and vendor demonstrations provided participants with the information on widely used tools for thermal and fluids analysis. The paper sessions provided a forum for the exchange of information and ideas among thermal and fluids analysts. Paper topics included advances and uses of established thermal and fluids computer codes (such as SINDA and TRASYS) as well as unique modeling techniques and applications.

  7. NASA and USGS invest in invasive species modeling to evaluate habitat for Africanized Honey Bees

    USGS Publications Warehouse

    2009-01-01

    Invasive non-native species, such as plants, animals, and pathogens, have long been an interest to the U.S. Geological Survey (USGS) and NASA. Invasive species cause harm to our economy (around $120 B/year), the environment (e.g., replacing native biodiversity, forest pathogens negatively affecting carbon storage), and human health (e.g., plague, West Nile virus). Five years ago, the USGS and NASA formed a partnership to improve ecological forecasting capabilities for the early detection and containment of the highest priority invasive species. Scientists from NASA Goddard Space Flight Center (GSFC) and the Fort Collins Science Center developed a longterm strategy to integrate remote sensing capabilities, high-performance computing capabilities and new spatial modeling techniques to advance the science of ecological invasions [Schnase et al., 2002].

  8. NASA Lewis Research Center Workshop on Forced Response in Turbomachinery

    NASA Technical Reports Server (NTRS)

    Stefko, George L. (Compiler); Murthy, Durbha V. (Compiler); Morel, Michael (Compiler); Hoyniak, Dan (Compiler); Gauntner, Jim W. (Compiler)

    1994-01-01

    A summary of the NASA Lewis Research Center (LeRC) Workshop on Forced Response in Turbomachinery in August, 1993 is presented. It was sponsored by the following NASA organizations: Structures, Space Propulsion Technology, and Propulsion Systems Divisions of NASA LeRC and the Aeronautics and Advanced Concepts & Technology Offices of NASA Headquarters. In addition, the workshop was held in conjunction with the GUIde (Government/Industry/Universities) Consortium on Forced Response. The workshop was specifically designed to receive suggestions and comments from industry on current research at NASA LeRC in the area of forced vibratory response of turbomachinery blades which includes both computational and experimental approaches. There were eight presentations and a code demonstration. Major areas of research included aeroelastic response, steady and unsteady fluid dynamics, mistuning, and corresponding experimental work.

  9. ED08-0109-08

    NASA Image and Video Library

    2008-05-01

    Ikhana fiber optic wing shape sensor team: clockwise from left, Anthony "Nino" Piazza, Allen Parker, William Ko and Lance Richards. The sensors, located along a fiber the thickness of a human hair, aren't visible in the center of the Ikhana aircraft's left wing. NASA Dryden Flight Research Center is evaluating an advanced fiber optic-based sensing technology installed on the wings of NASA's Ikhana aircraft. The fiber optic system measures and displays the shape of the aircraft's wings in flight. There are other potential safety applications for the technology, such as vehicle structural health monitoring. If an aircraft structure can be monitored with sensors and a computer can manipulate flight control surfaces to compensate for stresses on the wings, structural control can be established to prevent situations that might otherwise result in a loss of control.

  10. Numerical Viscous Flow Analysis of an Advanced Semispan Diamond-Wing Model at High-Life Conditions

    NASA Technical Reports Server (NTRS)

    Ghaffari, F.; Biedron, R. T.; Luckring, J. M.

    2002-01-01

    Turbulent Navier-Stokes computational results are presented for an advanced diamond wing semispan model at low speed, high-lift conditions. The numerical results are obtained in support of a wind-tunnel test that was conducted in the National Transonic Facility (NTF) at the NASA Langley Research Center. The model incorporated a generic fuselage and was mounted on the tunnel sidewall using a constant width standoff. The analyses include: (1) the numerical simulation of the NTF empty, tunnel flow characteristics; (2) semispan high-lift model with the standoff in the tunnel environment; (3) semispan high-lift model with the standoff and viscous sidewall in free air; and (4) semispan high-lift model without the standoff in free air. The computations were performed at conditions that correspond to a nominal approach and landing configuration. The wing surface pressure distributions computed for the model in both the tunnel and in free air agreed well with the corresponding experimental data and they both indicated small increments due to the wall interference effects. However, the wall interference effects were found to be more pronounced in the total measured and the computed lift, drag and pitching moment due to standard induced up-flow effects. Although the magnitudes of the computed forces and moment were slightly off compared to the measured data, the increments due the wall interference effects were predicted well. The numerical predictions are also presented on the combined effects of the tunnel sidewall boundary layer and the standoff geometry on the fuselage fore-body pressure distributions and the resulting impact on the overall configuration longitudinal aerodynamic characteristics.

  11. Lack of communication and control: experiences of distance caregivers of parents with advanced cancer.

    PubMed

    Mazanec, Polly; Daly, Barbara J; Ferrell, Betty Rolling; Prince-Paul, Maryjo

    2011-05-01

    To explore the new and complex phenomenon of distance caregiving in the advanced cancer population. Qualitative. A large comprehensive cancer center in the midwestern region of the United States. 14 distance caregivers of parents with advanced cancer. Patients with advanced lung, gastrointestinal, and gynecologic malignancies consented to have their distance caregiving adult children contacted to participate in the study. Responses to three open-ended questions guided the tape-recorded telephone interviews with the distance caregivers. Following transcription, content analysis with inductive coding was performed. Two major themes, communication and control, and five subthemes, benefits and burdens of distance caregiving, dealing with uncertainty, direct action through information seeking, protecting, and staying connected, emerged from the data. Distance caregivers experience some of the same stressors that local caregivers of patients with cancer experience. In addition, they have unique psychosocial needs related to the burden of geographic distance. Distance caregivers could benefit from nursing interventions targeted at their unique needs. Innovative interventions using Web-based computer technology for improved communication, as well as supportive care interventions, may be helpful.

  12. 78 FR 59927 - Next Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-30

    ... Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and Systems Biology..., Computational, and Systems Biology [External Review Draft]'' (EPA/600/R-13/214A). EPA is also announcing that... Advances in Molecular, Computational, and Systems Biology [External Review Draft]'' is available primarily...

  13. 78 FR 68058 - Next Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-13

    ... Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and Systems Biology... Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and Systems Biology..., computational, and systems biology data can better inform risk assessment. This draft document is available for...

  14. Making Advanced Computer Science Topics More Accessible through Interactive Technologies

    ERIC Educational Resources Information Center

    Shao, Kun; Maher, Peter

    2012-01-01

    Purpose: Teaching advanced technical concepts in a computer science program to students of different technical backgrounds presents many challenges. The purpose of this paper is to present a detailed experimental pedagogy in teaching advanced computer science topics, such as computer networking, telecommunications and data structures using…

  15. The pKa Cooperative: A Collaborative Effort to Advance Structure-Based Calculations of pKa values and Electrostatic Effects in Proteins

    PubMed Central

    Nielsen, Jens E.; Gunner, M. R.; Bertrand García-Moreno, E.

    2012-01-01

    The pKa Cooperative http://www.pkacoop.org was organized to advance development of accurate and useful computational methods for structure-based calculation of pKa values and electrostatic energy in proteins. The Cooperative brings together laboratories with expertise and interest in theoretical, computational and experimental studies of protein electrostatics. To improve structure-based energy calculations it is necessary to better understand the physical character and molecular determinants of electrostatic effects. The Cooperative thus intends to foment experimental research into fundamental aspects of proteins that depend on electrostatic interactions. It will maintain a depository for experimental data useful for critical assessment of methods for structure-based electrostatics calculations. To help guide the development of computational methods the Cooperative will organize blind prediction exercises. As a first step, computational laboratories were invited to reproduce an unpublished set of experimental pKa values of acidic and basic residues introduced in the interior of staphylococcal nuclease by site-directed mutagenesis. The pKa values of these groups are unique and challenging to simulate owing to the large magnitude of their shifts relative to normal pKa values in water. Many computational methods were tested in this 1st Blind Prediction Challenge and critical assessment exercise. A workshop was organized in the Telluride Science Research Center to assess objectively the performance of many computational methods tested on this one extensive dataset. This volume of PROTEINS: Structure, Function, and Bioinformatics introduces the pKa Cooperative, presents reports submitted by participants in the blind prediction challenge, and highlights some of the problems in structure-based calculations identified during this exercise. PMID:22002877

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Darrow, Ken; Hedman, Bruce

    Data centers represent a rapidly growing and very energy intensive activity in commercial, educational, and government facilities. In the last five years the growth of this sector was the electric power equivalent to seven new coal-fired power plants. Data centers consume 1.5% of the total power in the U.S. Growth over the next five to ten years is expected to require a similar increase in power generation. This energy consumption is concentrated in buildings that are 10-40 times more energy intensive than a typical office building. The sheer size of the market, the concentrated energy consumption per facility, and themore » tendency of facilities to cluster in 'high-tech' centers all contribute to a potential power infrastructure crisis for the industry. Meeting the energy needs of data centers is a moving target. Computing power is advancing rapidly, which reduces the energy requirements for data centers. A lot of work is going into improving the computing power of servers and other processing equipment. However, this increase in computing power is increasing the power densities of this equipment. While fewer pieces of equipment may be needed to meet a given data processing load, the energy density of a facility designed to house this higher efficiency equipment will be as high as or higher than it is today. In other words, while the data center of the future may have the IT power of ten data centers of today, it is also going to have higher power requirements and higher power densities. This report analyzes the opportunities for CHP technologies to assist primary power in making the data center more cost-effective and energy efficient. Broader application of CHP will lower the demand for electricity from central stations and reduce the pressure on electric transmission and distribution infrastructure. This report is organized into the following sections: (1) Data Center Market Segmentation--the description of the overall size of the market, the size and types of facilities involved, and the geographic distribution. (2) Data Center Energy Use Trends--a discussion of energy use and expected energy growth and the typical energy consumption and uses in data centers. (3) CHP Applicability--Potential configurations, CHP case studies, applicable equipment, heat recovery opportunities (cooling), cost and performance benchmarks, and power reliability benefits (4) CHP Drivers and Hurdles--evaluation of user benefits, social benefits, market structural issues and attitudes toward CHP, and regulatory hurdles. (5) CHP Paths to Market--Discussion of technical needs, education, strategic partnerships needed to promote CHP in the IT community.« less

  17. A collaborative institutional model for integrating computer applications in the medical curriculum.

    PubMed Central

    Friedman, C. P.; Oxford, G. S.; Juliano, E. L.

    1991-01-01

    The introduction and promotion of information technology in an established medical curriculum with existing academic and technical support structures poses a number of challenges. The UNC School of Medicine has developed the Taskforce on Educational Applications in Medicine (TEAM), to coordinate this effort. TEAM works as a confederation of existing research and support units with interests in computers and education, along with a core of interested faculty with curricular responsibilities. Constituent units of the TEAM confederation include the medical center library, medical television studios, basic science teaching laboratories, educational development office, microcomputer and network support groups, academic affairs administration, and a subset of course directors and teaching faculty. Among our efforts have been the establishment of (1) a mini-grant program to support faculty initiated development and implementation of computer applications in the curriculum, (2) a symposium series with visiting speakers to acquaint faculty with current developments in medical informatics and related curricular efforts at other institution, (3) 20 computer workstations located in the multipurpose teaching labs where first and second year students do much of their academic work, (4) a demonstration center for evaluation of courseware and technologically advanced delivery systems. The student workstations provide convenient access to electronic mail, University schedules and calendars, the CoSy computer conferencing system, and several software applications integral to their courses in pathology, histology, microbiology, biochemistry, and neurobiology. The progress achieved toward the primary goal has modestly exceeded our initial expectations, while the collegiality and interest expressed toward TEAM activities in the local environment stand as empirical measures of the success of the concept. PMID:1807705

  18. eHealth Research from the User’s Perspective

    PubMed Central

    Hesse, Bradford W.; Shneiderman, Ben

    2007-01-01

    The application of Information Technology (IT) to issues of healthcare delivery has had a long and tortuous history in the U.S. Within the field of eHealth, vanguard applications of advanced computing techniques, such as applications in artificial intelligence or expert systems, have languished in spite of a track record of scholarly publication and decisional accuracy. The problem is one of purpose, of asking the right questions for the science to solve. Historically, many computer science pioneers have been tempted to ask “what can the computer do?” New advances in eHealth are prompting developers to ask “what can people do?” How can eHealth take part in national goals for healthcare reform to empower relationships between healthcare professionals and patients, healthcare teams and families, and hospitals and communities to improve health equitably throughout the population? To do this, eHealth researchers must combine best evidence from the user sciences (human factors engineering, human-computer interaction, psychology, and usability) with best evidence in medicine to create transformational improvements in the quality of care that medicine offers. These improvements should follow recommendations from the Institute of Medicine to create a health care system that is (a) safe, (b) effective (evidence-based), (c) patient-centered, and (d) timely. Relying on the eHealth researcher’s intuitive grasp of systems issues, improvements should be made with considerations of users and beneficiaries at the individual (patient/physician), group (family/staff), community, and broad environmental levels. PMID:17466825

  19. Human Centered Autonomous and Assistant Systems Testbed for Exploration Operations

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Mount, Frances; Carreon, Patricia; Torney, Susan E.

    2001-01-01

    The Engineering and Mission Operations Directorates at NASA Johnson Space Center are combining laboratories and expertise to establish the Human Centered Autonomous and Assistant Systems Testbed for Exploration Operations. This is a testbed for human centered design, development and evaluation of intelligent autonomous and assistant systems that will be needed for human exploration and development of space. This project will improve human-centered analysis, design and evaluation methods for developing intelligent software. This software will support human-machine cognitive and collaborative activities in future interplanetary work environments where distributed computer and human agents cooperate. We are developing and evaluating prototype intelligent systems for distributed multi-agent mixed-initiative operations. The primary target domain is control of life support systems in a planetary base. Technical approaches will be evaluated for use during extended manned tests in the target domain, the Bioregenerative Advanced Life Support Systems Test Complex (BIO-Plex). A spinoff target domain is the International Space Station (ISS) Mission Control Center (MCC). Prodl}cts of this project include human-centered intelligent software technology, innovative human interface designs, and human-centered software development processes, methods and products. The testbed uses adjustable autonomy software and life support systems simulation models from the Adjustable Autonomy Testbed, to represent operations on the remote planet. Ground operations prototypes and concepts will be evaluated in the Exploration Planning and Operations Center (ExPOC) and Jupiter Facility.

  20. Advanced Computer Typography.

    DTIC Science & Technology

    1981-12-01

    ADVANCED COMPUTER TYPOGRAPHY .(U) DEC 81 A V HERSHEY UNCLASSIFIED NPS012-81-005 M MEEEIEEEII IIUJIL15I.4 MICROCQP RE SO.JjI ON ft R NPS012-81-005...NAVAL POSTGRADUATE SCHOOL 0Monterey, California DTIC SELECTEWA APR 5 1982 B ADVANCED COMPUTER TYPOGRAPHY by A. V. HERSHEY December 1981 OApproved for...Subtitle) S. TYPE Or REPORT & PERIOD COVERED Final ADVANCED COMPUTER TYPOGRAPHY Dec 1979 - Dec 1981 S. PERFORMING ORG. REPORT NUMBER 7. AUTHOR(s) S CONTRACT

  1. Development and training of a learning expert system in an autonomous mobile robot via simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spelt, P.F.; Lyness, E.; DeSaussure, G.

    1989-11-01

    The Center for Engineering Systems Advanced Research (CESAR) conducts basic research in the area of intelligent machines. Recently at CESAR a learning expert system was created to operate on board an autonomous robot working at a process control panel. The authors discuss two-computer simulation system used to create, evaluate and train this learning system. The simulation system has a graphics display of the current status of the process being simulated, and the same program which does the simulating also drives the actual control panel. Simulation results were validated on the actual robot. The speed and safety values of using amore » computerized simulator to train a learning computer, and future uses of the simulation system, are discussed.« less

  2. A Look at the Impact of High-End Computing Technologies on NASA Missions

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak; Dunbar, Jill; Hardman, John; Bailey, F. Ron; Wheeler, Lorien; Rogers, Stuart

    2012-01-01

    From its bold start nearly 30 years ago and continuing today, the NASA Advanced Supercomputing (NAS) facility at Ames Research Center has enabled remarkable breakthroughs in the space agency s science and engineering missions. Throughout this time, NAS experts have influenced the state-of-the-art in high-performance computing (HPC) and related technologies such as scientific visualization, system benchmarking, batch scheduling, and grid environments. We highlight the pioneering achievements and innovations originating from and made possible by NAS resources and know-how, from early supercomputing environment design and software development, to long-term simulation and analyses critical to design safe Space Shuttle operations and associated spinoff technologies, to the highly successful Kepler Mission s discovery of new planets now capturing the world s imagination.

  3. A Concept for the Inclusion of Analytical and Computational Capability in Existing Systems for Measurement of Neutron Flux

    NASA Technical Reports Server (NTRS)

    Patrick, Clinton; Cooper, Anita E.; Powers, W. T.

    2005-01-01

    For approximately two decades, efforts have been sponsored by NASA's Marshall Space Flight Center to make possible high-speed, automated classification and quantification of constituent materials in various harsh environments. MSFC, along with the Air Force/Arnold Engineering Development Center, has led the work, developing and implementing systems that employ principles of emission and absorption spectroscopy to monitor molecular and atomic particulates in gas plasma of rocket engine flow fields. One such system identifies species and quantifies mass loss rates in H2/O2 rocket plumes. Other gases have been examined and the physics of their detection under numerous conditions were made a part of the knowledge base for the MSFC/USAF team. Additionally, efforts are being advanced to hardware encode components of the data analysis tools in order to address real-time operational requirements for health monitoring and management. NASA has a significant investment in these systems, warranting a spiral approach that meshes current tools and experience with technological advancements. This paper addresses current systems - the Optical Plume Anomaly Detector (OPAD) and the Engine Diagnostic Filtering System (EDIFIS) - and discusses what is considered a natural progression: a concept for migrating them towards detection of high energy particles, including neutrons and gamma rays. The proposal outlines system development to date, basic concepts for future advancements, and recommendations for accomplishing them.

  4. STARS: An Integrated, Multidisciplinary, Finite-Element, Structural, Fluids, Aeroelastic, and Aeroservoelastic Analysis Computer Program

    NASA Technical Reports Server (NTRS)

    Gupta, K. K.

    1997-01-01

    A multidisciplinary, finite element-based, highly graphics-oriented, linear and nonlinear analysis capability that includes such disciplines as structures, heat transfer, linear aerodynamics, computational fluid dynamics, and controls engineering has been achieved by integrating several new modules in the original STARS (STructural Analysis RoutineS) computer program. Each individual analysis module is general-purpose in nature and is effectively integrated to yield aeroelastic and aeroservoelastic solutions of complex engineering problems. Examples of advanced NASA Dryden Flight Research Center projects analyzed by the code in recent years include the X-29A, F-18 High Alpha Research Vehicle/Thrust Vectoring Control System, B-52/Pegasus Generic Hypersonics, National AeroSpace Plane (NASP), SR-71/Hypersonic Launch Vehicle, and High Speed Civil Transport (HSCT) projects. Extensive graphics capabilities exist for convenient model development and postprocessing of analysis results. The program is written in modular form in standard FORTRAN language to run on a variety of computers, such as the IBM RISC/6000, SGI, DEC, Cray, and personal computer; associated graphics codes use OpenGL and IBM/graPHIGS language for color depiction. This program is available from COSMIC, the NASA agency for distribution of computer programs.

  5. Educational Technology Network: a computer conferencing system dedicated to applications of computers in radiology practice, research, and education.

    PubMed

    D'Alessandro, M P; Ackerman, M J; Sparks, S M

    1993-11-01

    Educational Technology Network (ET Net) is a free, easy to use, on-line computer conferencing system organized and funded by the National Library of Medicine that is accessible via the SprintNet (SprintNet, Reston, VA) and Internet (Merit, Ann Arbor, MI) computer networks. It is dedicated to helping bring together, in a single continuously running electronic forum, developers and users of computer applications in the health sciences, including radiology. ET Net uses the Caucus computer conferencing software (Camber-Roth, Troy, NY) running on a microcomputer. This microcomputer is located in the National Library of Medicine's Lister Hill National Center for Biomedical Communications and is directly connected to the SprintNet and the Internet networks. The advanced computer conferencing software of ET Net allows individuals who are separated in space and time to unite electronically to participate, at any time, in interactive discussions on applications of computers in radiology. A computer conferencing system such as ET Net allows radiologists to maintain contact with colleagues on a regular basis when they are not physically together. Topics of discussion on ET Net encompass all applications of computers in radiological practice, research, and education. ET Net has been in successful operation for 3 years and has a promising future aiding radiologists in the exchange of information pertaining to applications of computers in radiology.

  6. Spacecraft applications of advanced global positioning system technology

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This is the final report on the Texas Instruments Incorporated (TI) simulations study of Spacecraft Application of Advanced Global Positioning System (GPS) Technology. This work was conducted for the NASA Johnson Space Center (JSC) under contract NAS9-17781. GPS, in addition to its baselined capability as a highly accurate spacecraft navigation system, can provide traffic control, attitude control, structural control, and uniform time base. In Phase 1 of this program, another contractor investigated the potential of GPS in these four areas and compared GPS to other techniques. This contract was for the Phase 2 effort, to study the performance of GPS for these spacecraft applications through computer simulations. TI had previously developed simulation programs for GPS differential navigation and attitude measurement. These programs were adapted for these specific spacecraft applications. In addition, TI has extensive expertise in the design and production of advanced GPS receivers, including space-qualified GPS receivers. We have drawn on this background to augment the simulation results in the system level overview, which is Section 2 of this report.

  7. The advanced software development workstation project

    NASA Technical Reports Server (NTRS)

    Fridge, Ernest M., III; Pitman, Charles L.

    1991-01-01

    The Advanced Software Development Workstation (ASDW) task is researching and developing the technologies required to support Computer Aided Software Engineering (CASE) with the emphasis on those advanced methods, tools, and processes that will be of benefit to support all NASA programs. Immediate goals are to provide research and prototype tools that will increase productivity, in the near term, in projects such as the Software Support Environment (SSE), the Space Station Control Center (SSCC), and the Flight Analysis and Design System (FADS) which will be used to support the Space Shuttle and Space Station Freedom. Goals also include providing technology for development, evolution, maintenance, and operations. The technologies under research and development in the ASDW project are targeted to provide productivity enhancements during the software life cycle phase of enterprise and information system modeling, requirements generation and analysis, system design and coding, and system use and maintenance. On-line user's guides will assist users in operating the developed information system with knowledge base expert assistance.

  8. Next Generation CTAS Tools

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz

    2000-01-01

    The FAA's Free Flight Phase 1 Office is in the process of deploying the current generation of CTAS tools, which are the Traffic Management Advisor (TMA) and the passive Final Approach Spacing Tool (pFAST), at selected centers and airports. Research at NASA is now focussed on extending the CTAS software and computer human interfaces to provide more advanced capabilities. The Multi-center TMA (McTMA) is designed to operate at airports where arrival flows originate from two or more centers whose boundaries are in close proximity to the TRACON boundary. McTMA will also include techniques for routing arrival flows away from congested airspace and around airspace reserved for arrivals into other hub airports. NASA is working with FAA and MITRE to build a prototype McTMA for the Philadelphia airport. The active Final Approach Spacing Tool (aFAST) provides speed and heading advisories to help controllers achieve accurate spacing between aircraft on final approach. These advisories will be integrated with those in the existing pFAST to provide a set of comprehensive advisories for controlling arrival traffic from the TRACON boundary to touchdown at complex, high-capacity airports. A research prototype of aFAST, designed for the Dallas-Fort Worth is in an advanced stage of development. The Expedite Departure Path (EDP) and Direct-To tools are designed to help controllers guide departing aircraft out of the TRACON airspace and to climb to cruise altitude along the most efficient routes.

  9. SPOT4 Operational Control Center (CMP)

    NASA Technical Reports Server (NTRS)

    Zaouche, G.

    1993-01-01

    CNES(F) is responsible for the development of a new generation of Operational Control Center (CMP) which will operate the new heliosynchronous remote sensing satellite (SPOT4). This Operational Control Center takes large benefit from the experience of the first generation of control center and from the recent advances in computer technology and standards. The CMP is designed for operating two satellites all the same time with a reduced pool of controllers. The architecture of this CMP is simple, robust, and flexible, since it is based on powerful distributed workstations interconnected through an Ethernet LAN. The application software uses modern and formal software engineering methods, in order to improve quality and reliability, and facilitate maintenance. This software is table driven so it can be easily adapted to other operational needs. Operation tasks are automated to the maximum extent, so that it could be possible to operate the CMP automatically with very limited human interference for supervision and decision making. This paper provides an overview of the SPOTS mission and associated ground segment. It also details the CMP, its functions, and its software and hardware architecture.

  10. Using Web 2.0 (and Beyond?) in Space Flight Operations Control Centers

    NASA Technical Reports Server (NTRS)

    Scott, David W.

    2010-01-01

    Word processing was one of the earliest uses for small workstations, but we quickly learned that desktop computers were far more than e-typewriters. Similarly, "Web 2.0" capabilities, particularly advanced search engines, chats, wikis, blogs, social networking, and the like, offer tools that could significantly improve our efficiency at managing the avalanche of information and decisions needed to operate space vehicles in realtime. However, could does not necessarily equal should. We must wield two-edged swords carefully to avoid stabbing ourselves. This paper examines some Web 2.0 tools, with an emphasis on social media, and suggests which ones might be useful or harmful in real-time space operations co rnotl environments, based on the author s experience as a Payload Crew Communicator (PAYCOM) at Marshall Space Flight Center s (MSFC) Payload Operations Integration Center (POIC) for the International Space Station (ISS) and on discussions with other space flight operations control organizations and centers. There is also some discussion of an offering or two that may come from beyond the current cyber-horizon.

  11. Mailman Segal Center for Human Development | NSU

    Science.gov Websites

    Dean Jim & Jan Moran Family Center Village Collaborations Early Learning Programs About Early Learning Programs Family Center Preschool About Our Preschool Enrollment Family Center Infant & Toddler - Advanced ABA M.S. in Developmental Disabilities - ABA Non-Degree Seeking - ABA & Advanced ABA Autism

  12. Energy Innovation Hubs: A Home for Scientific Collaboration

    ScienceCinema

    Chu, Steven

    2017-12-11

    Secretary Chu will host a live, streaming Q&A session with the directors of the Energy Innovation Hubs on Tuesday, March 6, at 2:15 p.m. EST. The directors will be available for questions regarding their teams' work and the future of American energy. Ask your questions in the comments below, or submit them on Facebook, Twitter (@energy), or send an e-mail to newmedia@hq.doe.gov, prior or during the live event. Dr. Hank Foley is the director of the Greater Philadelphia Innovation Cluster for Energy-Efficient Buildings, which is pioneering new data intensive techniques for designing and operating energy efficient buildings, including advanced computer modeling. Dr. Douglas Kothe is the director of the Consortium for Advanced Simulation of Light Water Reactors, which uses powerful supercomputers to create "virtual" reactors that will help improve the safety and performance of both existing and new nuclear reactors. Dr. Nathan Lewis is the director of the Joint Center for Artificial Photosynthesis, which focuses on how to produce fuels from sunlight, water, and carbon dioxide. The Energy Innovation Hubs are major integrated research centers, with researchers from many different institutions and technical backgrounds. Each hub is focused on a specific high priority goal, rapidly accelerating scientific discoveries and shortening the path from laboratory innovation to technological development and commercial deployment of critical energy technologies. Ask your questions in the comments below, or submit them on Facebook, Twitter (@energy), or send an e-mail to newmedia@energy.gov, prior or during the live event. The Energy Innovation Hubs are major integrated research centers, with researchers from many different institutions and technical backgrounds. Each Hub is focused on a specific high priority goal, rapidly accelerating scientific discoveries and shortening the path from laboratory innovation to technological development and commercial deployment of critical energy technologies. Dr. Hank Holey is the director of the Greater Philadelphia Innovation Cluster for Energy-Efficient Buildings, which is pioneering new data intensive techniques for designing and operating energy efficient buildings, including advanced computer modeling. Dr. Douglas Kothe is the director of the Modeling and Simulation for Nuclear Reactors Hub, which uses powerful supercomputers to create "virtual" reactors that will help improve the safety and performance of both existing and new nuclear reactors. Dr. Nathan Lewis is the director of the Joint Center for Artificial Photosynthesis Hub, which focuses on how to produce biofuels from sunlight, water, and carbon dioxide.

  13. 75 FR 71463 - Dentek.Com, Inc. D/B/A Nsequence Center for Advanced Dentistry Reno, NV; Notice of Negative...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-23

    ... DEPARTMENT OF LABOR Employment and Training Administration [TA-W-73,963] Dentek.Com, Inc. D/B/A Nsequence Center for Advanced Dentistry Reno, NV; Notice of Negative Determination on Reconsideration By... applicable to workers and former workers at Dentek.com , Inc., d/b/a nSequence Center for Advanced Dentistry...

  14. An Investigation of the North Carolina Center for the Advancement of Teaching and Its Possible Influence on Beginning Teacher Retention: A Companion Dissertation

    ERIC Educational Resources Information Center

    Shook, Anna Lorraine Braverman

    2015-01-01

    An Investigation of the North Carolina Center for the Advancement of Teaching and its Influence on Beginning Teacher Retention: A Companion Dissertation. Shook, Anna, 2015. Dissertation, Gardner-Webb University, Adult Learning Theory/Adult Developmental Theory/Professional Development/Beginning Teacher/North Carolina Center for the Advancement of…

  15. Strategic Defense Initiative Demonstration/Validation Program: Environmental Assessments Summary

    DTIC Science & Technology

    1987-08-01

    TECHNOLOGY TESTS BY FACILITY TECHNOLOGY FACILITY BSTS SSTS GSTS SBI ERIS BM/C 3 Alabama Advanced Research Center A,S,C * California Edwards Air Force Base...Alabama - Advanced Research Center o California - Edwards Air Force Base o Florida - Eglin Air Force Base Kennedy Space Center o Maryland - Harry Diamond...BSTS SSTS GSTS SBI ERIS BM/C 3 Alabama Advanced Research Center A,S,C * California Edwards Air Force Base C Vandenberg Air Force Base/ F (1) F (2) F( 2

  16. Strategic Computing. New-Generation Computing Technology: A Strategic Plan for Its Development and Application to Critical Problems in Defense

    DTIC Science & Technology

    1983-10-28

    Computing. By seizing an opportunity to leverage recent advances in artificial intelligence, computer science, and microelectronics, the Agency plans...occurred in many separated areas of artificial intelligence, computer science, and microelectronics. Advances in "expert system" technology now...and expert knowledge o Advances in Artificial Intelligence: Mechanization of speech recognition, vision, and natural language understanding. o

  17. A Web-Based Treatment Decision Support Tool for Patients With Advanced Knee Arthritis: Evaluation of User Interface and Content Design

    PubMed Central

    Zheng, Hua; Rosal, Milagros C; Li, Wenjun; Borg, Amy; Yang, Wenyun; Ayers, David C

    2018-01-01

    Background Data-driven surgical decisions will ensure proper use and timing of surgical care. We developed a Web-based patient-centered treatment decision and assessment tool to guide treatment decisions among patients with advanced knee osteoarthritis who are considering total knee replacement surgery. Objective The aim of this study was to examine user experience and acceptance of the Web-based treatment decision support tool among older adults. Methods User-centered formative and summative evaluations were conducted for the tool. A sample of 28 patients who were considering total knee replacement participated in the study. Participants’ responses to the user interface design, the clarity of information, as well as usefulness, satisfaction, and acceptance of the tool were collected through qualitative (ie, individual patient interviews) and quantitative (ie, standardized Computer System Usability Questionnaire) methods. Results Participants were older adults with a mean age of 63 (SD 11) years. Three-quarters of them had no technical questions using the tool. User interface design recommendations included larger fonts, bigger buttons, less colors, simpler navigation without extra “next page” click, less mouse movement, and clearer illustrations with simple graphs. Color-coded bar charts and outcome-specific graphs with positive action were easiest for them to understand the outcomes data. Questionnaire data revealed high satisfaction with the tool usefulness and interface quality, and also showed ease of use of the tool, regardless of age or educational status. Conclusions We evaluated the usability of a patient-centered decision support tool designed for advanced knee arthritis patients to facilitate their knee osteoarthritis treatment decision making. The lessons learned can inform other decision support tools to improve interface and content design for older patients’ use. PMID:29712620

  18. Government Cloud Computing Policies: Potential Opportunities for Advancing Military Biomedical Research.

    PubMed

    Lebeda, Frank J; Zalatoris, Jeffrey J; Scheerer, Julia B

    2018-02-07

    This position paper summarizes the development and the present status of Department of Defense (DoD) and other government policies and guidances regarding cloud computing services. Due to the heterogeneous and growing biomedical big datasets, cloud computing services offer an opportunity to mitigate the associated storage and analysis requirements. Having on-demand network access to a shared pool of flexible computing resources creates a consolidated system that should reduce potential duplications of effort in military biomedical research. Interactive, online literature searches were performed with Google, at the Defense Technical Information Center, and at two National Institutes of Health research portfolio information sites. References cited within some of the collected documents also served as literature resources. We gathered, selected, and reviewed DoD and other government cloud computing policies and guidances published from 2009 to 2017. These policies were intended to consolidate computer resources within the government and reduce costs by decreasing the number of federal data centers and by migrating electronic data to cloud systems. Initial White House Office of Management and Budget information technology guidelines were developed for cloud usage, followed by policies and other documents from the DoD, the Defense Health Agency, and the Armed Services. Security standards from the National Institute of Standards and Technology, the Government Services Administration, the DoD, and the Army were also developed. Government Services Administration and DoD Inspectors General monitored cloud usage by the DoD. A 2016 Government Accountability Office report characterized cloud computing as being economical, flexible and fast. A congressionally mandated independent study reported that the DoD was active in offering a wide selection of commercial cloud services in addition to its milCloud system. Our findings from the Department of Health and Human Services indicated that the security infrastructure in cloud services may be more compliant with the Health Insurance Portability and Accountability Act of 1996 regulations than traditional methods. To gauge the DoD's adoption of cloud technologies proposed metrics included cost factors, ease of use, automation, availability, accessibility, security, and policy compliance. Since 2009, plans and policies were developed for the use of cloud technology to help consolidate and reduce the number of data centers which were expected to reduce costs, improve environmental factors, enhance information technology security, and maintain mission support for service members. Cloud technologies were also expected to improve employee efficiency and productivity. Federal cloud computing policies within the last decade also offered increased opportunities to advance military healthcare. It was assumed that these opportunities would benefit consumers of healthcare and health science data by allowing more access to centralized cloud computer facilities to store, analyze, search and share relevant data, to enhance standardization, and to reduce potential duplications of effort. We recommend that cloud computing be considered by DoD biomedical researchers for increasing connectivity, presumably by facilitating communications and data sharing, among the various intra- and extramural laboratories. We also recommend that policies and other guidances be updated to include developing additional metrics that will help stakeholders evaluate the above mentioned assumptions and expectations. Published by Oxford University Press on behalf of the Association of Military Surgeons of the United States 2018. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  19. Laboratory Computing Resource Center

    Science.gov Websites

    Systems Computing and Data Resources Purchasing Resources Future Plans For Users Getting Started Using LCRC Software Best Practices and Policies Getting Help Support Laboratory Computing Resource Center Laboratory Computing Resource Center Latest Announcements See All April 27, 2018, Announcements, John Low

  20. Consolidating NASA's Arc Jets

    NASA Technical Reports Server (NTRS)

    Balboni, John A.; Gokcen, Tahir; Hui, Frank C. L.; Graube, Peter; Morrissey, Patricia; Lewis, Ronald

    2015-01-01

    The paper describes the consolidation of NASA's high powered arc-jet testing at a single location. The existing plasma arc-jet wind tunnels located at the Johnson Space Center were relocated to Ames Research Center while maintaining NASA's technical capability to ground-test thermal protection system materials under simulated atmospheric entry convective heating. The testing conditions at JSC were reproduced and successfully demonstrated at ARC through close collaboration between the two centers. New equipment was installed at Ames to provide test gases of pure nitrogen mixed with pure oxygen, and for future nitrogen-carbon dioxide mixtures. A new control system was custom designed, installed and tested. Tests demonstrated the capability of the 10 MW constricted-segmented arc heater at Ames meets the requirements of the major customer, NASA's Orion program. Solutions from an advanced computational fluid dynamics code were used to aid in characterizing the properties of the plasma stream and the surface environment on the calorimeters in the supersonic flow stream produced by the arc heater.

  1. Distribution of man-machine controls in space teleoperation

    NASA Technical Reports Server (NTRS)

    Bejczy, A. K.

    1982-01-01

    The distribution of control between man and machine is dependent on the tasks, available technology, human performance characteristics and control goals. This dependency has very specific projections on systems designed for teleoperation in space. This paper gives a brief outline of the space-related issues and presents the results of advanced teleoperator research and development at the Jet Propulsion Laboratory (JPL). The research and development work includes smart sensors, flexible computer controls and intelligent man-machine interface devices in the area of visual displays and kinesthetic man-machine coupling in remote control of manipulators. Some of the development results have been tested at the Johnson Space Center (JSC) using the simulated full-scale Shuttle Remote Manipulator System (RMS). The research and development work for advanced space teleoperation is far from complete and poses many interdisciplinary challenges.

  2. Quantum technology past, present, future: quantum energetics (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Choi, Sang H.

    2017-04-01

    Since the development of quantum physics in the early part of the 1900s, this field of study has made remarkable contributions to our civilization. Some of these advances include lasers, light-emitting diodes (LED), sensors, spectroscopy, quantum dots, quantum gravity and quantum entanglements. In 1998, the NASA Langley Research Center established a quantum technology committee to monitor the progress in this area and initiated research to determine the potential of quantum technology for future NASA missions. The areas of interest in quantum technology at NASA included fundamental quantum-optics materials associated with quantum dots and quantum wells, device-oriented photonic crystals, smart optics, quantum conductors, quantum information and computing, teleportation theorem, and quantum energetics. A brief review of the work performed, the progress made in advancing these technologies, and the potential NASA applications of quantum technology will be presented.

  3. Role of High-End Computing in Meeting NASA's Science and Engineering Challenges

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak; Tu, Eugene L.; Van Dalsem, William R.

    2006-01-01

    Two years ago, NASA was on the verge of dramatically increasing its HEC capability and capacity. With the 10,240-processor supercomputer, Columbia, now in production for 18 months, HEC has an even greater impact within the Agency and extending to partner institutions. Advanced science and engineering simulations in space exploration, shuttle operations, Earth sciences, and fundamental aeronautics research are occurring on Columbia, demonstrating its ability to accelerate NASA s exploration vision. This talk describes how the integrated production environment fostered at the NASA Advanced Supercomputing (NAS) facility at Ames Research Center is accelerating scientific discovery, achieving parametric analyses of multiple scenarios, and enhancing safety for NASA missions. We focus on Columbia s impact on two key engineering and science disciplines: Aerospace, and Climate. We also discuss future mission challenges and plans for NASA s next-generation HEC environment.

  4. The Joint Space Operations Center Mission System and the Advanced Research, Collaboration, and Application Development Environment Status Update 2016

    NASA Astrophysics Data System (ADS)

    Murray-Krezan, Jeremy; Howard, Samantha; Sabol, Chris; Kim, Richard; Echeverry, Juan

    2016-05-01

    The Joint Space Operations Center (JSpOC) Mission System (JMS) is a service-oriented architecture (SOA) infrastructure with increased process automation and improved tools to enhance Space Situational Awareness (SSA) performed at the US-led JSpOC. The Advanced Research, Collaboration, and Application Development Environment (ARCADE) is a test-bed maintained and operated by the Air Force to (1) serve as a centralized test-bed for all research and development activities related to JMS applications, including algorithm development, data source exposure, service orchestration, and software services, and provide developers reciprocal access to relevant tools and data to accelerate technology development, (2) allow the JMS program to communicate user capability priorities and requirements to developers, (3) provide the JMS program with access to state-of-the-art research, development, and computing capabilities, and (4) support JMS Program Office-led market research efforts by identifying outstanding performers that are available to shepherd into the formal transition process. In this paper we will share with the international remote sensing community some of the recent JMS and ARCADE developments that may contribute to greater SSA at the JSpOC in the future, and share technical areas still in great need.

  5. Advances in the Remote Monitoring of Balloon Flights

    NASA Astrophysics Data System (ADS)

    Breeding, S.

    At the National Scientific Balloon Facility (NSBF), we must staff the Long Duration Balloon (LDB) control center 24 hours a day during LDB flights. This requires three daily shifts of two operators (balloon control and tdrss scheduling). In addition to this we also have one engineer on-call as LDB Lead to resolve technical issues and one manager on-call for flight management. These on-call periods are typically 48 to 72 hours in length. In the past the on-call staff had to travel to the LDB control center in order to monitor the status of a flight in any detail. This becomes problematic as flight durations push out beyond 20 to 30 day lengths, as these staff members are not available for business travel during these periods. This paper describes recent advances which allow for the remote monitoring of scientific balloon flight ground station computer displays. This allows balloon flight managers and lead engineers to check flight status and performance from any location with a network or telephone connection. This capability frees key personnel from the NSBF base during flights. It also allows other interested parties to check on the flight status at their convenience.

  6. Motion Analysis System for Instruction of Nihon Buyo using Motion Capture

    NASA Astrophysics Data System (ADS)

    Shinoda, Yukitaka; Murakami, Shingo; Watanabe, Yuta; Mito, Yuki; Watanuma, Reishi; Marumo, Mieko

    The passing on and preserving of advanced technical skills has become an important issue in a variety of fields, and motion analysis using motion capture has recently become popular in the research of advanced physical skills. This research aims to construct a system having a high on-site instructional effect on dancers learning Nihon Buyo, a traditional dance in Japan, and to classify Nihon Buyo dancing according to style, school, and dancer's proficiency by motion analysis. We have been able to study motion analysis systems for teaching Nihon Buyo now that body-motion data can be digitized and stored by motion capture systems using high-performance computers. Thus, with the aim of developing a user-friendly instruction-support system, we have constructed a motion analysis system that displays a dancer's time series of body motions and center of gravity for instructional purposes. In this paper, we outline this instructional motion analysis system based on three-dimensional position data obtained by motion capture. We also describe motion analysis that we performed based on center-of-gravity data obtained by this system and motion analysis focusing on school and age group using this system.

  7. Advanced Communications Technology Satellite high burst rate link evaluation terminal experiment control and monitor software maintenance manual, version 1.0

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.

    1992-01-01

    The Experiment Control and Monitor (EC&M) software was developed at NASA Lewis Research Center to support the Advanced Communications Technology Satellite (ACTS) High Burst Rate Link Evaluation Terminal (HBR-LET). The HBR-LET is an experimenter's terminal to communicate with the ACTS for various investigations by government agencies, universities, and industry. The EC&M software is one segment of the Control and Performance Monitoring (C&PM) software system of the HBR-LET. The EC&M software allows users to initialize, control, and monitor the instrumentation within the HBR-LET using a predefined sequence of commands. Besides instrument control, the C&PM software system is also responsible for computer communication between the HBR-LET and the ACTS NASA Ground Station and for uplink power control of the HBR-LET to demonstrate power augmentation during rain fade events. The EC&M Software User's Guide, Version 1.0 (NASA-CR-189160) outlines the commands required to install and operate the EC&M software. Input and output file descriptions, operator commands, and error recovery procedures are discussed in the document. The EC&M Software Maintenance Manual, Version 1.0 (NASA-CR-189161) is a programmer's guide that describes current implementation of the EC&M software from a technical perspective. An overview of the EC&M software, computer algorithms, format representation, and computer hardware configuration are included in the manual.

  8. Advanced Avionics and Processor Systems for a Flexible Space Exploration Architecture

    NASA Technical Reports Server (NTRS)

    Keys, Andrew S.; Adams, James H.; Smith, Leigh M.; Johnson, Michael A.; Cressler, John D.

    2010-01-01

    The Advanced Avionics and Processor Systems (AAPS) project, formerly known as the Radiation Hardened Electronics for Space Environments (RHESE) project, endeavors to develop advanced avionic and processor technologies anticipated to be used by NASA s currently evolving space exploration architectures. The AAPS project is a part of the Exploration Technology Development Program, which funds an entire suite of technologies that are aimed at enabling NASA s ability to explore beyond low earth orbit. NASA s Marshall Space Flight Center (MSFC) manages the AAPS project. AAPS uses a broad-scoped approach to developing avionic and processor systems. Investment areas include advanced electronic designs and technologies capable of providing environmental hardness, reconfigurable computing techniques, software tools for radiation effects assessment, and radiation environment modeling tools. Near-term emphasis within the multiple AAPS tasks focuses on developing prototype components using semiconductor processes and materials (such as Silicon-Germanium (SiGe)) to enhance a device s tolerance to radiation events and low temperature environments. As the SiGe technology will culminate in a delivered prototype this fiscal year, the project emphasis shifts its focus to developing low-power, high efficiency total processor hardening techniques. In addition to processor development, the project endeavors to demonstrate techniques applicable to reconfigurable computing and partially reconfigurable Field Programmable Gate Arrays (FPGAs). This capability enables avionic architectures the ability to develop FPGA-based, radiation tolerant processor boards that can serve in multiple physical locations throughout the spacecraft and perform multiple functions during the course of the mission. The individual tasks that comprise AAPS are diverse, yet united in the common endeavor to develop electronics capable of operating within the harsh environment of space. Specifically, the AAPS tasks for the Federal fiscal year of 2010 are: Silicon-Germanium (SiGe) Integrated Electronics for Extreme Environments, Modeling of Radiation Effects on Electronics, Radiation Hardened High Performance Processors (HPP), and and Reconfigurable Computing.

  9. Alternative Fuels Data Center: Buying and Selling Pre-Owned Alternative

    Science.gov Websites

    Alternative Fuels Data Center: Buying and Selling Pre-Owned Alternative Fuel and Advanced Vehicles to someone by E-mail Share Alternative Fuels Data Center: Buying and Selling Pre-Owned Alternative Fuel and Advanced Vehicles on Facebook Tweet about Alternative Fuels Data Center: Buying and Selling Pre-Owned

  10. CD-ROM technology at the EROS data center

    USGS Publications Warehouse

    Madigan, Michael E.; Weinheimer, Mary C.

    1993-01-01

    The vast amount of digital spatial data often required by a single user has created a demand for media alternatives to 1/2" magnetic tape. One such medium that has been recently adopted at the U.S. Geological Survey's EROS Data Center is the compact disc (CD). CD's are a versatile, dynamic, and low-cost method for providing a variety of data on a single media device and are compatible with various computer platforms. CD drives are available for personal computers, UNIX workstations, and mainframe systems, either directly connected, or through a network. This medium furnishes a quick method of reproducing and distributing large amounts of data on a single CD. Several data sets are already available on CD's, including collections of historical Landsat multispectral scanner data and biweekly composites of Advanced Very High Resolution Radiometer data for the conterminous United States. The EROS Data Center intends to provide even more data sets on CD's. Plans include specific data sets on a customized disc to fulfill individual requests, and mass production of unique data sets for large-scale distribution. Requests for a single compact disc-read only memory (CD-ROM) containing a large volume of data either for archiving or for one-time distribution can be addressed with a CD-write once (CD-WO) unit. Mass production and large-scale distribution will require CD-ROM replication and mastering.

  11. The Development of University Computing in Sweden 1965-1985

    NASA Astrophysics Data System (ADS)

    Dahlstrand, Ingemar

    In 1965-70 the government agency, Statskontoret, set up five university computing centers, as service bureaux financed by grants earmarked for computer use. The centers were well equipped and staffed and caused a surge in computer use. When the yearly flow of grant money stagnated at 25 million Swedish crowns, the centers had to find external income to survive and acquire time-sharing. But the charging system led to the computers not being fully used. The computer scientists lacked equipment for laboratory use. The centers were decentralized and the earmarking abolished. Eventually they got new tasks like running computers owned by the departments, and serving the university administration.

  12. Image2000: A Free, Innovative, Java Based Imaging Package

    NASA Technical Reports Server (NTRS)

    Pell, Nicholas; Wheeler, Phil; Cornwell, Carl; Matusow, David; Obenschain, Arthur F. (Technical Monitor)

    2001-01-01

    The National Aeronautics and Space Administration (NASA) Goddard Space Flight Center's (GSFC) Scientific and Educational Endeavors (SEE) and the Center for Image Processing in Education (CIPE) use satellite image processing as part of their science lessons developed for students and educators. The image processing products that they use, as part of these lessons, no longer fulfill the needs of SEE and CIPE because these products are either dependent on a particular computing platform, hard to customize and extend, or do not have enough functionality. SEE and CIPE began looking for what they considered the "perfect" image processing tool that was platform independent, rich in functionality and could easily be extended and customized for their purposes. At the request of SEE, NASA's GSFC, code 588 the Advanced Architectures and Automation Branch developed a powerful new Java based image processing endeavors.

  13. A global spacecraft control network for spacecraft autonomy research

    NASA Technical Reports Server (NTRS)

    Kitts, Christopher A.

    1996-01-01

    The development and implementation of the Automated Space System Experimental Testbed (ASSET) space operations and control network, is reported on. This network will serve as a command and control architecture for spacecraft operations and will offer a real testbed for the application and validation of advanced autonomous spacecraft operations strategies. The proposed network will initially consist of globally distributed amateur radio ground stations at locations throughout North America and Europe. These stations will be linked via Internet to various control centers. The Stanford (CA) control center will be capable of human and computer based decision making for the coordination of user experiments, resource scheduling and fault management. The project's system architecture is described together with its proposed use as a command and control system, its value as a testbed for spacecraft autonomy research, and its current implementation.

  14. Time-Shifted Boundary Conditions Used for Navier-Stokes Aeroelastic Solver

    NASA Technical Reports Server (NTRS)

    Srivastava, Rakesh

    1999-01-01

    Under the Advanced Subsonic Technology (AST) Program, an aeroelastic analysis code (TURBO-AE) based on Navier-Stokes equations is currently under development at NASA Lewis Research Center s Machine Dynamics Branch. For a blade row, aeroelastic instability can occur in any of the possible interblade phase angles (IBPA s). Analyzing small IBPA s is very computationally expensive because a large number of blade passages must be simulated. To reduce the computational cost of these analyses, we used time shifted, or phase-lagged, boundary conditions in the TURBO-AE code. These conditions can be used to reduce the computational domain to a single blade passage by requiring the boundary conditions across the passage to be lagged depending on the IBPA being analyzed. The time-shifted boundary conditions currently implemented are based on the direct-store method. This method requires large amounts of data to be stored over a period of the oscillation cycle. On CRAY computers this is not a major problem because solid-state devices can be used for fast input and output to read and write the data onto a disk instead of storing it in core memory.

  15. A portable fNIRS system with eight channels

    NASA Astrophysics Data System (ADS)

    Si, Juanning; Zhao, Ruirui; Zhang, Yujin; Zuo, Nianming; Zhang, Xin; Jiang, Tianzi

    2015-03-01

    Abundant study on the hemodynamic response of a brain have brought quite a few advances in technologies of measuring it. The most benefitted is the functional near infrared spectroscope (fNIRS). A variety of devices have been developed for different applications. Because portable fNIRS systems were more competent to measure responses either of special subjects or in natural environment, several kinds of portable fNIRS systems have been reported. However, they all required a computer for receiving data. The extra computer increases the cost of a fNIRS system. What's more noticeable is the space required to locate the computer even for a portable system. It will discount the portability of the fNIRS system. So we designed a self-contained eight channel fNIRS system, which does not demand a computer to receive data and display data in a monitor. Instead, the system is centered by an ARM core CPU, which takes charge in organizing data and saving data, and then displays data on a touch screen. The system has also been validated by experiments on phantoms and on subjects in tasks.

  16. Binary Black Holes, Gravitational Waves, and Numerical Relativity

    NASA Technical Reports Server (NTRS)

    Centrella, Joan

    2007-01-01

    Massive black hole (MBH) binaries are found at the centers of most galaxies. MBH mergers trace galaxy mergers and are strong sources of gravitational waves. Observing these sources with gravitational wave detectors requires that we know the radiation waveforms they emit. Since these mergers take place in regions of very strong gravitational fields, we need to solve Einstein's equations of general relativity on a computer in order to calculate these waveforms. For more than 30 years, scientists have tried to compute these waveforms using the methods of numerical relativity. The resulting computer codes have been plagued by instabilities. causing them to crash well before the black hole:, in the binary could complete even a single orbit. Recently this situation has changed dramatically, with a series of amazing breakthroughs. This presentation shows how a spacetime is constructed on a computer to build a simulation laboratory for binary black hole mergers. Focus is on the recent advances that that reveal these waveforms, and the potential for discoveries that arises when these sources are observed by LIGO and LISA.

  17. 78 FR 41046 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-09

    ... Services Administration, notice is hereby given that the Advanced Scientific Computing Advisory Committee will be renewed for a two-year period beginning on July 1, 2013. The Committee will provide advice to the Director, Office of Science (DOE), on the Advanced Scientific Computing Research Program managed...

  18. Computational toxicology as implemented by the U.S. EPA: providing high throughput decision support tools for screening and assessing chemical exposure, hazard and risk.

    PubMed

    Kavlock, Robert; Dix, David

    2010-02-01

    Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, the U.S. Environmental Protection Agency EPA is developing robust and flexible computational tools that can be applied to the thousands of chemicals in commerce, and contaminant mixtures found in air, water, and hazardous-waste sites. The Office of Research and Development (ORD) Computational Toxicology Research Program (CTRP) is composed of three main elements. The largest component is the National Center for Computational Toxicology (NCCT), which was established in 2005 to coordinate research on chemical screening and prioritization, informatics, and systems modeling. The second element consists of related activities in the National Health and Environmental Effects Research Laboratory (NHEERL) and the National Exposure Research Laboratory (NERL). The third and final component consists of academic centers working on various aspects of computational toxicology and funded by the U.S. EPA Science to Achieve Results (STAR) program. Together these elements form the key components in the implementation of both the initial strategy, A Framework for a Computational Toxicology Research Program (U.S. EPA, 2003), and the newly released The U.S. Environmental Protection Agency's Strategic Plan for Evaluating the Toxicity of Chemicals (U.S. EPA, 2009a). Key intramural projects of the CTRP include digitizing legacy toxicity testing information toxicity reference database (ToxRefDB), predicting toxicity (ToxCast) and exposure (ExpoCast), and creating virtual liver (v-Liver) and virtual embryo (v-Embryo) systems models. U.S. EPA-funded STAR centers are also providing bioinformatics, computational toxicology data and models, and developmental toxicity data and models. The models and underlying data are being made publicly available through the Aggregated Computational Toxicology Resource (ACToR), the Distributed Structure-Searchable Toxicity (DSSTox) Database Network, and other U.S. EPA websites. While initially focused on improving the hazard identification process, the CTRP is placing increasing emphasis on using high-throughput bioactivity profiling data in systems modeling to support quantitative risk assessments, and in developing complementary higher throughput exposure models. This integrated approach will enable analysis of life-stage susceptibility, and understanding of the exposures, pathways, and key events by which chemicals exert their toxicity in developing systems (e.g., endocrine-related pathways). The CTRP will be a critical component in next-generation risk assessments utilizing quantitative high-throughput data and providing a much higher capacity for assessing chemical toxicity than is currently available.

  19. Systems Toxicology: Real World Applications and Opportunities.

    PubMed

    Hartung, Thomas; FitzGerald, Rex E; Jennings, Paul; Mirams, Gary R; Peitsch, Manuel C; Rostami-Hodjegan, Amin; Shah, Imran; Wilks, Martin F; Sturla, Shana J

    2017-04-17

    Systems Toxicology aims to change the basis of how adverse biological effects of xenobiotics are characterized from empirical end points to describing modes of action as adverse outcome pathways and perturbed networks. Toward this aim, Systems Toxicology entails the integration of in vitro and in vivo toxicity data with computational modeling. This evolving approach depends critically on data reliability and relevance, which in turn depends on the quality of experimental models and bioanalysis techniques used to generate toxicological data. Systems Toxicology involves the use of large-scale data streams ("big data"), such as those derived from omics measurements that require computational means for obtaining informative results. Thus, integrative analysis of multiple molecular measurements, particularly acquired by omics strategies, is a key approach in Systems Toxicology. In recent years, there have been significant advances centered on in vitro test systems and bioanalytical strategies, yet a frontier challenge concerns linking observed network perturbations to phenotypes, which will require understanding pathways and networks that give rise to adverse responses. This summary perspective from a 2016 Systems Toxicology meeting, an international conference held in the Alps of Switzerland, describes the limitations and opportunities of selected emerging applications in this rapidly advancing field. Systems Toxicology aims to change the basis of how adverse biological effects of xenobiotics are characterized, from empirical end points to pathways of toxicity. This requires the integration of in vitro and in vivo data with computational modeling. Test systems and bioanalytical technologies have made significant advances, but ensuring data reliability and relevance is an ongoing concern. The major challenge facing the new pathway approach is determining how to link observed network perturbations to phenotypic toxicity.

  20. Systems Toxicology: Real World Applications and Opportunities

    PubMed Central

    2017-01-01

    Systems Toxicology aims to change the basis of how adverse biological effects of xenobiotics are characterized from empirical end points to describing modes of action as adverse outcome pathways and perturbed networks. Toward this aim, Systems Toxicology entails the integration of in vitro and in vivo toxicity data with computational modeling. This evolving approach depends critically on data reliability and relevance, which in turn depends on the quality of experimental models and bioanalysis techniques used to generate toxicological data. Systems Toxicology involves the use of large-scale data streams (“big data”), such as those derived from omics measurements that require computational means for obtaining informative results. Thus, integrative analysis of multiple molecular measurements, particularly acquired by omics strategies, is a key approach in Systems Toxicology. In recent years, there have been significant advances centered on in vitro test systems and bioanalytical strategies, yet a frontier challenge concerns linking observed network perturbations to phenotypes, which will require understanding pathways and networks that give rise to adverse responses. This summary perspective from a 2016 Systems Toxicology meeting, an international conference held in the Alps of Switzerland, describes the limitations and opportunities of selected emerging applications in this rapidly advancing field. Systems Toxicology aims to change the basis of how adverse biological effects of xenobiotics are characterized, from empirical end points to pathways of toxicity. This requires the integration of in vitro and in vivo data with computational modeling. Test systems and bioanalytical technologies have made significant advances, but ensuring data reliability and relevance is an ongoing concern. The major challenge facing the new pathway approach is determining how to link observed network perturbations to phenotypic toxicity. PMID:28362102

  1. Center for Advanced Space Propulsion

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The Center for Advanced Space Propulsion (CASP) is part of the University of Tennessee-Calspan Center for Aerospace Research (CAR). It was formed in 1985 to take advantage of the extensive research faculty and staff of the University of Tennessee and Calspan Corporation. It is also one of sixteen NASA sponsored Centers established to facilitate the Commercial Development of Space. Based on investigators' qualifications in propulsion system development, and matching industries' strong intent, the Center focused its efforts in the following technical areas: advanced chemical propulsion, electric propulsion, AI/Expert systems, fluids management in microgravity, and propulsion materials processing. This annual report focuses its discussion in these technical areas.

  2. Reducing the Time and Cost of Testing Engines

    NASA Technical Reports Server (NTRS)

    2004-01-01

    Producing a new aircraft engine currently costs approximately $1 billion, with 3 years of development time for a commercial engine and 10 years for a military engine. The high development time and cost make it extremely difficult to transition advanced technologies for cleaner, quieter, and more efficient new engines. To reduce this time and cost, NASA created a vision for the future where designers would use high-fidelity computer simulations early in the design process in order to resolve critical design issues before building the expensive engine hardware. To accomplish this vision, NASA's Glenn Research Center initiated a collaborative effort with the aerospace industry and academia to develop its Numerical Propulsion System Simulation (NPSS), an advanced engineering environment for the analysis and design of aerospace propulsion systems and components. Partners estimate that using NPSS has the potential to dramatically reduce the time, effort, and expense necessary to design and test jet engines by generating sophisticated computer simulations of an aerospace object or system. These simulations will permit an engineer to test various design options without having to conduct costly and time-consuming real-life tests. By accelerating and streamlining the engine system design analysis and test phases, NPSS facilitates bringing the final product to market faster. NASA's NPSS Version (V)1.X effort was a task within the Agency s Computational Aerospace Sciences project of the High Performance Computing and Communication program, which had a mission to accelerate the availability of high-performance computing hardware and software to the U.S. aerospace community for its use in design processes. The technology brings value back to NASA by improving methods of analyzing and testing space transportation components.

  3. Parallel Architectures for Planetary Exploration Requirements (PAPER)

    NASA Technical Reports Server (NTRS)

    Cezzar, Ruknet; Sen, Ranjan K.

    1989-01-01

    The Parallel Architectures for Planetary Exploration Requirements (PAPER) project is essentially research oriented towards technology insertion issues for NASA's unmanned planetary probes. It was initiated to complement and augment the long-term efforts for space exploration with particular reference to NASA/LaRC's (NASA Langley Research Center) research needs for planetary exploration missions of the mid and late 1990s. The requirements for space missions as given in the somewhat dated Advanced Information Processing Systems (AIPS) requirements document are contrasted with the new requirements from JPL/Caltech involving sensor data capture and scene analysis. It is shown that more stringent requirements have arisen as a result of technological advancements. Two possible architectures, the AIPS Proof of Concept (POC) configuration and the MAX Fault-tolerant dataflow multiprocessor, were evaluated. The main observation was that the AIPS design is biased towards fault tolerance and may not be an ideal architecture for planetary and deep space probes due to high cost and complexity. The MAX concepts appears to be a promising candidate, except that more detailed information is required. The feasibility for adding neural computation capability to this architecture needs to be studied. Key impact issues for architectural design of computing systems meant for planetary missions were also identified.

  4. Developing an Advanced Environment for Collaborative Computing

    NASA Technical Reports Server (NTRS)

    Becerra-Fernandez, Irma; Stewart, Helen; DelAlto, Martha; DelAlto, Martha; Knight, Chris

    1999-01-01

    Knowledge management in general tries to organize and make available important know-how, whenever and where ever is needed. Today, organizations rely on decision-makers to produce "mission critical" decisions that am based on inputs from multiple domains. The ideal decision-maker has a profound understanding of specific domains that influence the decision-making process coupled with the experience that allows them to act quickly and decisively on the information. In addition, learning companies benefit by not repeating costly mistakes, and by reducing time-to-market in Research & Development projects. Group-decision making tools can help companies make better decisions by capturing the knowledge from groups of experts. Furthermore, companies that capture their customers preferences can improve their customer service, which translates to larger profits. Therefore collaborative computing provides a common communication space, improves sharing of knowledge, provides a mechanism for real-time feedback on the tasks being performed, helps to optimize processes, and results in a centralized knowledge warehouse. This paper presents the research directions. of a project which seeks to augment an advanced collaborative web-based environment called Postdoc, with workflow capabilities. Postdoc is a "government-off-the-shelf" document management software developed at NASA-Ames Research Center (ARC).

  5. Aerospace Engineering Systems and the Advanced Design Technologies Testbed Experience

    NASA Technical Reports Server (NTRS)

    VanDalsem, William R.; Livingston, Mary E.; Melton, John E.; Torres, Francisco J.; Stremel, Paul M.

    1999-01-01

    Continuous improvement of aerospace product development processes is a driving requirement across much of the aerospace community. As up to 90% of the cost of an aerospace product is committed during the first 10% of the development cycle, there is a strong emphasis on capturing, creating, and communicating better information (both requirements and performance) early in the product development process. The community has responded by pursuing the development of computer-based systems designed to enhance the decision-making capabilities of product development individuals and teams. Recently, the historical foci on sharing the geometrical representation and on configuration management are being augmented: 1) Physics-based analysis tools for filling the design space database; 2) Distributed computational resources to reduce response time and cost; 3) Web-based technologies to relieve machine-dependence; and 4) Artificial intelligence technologies to accelerate processes and reduce process variability. The Advanced Design Technologies Testbed (ADTT) activity at NASA Ames Research Center was initiated to study the strengths and weaknesses of the technologies supporting each of these trends, as well as the overall impact of the combination of these trends on a product development event. Lessons learned and recommendations for future activities are reported.

  6. Theoretical Comparison Between Candidates for Dark Matter

    NASA Astrophysics Data System (ADS)

    McKeough, James; Hira, Ajit; Valdez, Alexandra

    2017-01-01

    Since the generally-accepted view among astrophysicists is that the matter component of the universe is mostly dark matter, the search for dark matter particles continues unabated. The Large Underground Xenon (LUX) improvements, aided by advanced computer simulations at the U.S. Department of Energy's Lawrence Berkeley National Laboratory's (Berkeley Lab) National Energy Research Scientific Computing Center (NERSC) and Brown University's Center for Computation and Visualization (CCV), can potentially eliminate some particle models of dark matter. Generally, the proposed candidates can be put in three categories: baryonic dark matter, hot dark matter, and cold dark matter. The Lightest Supersymmetric Particle(LSP) of supersymmetric models is a dark matter candidate, and is classified as a Weakly Interacting Massive Particle (WIMP). Similar to the cosmic microwave background radiation left over from the Big Bang, there is a background of low-energy neutrinos in our Universe. According to some researchers, these may be the explanation for the dark matter. One advantage of the Neutrino Model is that they are known to exist. Dark matter made from neutrinos is termed ``hot dark matter''. We formulate a novel empirical function for the average density profile of cosmic voids, identified via the watershed technique in ΛCDM N-body simulations. This function adequately treats both void size and redshift, and describes the scale radius and the central density of voids. We started with a five-parameter model. Our research is mainly on LSP and Neutrino models.

  7. Computational fluid dynamics assessment: Volume 1, Computer simulations of the METC (Morgantown Energy Technology Center) entrained-flow gasifier: Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Celik, I.; Chattree, M.

    1988-07-01

    An assessment of the theoretical and numerical aspects of the computer code, PCGC-2, is made; and the results of the application of this code to the Morgantown Energy Technology Center (METC) advanced gasification facility entrained-flow reactor, ''the gasifier,'' are presented. PCGC-2 is a code suitable for simulating pulverized coal combustion or gasification under axisymmetric (two-dimensional) flow conditions. The governing equations for the gas and particulate phase have been reviewed. The numerical procedure and the related programming difficulties have been elucidated. A single-particle model similar to the one used in PCGC-2 has been developed, programmed, and applied to some simple situationsmore » in order to gain insight to the physics of coal particle heat-up, devolatilization, and char oxidation processes. PCGC-2 was applied to the METC entrained-flow gasifier to study numerically the flash pyrolysis of coal, and gasification of coal with steam or carbon dioxide. The results from the simulations are compared with measurements. The gas and particle residence times, particle temperature, and mass component history were also calculated and the results were analyzed. The results provide useful information for understanding the fundamentals of coal gasification and for assessment of experimental results performed using the reactor considered. 69 refs., 35 figs., 23 tabs.« less

  8. Data communication network at the ASRM facility

    NASA Astrophysics Data System (ADS)

    Moorhead, Robert J., II; Smith, Wayne D.

    1993-08-01

    This report describes the simulation of the overall communication network structure for the Advanced Solid Rocket Motor (ASRM) facility being built at Yellow Creek near Iuka, Mississippi as of today. The report is compiled using information received from NASA/MSFC, LMSC, AAD, and RUST Inc. As per the information gathered, the overall network structure will have one logical FDDI ring acting as a backbone for the whole complex. The buildings will be grouped into two categories viz. manufacturing intensive and manufacturing non-intensive. The manufacturing intensive buildings will be connected via FDDI to the Operational Information System (OIS) in the main computing center in B_1000. The manufacturing non-intensive buildings will be connected by 10BASE-FL to the OIS through the Business Information System (BIS) hub in the main computing center. All the devices inside B_1000 will communicate with the BIS. The workcells will be connected to the Area Supervisory Computers (ASCs) through the nearest manufacturing intensive hub and one of the OIS hubs. Comdisco's Block Oriented Network Simulator (BONeS) has been used to simulate the performance of the network. BONeS models a network topology, traffic, data structures, and protocol functions using a graphical interface. The main aim of the simulations was to evaluate the loading of the OIS, the BIS, and the ASCs, and the network links by the traffic generated by the workstations and workcells throughout the site.

  9. Data communication network at the ASRM facility

    NASA Technical Reports Server (NTRS)

    Moorhead, Robert J., II; Smith, Wayne D.

    1993-01-01

    This report describes the simulation of the overall communication network structure for the Advanced Solid Rocket Motor (ASRM) facility being built at Yellow Creek near Iuka, Mississippi as of today. The report is compiled using information received from NASA/MSFC, LMSC, AAD, and RUST Inc. As per the information gathered, the overall network structure will have one logical FDDI ring acting as a backbone for the whole complex. The buildings will be grouped into two categories viz. manufacturing intensive and manufacturing non-intensive. The manufacturing intensive buildings will be connected via FDDI to the Operational Information System (OIS) in the main computing center in B_1000. The manufacturing non-intensive buildings will be connected by 10BASE-FL to the OIS through the Business Information System (BIS) hub in the main computing center. All the devices inside B_1000 will communicate with the BIS. The workcells will be connected to the Area Supervisory Computers (ASCs) through the nearest manufacturing intensive hub and one of the OIS hubs. Comdisco's Block Oriented Network Simulator (BONeS) has been used to simulate the performance of the network. BONeS models a network topology, traffic, data structures, and protocol functions using a graphical interface. The main aim of the simulations was to evaluate the loading of the OIS, the BIS, and the ASCs, and the network links by the traffic generated by the workstations and workcells throughout the site.

  10. Implementation of an advanced clinical and administrative hospital information system.

    PubMed

    Vegoda, P R; Dyro, J F

    1986-01-01

    Over the last six years since University Hospital opened, the University Hospital Information System (UHIS) has continued to evolve to what is today an advanced administrative and clinical information system. At University Hospital UHIS is the way of conducting business. A wide range of patient care applications are operational including Patient Registration, ADT for Inpatient/Outpatient/Emergency Room visits, Advanced Order Entry/Result Reporting, Medical Records, Lab Automated Data Acquisition/Quality Control, Pharmacy, Radiology, Dietary, Respiratory Therapy, ECG, EEG, Cardiology, Physical/Occupational Therapy and Nursing. These systems and numerous financial systems have been installed in a highly tuned, efficient computer system. All applications are real-time, on-line, and data base oriented. Each system is provided with multiple data security levels, forward file recovery, and dynamic transaction backout of in-flight tasks. Sensitive medical information is safeguarded by job function passwords, identification codes, need-to-know master screens and terminal keylocks. University Hospital has an IBM 3083 CPU with five 3380 disk drives, four dual density tape drives, and a 3705 network controller. The network of 300 terminals and 100 printers is connected to the computer center by an RF broadband cable. The software is configured around the IBM/MVS operating system using CICS as the telecommunication monitor, IMS as the data base management system and PCS/ADS as the application enabling tool. The most extensive clinical system added to UHIS is the Physiological Monitoring/Patient Data Management System with serves 92 critical care beds. In keeping with the Hospital's philosophy of integrated computing, the PMS/PDMS with its network of minicomputers was linked to the UHIS system. In a pilot program, remote access to UHIS through the IBM personal computer has been implemented in several physician offices in the local community, further extending the communications horizons of University Hospital's Information System. The implications of remote access to PDMS through the IBM PC emulating a Siemens Model 420 Patient Data Management Terminal are being examined.

  11. Computer-based symptom assessment is feasible in patients with advanced cancer: results from an international multicenter study, the EPCRC-CSA.

    PubMed

    Hjermstad, Marianne Jensen; Lie, Hanne C; Caraceni, Augusto; Currow, David C; Fainsinger, Robin L; Gundersen, Odd Erik; Haugen, Dagny Faksvaag; Heitzer, Ellen; Radbruch, Lukas; Stone, Patrick C; Strasser, Florian; Kaasa, Stein; Loge, Jon Håvard

    2012-11-01

    Symptom assessment by computers is only effective if it provides valid results and is perceived as useful for clinical use by the end users: patients and health care providers. To identify factors associated with discontinuation, time expenditure, and patient preferences of the computerized symptom assessment used in an international multicenter data collection project: the European Palliative Care Research Collaborative-Computerized Symptom Assessment. Cancer patients with incurable metastatic or locally advanced disease were recruited from 17 centers in eight countries, providing 1017 records for analyses. Observer-based registrations and patient-reported measures on pain, depression, and physical function were entered on touch screen laptop computers. The entire assessment was completed by 94.9% (n = 965), with median age 63 years (range 18-91 years) and median Karnofsky Performance Status (KPS) score of 70 (range 20-100). Predictive factors for noncompletion were higher age, lower KPS, and more pain (P ≤ 0.012). Time expenditure among completers increased with higher age, male gender, Norwegian nationality, number of comorbidities, and lower physical functioning (P ≤ 0.007) but was inversely related to pain levels and tiredness (P ≤ 0.03). Need for assistance was predicted by higher age, nationality other than Norwegian, lower KPS, and lower educational level (P < 0.001). More than 50% of patients preferred computerized assessment to a paper and pencil version. The high completion rate shows that symptom assessment by computers is feasible in patients with advanced cancer. However, reduced performance status reduces compliance and increases the need for assistance. Future work should aim at identifying the minimum set of valid screening questions and refine the software to optimize symptom assessment and reduce respondent burden in frail patients. Copyright © 2012 U.S. Cancer Pain Relief Committee. Published by Elsevier Inc. All rights reserved.

  12. Alternative Fuels Data Center: State Alternative Fuel and Advanced Vehicle

    Science.gov Websites

    Laws and Incentives: 2014 Year in Review State Alternative Fuel and Advanced Vehicle Laws and Fuel and Advanced Vehicle Laws and Incentives: 2014 Year in Review on Facebook Tweet about Alternative Fuels Data Center: State Alternative Fuel and Advanced Vehicle Laws and Incentives: 2014 Year in Review

  13. Using 3D infrared imaging to calibrate and refine computational fluid dynamic modeling for large computer and data centers

    NASA Astrophysics Data System (ADS)

    Stockton, Gregory R.

    2011-05-01

    Over the last 10 years, very large government, military, and commercial computer and data center operators have spent millions of dollars trying to optimally cool data centers as each rack has begun to consume as much as 10 times more power than just a few years ago. In fact, the maximum amount of data computation in a computer center is becoming limited by the amount of available power, space and cooling capacity at some data centers. Tens of millions of dollars and megawatts of power are being annually spent to keep data centers cool. The cooling and air flows dynamically change away from any predicted 3-D computational fluid dynamic modeling during construction and as time goes by, and the efficiency and effectiveness of the actual cooling rapidly departs even farther from predicted models. By using 3-D infrared (IR) thermal mapping and other techniques to calibrate and refine the computational fluid dynamic modeling and make appropriate corrections and repairs, the required power for data centers can be dramatically reduced which reduces costs and also improves reliability.

  14. A uniform approach for programming distributed heterogeneous computing systems

    PubMed Central

    Grasso, Ivan; Pellegrini, Simone; Cosenza, Biagio; Fahringer, Thomas

    2014-01-01

    Large-scale compute clusters of heterogeneous nodes equipped with multi-core CPUs and GPUs are getting increasingly popular in the scientific community. However, such systems require a combination of different programming paradigms making application development very challenging. In this article we introduce libWater, a library-based extension of the OpenCL programming model that simplifies the development of heterogeneous distributed applications. libWater consists of a simple interface, which is a transparent abstraction of the underlying distributed architecture, offering advanced features such as inter-context and inter-node device synchronization. It provides a runtime system which tracks dependency information enforced by event synchronization to dynamically build a DAG of commands, on which we automatically apply two optimizations: collective communication pattern detection and device-host-device copy removal. We assess libWater’s performance in three compute clusters available from the Vienna Scientific Cluster, the Barcelona Supercomputing Center and the University of Innsbruck, demonstrating improved performance and scaling with different test applications and configurations. PMID:25844015

  15. A uniform approach for programming distributed heterogeneous computing systems.

    PubMed

    Grasso, Ivan; Pellegrini, Simone; Cosenza, Biagio; Fahringer, Thomas

    2014-12-01

    Large-scale compute clusters of heterogeneous nodes equipped with multi-core CPUs and GPUs are getting increasingly popular in the scientific community. However, such systems require a combination of different programming paradigms making application development very challenging. In this article we introduce libWater, a library-based extension of the OpenCL programming model that simplifies the development of heterogeneous distributed applications. libWater consists of a simple interface, which is a transparent abstraction of the underlying distributed architecture, offering advanced features such as inter-context and inter-node device synchronization. It provides a runtime system which tracks dependency information enforced by event synchronization to dynamically build a DAG of commands, on which we automatically apply two optimizations: collective communication pattern detection and device-host-device copy removal. We assess libWater's performance in three compute clusters available from the Vienna Scientific Cluster, the Barcelona Supercomputing Center and the University of Innsbruck, demonstrating improved performance and scaling with different test applications and configurations.

  16. A PICKSC Science Gateway for enabling the common plasma physicist to run kinetic software

    NASA Astrophysics Data System (ADS)

    Hu, Q.; Winjum, B. J.; Zonca, A.; Youn, C.; Tsung, F. S.; Mori, W. B.

    2017-10-01

    Computer simulations offer tremendous opportunities for studying plasmas, ranging from simulations for students that illuminate fundamental educational concepts to research-level simulations that advance scientific knowledge. Nevertheless, there is a significant hurdle to using simulation tools. Users must navigate codes and software libraries, determine how to wrangle output into meaningful plots, and oftentimes confront a significant cyberinfrastructure with powerful computational resources. Science gateways offer a Web-based environment to run simulations without needing to learn or manage the underlying software and computing cyberinfrastructure. We discuss our progress on creating a Science Gateway for the Particle-in-Cell and Kinetic Simulation Software Center that enables users to easily run and analyze kinetic simulations with our software. We envision that this technology could benefit a wide range of plasma physicists, both in the use of our simulation tools as well as in its adaptation for running other plasma simulation software. Supported by NSF under Grant ACI-1339893 and by the UCLA Institute for Digital Research and Education.

  17. Civil propulsion technology for the next twenty-five years

    NASA Technical Reports Server (NTRS)

    Rosen, Robert; Facey, John R.

    1987-01-01

    The next twenty-five years will see major advances in civil propulsion technology that will result in completely new aircraft systems for domestic, international, commuter and high-speed transports. These aircraft will include advanced aerodynamic, structural, and avionic technologies resulting in major new system capabilities and economic improvements. Propulsion technologies will include high-speed turboprops in the near term, very high bypass ratio turbofans, high efficiency small engines and advanced cycles utilizing high temperature materials for high-speed propulsion. Key fundamental enabling technologies include increased temperature capability and advanced design methods. Increased temperature capability will be based on improved composite materials such as metal matrix, intermetallics, ceramics, and carbon/carbon as well as advanced heat transfer techniques. Advanced design methods will make use of advances in internal computational fluid mechanics, reacting flow computation, computational structural mechanics and computational chemistry. The combination of advanced enabling technologies, new propulsion concepts and advanced control approaches will provide major improvements in civil aircraft.

  18. Application of electrochemical energy storage in solar thermal electric generation systems

    NASA Technical Reports Server (NTRS)

    Das, R.; Krauthamer, S.; Frank, H.

    1982-01-01

    This paper assesses the status, cost, and performance of existing electrochemical energy storage systems, and projects the cost, performance, and availability of advanced storage systems for application in terrestrial solar thermal electric generation. A 10 MWe solar plant with five hours of storage is considered and the cost of delivered energy is computed for sixteen different storage systems. The results indicate that the five most attractive electrochemical storage systems use the following battery types: zinc-bromine (Exxon), iron-chromium redox (NASA/Lewis Research Center, LeRC), sodium-sulfur (Ford), sodium-sulfur (Dow), and zinc-chlorine (Energy Development Associates, EDA).

  19. Telerobotic control of the seven-degree-of-freedom CESAR manipulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Babcock, S.M.; Dubey, R.V.; Euler, J.A.

    1988-01-01

    The application of a computationally efficient kinematic control scheme for manipulators with redundant degrees of freedom to the unilateral telerobotic control of seven-degree-of-freedom manipulator (CESARM) at the Oak Ridge National Laboratory Center for Engineering Systems Advanced Research is presented. The kinematic control scheme uses a gradient projection optimization method, which eliminates that need to determine the generalized inverse of the Jacobian when solving for joint velocities, given Cartesian end-effector velocities. A six-degree-of-freedom (nonreplica) master controller is used. Performance indices for redundancy resolution are discussed. 5 ref., 6 figs.

  20. Tools Automate Spacecraft Testing, Operation

    NASA Technical Reports Server (NTRS)

    2010-01-01

    "NASA began the Small Explorer (SMEX) program to develop spacecraft to advance astrophysics and space physics. As one of the entities supporting software development at Goddard Space Flight Center, the Hammers Company Inc. (tHC Inc.), of Greenbelt, Maryland, developed the Integrated Test and Operations System to support SMEX. Later, the company received additional Small Business Innovation Research (SBIR) funding from Goddard for a tool to facilitate the development of flight software called VirtualSat. NASA uses the tools to support 15 satellites, and the aerospace industry is using them to develop science instruments, spacecraft computer systems, and navigation and control software."

  1. Be a Mentor and Experience the Excitement of Rediscovery | Poster

    Cancer.gov

    You don’t really know something until you can teach it to someone. Raul Cachau said he believes this is not only true in academia, but in research laboratories as well. He said that being a mentor means rediscovering things long taken for granted. “It really forces you to rethink some of the things you do,” said Cachau, Ph.D., principal scientist, Advanced Biomedical Computing Center (ABCC). “It brings focus to many of the things that happen on a daily basis … There’s a positive impact to taking a fresh look at something.”

  2. Advanced software development workstation. Knowledge base design: Design of knowledge base for flight planning application

    NASA Technical Reports Server (NTRS)

    Izygon, Michel E.

    1992-01-01

    The development process of the knowledge base for the generation of Test Libraries for Mission Operations Computer (MOC) Command Support focused on a series of information gathering interviews. These knowledge capture sessions are supporting the development of a prototype for evaluating the capabilities of INTUIT on such an application. the prototype includes functions related to POCC (Payload Operation Control Center) processing. It prompts the end-users for input through a series of panels and then generates the Meds associated with the initialization and the update of hazardous command tables for a POCC Processing TLIB.

  3. GloVis

    USGS Publications Warehouse

    Houska, Treva R.; Johnson, A.P.

    2012-01-01

    The Global Visualization Viewer (GloVis) trifold provides basic information for online access to a subset of satellite and aerial photography collections from the U.S. Geological Survey Earth Resources Observation and Science (EROS) Center archive. The GloVis (http://glovis.usgs.gov/) browser-based utility allows users to search and download National Aerial Photography Program (NAPP), National High Altitude Photography (NHAP), Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER), Earth Observing-1 (EO-1), Global Land Survey, Moderate Resolution Imaging Spectroradiometer (MODIS), and TerraLook data. Minimum computer system requirements and customer service contact information also are included in the brochure.

  4. The Advanced Software Development and Commercialization Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gallopoulos, E.; Canfield, T.R.; Minkoff, M.

    1990-09-01

    This is the first of a series of reports pertaining to progress in the Advanced Software Development and Commercialization Project, a joint collaborative effort between the Center for Supercomputing Research and Development of the University of Illinois and the Computing and Telecommunications Division of Argonne National Laboratory. The purpose of this work is to apply techniques of parallel computing that were pioneered by University of Illinois researchers to mature computational fluid dynamics (CFD) and structural dynamics (SD) computer codes developed at Argonne. The collaboration in this project will bring this unique combination of expertise to bear, for the first time,more » on industrially important problems. By so doing, it will expose the strengths and weaknesses of existing techniques for parallelizing programs and will identify those problems that need to be solved in order to enable wide spread production use of parallel computers. Secondly, the increased efficiency of the CFD and SD codes themselves will enable the simulation of larger, more accurate engineering models that involve fluid and structural dynamics. In order to realize the above two goals, we are considering two production codes that have been developed at ANL and are widely used by both industry and Universities. These are COMMIX and WHAMS-3D. The first is a computational fluid dynamics code that is used for both nuclear reactor design and safety and as a design tool for the casting industry. The second is a three-dimensional structural dynamics code used in nuclear reactor safety as well as crashworthiness studies. These codes are currently available for both sequential and vector computers only. Our main goal is to port and optimize these two codes on shared memory multiprocessors. In so doing, we shall establish a process that can be followed in optimizing other sequential or vector engineering codes for parallel processors.« less

  5. Delivering The Benefits of Chemical-Biological Integration in ...

    EPA Pesticide Factsheets

    Abstract: Researchers at the EPA’s National Center for Computational Toxicology integrate advances in biology, chemistry, and computer science to examine the toxicity of chemicals and help prioritize chemicals for further research based on potential human health risks. The intention of this research program is to quickly evaluate thousands of chemicals for potential risk but with much reduced cost relative to historical approaches. This work involves computational and data driven approaches including high-throughput screening, modeling, text-mining and the integration of chemistry, exposure and biological data. We have developed a number of databases and applications that are delivering on the vision of developing a deeper understanding of chemicals and their effects on exposure and biological processes that are supporting a large community of scientists in their research efforts. This presentation will provide an overview of our work to bring together diverse large scale data from the chemical and biological domains, our approaches to integrate and disseminate these data, and the delivery of models supporting computational toxicology. This abstract does not reflect U.S. EPA policy. Presentation at ACS TOXI session on Computational Chemistry and Toxicology in Chemical Discovery and Assessement (QSARs).

  6. NASA's 3D Flight Computer for Space Applications

    NASA Technical Reports Server (NTRS)

    Alkalai, Leon

    2000-01-01

    The New Millennium Program (NMP) Integrated Product Development Team (IPDT) for Microelectronics Systems was planning to validate a newly developed 3D Flight Computer system on its first deep-space flight, DS1, launched in October 1998. This computer, developed in the 1995-97 time frame, contains many new computer technologies previously never used in deep-space systems. They include: advanced 3D packaging architecture for future low-mass and low-volume avionics systems; high-density 3D packaged chip-stacks for both volatile and non-volatile mass memory: 400 Mbytes of local DRAM memory, and 128 Mbytes of Flash memory; high-bandwidth Peripheral Component Interface (Per) local-bus with a bridge to VME; high-bandwidth (20 Mbps) fiber-optic serial bus; and other attributes, such as standard support for Design for Testability (DFT). Even though this computer system did not complete on time for delivery to the DS1 project, it was an important development along a technology roadmap towards highly integrated and highly miniaturized avionics systems for deep-space applications. This continued technology development is now being performed by NASA's Deep Space System Development Program (also known as X2000) and within JPL's Center for Integrated Space Microsystems (CISM).

  7. Perceptions and Expectations at New York State's Centers for Advanced Technology: Some Implications for Research Management.

    ERIC Educational Resources Information Center

    Bitting, Robert K.

    1989-01-01

    An evaluation of the 10 Centers for Advanced Ceramic Technology in New York State's collaborative research and development program yielded unexpected perceptions held by center, government, and industry personnel. Implications for the research effort, including the role of basic research and the importance of the research center administrators,…

  8. Responding to Industry Demands: Advanced Technology Centers.

    ERIC Educational Resources Information Center

    Smith, Elizabeth Brient

    1991-01-01

    Discusses characteristics identified by the Center for Occupational Research and Development as indicative of fully functioning advanced technology centers, including the provision of training and retraining in such areas as design, manufacturing, materials science, and electro-optics; technology transfer; demonstration sites; needs assessment;…

  9. Computational Aerodynamic Simulations of an 840 ft/sec Tip Speed Advanced Ducted Propulsor Fan System Model for Acoustic Methods Assessment and Development

    NASA Technical Reports Server (NTRS)

    Tweedt, Daniel L.

    2014-01-01

    Computational Aerodynamic simulations of an 840 ft/sec tip speed, Advanced Ducted Propulsor fan system were performed at five different operating points on the fan operating line, in order to provide detailed internal flow field information for use with fan acoustic prediction methods presently being developed, assessed and validated. The fan system is a sub-scale, lownoise research fan/nacelle model that has undergone extensive experimental testing in the 9- by 15- foot Low Speed Wind Tunnel at the NASA Glenn Research Center, resulting in quality, detailed aerodynamic and acoustic measurement data. Details of the fan geometry, the computational fluid dynamics methods, the computational grids, and various computational parameters relevant to the numerical simulations are discussed. Flow field results for three of the five operating conditions simulated are presented in order to provide a representative look at the computed solutions. Each of the five fan aerodynamic simulations involved the entire fan system, excluding a long core duct section downstream of the core inlet guide vane. As a result, only fan rotational speed and system bypass ratio, set by specifying static pressure downstream of the core inlet guide vane row, were adjusted in order to set the fan operating point, leading to operating points that lie on a fan operating line and making mass flow rate a fully dependent parameter. The resulting mass flow rates are in good agreement with measurement values. The computed blade row flow fields for all five fan operating points are, in general, aerodynamically healthy. Rotor blade and fan exit guide vane flow characteristics are good, including incidence and deviation angles, chordwise static pressure distributions, blade surface boundary layers, secondary flow structures, and blade wakes. Examination of the computed flow fields reveals no excessive boundary layer separations or related secondary-flow problems. A few spanwise comparisons between computational and measurement data in the bypass duct show that they are in good agreement, thus providing a partial validation of the computational results.

  10. A PC-based bus monitor program for use with the transport systems research vehicle RS-232 communication interfaces

    NASA Technical Reports Server (NTRS)

    Easley, Wesley C.

    1991-01-01

    Experiment critical use of RS-232 data busses in the Transport Systems Research Vehicle (TSRV) operated by the Advanced Transport Operating Systems Program Office at the NASA Langley Research Center has recently increased. Each application utilizes a number of nonidentical computer and peripheral configurations and requires task specific software development. To aid these development tasks, an IBM PC-based RS-232 bus monitoring system was produced. It can simultaneously monitor two communication ports of a PC or clone, including the nonstandard bus expansion of the TSRV Grid laptop computers. Display occurs in a separate window for each port's input with binary display being selectable. A number of other features including binary log files, screen capture to files, and a full range of communication parameters are provided.

  11. NASA/Army Rotorcraft Transmission Research, a Review of Recent Significant Accomplishments

    NASA Technical Reports Server (NTRS)

    Krantz, Timothy L.

    1994-01-01

    A joint helicopter transmission research program between NASA Lewis Research Center and the U.S. Army Research Lab has existed since 1970. Research goals are to reduce weight and noise while increasing life, reliability, and safety. These research goals are achieved by the NASA/Army Mechanical Systems Technology Branch through both in-house research and cooperative research projects with university and industry partners. Some recent significant technical accomplishments produced by this cooperative research are reviewed. The following research projects are reviewed: oil-off survivability of tapered roller bearings, design and evaluation of high contact ratio gearing, finite element analysis of spiral bevel gears, computer numerical control grinding of spiral bevel gears, gear dynamics code validation, computer program for life and reliability of helicopter transmissions, planetary gear train efficiency study, and the Advanced Rotorcraft Transmission (ART) program.

  12. EEGLAB, SIFT, NFT, BCILAB, and ERICA: new tools for advanced EEG processing.

    PubMed

    Delorme, Arnaud; Mullen, Tim; Kothe, Christian; Akalin Acar, Zeynep; Bigdely-Shamlo, Nima; Vankov, Andrey; Makeig, Scott

    2011-01-01

    We describe a set of complementary EEG data collection and processing tools recently developed at the Swartz Center for Computational Neuroscience (SCCN) that connect to and extend the EEGLAB software environment, a freely available and readily extensible processing environment running under Matlab. The new tools include (1) a new and flexible EEGLAB STUDY design facility for framing and performing statistical analyses on data from multiple subjects; (2) a neuroelectromagnetic forward head modeling toolbox (NFT) for building realistic electrical head models from available data; (3) a source information flow toolbox (SIFT) for modeling ongoing or event-related effective connectivity between cortical areas; (4) a BCILAB toolbox for building online brain-computer interface (BCI) models from available data, and (5) an experimental real-time interactive control and analysis (ERICA) environment for real-time production and coordination of interactive, multimodal experiments.

  13. Adjoint-Based Algorithms for Adaptation and Design Optimizations on Unstructured Grids

    NASA Technical Reports Server (NTRS)

    Nielsen, Eric J.

    2006-01-01

    Schemes based on discrete adjoint algorithms present several exciting opportunities for significantly advancing the current state of the art in computational fluid dynamics. Such methods provide an extremely efficient means for obtaining discretely consistent sensitivity information for hundreds of design variables, opening the door to rigorous, automated design optimization of complex aerospace configuration using the Navier-Stokes equation. Moreover, the discrete adjoint formulation provides a mathematically rigorous foundation for mesh adaptation and systematic reduction of spatial discretization error. Error estimates are also an inherent by-product of an adjoint-based approach, valuable information that is virtually non-existent in today's large-scale CFD simulations. An overview of the adjoint-based algorithm work at NASA Langley Research Center is presented, with examples demonstrating the potential impact on complex computational problems related to design optimization as well as mesh adaptation.

  14. Implementing an Affordable High-Performance Computing for Teaching-Oriented Computer Science Curriculum

    ERIC Educational Resources Information Center

    Abuzaghleh, Omar; Goldschmidt, Kathleen; Elleithy, Yasser; Lee, Jeongkyu

    2013-01-01

    With the advances in computing power, high-performance computing (HPC) platforms have had an impact on not only scientific research in advanced organizations but also computer science curriculum in the educational community. For example, multicore programming and parallel systems are highly desired courses in the computer science major. However,…

  15. The application of automated operations at the Institutional Processing Center

    NASA Technical Reports Server (NTRS)

    Barr, Thomas H.

    1993-01-01

    The JPL Institutional and Mission Computing Division, Communications, Computing and Network Services Section, with its mission contractor, OAO Corporation, have for some time been applying automation to the operation of JPL's Information Processing Center (IPC). Automation does not come in one easy to use package. Automation for a data processing center is made up of many different software and hardware products supported by trained personnel. The IPC automation effort formally began with console automation, and has since spiraled out to include production scheduling, data entry, report distribution, online reporting, failure reporting and resolution, documentation, library storage, and operator and user education, while requiring the interaction of multi-vendor and locally developed software. To begin the process, automation goals are determined. Then a team including operations personnel is formed to research and evaluate available options. By acquiring knowledge of current products and those in development, taking an active role in industry organizations, and learning of other data center's experiences, a forecast can be developed as to what direction technology is moving. With IPC management's approval, an implementation plan is developed and resources identified to test or implement new systems. As an example, IPC's new automated data entry system was researched by Data Entry, Production Control, and Advance Planning personnel. A proposal was then submitted to management for review. A determination to implement the new system was made and elements/personnel involved with the initial planning performed the implementation. The final steps of the implementation were educating data entry personnel in the areas effected and procedural changes necessary to the successful operation of the new system.

  16. Experiences at Langley Research Center in the application of optimization techniques to helicopter airframes for vibration reduction

    NASA Technical Reports Server (NTRS)

    Murthy, T. Sreekanta; Kvaternik, Raymond G.

    1991-01-01

    A NASA/industry rotorcraft structural dynamics program known as Design Analysis Methods for VIBrationS (DAMVIBS) was initiated at Langley Research Center in 1984 with the objective of establishing the technology base needed by the industry for developing an advanced finite-element-based vibrations design analysis capability for airframe structures. As a part of the in-house activities contributing to that program, a study was undertaken to investigate the use of formal, nonlinear programming-based, numerical optimization techniques for airframe vibrations design work. Considerable progress has been made in connection with that study since its inception in 1985. This paper presents a unified summary of the experiences and results of that study. The formulation and solution of airframe optimization problems are discussed. Particular attention is given to describing the implementation of a new computational procedure based on MSC/NASTRAN and CONstrained function MINimization (CONMIN) in a computer program system called DYNOPT for the optimization of airframes subject to strength, frequency, dynamic response, and fatigue constraints. The results from the application of the DYNOPT program to the Bell AH-1G helicopter are presented and discussed.

  17. Automation in the Space Station module power management and distribution Breadboard

    NASA Technical Reports Server (NTRS)

    Walls, Bryan; Lollar, Louis F.

    1990-01-01

    The Space Station Module Power Management and Distribution (SSM/PMAD) Breadboard, located at NASA's Marshall Space Flight Center (MSFC) in Huntsville, Alabama, models the power distribution within a Space Station Freedom Habitation or Laboratory module. Originally designed for 20 kHz ac power, the system is now being converted to high voltage dc power with power levels on a par with those expected for a space station module. In addition to the power distribution hardware, the system includes computer control through a hierarchy of processes. The lowest level process consists of fast, simple (from a computing standpoint) switchgear, capable of quickly safing the system. The next level consists of local load center processors called Lowest Level Processors (LLP's). These LLP's execute load scheduling, perform redundant switching, and shed loads which use more than scheduled power. The level above the LLP's contains a Communication and Algorithmic Controller (CAC) which coordinates communications with the highest level. Finally, at this highest level, three cooperating Artificial Intelligence (AI) systems manage load prioritization, load scheduling, load shedding, and fault recovery and management. The system provides an excellent venue for developing and examining advanced automation techniques. The current system and the plans for its future are examined.

  18. Advanced Computational Methods for Optimization of Non-Periodic Inspection Intervals for Aging Infrastructure

    DTIC Science & Technology

    2017-01-05

    AFRL-AFOSR-JP-TR-2017-0002 Advanced Computational Methods for Optimization of Non-Periodic Inspection Intervals for Aging Infrastructure Manabu...Computational Methods for Optimization of Non-Periodic Inspection Intervals for Aging Infrastructure 5a.  CONTRACT NUMBER 5b.  GRANT NUMBER FA2386...UNLIMITED: PB Public Release 13. SUPPLEMENTARY NOTES 14. ABSTRACT This report for the project titled ’Advanced Computational Methods for Optimization of

  19. Cloud Computing Applications in Support of Earth Science Activities at Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    Molthan, Andrew L.; Limaye, Ashutosh S.; Srikishen, Jayanthi

    2011-01-01

    Currently, the NASA Nebula Cloud Computing Platform is available to Agency personnel in a pre-release status as the system undergoes a formal operational readiness review. Over the past year, two projects within the Earth Science Office at NASA Marshall Space Flight Center have been investigating the performance and value of Nebula s "Infrastructure as a Service", or "IaaS" concept and applying cloud computing concepts to advance their respective mission goals. The Short-term Prediction Research and Transition (SPoRT) Center focuses on the transition of unique NASA satellite observations and weather forecasting capabilities for use within the operational forecasting community through partnerships with NOAA s National Weather Service (NWS). SPoRT has evaluated the performance of the Weather Research and Forecasting (WRF) model on virtual machines deployed within Nebula and used Nebula instances to simulate local forecasts in support of regional forecast studies of interest to select NWS forecast offices. In addition to weather forecasting applications, rapidly deployable Nebula virtual machines have supported the processing of high resolution NASA satellite imagery to support disaster assessment following the historic severe weather and tornado outbreak of April 27, 2011. Other modeling and satellite analysis activities are underway in support of NASA s SERVIR program, which integrates satellite observations, ground-based data and forecast models to monitor environmental change and improve disaster response in Central America, the Caribbean, Africa, and the Himalayas. Leveraging SPoRT s experience, SERVIR is working to establish a real-time weather forecasting model for Central America. Other modeling efforts include hydrologic forecasts for Kenya, driven by NASA satellite observations and reanalysis data sets provided by the broader meteorological community. Forecast modeling efforts are supplemented by short-term forecasts of convective initiation, determined by geostationary satellite observations processed on virtual machines powered by Nebula.

  20. Educational NASA Computational and Scientific Studies (enCOMPASS)

    NASA Technical Reports Server (NTRS)

    Memarsadeghi, Nargess

    2013-01-01

    Educational NASA Computational and Scientific Studies (enCOMPASS) is an educational project of NASA Goddard Space Flight Center aimed at bridging the gap between computational objectives and needs of NASA's scientific research, missions, and projects, and academia's latest advances in applied mathematics and computer science. enCOMPASS achieves this goal via bidirectional collaboration and communication between NASA and academia. Using developed NASA Computational Case Studies in university computer science/engineering and applied mathematics classes is a way of addressing NASA's goals of contributing to the Science, Technology, Education, and Math (STEM) National Objective. The enCOMPASS Web site at http://encompass.gsfc.nasa.gov provides additional information. There are currently nine enCOMPASS case studies developed in areas of earth sciences, planetary sciences, and astrophysics. Some of these case studies have been published in AIP and IEEE's Computing in Science and Engineering magazines. A few university professors have used enCOMPASS case studies in their computational classes and contributed their findings to NASA scientists. In these case studies, after introducing the science area, the specific problem, and related NASA missions, students are first asked to solve a known problem using NASA data and past approaches used and often published in a scientific/research paper. Then, after learning about the NASA application and related computational tools and approaches for solving the proposed problem, students are given a harder problem as a challenge for them to research and develop solutions for. This project provides a model for NASA scientists and engineers on one side, and university students, faculty, and researchers in computer science and applied mathematics on the other side, to learn from each other's areas of work, computational needs and solutions, and the latest advances in research and development. This innovation takes NASA science and engineering applications to computer science and applied mathematics university classes, and makes NASA objectives part of the university curricula. There is great potential for growth and return on investment of this program to the point where every major university in the U.S. would use at least one of these case studies in one of their computational courses, and where every NASA scientist and engineer facing a computational challenge (without having resources or expertise to solve it) would use enCOMPASS to formulate the problem as a case study, provide it to a university, and get back their solutions and ideas.

Top