Science.gov

Sample records for advanced computing center

  1. Center for Advanced Computational Technology

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    2000-01-01

    The Center for Advanced Computational Technology (ACT) was established to serve as a focal point for diverse research activities pertaining to application of advanced computational technology to future aerospace systems. These activities include the use of numerical simulations, artificial intelligence methods, multimedia and synthetic environments, and computational intelligence, in the modeling, analysis, sensitivity studies, optimization, design and operation of future aerospace systems. The Center is located at NASA Langley and is an integral part of the School of Engineering and Applied Science of the University of Virginia. The Center has four specific objectives: 1) conduct innovative research on applications of advanced computational technology to aerospace systems; 2) act as pathfinder by demonstrating to the research community what can be done (high-potential, high-risk research); 3) help in identifying future directions of research in support of the aeronautical and space missions of the twenty-first century; and 4) help in the rapid transfer of research results to industry and in broadening awareness among researchers and engineers of the state-of-the-art in applications of advanced computational technology to the analysis, design prototyping and operations of aerospace and other high-performance engineering systems. In addition to research, Center activities include helping in the planning and coordination of the activities of a multi-center team of NASA and JPL researchers who are developing an intelligent synthesis environment for future aerospace systems; organizing workshops and national symposia; as well as writing state-of-the-art monographs and NASA special publications on timely topics.

  2. Advanced Biomedical Computing Center (ABCC) | DSITP

    Cancer.gov

    The Advanced Biomedical Computing Center (ABCC), located in Frederick Maryland (MD), provides HPC resources for both NIH/NCI intramural scientists and the extramural biomedical research community. Its mission is to provide HPC support, to provide collaborative research, and to conduct in-house research in various areas of computational biology and biomedical research.

  3. National Energy Research Scientific Computing Center (NERSC): Advancing the frontiers of computational science and technology

    SciTech Connect

    Hules, J.

    1996-11-01

    National Energy Research Scientific Computing Center (NERSC) provides researchers with high-performance computing tools to tackle science`s biggest and most challenging problems. Founded in 1974 by DOE/ER, the Controlled Thermonuclear Research Computer Center was the first unclassified supercomputer center and was the model for those that followed. Over the years the center`s name was changed to the National Magnetic Fusion Energy Computer Center and then to NERSC; it was relocated to LBNL. NERSC, one of the largest unclassified scientific computing resources in the world, is the principal provider of general-purpose computing services to DOE/ER programs: Magnetic Fusion Energy, High Energy and Nuclear Physics, Basic Energy Sciences, Health and Environmental Research, and the Office of Computational and Technology Research. NERSC users are a diverse community located throughout US and in several foreign countries. This brochure describes: the NERSC advantage, its computational resources and services, future technologies, scientific resources, and computational science of scale (interdisciplinary research over a decade or longer; examples: combustion in engines, waste management chemistry, global climate change modeling).

  4. CNC Turning Center Advanced Operations. Computer Numerical Control Operator/Programmer. 444-332.

    ERIC Educational Resources Information Center

    Skowronski, Steven D.; Tatum, Kenneth

    This student guide provides materials for a course designed to introduce the student to the operations and functions of a two-axis computer numerical control (CNC) turning center. The course consists of seven units. Unit 1 presents course expectations and syllabus, covers safety precautions, and describes the CNC turning center components, CNC…

  5. Development of Advanced Computational Aeroelasticity Tools at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Bartels, R. E.

    2008-01-01

    NASA Langley Research Center has continued to develop its long standing computational tools to address new challenges in aircraft and launch vehicle design. This paper discusses the application and development of those computational aeroelastic tools. Four topic areas will be discussed: 1) Modeling structural and flow field nonlinearities; 2) Integrated and modular approaches to nonlinear multidisciplinary analysis; 3) Simulating flight dynamics of flexible vehicles; and 4) Applications that support both aeronautics and space exploration.

  6. COMPUTATIONAL SCIENCE CENTER

    SciTech Connect

    DAVENPORT, J.

    2006-11-01

    Computational Science is an integral component of Brookhaven's multi science mission, and is a reflection of the increased role of computation across all of science. Brookhaven currently has major efforts in data storage and analysis for the Relativistic Heavy Ion Collider (RHIC) and the ATLAS detector at CERN, and in quantum chromodynamics. The Laboratory is host for the QCDOC machines (quantum chromodynamics on a chip), 10 teraflop/s computers which boast 12,288 processors each. There are two here, one for the Riken/BNL Research Center and the other supported by DOE for the US Lattice Gauge Community and other scientific users. A 100 teraflop/s supercomputer will be installed at Brookhaven in the coming year, managed jointly by Brookhaven and Stony Brook, and funded by a grant from New York State. This machine will be used for computational science across Brookhaven's entire research program, and also by researchers at Stony Brook and across New York State. With Stony Brook, Brookhaven has formed the New York Center for Computational Science (NYCCS) as a focal point for interdisciplinary computational science, which is closely linked to Brookhaven's Computational Science Center (CSC). The CSC has established a strong program in computational science, with an emphasis on nanoscale electronic structure and molecular dynamics, accelerator design, computational fluid dynamics, medical imaging, parallel computing and numerical algorithms. We have been an active participant in DOES SciDAC program (Scientific Discovery through Advanced Computing). We are also planning a major expansion in computational biology in keeping with Laboratory initiatives. Additional laboratory initiatives with a dependence on a high level of computation include the development of hydrodynamics models for the interpretation of RHIC data, computational models for the atmospheric transport of aerosols, and models for combustion and for energy utilization. The CSC was formed to bring together

  7. COMPUTATIONAL SCIENCE CENTER

    SciTech Connect

    DAVENPORT,J.

    2004-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security.

  8. COMPUTATIONAL SCIENCE CENTER

    SciTech Connect

    DAVENPORT, J.

    2005-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include, for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security. To achieve our goals we have established a close alliance with applied mathematicians and computer scientists at Stony Brook and Columbia Universities.

  9. Swiftly Computing Center Strings

    PubMed Central

    2011-01-01

    Background The center string (or closest string) problem is a classic computer science problem with important applications in computational biology. Given k input strings and a distance threshold d, we search for a string within Hamming distance at most d to each input string. This problem is NP complete. Results In this paper, we focus on exact methods for the problem that are also swift in application. We first introduce data reduction techniques that allow us to infer that certain instances have no solution, or that a center string must satisfy certain conditions. We describe how to use this information to speed up two previously published search tree algorithms. Then, we describe a novel iterative search strategy that is effecient in practice, where some of our reduction techniques can also be applied. Finally, we present results of an evaluation study for two different data sets from a biological application. Conclusions We find that the running time for computing the optimal center string is dominated by the subroutine calls for d = dopt -1 and d = dopt. Our data reduction is very effective for both, either rejecting unsolvable instances or solving trivial positions. We find that this speeds up computations considerably. PMID:21504573

  10. Center for Computational Structures Technology

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Perry, Ferman W.

    1995-01-01

    The Center for Computational Structures Technology (CST) is intended to serve as a focal point for the diverse CST research activities. The CST activities include the use of numerical simulation and artificial intelligence methods in modeling, analysis, sensitivity studies, and optimization of flight-vehicle structures. The Center is located at NASA Langley and is an integral part of the School of Engineering and Applied Science of the University of Virginia. The key elements of the Center are: (1) conducting innovative research on advanced topics of CST; (2) acting as pathfinder by demonstrating to the research community what can be done (high-potential, high-risk research); (3) strong collaboration with NASA scientists and researchers from universities and other government laboratories; and (4) rapid dissemination of CST to industry, through integration of industrial personnel into the ongoing research efforts.

  11. Center for Advanced Space Propulsion

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The Center for Advanced Space Propulsion (CASP) is part of the University of Tennessee-Calspan Center for Aerospace Research (CAR). It was formed in 1985 to take advantage of the extensive research faculty and staff of the University of Tennessee and Calspan Corporation. It is also one of sixteen NASA sponsored Centers established to facilitate the Commercial Development of Space. Based on investigators' qualifications in propulsion system development, and matching industries' strong intent, the Center focused its efforts in the following technical areas: advanced chemical propulsion, electric propulsion, AI/Expert systems, fluids management in microgravity, and propulsion materials processing. This annual report focuses its discussion in these technical areas.

  12. Advanced Computer Typography.

    DTIC Science & Technology

    1981-12-01

    ADVANCED COMPUTER TYPOGRAPHY .(U) DEC 81 A V HERSHEY UNCLASSIFIED NPS012-81-005 M MEEEIEEEII IIUJIL15I.4 MICROCQP RE SO.JjI ON ft R NPS012-81-005...NAVAL POSTGRADUATE SCHOOL 0Monterey, California DTIC SELECTEWA APR 5 1982 B ADVANCED COMPUTER TYPOGRAPHY by A. V. HERSHEY December 1981 OApproved for...Subtitle) S. TYPE Or REPORT & PERIOD COVERED Final ADVANCED COMPUTER TYPOGRAPHY Dec 1979 - Dec 1981 S. PERFORMING ORG. REPORT NUMBER 7. AUTHOR(s) S CONTRACT

  13. Computer Center: CIBE Systems.

    ERIC Educational Resources Information Center

    Crovello, Theodore J.

    1982-01-01

    Differentiates between computer systems and Computers in Biological Education (CIBE) systems (computer system intended for use in biological education). Describes several CIBE stand alone systems: single-user microcomputer; single-user microcomputer/video-disc; multiuser microcomputers; multiuser maxicomputer; and local and long distance computer…

  14. Ohio Advanced Energy Manufacturing Center

    SciTech Connect

    Kimberly Gibson; Mark Norfolk

    2012-07-30

    The program goal of the Ohio Advanced Energy Manufacturing Center (OAEMC) is to support advanced energy manufacturing and to create responsive manufacturing clusters that will support the production of advanced energy and energy-efficient products to help ensure the nation's energy and environmental security. This goal cuts across a number of existing industry segments critical to the nation's future. Many of the advanced energy businesses are starting to make the transition from technology development to commercial production. Historically, this transition from laboratory prototypes through initial production for early adopters to full production for mass markets has taken several years. Developing and implementing manufacturing technology to enable production at a price point the market will accept is a key step. Since these start-up operations are configured to advance the technology readiness of the core energy technology, they have neither the expertise nor the resources to address manufacturing readiness issues they encounter as the technology advances toward market entry. Given the economic realities of today's business environment, finding ways to accelerate this transition can make the difference between success and failure for a new product or business. The advanced energy industry touches a wide range of industry segments that are not accustomed to working together in complex supply chains to serve large markets such as automotive and construction. During its first three years, the Center has catalyzed the communication between companies and industry groups that serve the wide range of advanced energy markets. The Center has also found areas of common concern, and worked to help companies address these concerns on a segment or industry basis rather than having each company work to solve common problems individually. EWI worked with three industries through public-private partnerships to sew together disparate segments helping to promote overall industry

  15. Computer Center: Software Review.

    ERIC Educational Resources Information Center

    Duhrkopf, Richard, Ed.; Belshe, John F., Ed.

    1988-01-01

    Reviews a software package, "Mitosis-Meiosis," available for Apple II or IBM computers with colorgraphics capabilities. Describes the documentation, presentation and flexibility of the program. Rates the program based on graphics and usability in a biology classroom. (CW)

  16. Computer Center Reference Manual

    DTIC Science & Technology

    1988-06-20

    introduction to the operating systems of the Cray X -MP (COS), DEC VAxcluster (VMS), and CDC (NOS) for applications programmers. Some information has been...Cray X -MP / 24 1-1-2 CDC CYBER 180 model 860A 1-1-2 DEC VAXcluster 1-1-3 DEC Remote Mini 1-1-3 The Integrated Supercomputer Network 1-1-4 User Interface...ADP Control Center 1-2-3 ( Software Available 1-3-1 2 The Cray X -MP 2-1-1 COS Version 1.16 2-1-1 Accessing the Cray X -MP 2-1-1 Cray Detasets 2-1-1

  17. Advanced laptop and small personal computer technology

    NASA Technical Reports Server (NTRS)

    Johnson, Roger L.

    1991-01-01

    Advanced laptop and small personal computer technology is presented in the form of the viewgraphs. The following areas of hand carried computers and mobile workstation technology are covered: background, applications, high end products, technology trends, requirements for the Control Center application, and recommendations for the future.

  18. Center for Advanced Separation Technology

    SciTech Connect

    Honaker, Rick

    2013-09-30

    The U.S. is the largest producer of mining products in the world. In 2011, U.S. mining operations contributed a total of $232 billion to the nation’s GDP plus $138 billion in labor income. Of this the coal mining industry contributed a total of $97.5 billion to GDP plus $53 billion in labor income. Despite these contributions, the industry has not been well supported with research and development funds as compared to mining industries in other countries. To overcome this problem, the Center for Advanced Separation Technologies (CAST) was established to develop technologies that can be used by the U.S. mining industry to create new products, reduce production costs, and meet environmental regulations. Originally set up by Virginia Tech and West Virginia University, CAST is now a five-university consortium – Virginia Tech, West Virginia University, University of Kentucky, University of Utah and Montana Tech, - that is supported through U.S. DOE Cooperative Agreement No. DE-FE0000699, Center for Advanced Separation Technology. Much of the research to be conducted with Cooperative Agreement funds will be longer term, high-risk, basic research and will be carried out in two broad areas: Advanced Pre-Combustion Clean Coal Technologies and Gas-Gas Separations. Distribution of funds is handled via competitive solicitation of research proposals through Site Coordinators at the five member universities. These were reviewed and the selected proposals were forwarded these to the DOE/NETL Project Officer for final review and approval. The successful projects are listed below by category, along with abstracts from their final reports.

  19. Advances in Computational Astrophysics

    SciTech Connect

    Calder, Alan C.; Kouzes, Richard T.

    2009-03-01

    I was invited to be the guest editor for a special issue of Computing in Science and Engineering along with a colleague from Stony Brook. This is the guest editors' introduction to a special issue of Computing in Science and Engineering. Alan and I have written this introduction and have been the editors for the 4 papers to be published in this special edition.

  20. Center for advanced microstructures and devices (CAMD)

    NASA Astrophysics Data System (ADS)

    Craft, B. C.; Feldman, M.; Morikawa, E.; Poliakoff, E. D.; Saile, V.; Scott, J. D.; Stockbauer, R. L.

    1992-01-01

    The new synchrotron-radiation facility, Center for Advanced Microstructures and Devices, at Louisiana State University is described with regard to the status of installation of the storage ring, implementation of the various programs, and construction of the first beamlines.

  1. Responding to Industry Demands: Advanced Technology Centers.

    ERIC Educational Resources Information Center

    Smith, Elizabeth Brient

    1991-01-01

    Discusses characteristics identified by the Center for Occupational Research and Development as indicative of fully functioning advanced technology centers, including the provision of training and retraining in such areas as design, manufacturing, materials science, and electro-optics; technology transfer; demonstration sites; needs assessment;…

  2. Center for Advanced Space Propulsion (CASP)

    NASA Technical Reports Server (NTRS)

    1988-01-01

    With a mission to initiate and conduct advanced propulsion research in partnership with industry, and a goal to strengthen U.S. national capability in propulsion technology, the Center for Advanced Space Propulsion (CASP) is the only NASA Center for Commercial Development of Space (CCDS) which focuses on propulsion and associated technologies. Meetings with industrial partners and NASA Headquarters personnel provided an assessment of the constraints placed on, and opportunities afforded commercialization projects. Proprietary information, data rights, and patent rights were some of the areas where well defined information is crucial to project success and follow-on efforts. There were five initial CASP projects. At the end of the first year there are six active, two of which are approaching the ground test phase in their development. Progress in the current six projects has met all milestones and is detailed. Working closely with the industrial counterparts it was found that the endeavors in expert systems development, computational fluid dynamics, fluid management in microgravity, and electric propulsion were well received. One project with the Saturn Corporation which dealt with expert systems application in the assembly process, was placed on hold pending further direction from Saturn. The Contamination Measurment and Analysis project was not implemented since CASP was unable to identify an industrial participant. Additional propulsion and related projects were investigated during the year. A subcontract was let to a small business, MicroCraft, Inc., to study rocket engine certification standards. The study produced valuable results; however, based on a number of factors it was decided not to pursue this project further.

  3. Computational mechanics and physics at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    South, Jerry C., Jr.

    1987-01-01

    An overview is given of computational mechanics and physics at NASA Langley Research Center. Computational analysis is a major component and tool in many of Langley's diverse research disciplines, as well as in the interdisciplinary research. Examples are given for algorithm development and advanced applications in aerodynamics, transition to turbulence and turbulence simulation, hypersonics, structures, and interdisciplinary optimization.

  4. Advances in Computational Capabilities for Hypersonic Flows

    NASA Technical Reports Server (NTRS)

    Kumar, Ajay; Gnoffo, Peter A.; Moss, James N.; Drummond, J. Philip

    1997-01-01

    The paper reviews the growth and advances in computational capabilities for hypersonic applications over the period from the mid-1980's to the present day. The current status of the code development issues such as surface and field grid generation, algorithms, physical and chemical modeling, and validation is provided. A brief description of some of the major codes being used at NASA Langley Research Center for hypersonic continuum and rarefied flows is provided, along with their capabilities and deficiencies. A number of application examples are presented, and future areas of research to enhance accuracy, reliability, efficiency, and robustness of computational codes are discussed.

  5. Advanced computations in plasma physics

    NASA Astrophysics Data System (ADS)

    Tang, W. M.

    2002-05-01

    Scientific simulation in tandem with theory and experiment is an essential tool for understanding complex plasma behavior. In this paper we review recent progress and future directions for advanced simulations in magnetically confined plasmas with illustrative examples chosen from magnetic confinement research areas such as microturbulence, magnetohydrodynamics, magnetic reconnection, and others. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales together with access to powerful new computational resources. In particular, the fusion energy science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPP's to produce three-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of turbulence self-regulation by zonal flows. It should be emphasized that these calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to

  6. A multi-purpose computing center: FNAL

    SciTech Connect

    Wolbers, Stephen; /Fermilab

    2009-01-01

    The Fermilab Computing Center is described with a special emphasis given to the scientific computing systems and the data storage and archiving systems. The scope and focus of this paper is the Fermilab scientific computing facility. It does not cover, or does not cover very well, related issues such as data Grids, cloud computing and storage, commercial storage, data integrity, authorization, access rates, and novel storage technologies. These are all important considerations in discussing data centers and should be kept in mind when one explores issues related to computing centers and long-term data storage.

  7. Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2000-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center. It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. Ames has been designated NASA's Center of Excellence in Information Technology. In this capacity, Ames is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA Ames and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth; (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking. Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a

  8. Advanced Computation in Plasma Physics

    NASA Astrophysics Data System (ADS)

    Tang, William

    2001-10-01

    Scientific simulation in tandem with theory and experiment is an essential tool for understanding complex plasma behavior. This talk will review recent progress and future directions for advanced simulations in magnetically-confined plasmas with illustrative examples chosen from areas such as microturbulence, magnetohydrodynamics, magnetic reconnection, and others. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales together with access to powerful new computational resources. In particular, the fusion energy science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop MPP's to produce 3-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of turbulence self-regulation by zonal flows. It should be emphasized that these calculations, which typically utilized billions of particles for tens of thousands time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to stimulate improved cross-cutting collaborations with other fields and also to help attract

  9. User-Centered Computer Aided Language Learning

    ERIC Educational Resources Information Center

    Zaphiris, Panayiotis, Ed.; Zacharia, Giorgos, Ed.

    2006-01-01

    In the field of computer aided language learning (CALL), there is a need for emphasizing the importance of the user. "User-Centered Computer Aided Language Learning" presents methodologies, strategies, and design approaches for building interfaces for a user-centered CALL environment, creating a deeper understanding of the opportunities and…

  10. Advanced flight computer. Special study

    NASA Technical Reports Server (NTRS)

    Coo, Dennis

    1995-01-01

    This report documents a special study to define a 32-bit radiation hardened, SEU tolerant flight computer architecture, and to investigate current or near-term technologies and development efforts that contribute to the Advanced Flight Computer (AFC) design and development. An AFC processing node architecture is defined. Each node may consist of a multi-chip processor as needed. The modular, building block approach uses VLSI technology and packaging methods that demonstrate a feasible AFC module in 1998 that meets that AFC goals. The defined architecture and approach demonstrate a clear low-risk, low-cost path to the 1998 production goal, with intermediate prototypes in 1996.

  11. Center for Technology for Advanced Scientific Component Software (TASCS)

    SciTech Connect

    Damevski, Kostadin

    2009-03-30

    A resounding success of the Scientific Discover through Advanced Computing (SciDAC) program is that high-performance computational science is now universally recognized as a critical aspect of scientific discovery [71], complementing both theoretical and experimental research. As scientific communities prepare to exploit unprecedened computing capabilities of emerging leadership-class machines for multi-model simulations at the extreme scale [72], it is more important than ever to address the technical and social challenges of geographically distributed teams that combine expertise in domain science, applied mathematics, and computer science to build robust and flexible codes that can incorporate changes over time. The Center for Technology for Advanced Scientific Component Software (TASCS) tackles these issues by exploiting component-based software development to facilitate collaborative hig-performance scientific computing.

  12. Center for Advanced Modeling and Simulation Intern

    SciTech Connect

    Gertman, Vanessa

    2010-01-01

    Some interns just copy papers and seal envelopes. Not at INL! Check out how Vanessa Gertman, an INL intern working at the Center for Advanced Modeling and Simulation, spent her summer working with some intense visualization software. Lots more content like this is available at INL's facebook page http://www.facebook.com/idahonationallaboratory.

  13. National Center for Advanced Manufacturing Overview

    NASA Technical Reports Server (NTRS)

    Vickers, John H.

    2000-01-01

    This paper presents a general overview of the National Center for Advanced Manufacturing, with an emphasis on Aerospace Materials, Processes and Environmental Technology. The topics include: 1) Background; 2) Mission; 3) Technology Development Approach; 4) Space Transportation Significance; 5) Partnering; 6) NCAM MAF Project; 7) NASA & Calhoun Community College; 8) Educational Development; and 9) Intelligent Synthesis Environment. This paper is presented in viewgraph form.

  14. Center for Advanced Modeling and Simulation Intern

    ScienceCinema

    Gertman, Vanessa

    2016-07-12

    Some interns just copy papers and seal envelopes. Not at INL! Check out how Vanessa Gertman, an INL intern working at the Center for Advanced Modeling and Simulation, spent her summer working with some intense visualization software. Lots more content like this is available at INL's facebook page http://www.facebook.com/idahonationallaboratory.

  15. Advanced technologies for Mission Control Centers

    NASA Technical Reports Server (NTRS)

    Dalton, John T.; Hughes, Peter M.

    1991-01-01

    Advance technologies for Mission Control Centers are presented in the form of the viewgraphs. The following subject areas are covered: technology needs; current technology efforts at GSFC (human-machine interface development, object oriented software development, expert systems, knowledge-based software engineering environments, and high performance VLSI telemetry systems); and test beds.

  16. Computers in Information Data Centers.

    ERIC Educational Resources Information Center

    Clifton, Joe Ann, Ed.; Helgeson, Duane, Ed.

    This collection of ten conference reports begins with "Historical Impact of Computers on the Operation of Libraries and Information Systems: A Personal Perspective" and is followed by "Tips on Computer Software: Advantages and Methods of Program Dissemination" and "BLOCS--A Unique Multi-Dimensional Approach to On-Line Circulation". Next, libraries…

  17. Center for Center for Technology for Advanced Scientific Component Software (TASCS)

    SciTech Connect

    Kostadin, Damevski

    2015-01-25

    A resounding success of the Scientific Discovery through Advanced Computing (SciDAC) program is that high-performance computational science is now universally recognized as a critical aspect of scientific discovery [71], complementing both theoretical and experimental research. As scientific communities prepare to exploit unprecedented computing capabilities of emerging leadership-class machines for multi-model simulations at the extreme scale [72], it is more important than ever to address the technical and social challenges of geographically distributed teams that combine expertise in domain science, applied mathematics, and computer science to build robust and flexible codes that can incorporate changes over time. The Center for Technology for Advanced Scientific Component Software (TASCS)1 tackles these these issues by exploiting component-based software development to facilitate collaborative high-performance scientific computing.

  18. Computer Simulation of Community Mental Health Centers.

    ERIC Educational Resources Information Center

    Cox, Gary B.; And Others

    1985-01-01

    Describes an ongoing research project designed to develop a computer model capable of simulating the service delivery activities of community mental health care centers and human service agencies. The goal and methodology of the project are described. (NB)

  19. Management Controls in Navy Computing Centers.

    DTIC Science & Technology

    1984-03-01

    38 (.1 Use of Data ty Managesent and Decentralized Un its .. .. .. .. .. .. .. .. .. . .. 6 3 II ii A. ICLI OP fnVAGEMEBI CONTROL SYSTEMS...NAVAL POSTGRADUATE SCHOOL Monterey, California 11 : 24 THESIS MANAGEMENT CONTROLS IN NAVY COMPUTING CENTERS by Dewey R. Collier...RECIPIENT’S CATALOG NUMBER 4. TITLE (Amd SueitiI) S. TYPE Or REPORT a PERIOD COVERED Management Controls in Navy Computing Master’s Thesis Centers March

  20. Plug Pulled on Chemistry Computer Center.

    ERIC Educational Resources Information Center

    Robinson, Arthur L.

    1980-01-01

    Discusses the controversy surrounding the initial decision to establish, and the current decision to phase out, the National Resource for Computation in Chemistry (NRCC), a computational chemistry center jointly sponsored by the National Science Foundation and the Department of Energy. (CS)

  1. Center for Advanced Energy Studies Program Plan

    SciTech Connect

    Kevin Kostelnik

    2005-09-01

    The world is facing critical energy-related challenges regarding world and national energy demands, advanced science and energy technology delivery, nuclear engineering educational shortfalls, and adequately trained technical staff. Resolution of these issues is important for the United States to ensure a secure and affordable energy supply, which is essential for maintaining U.S. national security, continued economic prosperity, and future sustainable development. One way that the U.S. Department of Energy (DOE) is addressing these challenges is by tasking the Battelle Energy Alliance, LLC (BEA) with developing the Center for Advanced Energy Studies (CAES) at the Idaho National Laboratory (INL). By 2015, CAES will be a self-sustaining, world-class, academic and research institution where the INL; DOE; Idaho, regional, and other national universities; and the international community will cooperate to conduct critical energy-related research, classroom instruction, technical training, policy conceptualization, public dialogue, and other events.

  2. Computer Maintenance Operations Center (CMOC), additional computer support equipment ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Computer Maintenance Operations Center (CMOC), additional computer support equipment - Beale Air Force Base, Perimeter Acquisition Vehicle Entry Phased-Array Warning System, Techinical Equipment Building, End of Spencer Paul Road, north of Warren Shingle Road (14th Street), Marysville, Yuba County, CA

  3. Computer Maintenance Operations Center (CMOC), showing duplexed cyber 170174 computers ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Computer Maintenance Operations Center (CMOC), showing duplexed cyber 170-174 computers - Beale Air Force Base, Perimeter Acquisition Vehicle Entry Phased-Array Warning System, Techinical Equipment Building, End of Spencer Paul Road, north of Warren Shingle Road (14th Street), Marysville, Yuba County, CA

  4. Center for Advanced Gas Turbine Systems Research

    SciTech Connect

    Golan, L.P.

    1992-12-31

    An unregulated conventional power station based on the Rankine Cycle typically bums pulverized coal in a boiler that exports steam for expansion through a steam turbine which ultimately drives an electric generator. The flue gases are normally cleaned of particulates by an electrostatic precipitator or bag house. A basic cycle such as this will have an efficiency of approximately 35% with 10% of the energy released through the stack and 55% to cooling water. Advanced gas turbine based combustion systems have the potential to be environmentally and commercially superior to existing conventional technology. however, to date, industry, academic, and government groups have not coordinated their effort to commercialize these technologies. The Center for Advanced Gas Turbine Systems Research will provide the medium to support effective commercialization of this technology. Several cycles or concepts for advanced gas turbine systems that could be fired on natural gas or could be adapted into coal based systems have been proposed (for examples, see Figures 4, 5, 6, and 7) (2) all with vary degrees of complexity, research needs, and system potential. Natural gas fired power systems are now available with 52% efficiency ratings; however, with a focused base technology program, it is expected that the efficiency levels can be increased to the 60% level and beyond. This increase in efficiency will significantly reduce the environmental burden and reduce the cost of power generation.

  5. Center for Advanced Gas Turbine Systems Research

    SciTech Connect

    Golan, L.P.

    1992-01-01

    An unregulated conventional power station based on the Rankine Cycle typically bums pulverized coal in a boiler that exports steam for expansion through a steam turbine which ultimately drives an electric generator. The flue gases are normally cleaned of particulates by an electrostatic precipitator or bag house. A basic cycle such as this will have an efficiency of approximately 35% with 10% of the energy released through the stack and 55% to cooling water. Advanced gas turbine based combustion systems have the potential to be environmentally and commercially superior to existing conventional technology. however, to date, industry, academic, and government groups have not coordinated their effort to commercialize these technologies. The Center for Advanced Gas Turbine Systems Research will provide the medium to support effective commercialization of this technology. Several cycles or concepts for advanced gas turbine systems that could be fired on natural gas or could be adapted into coal based systems have been proposed (for examples, see Figures 4, 5, 6, and 7) (2) all with vary degrees of complexity, research needs, and system potential. Natural gas fired power systems are now available with 52% efficiency ratings; however, with a focused base technology program, it is expected that the efficiency levels can be increased to the 60% level and beyond. This increase in efficiency will significantly reduce the environmental burden and reduce the cost of power generation.

  6. Application of advanced computational technology to propulsion CFD

    NASA Astrophysics Data System (ADS)

    Szuch, John R.

    The Internal Fluid Mechanics Division of the NASA Lewis Research Center is combining the key elements of computational fluid dynamics, aerothermodynamic experiments, and advanced computational technology to bring internal computational fluid dynamics (ICFM) to a state of practical application for aerospace propulsion system design. This paper presents an overview of efforts underway at NASA Lewis to advance and apply computational technology to ICFM. These efforts include the use of modern, software engineering principles for code development, the development of an AI-based user-interface for large codes, the establishment of a high-performance, data communications network to link ICFM researchers and facilities, and the application of parallel processing to speed up computationally intensive and/or time-critical ICFM problems. A multistage compressor flow physics program is cited as an example of efforts to use advanced computational technology to enhance a current NASA Lewis ICFM research program.

  7. Digital optical computers at the optoelectronic computing systems center

    NASA Technical Reports Server (NTRS)

    Jordan, Harry F.

    1991-01-01

    The Digital Optical Computing Program within the National Science Foundation Engineering Research Center for Opto-electronic Computing Systems has as its specific goal research on optical computing architectures suitable for use at the highest possible speeds. The program can be targeted toward exploiting the time domain because other programs in the Center are pursuing research on parallel optical systems, exploiting optical interconnection and optical devices and materials. Using a general purpose computing architecture as the focus, we are developing design techniques, tools and architecture for operation at the speed of light limit. Experimental work is being done with the somewhat low speed components currently available but with architectures which will scale up in speed as faster devices are developed. The design algorithms and tools developed for a general purpose, stored program computer are being applied to other systems such as optimally controlled optical communication networks.

  8. Argonne's Laboratory computing center - 2007 annual report.

    SciTech Connect

    Bair, R.; Pieper, G. W.

    2008-05-28

    Argonne National Laboratory founded the Laboratory Computing Resource Center (LCRC) in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. In September 2002 the LCRC deployed a 350-node computing cluster from Linux NetworX to address Laboratory needs for mid-range supercomputing. This cluster, named 'Jazz', achieved over a teraflop of computing power (1012 floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the 50 fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2007, there were over 60 active projects representing a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to foster growth in the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure providers to offer more scientific data management capabilities, expanding Argonne staff use of national computing facilities, and improving the scientific reach and

  9. Re-Centering the Research Computing Enterprise

    ERIC Educational Resources Information Center

    McRobbie, Michael A.

    2006-01-01

    The insatiable institutional demands for computing cycles, network bandwidth, and storage clearly demonstrate that IT is a mission-critical function in nearly all areas of higher education. Not too long ago, the important issue for the central data center was physical size and floor space. As IT leaders struggle to meet relentlessly increasing…

  10. Hibbing Community College's Community Computer Center.

    ERIC Educational Resources Information Center

    Regional Technology Strategies, Inc., Carrboro, NC.

    This paper reports on the development of the Community Computer Center (CCC) at Hibbing Community College (HCC) in Minnesota. HCC is located in the largest U.S. iron mining area in the United States. Closures of steel-producing plants are affecting the Hibbing area. Outmigration, particularly of younger workers and their families, has been…

  11. Computer Access Centers for the Disabled.

    ERIC Educational Resources Information Center

    Hawk, Doug

    1995-01-01

    Describes Computer Access Centers (CACs), created by the Colorado Community College and Occupational Education System to help students with disabilities learn job skills and improve their lives. Indicates that, despite fears about program costs, CACs are in place at 11 sites and have expanded access for people with disabilities. (MAB)

  12. CENTER FOR ADVANCED SEPARATION TECHNOLOGY (CAST) PROGRAM

    SciTech Connect

    Yoon, Roe-Hoan; Hull, Christopher

    2014-09-30

    The U.S. is the largest producer of mining products in the world. In 2011, U.S. mining operations contributed a total of $232 billion to the nation’s GDP plus $138 billion in labor income. Of this the coal mining industry contributed a total of $97.5 billion to GDP plus $53 billion in labor income. Despite these contributions, the industry has not been well supported with research and development funds as compared to mining industries in other countries. To overcome this problem, the Center for Advanced Separation Technologies (CAST) was established to develop technologies that can be used by the U.S. mining industry to create new products, reduce production costs, and meet environmental regulations.

  13. The New Center for Advanced Energy Studies

    SciTech Connect

    L.J. Bond; K. Kostelnik; R.A. Wharton; A. Kadak

    2006-06-01

    A secure and affordable energy supply is essential for achieving U.S. national security, in continuing U.S. prosperity and in laying the foundation to enable future economic growth. The next generation energy workforce in the U.S. is a critical element in meeting both national and global energy needs. The Center for Advanced Energy Studies (CAES) was established in 2005 in response to U.S. Department of Energy (DOE) requirements. CAES, located at the new Idaho National Laboratory (INL), will address critical energy education, research, policy study and training needs. CAES is a unique joint partnership between the Battelle Energy Alliance (BEA), the State of Idaho, an Idaho University Consortium (IUC), and a National University Consortium (NUC). CAES will be based in a new facility that will foster collaborative academic and research efforts among participating institutions.

  14. Telemetry Computer System at Wallops Flight Center

    NASA Technical Reports Server (NTRS)

    Bell, H.; Strock, J.

    1980-01-01

    This paper describes the Telemetry Computer System in operation at NASA's Wallops Flight Center for real-time or off-line processing, storage, and display of telemetry data from rockets and aircraft. The system accepts one or two PCM data streams and one FM multiplex, converting each type of data into computer format and merging time-of-day information. A data compressor merges the active streams, and removes redundant data if desired. Dual minicomputers process data for display, while storing information on computer tape for further processing. Real-time displays are located at the station, at the rocket launch control center, and in the aircraft control tower. The system is set up and run by standard telemetry software under control of engineers and technicians. Expansion capability is built into the system to take care of possible future requirements.

  15. The Advanced Technology Development Center (ATDC)

    NASA Technical Reports Server (NTRS)

    Clements, G. R.; Willcoxon, R. (Technical Monitor)

    2001-01-01

    NASA is building the Advanced Technology Development Center (ATDC) to provide a 'national resource' for the research, development, demonstration, testing, and qualification of Spaceport and Range Technologies. The ATDC will be located at Space Launch Complex 20 (SLC-20) at Cape Canaveral Air Force Station (CCAFS) in Florida. SLC-20 currently provides a processing and launch capability for small-scale rockets; this capability will be augmented with additional ATDC facilities to provide a comprehensive and integrated in situ environment. Examples of Spaceport Technologies that will be supported by ATDC infrastructure include densified cryogenic systems, intelligent automated umbilicals, integrated vehicle health management systems, next-generation safety systems, and advanced range systems. The ATDC can be thought of as a prototype spaceport where industry, government, and academia, in partnership, can work together to improve safety of future space initiatives. The ATDC is being deployed in five separate phases. Major ATDC facilities will include a Liquid Oxygen Area; a Liquid Hydrogen Area, a Liquid Nitrogen Area, and a multipurpose Launch Mount; 'Iron Rocket' Test Demonstrator; a Processing Facility with a Checkout and Control System; and Future Infrastructure Developments. Initial ATDC development will be completed in 2006.

  16. Center for Advanced Signal and Imaging Sciences Workshop 2004

    SciTech Connect

    McClellan, J H; Carrano, C; Poyneer, L; Palmer, D; Baker, K; Chen, D; London, R; Weinert, G; Brase, J; Paglieroni, D; Lopez, A; Grant, C W; Wright, W; Burke, M; Miller, W O; DeTeresa, S; White, D; Toeppen, J; Haugen, P; Kamath, C; Nguyen, T; Manay, S; Newsam, S; Cantu-Paz, E; Pao, H; Chang, J; Chambers, D; Leach, R; Paulson, C; Romero, C E; Spiridon, A; Vigars, M; Welsh, P; Zumstein, J; Romero, K; Oppenheim, A; Harris, D B; Dowla, F; Brown, C G; Clark, G A; Ong, M M; Clance, T J; Kegelmeyer, l M; Benzuijen, M; Bliss, E; Burkhart, S; Conder, A; Daveler, S; Ferguson, W; Glenn, S; Liebman, J; Norton, M; Prasad, R; Salmon, T; Kegelmeyer, L M; Hafiz, O; Cheung, S; Fodor, I; Aufderheide, M B; Bary, A; Martz, Jr., H E; Burke, M W; Benson, S; Fisher, K A; Quarry, M J

    2004-11-15

    Welcome to the Eleventh Annual C.A.S.I.S. Workshop, a yearly event at the Lawrence Livermore National Laboratory, presented by the Center for Advanced Signal & Image Sciences, or CASIS, and sponsored by the LLNL Engineering Directorate. Every November for the last 10 years we have convened a diverse set of engineering and scientific talent to share their work in signal processing, imaging, communications, controls, along with associated fields of mathematics, statistics, and computing sciences. This year is no exception, with sessions in Adaptive Optics, Applied Imaging, Scientific Data Mining, Electromagnetic Image and Signal Processing, Applied Signal Processing, National Ignition Facility (NIF) Imaging, and Nondestructive Characterization.

  17. NASA's National Center for Advanced Manufacturing

    NASA Technical Reports Server (NTRS)

    Vickers, John

    2003-01-01

    NASA has designated the Principal Center Assignment to the Marshall Space Flight Center (MSFC) for implementation of the National Center for Advanced Manufacturing (NCAM). NCAM is NASA s leading resource for the aerospace manufacturing research, development, and innovation needs that are critical to the goals of the Agency. Through this initiative NCAM s people work together with government, industry, and academia to ensure the technology base and national infrastructure are available to develop innovative manufacturing technologies with broad application to NASA Enterprise programs, and U.S. industry. Educational enhancements are ever-present within the NCAM focus to promote research, to inspire participation and to support education and training in manufacturing. Many important accomplishments took place during 2002. Through NCAM, NASA was among five federal agencies involved in manufacturing research and development (R&D) to launch a major effort to exchange information and cooperate directly to enhance the payoffs from federal investments. The Government Agencies Technology Exchange in Manufacturing (GATE-M) is the only active effort to specifically and comprehensively address manufacturing R&D across the federal government. Participating agencies include the departments of Commerce (represented by the National Institute of Standards and Technology), Defense, and Energy, as well as the National Science Foundation and NASA. MSFC s ongoing partnership with the State of Louisiana, the University of New Orleans, and Lockheed Martin Corporation at the Michoud Assembly Facility (MAF) progressed significantly. Major capital investments were initiated for world-class equipment additions including a universal friction stir welding system, composite fiber placement machine, five-axis machining center, and ten-axis laser ultrasonic nondestructive test system. The NCAM consortium of five universities led by University of New Orleans with Mississippi State University

  18. Computer Bits: The Ideal Computer System for Your Center.

    ERIC Educational Resources Information Center

    Brown, Dennis; Neugebauer, Roger

    1986-01-01

    Reviews five computer systems that can address the needs of a child care center: (1) Sperry PC IT with Bernoulli Box, (2) Compaq DeskPro 286, (3) Macintosh Plus, (4) Epson Equity II, and (5) Leading Edge Model "D." (HOD)

  19. THE CENTER FOR DATA INTENSIVE COMPUTING

    SciTech Connect

    GLIMM,J.

    2001-11-01

    CDIC will provide state-of-the-art computational and computer science for the Laboratory and for the broader DOE and scientific community. We achieve this goal by performing advanced scientific computing research in the Laboratory's mission areas of High Energy and Nuclear Physics, Biological and Environmental Research, and Basic Energy Sciences. We also assist other groups at the Laboratory to reach new levels of achievement in computing. We are ''data intensive'' because the production and manipulation of large quantities of data are hallmarks of scientific research in the 21st century and are intrinsic features of major programs at Brookhaven. An integral part of our activity to accomplish this mission will be a close collaboration with the University at Stony Brook.

  20. THE CENTER FOR DATA INTENSIVE COMPUTING

    SciTech Connect

    GLIMM,J.

    2002-11-01

    CDIC will provide state-of-the-art computational and computer science for the Laboratory and for the broader DOE and scientific community. We achieve this goal by performing advanced scientific computing research in the Laboratory's mission areas of High Energy and Nuclear Physics, Biological and Environmental Research, and Basic Energy Sciences. We also assist other groups at the Laboratory to reach new levels of achievement in computing. We are ''data intensive'' because the production and manipulation of large quantities of data are hallmarks of scientific research in the 21st century and are intrinsic features of major programs at Brookhaven. An integral part of our activity to accomplish this mission will be a close collaboration with the University at Stony Brook.

  1. THE CENTER FOR DATA INTENSIVE COMPUTING

    SciTech Connect

    GLIMM,J.

    2003-11-01

    CDIC will provide state-of-the-art computational and computer science for the Laboratory and for the broader DOE and scientific community. We achieve this goal by performing advanced scientific computing research in the Laboratory's mission areas of High Energy and Nuclear Physics, Biological and Environmental Research, and Basic Energy Sciences. We also assist other groups at the Laboratory to reach new levels of achievement in computing. We are ''data intensive'' because the production and manipulation of large quantities of data are hallmarks of scientific research in the 21st century and are intrinsic features of major programs at Brookhaven. An integral part of our activity to accomplish this mission will be a close collaboration with the University at Stony Brook.

  2. Human Centered Computing for Mars Exploration

    NASA Technical Reports Server (NTRS)

    Trimble, Jay

    2005-01-01

    The science objectives are to determine the aqueous, climatic, and geologic history of a site on Mars where conditions may have been favorable to the preservation of evidence of prebiotic or biotic processes. Human Centered Computing is a development process that starts with users and their needs, rather than with technology. The goal is a system design that serves the user, where the technology fits the task and the complexity is that of the task not of the tool.

  3. Aerodynamic Analyses Requiring Advanced Computers, part 2

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Papers given at the conference present the results of theoretical research on aerodynamic flow problems requiring the use of advanced computers. Topics discussed include two-dimensional configurations, three-dimensional configurations, transonic aircraft, and the space shuttle.

  4. Aerodynamic Analyses Requiring Advanced Computers, Part 1

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Papers are presented which deal with results of theoretical research on aerodynamic flow problems requiring the use of advanced computers. Topics discussed include: viscous flows, boundary layer equations, turbulence modeling and Navier-Stokes equations, and internal flows.

  5. Bringing Advanced Computational Techniques to Energy Research

    SciTech Connect

    Mitchell, Julie C

    2012-11-17

    Please find attached our final technical report for the BACTER Institute award. BACTER was created as a graduate and postdoctoral training program for the advancement of computational biology applied to questions of relevance to bioenergy research.

  6. Quantum chromodynamics with advanced computing

    SciTech Connect

    Kronfeld, Andreas S.; /Fermilab

    2008-07-01

    We survey results in lattice quantum chromodynamics from groups in the USQCD Collaboration. The main focus is on physics, but many aspects of the discussion are aimed at an audience of computational physicists.

  7. A Computer Learning Center for Environmental Sciences

    NASA Technical Reports Server (NTRS)

    Mustard, John F.

    2000-01-01

    In the fall of 1998, MacMillan Hall opened at Brown University to students. In MacMillan Hall was the new Computer Learning Center, since named the EarthLab which was outfitted with high-end workstations and peripherals primarily focused on the use of remotely sensed and other spatial data in the environmental sciences. The NASA grant we received as part of the "Centers of Excellence in Applications of Remote Sensing to Regional and Global Integrated Environmental Assessments" was the primary source of funds to outfit this learning and research center. Since opening, we have expanded the range of learning and research opportunities and integrated a cross-campus network of disciplines who have come together to learn and use spatial data of all kinds. The EarthLab also forms a core of undergraduate, graduate, and faculty research on environmental problems that draw upon the unique perspective of remotely sensed data. Over the last two years, the Earthlab has been a center for research on the environmental impact of water resource use in and regions, impact of the green revolution on forest cover in India, the design of forest preserves in Vietnam, and detailed assessments of the utility of thermal and hyperspectral data for water quality analysis. It has also been used extensively for local environmental activities, in particular studies on the impact of lead on the health of urban children in Rhode Island. Finally, the EarthLab has also served as a key educational and analysis center for activities related to the Brown University Affiliated Research Center that is devoted to transferring university research to the private sector.

  8. National Center for Advancing Translational Sciences

    MedlinePlus

    ... Groups Work with NCATS Research Team Advances Evatar Female Reproductive System Through its Tissue Chip for Drug Screening program, ... parasites and bacteria. More... Research Team Advances Evatar Female Reproductive System Through its Tissue Chip for Drug Screening program, ...

  9. Developing an Advanced Environment for Collaborative Computing

    NASA Technical Reports Server (NTRS)

    Becerra-Fernandez, Irma; Stewart, Helen; DelAlto, Martha; DelAlto, Martha; Knight, Chris

    1999-01-01

    Knowledge management in general tries to organize and make available important know-how, whenever and where ever is needed. Today, organizations rely on decision-makers to produce "mission critical" decisions that am based on inputs from multiple domains. The ideal decision-maker has a profound understanding of specific domains that influence the decision-making process coupled with the experience that allows them to act quickly and decisively on the information. In addition, learning companies benefit by not repeating costly mistakes, and by reducing time-to-market in Research & Development projects. Group-decision making tools can help companies make better decisions by capturing the knowledge from groups of experts. Furthermore, companies that capture their customers preferences can improve their customer service, which translates to larger profits. Therefore collaborative computing provides a common communication space, improves sharing of knowledge, provides a mechanism for real-time feedback on the tasks being performed, helps to optimize processes, and results in a centralized knowledge warehouse. This paper presents the research directions. of a project which seeks to augment an advanced collaborative web-based environment called Postdoc, with workflow capabilities. Postdoc is a "government-off-the-shelf" document management software developed at NASA-Ames Research Center (ARC).

  10. Center for the Advancement of Health

    MedlinePlus

    ... YouTube CFAH PARTNERS Alliance for Quality Psychosocial Cancer Care Kellogg Health Scholars Program KP Burch Leadership Program Diversity Data Place, Migration & Health Network * The Center for ...

  11. Center for Space Power and Advanced Electronics, Auburn University

    NASA Technical Reports Server (NTRS)

    Deis, Dan W.; Hopkins, Richard H.

    1991-01-01

    The union of Auburn University's Center for Space Power and Advanced Electronics and the Westinghouse Science and Technology Center to form a Center for the Commercial Development of Space (CCDS) is discussed. An area of focus for the CCDS will be the development of silicon carbide electronics technology, in terms of semiconductors and crystal growth. The discussion is presented in viewgraph form.

  12. Opportunities in computational mechanics: Advances in parallel computing

    SciTech Connect

    Lesar, R.A.

    1999-02-01

    In this paper, the authors will discuss recent advances in computing power and the prospects for using these new capabilities for studying plasticity and failure. They will first review the new capabilities made available with parallel computing. They will discuss how these machines perform and how well their architecture might work on materials issues. Finally, they will give some estimates on the size of problems possible using these computers.

  13. Advances and challenges in computational plasma science

    NASA Astrophysics Data System (ADS)

    Tang, W. M.

    2005-02-01

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behaviour. Recent advances in simulations of magnetically confined plasmas are reviewed in this paper, with illustrative examples, chosen from associated research areas such as microturbulence, magnetohydrodynamics and other topics. Progress has been stimulated, in particular, by the exponential growth of computer speed along with significant improvements in computer technology. The advances in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics have produced increasingly good agreement between experimental observations and computational modelling. This was enabled by two key factors: (a) innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales and (b) access to powerful new computational resources. Excellent progress has been made in developing codes for which computer run-time and problem-size scale well with the number of processors on massively parallel processors (MPPs). Examples include the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPPs to produce three-dimensional, general geometry, nonlinear particle simulations that have accelerated advances in understanding the nature of turbulence self-regulation by zonal flows. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In looking towards the future, the current results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. This

  14. [Activities of Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2001-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administrations missions. RIACS is located at the NASA Ames Research Center, Moffett Field, California. RIACS research focuses on the three cornerstones of IT research necessary to meet the future challenges of NASA missions: 1. Automated Reasoning for Autonomous Systems Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. 2. Human-Centered Computing Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities. 3. High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to analysis of large scientific datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply IT research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, visiting scientist programs and student summer programs, designed to encourage and facilitate collaboration between the university and NASA IT research communities.

  15. Role of HPC in Advancing Computational Aeroelasticity

    NASA Technical Reports Server (NTRS)

    Guruswamy, Guru P.

    2004-01-01

    On behalf of the High Performance Computing and Modernization Program (HPCMP) and NASA Advanced Supercomputing Division (NAS) a study is conducted to assess the role of supercomputers on computational aeroelasticity of aerospace vehicles. The study is mostly based on the responses to a web based questionnaire that was designed to capture the nuances of high performance computational aeroelasticity, particularly on parallel computers. A procedure is presented to assign a fidelity-complexity index to each application. Case studies based on major applications using HPCMP resources are presented.

  16. Advances and trends in computational structural mechanics

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Atluri, Satya N.

    1987-01-01

    The development status and applicational range of techniques in computational structural mechanics (CSM) are evaluated with a view to advances in computational models for material behavior, discrete-element technology, quality assessment, the control of numerical simulations of structural response, hybrid analysis techniques, techniques for large-scale optimization, and the impact of new computing systems on CSM. Primary pacers of CSM development encompass prediction and analysis of novel materials for structural components, computational strategies for large-scale structural calculations, and the assessment of response prediction reliability together with its adaptive improvement.

  17. Advances and Challenges in Computational Plasma Science

    SciTech Connect

    W.M. Tang; V.S. Chan

    2005-01-03

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behavior. Recent advances in simulations of magnetically-confined plasmas are reviewed in this paper with illustrative examples chosen from associated research areas such as microturbulence, magnetohydrodynamics, and other topics. Progress has been stimulated in particular by the exponential growth of computer speed along with significant improvements in computer technology.

  18. Advanced computational tools for 3-D seismic analysis

    SciTech Connect

    Barhen, J.; Glover, C.W.; Protopopescu, V.A.

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  19. Center for Advanced Space Propulsion Second Annual Technical Symposium Proceedings

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The proceedings for the Center for Advanced Space Propulsion Second Annual Technical Symposium are divided as follows: Chemical Propulsion, CFD; Space Propulsion; Electric Propulsion; Artificial Intelligence; Low-G Fluid Management; and Rocket Engine Materials.

  20. Cosmos, an international center for advanced studies

    NASA Technical Reports Server (NTRS)

    Ryzhov, Iurii; Alifanov, Oleg; Sadin, Stanley; Coleman, Paul

    1990-01-01

    The concept of Cosmos, a Soviet operating center for aerospace activities, is presented. The main Cosmos participants are the Institute for Aerospace Education, the Institute for Research and Commercial Development, and the Department of Space Policy and Socio-Economic Studies. Cosmos sponsors a number of educational programs, basic research, and studies of the social impact of space-related technologies.

  1. Center For Advanced Energy Studies Overview

    ScienceCinema

    Blackman, Harold

    2016-07-12

    A collaboration between Idaho National Laboratory, Boise State University, Idaho State University and the University of Idaho. Conducts research in nuclear energy, advanced materials, carbon management, bioenergy, energy policy, modeling and simulation, and energy efficiency. Educates next generation of energy workforce. Visit us at www.caesenergy.org.

  2. Center For Advanced Energy Studies Overview

    SciTech Connect

    Blackman, Harold

    2011-01-01

    A collaboration between Idaho National Laboratory, Boise State University, Idaho State University and the University of Idaho. Conducts research in nuclear energy, advanced materials, carbon management, bioenergy, energy policy, modeling and simulation, and energy efficiency. Educates next generation of energy workforce. Visit us at www.caesenergy.org.

  3. Computational Nanotechnology at NASA Ames Research Center, 1996

    NASA Technical Reports Server (NTRS)

    Globus, Al; Bailey, David; Langhoff, Steve; Pohorille, Andrew; Levit, Creon; Chancellor, Marisa K. (Technical Monitor)

    1996-01-01

    Some forms of nanotechnology appear to have enormous potential to improve aerospace and computer systems; computational nanotechnology, the design and simulation of programmable molecular machines, is crucial to progress. NASA Ames Research Center has begun a computational nanotechnology program including in-house work, external research grants, and grants of supercomputer time. Four goals have been established: (1) Simulate a hypothetical programmable molecular machine replicating itself and building other products. (2) Develop molecular manufacturing CAD (computer aided design) software and use it to design molecular manufacturing systems and products of aerospace interest, including computer components. (3) Characterize nanotechnologically accessible materials of aerospace interest. Such materials may have excellent strength and thermal properties. (4) Collaborate with experimentalists. Current in-house activities include: (1) Development of NanoDesign, software to design and simulate a nanotechnology based on functionalized fullerenes. Early work focuses on gears. (2) A design for high density atomically precise memory. (3) Design of nanotechnology systems based on biology. (4) Characterization of diamonoid mechanosynthetic pathways. (5) Studies of the laplacian of the electronic charge density to understand molecular structure and reactivity. (6) Studies of entropic effects during self-assembly. Characterization of properties of matter for clusters up to sizes exhibiting bulk properties. In addition, the NAS (NASA Advanced Supercomputing) supercomputer division sponsored a workshop on computational molecular nanotechnology on March 4-5, 1996 held at NASA Ames Research Center. Finally, collaborations with Bill Goddard at CalTech, Ralph Merkle at Xerox Parc, Don Brenner at NCSU (North Carolina State University), Tom McKendree at Hughes, and Todd Wipke at UCSC are underway.

  4. Advanced Computing Architectures for Cognitive Processing

    DTIC Science & Technology

    2009-07-01

    AND IS APPROVED FOR PUBLICATION IN ACCORDANCE WITH ASSIGNED DISTRIBUTION STATEMENT. FOR THE DIRECTOR: / s ... s / LOK YAN EDWARD J. JONES, Deputy Chief Work Unit Manager Advanced Computing Division...ELEMENT NUMBER 62702F 6. AUTHOR( S ) Gregory D. Peterson 5d. PROJECT NUMBER 459T 5e. TASK NUMBER AC 5f. WORK UNIT NUMBER CP 7. PERFORMING

  5. Predictive Dynamic Security Assessment through Advanced Computing

    SciTech Connect

    Huang, Zhenyu; Diao, Ruisheng; Jin, Shuangshuang; Chen, Yousu

    2014-11-30

    Abstract— Traditional dynamic security assessment is limited by several factors and thus falls short in providing real-time information to be predictive for power system operation. These factors include the steady-state assumption of current operating points, static transfer limits, and low computational speed. This addresses these factors and frames predictive dynamic security assessment. The primary objective of predictive dynamic security assessment is to enhance the functionality and computational process of dynamic security assessment through the use of high-speed phasor measurements and the application of advanced computing technologies for faster-than-real-time simulation. This paper presents algorithms, computing platforms, and simulation frameworks that constitute the predictive dynamic security assessment capability. Examples of phasor application and fast computation for dynamic security assessment are included to demonstrate the feasibility and speed enhancement for real-time applications.

  6. Simulation methods for advanced scientific computing

    SciTech Connect

    Booth, T.E.; Carlson, J.A.; Forster, R.A.

    1998-11-01

    This is the final report of a three-year, Laboratory Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The objective of the project was to create effective new algorithms for solving N-body problems by computer simulation. The authors concentrated on developing advanced classical and quantum Monte Carlo techniques. For simulations of phase transitions in classical systems, they produced a framework generalizing the famous Swendsen-Wang cluster algorithms for Ising and Potts models. For spin-glass-like problems, they demonstrated the effectiveness of an extension of the multicanonical method for the two-dimensional, random bond Ising model. For quantum mechanical systems, they generated a new method to compute the ground-state energy of systems of interacting electrons. They also improved methods to compute excited states when the diffusion quantum Monte Carlo method is used and to compute longer time dynamics when the stationary phase quantum Monte Carlo method is used.

  7. NASA Center for Computational Sciences: History and Resources

    NASA Technical Reports Server (NTRS)

    2000-01-01

    The Nasa Center for Computational Sciences (NCCS) has been a leading capacity computing facility, providing a production environment and support resources to address the challenges facing the Earth and space sciences research community.

  8. Center for Computing Research Summer Research Proceedings 2015.

    SciTech Connect

    Bradley, Andrew Michael; Parks, Michael L.

    2015-12-18

    The Center for Computing Research (CCR) at Sandia National Laboratories organizes a summer student program each summer, in coordination with the Computer Science Research Institute (CSRI) and Cyber Engineering Research Institute (CERI).

  9. Soft computing in design and manufacturing of advanced materials

    NASA Technical Reports Server (NTRS)

    Cios, Krzysztof J.; Baaklini, George Y; Vary, Alex

    1993-01-01

    The potential of fuzzy sets and neural networks, often referred to as soft computing, for aiding in all aspects of manufacturing of advanced materials like ceramics is addressed. In design and manufacturing of advanced materials, it is desirable to find which of the many processing variables contribute most to the desired properties of the material. There is also interest in real time quality control of parameters that govern material properties during processing stages. The concepts of fuzzy sets and neural networks are briefly introduced and it is shown how they can be used in the design and manufacturing processes. These two computational methods are alternatives to other methods such as the Taguchi method. The two methods are demonstrated by using data collected at NASA Lewis Research Center. Future research directions are also discussed.

  10. Computational Fluid Dynamics Program at NASA Ames Research Center

    NASA Technical Reports Server (NTRS)

    Holst, Terry L.

    1989-01-01

    The Computational Fluid Dynamics (CFD) Program at NASA Ames Research Center is reviewed and discussed. The technical elements of the CFD Program are listed and briefly discussed. These elements include algorithm research, research and pilot code development, scientific visualization, advanced surface representation, volume grid generation, and numerical optimization. Next, the discipline of CFD is briefly discussed and related to other areas of research at NASA Ames including experimental fluid dynamics, computer science research, computational chemistry, and numerical aerodynamic simulation. These areas combine with CFD to form a larger area of research, which might collectively be called computational technology. The ultimate goal of computational technology research at NASA Ames is to increase the physical understanding of the world in which we live, solve problems of national importance, and increase the technical capabilities of the aerospace community. Next, the major programs at NASA Ames that either use CFD technology or perform research in CFD are listed and discussed. Briefly, this list includes turbulent/transition physics and modeling, high-speed real gas flows, interdisciplinary research, turbomachinery demonstration computations, complete aircraft aerodynamics, rotorcraft applications, powered lift flows, high alpha flows, multiple body aerodynamics, and incompressible flow applications. Some of the individual problems actively being worked in each of these areas is listed to help define the breadth or extent of CFD involvement in each of these major programs. State-of-the-art examples of various CFD applications are presented to highlight most of these areas. The main emphasis of this portion of the presentation is on examples which will not otherwise be treated at this conference by the individual presentations. Finally, a list of principal current limitations and expected future directions is given.

  11. Applied human factors research at the NASA Johnson Space Center Human-Computer Interaction Laboratory

    NASA Technical Reports Server (NTRS)

    Rudisill, Marianne; Mckay, Timothy D.

    1990-01-01

    The applied human factors research program performed at the NASA Johnson Space Center's Human-Computer Interaction Laboratory is discussed. Research is conducted to advance knowledge in human interaction with computer systems during space crew tasks. In addition, the Laboratory is directly involved in the specification of the human-computer interface (HCI) for space systems in development (e.g., Space Station Freedom) and is providing guidelines and support for HCI design to current and future space missions.

  12. Airborne Advanced Reconfigurable Computer System (ARCS)

    NASA Technical Reports Server (NTRS)

    Bjurman, B. E.; Jenkins, G. M.; Masreliez, C. J.; Mcclellan, K. L.; Templeman, J. E.

    1976-01-01

    A digital computer subsystem fault-tolerant concept was defined, and the potential benefits and costs of such a subsystem were assessed when used as the central element of a new transport's flight control system. The derived advanced reconfigurable computer system (ARCS) is a triple-redundant computer subsystem that automatically reconfigures, under multiple fault conditions, from triplex to duplex to simplex operation, with redundancy recovery if the fault condition is transient. The study included criteria development covering factors at the aircraft's operation level that would influence the design of a fault-tolerant system for commercial airline use. A new reliability analysis tool was developed for evaluating redundant, fault-tolerant system availability and survivability; and a stringent digital system software design methodology was used to achieve design/implementation visibility.

  13. Activities of the Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Oliger, Joseph

    1994-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. Research at RIACS is currently being done in the following areas: (1) parallel computing; (2) advanced methods for scientific computing; (3) high performance networks; and (4) learning systems. RIACS technical reports are usually preprints of manuscripts that have been submitted to research journals or conference proceedings. A list of these reports for the period January 1, 1994 through December 31, 1994 is in the Reports and Abstracts section of this report.

  14. 75 FR 64720 - DOE/Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-20

    .../Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing... Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U. S. Department...

  15. 75 FR 9887 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-04

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing... Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U.S. Department...

  16. 78 FR 6087 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-29

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing..., Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U. S. Department of...

  17. 75 FR 43518 - Advanced Scientific Computing Advisory Committee; Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-26

    ... Advanced Scientific Computing Advisory Committee; Meeting AGENCY: Office of Science, DOE. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing Advisory..., Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U. S. Department of...

  18. 76 FR 41234 - Advanced Scientific Computing Advisory Committee Charter Renewal

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-13

    ... Advanced Scientific Computing Advisory Committee Charter Renewal AGENCY: Department of Energy, Office of... Administration, notice is hereby given that the Advanced Scientific Computing Advisory Committee will be renewed... concerning the Advanced Scientific Computing program in response only to charges from the Director of...

  19. 76 FR 9765 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-22

    ... Advanced Scientific Computing Advisory Committee AGENCY: Office of Science, Department of Energy. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing..., Office of Advanced Scientific Computing Research, SC-21/Germantown Building, U.S. Department of...

  20. 77 FR 45345 - DOE/Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-31

    .../Advanced Scientific Computing Advisory Committee AGENCY: Office of Science, Department of Energy. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing... Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U.S. Department...

  1. 78 FR 41046 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-09

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION... hereby given that the Advanced Scientific Computing Advisory Committee will be renewed for a two-year... (DOE), on the Advanced Scientific Computing Research Program managed by the Office of...

  2. Computer Center: It's Time to Take Inventory.

    ERIC Educational Resources Information Center

    Spain, James D.

    1984-01-01

    Describes typical instructional applications of computers. Areas considered include: (1) instructional simulations and animations; (2) data analysis; (3) drill and practice; (4) student evaluation; (5) development of computer models and simulations; (6) biometrics or biostatistics; and (7) direct data acquisition and analysis. (JN)

  3. Center for Technology for Advanced Scientific Component Software (TASCS) Consolidated Progress Report July 2006 - March 2009

    SciTech Connect

    Bernholdt, D E; McInnes, L C; Govindaraju, M; Bramley, R; Epperly, T; Kohl, J A; Nieplocha, J; Armstrong, R; Shasharina, S; Sussman, A L; Sottile, M; Damevski, K

    2009-04-14

    A resounding success of the Scientific Discovery through Advanced Computing (SciDAC) program is that high-performance computational science is now universally recognized as a critical aspect of scientific discovery [71], complementing both theoretical and experimental research. As scientific communities prepare to exploit unprecedented computing capabilities of emerging leadership-class machines for multi-model simulations at the extreme scale [72], it is more important than ever to address the technical and social challenges of geographically distributed teams that combine expertise in domain science, applied mathematics, and computer science to build robust and flexible codes that can incorporate changes over time. The Center for Technology for Advanced Scientific Component Software (TASCS) tackles these issues by exploiting component-based software development to facilitate collaborative high-performance scientific computing.

  4. CROSSCUTTING TECHNOLOGY DEVELOPMENT AT THE CENTER FOR ADVANCED SEPARATION TECHNOLOGIES

    SciTech Connect

    Christopher E. Hull

    2006-05-15

    This Technical Progress Report describes progress made on the twenty nine subprojects awarded in the second year of Cooperative Agreement DE-FC26-02NT41607: Crosscutting Technology Development at the Center for Advanced Separation Technologies. This work is summarized in the body of the main report: the individual sub-project Technical Progress Reports are attached as Appendices.

  5. CROSSCUTTING TECHNOLOGY DEVELOPMENT AT THE CENTER FOR ADVANCED SEPARATION TECHNOLOGIES

    SciTech Connect

    Christopher E. Hull

    2005-11-04

    This Technical Progress Report describes progress made on the twenty nine subprojects awarded in the second year of Cooperative Agreement DE-FC26-02NT41607: Crosscutting Technology Development at the Center for Advanced Separation Technologies. This work is summarized in the body of the main report: the individual sub-project Technical Progress Reports are attached as Appendices.

  6. Crosscutting Technology Development at the Center for Advanced Separation Technologies

    SciTech Connect

    Christopher E. Hull

    2006-09-30

    This Technical Progress Report describes progress made on the twenty nine subprojects awarded in the second year of Cooperative Agreement DE-FC26-02NT41607: Crosscutting Technology Development at the Center for Advanced Separation Technologies. This work is summarized in the body of the main report: the individual sub-project Technical Progress Reports are attached as Appendices.

  7. Center for Advanced Technology Training (CATT) Feasibility Study.

    ERIC Educational Resources Information Center

    Albuquerque Technical Vocational Inst., NM.

    A study of the feasibility of establishing a Center for Advanced Technology Training (CATT) at the Albuquerque Technical Vocational Institute (TVI Community College, New Mexico) was conducted by members of the Albuquerque business community, government representatives, and college administrators. Phase 1 of the study was an examination of the…

  8. The Advanced Technology Environmental Education Center Summer Fellows Institute.

    ERIC Educational Resources Information Center

    Depken, Diane E.; Zeman, Catherine L.; Lensch, Ellen Kabat; Brown, Edward J.

    2002-01-01

    Describes the background, activities, and outcomes of the Advanced Technology Environmental Education Center (ATEEC) and its Summer Fellows Institutes as a model for disciplinary and cross-disciplinary infusion of environmental science and technology content, curriculum, and methods into the classroom. Presents experiences, themes, and activities…

  9. Computational Design of Advanced Nuclear Fuels

    SciTech Connect

    Savrasov, Sergey; Kotliar, Gabriel; Haule, Kristjan

    2014-06-03

    The objective of the project was to develop a method for theoretical understanding of nuclear fuel materials whose physical and thermophysical properties can be predicted from first principles using a novel dynamical mean field method for electronic structure calculations. We concentrated our study on uranium, plutonium, their oxides, nitrides, carbides, as well as some rare earth materials whose 4f eletrons provide a simplified framework for understanding complex behavior of the f electrons. We addressed the issues connected to the electronic structure, lattice instabilities, phonon and magnon dynamics as well as thermal conductivity. This allowed us to evaluate characteristics of advanced nuclear fuel systems using computer based simulations and avoid costly experiments.

  10. ATCA for Machines-- Advanced Telecommunications Computing Architecture

    SciTech Connect

    Larsen, R.S.; /SLAC

    2008-04-22

    The Advanced Telecommunications Computing Architecture is a new industry open standard for electronics instrument modules and shelves being evaluated for the International Linear Collider (ILC). It is the first industrial standard designed for High Availability (HA). ILC availability simulations have shown clearly that the capabilities of ATCA are needed in order to achieve acceptable integrated luminosity. The ATCA architecture looks attractive for beam instruments and detector applications as well. This paper provides an overview of ongoing R&D including application of HA principles to power electronics systems.

  11. Computer Software Management and Information Center

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Computer programs for passive anti-roll tank, earth resources laboratory applications, the NIMBUS-7 coastal zone color scanner derived products, transportable applications executive, plastic and failure analysis of composites, velocity gradient method for calculating velocities in an axisymmetric annular duct, an integrated procurement management system, data I/O PRON for the Motorola exorcisor, aerodynamic shock-layer shape, kinematic modeling, hardware library for a graphics computer, and a file archival system are documented.

  12. AHPCRC - Army High Performance Computing Research Center

    DTIC Science & Technology

    2008-01-01

    materials “from the atoms up” or to model biological systems at the molecular level. The speed and capacity of massively parallel computers are key...Streamlined, massively parallel high performance computing structural codes allow researchers to examine many relevant physical factors simultaneously...expenditure of energy, so that the drones can carry their load of sensors, communications devices, and fuel. AHPCRC researchers are using massively

  13. Evaluation of Computer Center Professional Personnel.

    ERIC Educational Resources Information Center

    LeDuc, Albert L., Jr.

    1985-01-01

    Managers within the computing services environment face interesting management challenges, including technological change. A manager is required to exercise judgment to ensure that people understand organizational performance standards. Correct evaluation is the most crucial part of personnel management. (Author/MLW)

  14. Simulations for Complex Fluid Flow Problems from Berkeley Lab's Center for Computational Sciences and Engineering (CCSE)

    DOE Data Explorer

    The Center for Computational Sciences and Engineering (CCSE) develops and applies advanced computational methodologies to solve large-scale scientific and engineering problems arising in the Department of Energy (DOE) mission areas involving energy, environmental, and industrial technology. The primary focus is in the application of structured-grid finite difference methods on adaptive grid hierarchies for compressible, incompressible, and low Mach number flows. The diverse range of scientific applications that drive the research typically involve a large range of spatial and temporal scales (e.g. turbulent reacting flows) and require the use of extremely large computing hardware, such as the 153,000-core computer, Hopper, at NERSC. The CCSE approach to these problems centers on the development and application of advanced algorithms that exploit known separations in scale; for many of the application areas this results in algorithms are several orders of magnitude more efficient than traditional simulation approaches.

  15. Computer Search Center Statistics on Users and Data Bases

    ERIC Educational Resources Information Center

    Schipma, Peter B.

    1974-01-01

    Statistics gathered over five years of operation by the IIT Research Institute's Computer Search Center are summarized for profile terms and lists, use of truncation modes, use of logic operators, some characteristics of CA Condensates, etc. (Author/JB)

  16. A Center on Communications, Control and Computation.

    DTIC Science & Technology

    1988-04-06

    Information Pro- cessing Letters, November 1986; also LIDS Report LIDS-P-1616, October 1986. 6 20980-MA Marroquin , J.L., "Optimal Bayesian Estimators...for Image Segmentation and Surface Re- construction," LIDS Report LIDS-P-1456, 5/1985. ,- Marroquin , J.L., "Probabilistic Solution of Inverse Problems...34 LIDS Report LIDS-TH-1500, Ph.D. thesis, Department of Electrical Engineering and Computer Science, September 1985. Marroquin , J.L., "Bayesian

  17. AHPCRC - Army High Performance Computing Research Center

    DTIC Science & Technology

    2010-01-01

    treatments and reconstructive surgeries . High performance computer simu- lation allows designers to try out numerous mechanical and material...investigating the effect of techniques for simplifying the calculations (sending the projectile through a pre-existing hole, for example) on the accuracy of...semiconductor particles are size-dependent. These properties, including yield strength and resistance to fatigue, are not well predicted by macroscopic

  18. Center for the Integration of Optical Computing

    DTIC Science & Technology

    1993-10-15

    completed, the largest gains in III-V semiconductors, which we achieved in GaAs, required moving gratings, a difficult-to-achieve technology in which a...wavelength-division-multiplexing (WDM). However, transmitting many WDM amplified channels is difficult to implement since the EDFA gain is wavelength...optical computing system due to the EDFA’s non-uniform gain . I1) Passive equalization of non-uniform EDFA gain by optical filtering for transmission

  19. Computer based human-centered display system

    NASA Technical Reports Server (NTRS)

    Still, David L. (Inventor); Temme, Leonard A. (Inventor)

    2002-01-01

    A human centered informational display is disclosed that can be used with vehicles (e.g. aircraft) and in other operational environments where rapid human centered comprehension of an operational environment is required. The informational display integrates all cockpit information into a single display in such a way that the pilot can clearly understand with a glance, his or her spatial orientation, flight performance, engine status and power management issues, radio aids, and the location of other air traffic, runways, weather, and terrain features. With OZ the information is presented as an integrated whole, the pilot instantaneously recognizes flight path deviations, and is instinctively drawn to the corrective maneuvers. Our laboratory studies indicate that OZ transfers to the pilot all of the integrated display information in less than 200 milliseconds. The reacquisition of scan can be accomplished just as quickly. Thus, the time constants for forming a mental model are near instantaneous. The pilot's ability to keep up with rapidly changing and threatening environments is tremendously enhanced. OZ is most easily compatible with aircraft that has flight path information coded electronically. With the correct sensors (which are currently available) OZ can be installed in essentially all current aircraft.

  20. Defense Science Board Report on Advanced Computing

    DTIC Science & Technology

    2009-03-01

    complex computational  issues are  pursued , and that several vendors remain at  the  leading edge of  supercomputing  capability  in  the U.S.  In... pursuing   the  ASC  program  to  help  assure  that  HPC  advances  are  available  to  the  broad  national  security  community. As  in  the past, many...apply HPC  to  technical  problems  related  to  weapons  physics,  but  that  are  entirely  unclassified.  Examples include explosive  astrophysical

  1. Advanced high-performance computer system architectures

    NASA Astrophysics Data System (ADS)

    Vinogradov, V. I.

    2007-02-01

    Convergence of computer systems and communication technologies are moving to switched high-performance modular system architectures on the basis of high-speed switched interconnections. Multi-core processors become more perspective way to high-performance system, and traditional parallel bus system architectures (VME/VXI, cPCI/PXI) are moving to new higher speed serial switched interconnections. Fundamentals in system architecture development are compact modular component strategy, low-power processor, new serial high-speed interface chips on the board, and high-speed switched fabric for SAN architectures. Overview of advanced modular concepts and new international standards for development high-performance embedded and compact modular systems for real-time applications are described.

  2. Computer Center CDC Libraries/NSRDC (Subprograms).

    DTIC Science & Technology

    1981-02-01

    IGD ) DESCRIPTION OF PARAMETERS JG - DIRECTION OF CONVERSION 1 - GREGORIAN TO RELATIVE JULIAN 2 - RELATIVE JULIAN TO GREGORIAN JD - RELATIVE JULIAN...DATE (OUT IF JG=I, IN IF JG:2) IGY - GREGORIAN YEAR (EG, 1975) (IN IF JG=l, OUT IF JG=2) 1GM - GREGORIAN MONTH (1-12) (IN IF JG=l, OUT IF JG=2) IGD ...YEAR COMPUTATIONS. USAGE CALL JULIAN (JG, JD, IGY, IGM, IGD ) DESCRIPTION OF PARAMETERS G - DIRECTION OF CONVERSION 1 - GREGORIAN TO JULIAN 2 - JULIAN

  3. Establishment of the Center for Advanced Separation Technologies

    SciTech Connect

    Christopher E. Hull

    2006-09-30

    This Final Technical Report covers the eight sub-projects awarded in the first year and the five projects awarded in the second year of Cooperative Agreement DE-FC26-01NT41091: Establishment of the Center for Advanced Separation Technologies. This work is summarized in the body of the main report: the individual sub-project Technical Progress Reports are attached as Appendices.

  4. ESTABLISHMENT OF THE CENTER FOR ADVANCED SEPARATION TECHNOLOGIES

    SciTech Connect

    Hugh W. Rimmer

    2003-07-01

    Technical Progress Report describes progress made on the eight sub-projects awarded in the first year of Cooperative Agreement DE-FC26-01NT41091: Establishment of the Center for Advanced Separation Technologies. This work is summarized in the body of the main report: the individual sub-project Technical Progress Reports are attached as Appendices. Due to the time taken up by the solicitation/selection process, these cover the initial 6-month period of activity only.

  5. Final Technical Report - Center for Technology for Advanced Scientific Component Software (TASCS)

    SciTech Connect

    Sussman, Alan

    2014-10-21

    This is a final technical report for the University of Maryland work in the SciDAC Center for Technology for Advanced Scientific Component Software (TASCS). The Maryland work focused on software tools for coupling parallel software components built using the Common Component Architecture (CCA) APIs. Those tools are based on the Maryland InterComm software framework that has been used in multiple computational science applications to build large-scale simulations of complex physical systems that employ multiple separately developed codes.

  6. Final Report for "Center for Technology for Advanced Scientific Component Software"

    SciTech Connect

    Svetlana Shasharina

    2010-12-01

    The goal of the Center for Technology for Advanced Scientific Component Software is to fundamentally changing the way scientific software is developed and used by bringing component-based software development technologies to high-performance scientific and engineering computing. The role of Tech-X work in TASCS project is to provide an outreach to accelerator physics and fusion applications by introducing TASCS tools into applications, testing tools in the applications and modifying the tools to be more usable.

  7. Computational-physics program of the National MFE Computer Center

    SciTech Connect

    Mirin, A.A.

    1982-02-01

    The computational physics group is ivolved in several areas of fusion research. One main area is the application of multidimensional Fokker-Planck, transport and combined Fokker-Planck/transport codes to both toroidal and mirror devices. Another major area is the investigation of linear and nonlinear resistive magnetohydrodynamics in two and three dimensions, with applications to all types of fusion devices. The MHD work is often coupled with the task of numerically generating equilibria which model experimental devices. In addition to these computational physics studies, investigations of more efficient numerical algorithms are being carried out.

  8. Making Advanced Computer Science Topics More Accessible through Interactive Technologies

    ERIC Educational Resources Information Center

    Shao, Kun; Maher, Peter

    2012-01-01

    Purpose: Teaching advanced technical concepts in a computer science program to students of different technical backgrounds presents many challenges. The purpose of this paper is to present a detailed experimental pedagogy in teaching advanced computer science topics, such as computer networking, telecommunications and data structures using…

  9. 75 FR 57742 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-22

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION... Scientific Computing Advisory Committee (ASCAC). Federal Advisory Committee Act (Pub. L. 92-463, 86 Stat. 770...: Melea Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building;...

  10. 76 FR 45786 - Advanced Scientific Computing Advisory Committee; Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-01

    ... Advanced Scientific Computing Advisory Committee; Meeting AGENCY: Office of Science, Department of Energy... Computing Advisory Committee (ASCAC). Federal Advisory Committee Act (Pub. L. 92-463, 86 Stat. 770) requires... INFORMATION CONTACT: Melea Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown...

  11. Computer Center (VAXcluster Libraries/DTNSRDC (Commands and General Information).

    DTIC Science & Technology

    1986-05-01

    SURFACE EFFECTS DEPARTMENT DEPARTMENT k’ A 15 16 I - STRUCTURES COMPUTATION, I",,STRUCTURES MATHEMATICS AND -- = DEPARTMENT LOGISTICS DEPARTMENT " 17...18. SUPPLEMENTARY NOTES % ’- 19. KEY WORDS IContinue on reverse side if necessary and identify by block numberl " ,’.. Computer...documentation .16 q’ 20. ABSTRACT (Continue on reverse side if necessary and identify by block number) - * The Computer Center DEC VAXcluster Libraries

  12. OPENING REMARKS: Scientific Discovery through Advanced Computing

    NASA Astrophysics Data System (ADS)

    Strayer, Michael

    2006-01-01

    Good morning. Welcome to SciDAC 2006 and Denver. I share greetings from the new Undersecretary for Energy, Ray Orbach. Five years ago SciDAC was launched as an experiment in computational science. The goal was to form partnerships among science applications, computer scientists, and applied mathematicians to take advantage of the potential of emerging terascale computers. This experiment has been a resounding success. SciDAC has emerged as a powerful concept for addressing some of the biggest challenges facing our world. As significant as these successes were, I believe there is also significance in the teams that achieved them. In addition to their scientific aims these teams have advanced the overall field of computational science and set the stage for even larger accomplishments as we look ahead to SciDAC-2. I am sure that many of you are expecting to hear about the results of our current solicitation for SciDAC-2. I’m afraid we are not quite ready to make that announcement. Decisions are still being made and we will announce the results later this summer. Nearly 250 unique proposals were received and evaluated, involving literally thousands of researchers, postdocs, and students. These collectively requested more than five times our expected budget. This response is a testament to the success of SciDAC in the community. In SciDAC-2 our budget has been increased to about 70 million for FY 2007 and our partnerships have expanded to include the Environment and National Security missions of the Department. The National Science Foundation has also joined as a partner. These new partnerships are expected to expand the application space of SciDAC, and broaden the impact and visibility of the program. We have, with our recent solicitation, expanded to turbulence, computational biology, and groundwater reactive modeling and simulation. We are currently talking with the Department’s applied energy programs about risk assessment, optimization of complex systems - such

  13. Center for Technology for Advanced Scientific Componet Software (TASCS)

    SciTech Connect

    Govindaraju, Madhusudhan

    2010-10-31

    Advanced Scientific Computing Research Computer Science FY 2010Report Center for Technology for Advanced Scientific Component Software: Distributed CCA State University of New York, Binghamton, NY, 13902 Summary The overall objective of Binghamton's involvement is to work on enhancements of the CCA environment, motivated by the applications and research initiatives discussed in the proposal. This year we are working on re-focusing our design and development efforts to develop proof-of-concept implementations that have the potential to significantly impact scientific components. We worked on developing parallel implementations for non-hydrostatic code and worked on a model coupling interface for biogeochemical computations coded in MATLAB. We also worked on the design and implementation modules that will be required for the emerging MapReduce model to be effective for scientific applications. Finally, we focused on optimizing the processing of scientific datasets on multi-core processors. Research Details We worked on the following research projects that we are working on applying to CCA-based scientific applications. 1. Non-Hydrostatic Hydrodynamics: Non-static hydrodynamics are significantly more accurate at modeling internal waves that may be important in lake ecosystems. Non-hydrostatic codes, however, are significantly more computationally expensive, often prohibitively so. We have worked with Chin Wu at the University of Wisconsin to parallelize non-hydrostatic code. We have obtained a speed up of about 26 times maximum. Although this is significant progress, we hope to improve the performance further, such that it becomes a practical alternative to hydrostatic codes. 2. Model-coupling for water-based ecosystems: To answer pressing questions about water resources requires that physical models (hydrodynamics) be coupled with biological and chemical models. Most hydrodynamics codes are written in Fortran, however, while most ecologists work in MATLAB. This

  14. Current state and future direction of computer systems at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Rogers, James L. (Editor); Tucker, Jerry H. (Editor)

    1992-01-01

    Computer systems have advanced at a rate unmatched by any other area of technology. As performance has dramatically increased there has been an equally dramatic reduction in cost. This constant cost performance improvement has precipitated the pervasiveness of computer systems into virtually all areas of technology. This improvement is due primarily to advances in microelectronics. Most people are now convinced that the new generation of supercomputers will be built using a large number (possibly thousands) of high performance microprocessors. Although the spectacular improvements in computer systems have come about because of these hardware advances, there has also been a steady improvement in software techniques. In an effort to understand how these hardware and software advances will effect research at NASA LaRC, the Computer Systems Technical Committee drafted this white paper to examine the current state and possible future directions of computer systems at the Center. This paper discusses selected important areas of computer systems including real-time systems, embedded systems, high performance computing, distributed computing networks, data acquisition systems, artificial intelligence, and visualization.

  15. Computational physics program of the National MFE Computer Center

    SciTech Connect

    Mirin, A.A.

    1980-08-01

    The computational physics group is involved in several areas of fusion research. One main area is the application of multidimensional Fokker-Planck, transport and combined Fokker-Planck/transport codes to both toroidal and mirror devices. Another major area is the investigation of linear and nonlinear resistive magnetohydrodynamics in two and three dimensions, with applications to all types of fusion studies, investigations of more efficient numerical algorithms are being carried out.

  16. Vanderbilt University Institute of Imaging Science Center for Computational Imaging XNAT: A multimodal data archive and processing environment.

    PubMed

    Harrigan, Robert L; Yvernault, Benjamin C; Boyd, Brian D; Damon, Stephen M; Gibney, Kyla David; Conrad, Benjamin N; Phillips, Nicholas S; Rogers, Baxter P; Gao, Yurui; Landman, Bennett A

    2016-01-01

    The Vanderbilt University Institute for Imaging Science (VUIIS) Center for Computational Imaging (CCI) has developed a database built on XNAT housing over a quarter of a million scans. The database provides framework for (1) rapid prototyping, (2) large scale batch processing of images and (3) scalable project management. The system uses the web-based interfaces of XNAT and REDCap to allow for graphical interaction. A python middleware layer, the Distributed Automation for XNAT (DAX) package, distributes computation across the Vanderbilt Advanced Computing Center for Research and Education high performance computing center. All software are made available in open source for use in combining portable batch scripting (PBS) grids and XNAT servers.

  17. Application of advanced electronics to a future spacecraft computer design

    NASA Technical Reports Server (NTRS)

    Carney, P. C.

    1980-01-01

    Advancements in hardware and software technology are summarized with specific emphasis on spacecraft computer capabilities. Available state of the art technology is reviewed and candidate architectures are defined.

  18. 77 FR 12823 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-02

    ... final report, Advanced Networking update Status from Computer Science COV Early Career technical talks Summary of Applied Math and Computer Science Workshops ASCR's new SBIR awards Data-intensive...

  19. Computational and design methods for advanced imaging

    NASA Astrophysics Data System (ADS)

    Birch, Gabriel C.

    This dissertation merges the optical design and computational aspects of imaging systems to create novel devices that solve engineering problems in optical science and attempts to expand the solution space available to the optical designer. This dissertation is divided into two parts: the first discusses a new active illumination depth sensing modality, while the second part discusses a passive illumination system called plenoptic, or lightfield, imaging. The new depth sensing modality introduced in part one is called depth through controlled aberration. This technique illuminates a target with a known, aberrated projected pattern and takes an image using a traditional, unmodified imaging system. Knowing how the added aberration in the projected pattern changes as a function of depth, we are able to quantitatively determine depth of a series of points from the camera. A major advantage this method permits is the ability for illumination and imaging axes to be coincident. Plenoptic cameras capture both spatial and angular data simultaneously. This dissertation present a new set of parameters that permit the design and comparison of plenoptic devices outside the traditionally published plenoptic 1.0 and plenoptic 2.0 configurations. Additionally, a series of engineering advancements are presented, including full system raytraces of raw plenoptic images, Zernike compression techniques of raw image files, and non-uniform lenslet arrays to compensate for plenoptic system aberrations. Finally, a new snapshot imaging spectrometer is proposed based off the plenoptic configuration.

  20. CROSSCUTTING TECHNOLOGY DEVELOPMENT AT THE CENTER FOR ADVANCED SEPARATION TECHNOLOGIES

    SciTech Connect

    Hugh W. Rimmer

    2004-05-12

    This Technical Progress Report describes progress made on the seventeen subprojects awarded in the first year of Cooperative Agreement DE-FC26-02NT41607: Crosscutting Technology Development at the Center for Advanced Separation Technologies. This work is summarized in the body of the main report: the individual sub-project Technical Progress Reports are attached as Appendices. Due to the time taken up by the solicitation/selection process, these cover the initial 6-month period of project activity only. The U.S. is the largest producer of mining products in the world. In 1999, U.S. mining operations produced $66.7 billion worth of raw materials that contributed a total of $533 billion to the nation's wealth. Despite these contributions, the mining industry has not been well supported with research and development funds as compared to mining industries in other countries. To overcome this problem, the Center for Advanced Separation Technologies (CAST) was established to develop technologies that can be used by the U.S. mining industry to create new products, reduce production costs, and meet environmental regulations. Originally set up by Virginia Tech and West Virginia University, this endeavor has been expanded into a seven-university consortium--Virginia Tech, West Virginia University, University of Kentucky, University of Utah, Montana Tech, New Mexico Tech and University of Nevada, Reno--that is supported through U.S. DOE Cooperative Agreement No. DE-FC26-02NT41607: Crosscutting Technology Development at the Center for Advanced Separation Technologies. Much of the research to be conducted with Cooperative Agreement funds will be longer-term, high-risk, basic research and will be carried out in five broad areas: (1) Solid-solid separation (2) Solid-liquid separation (3) Chemical/Biological Extraction (4) Modeling and Control, and (5) Environmental Control.

  1. Oklahoma State University proposed Advanced Technology Research Center. Environmental Assessment

    SciTech Connect

    1995-06-01

    The Department of Energy (DOE) has prepared an Environmental Assessment (EA) evaluating the construction and equipping of the proposed Advanced Technology Research Center (ATRC) at Oklahoma State University (OSU) in Stillwater, Oklahoma. Based on the analysis in the EA, the DOE has determined that the proposed action does not constitute a major federal action significantly affecting the quality of the human environment within the meaning of the National Environmental Policy Act (NEPA) of 1969. Therefore, the preparation of an Environmental Impact Statement is not required.

  2. Argonne Laboratory Computing Resource Center - FY2004 Report.

    SciTech Connect

    Bair, R.

    2005-04-14

    In the spring of 2002, Argonne National Laboratory founded the Laboratory Computing Resource Center, and in April 2003 LCRC began full operations with Argonne's first teraflops computing cluster. The LCRC's driving mission is to enable and promote computational science and engineering across the Laboratory, primarily by operating computing facilities and supporting application use and development. This report describes the scientific activities, computing facilities, and usage in the first eighteen months of LCRC operation. In this short time LCRC has had broad impact on programs across the Laboratory. The LCRC computing facility, Jazz, is available to the entire Laboratory community. In addition, the LCRC staff provides training in high-performance computing and guidance on application usage, code porting, and algorithm development. All Argonne personnel and collaborators are encouraged to take advantage of this computing resource and to provide input into the vision and plans for computing and computational analysis at Argonne. Steering for LCRC comes from the Computational Science Advisory Committee, composed of computing experts from many Laboratory divisions. The CSAC Allocations Committee makes decisions on individual project allocations for Jazz.

  3. Extending Medical Center Computer Application to Rural Health Clinics

    PubMed Central

    Gottfredson, Douglas K.

    1983-01-01

    A paper entitled “A COMPUTER DATA BASE FOR CLINICIANS, MANAGERS AND RESEARCHERS,” presented during the 1981 SCAMC, described the Salt Lake VA Medical Center computer system. Since that time, two Rural Health Clinics each about 150 miles from Salt Lake City were established by the SL VAMC to reduce traveling distances and improve services for Veterans. Although many existing computer applications were available with no modifications, additional software was needed to support unique needs of the clinics. The Rural Health package of software was designed to gather and store demographic and clinical information on each Veteran, determine the types of services provided, track services over time, monitor services provided by local hospitals and clinical laboratories which are paid for by the VA, determine total clinic costs, etc. These computer applications may be of interest to Medical Centers with separate clinics or outreach programs and individuals or groups in private practice with programs similar to the VA Rural Health Clinics.

  4. Advanced flight computers for planetary exploration

    NASA Technical Reports Server (NTRS)

    Stephenson, R. Rhoads

    1988-01-01

    Research concerning flight computers for use on interplanetary probes is reviewed. The history of these computers from the Viking mission to the present is outlined. The differences between ground commercial computers and computers for planetary exploration are listed. The development of a computer for the Mariner Mark II comet rendezvous asteroid flyby mission is described. Various aspects of recently developed computer systems are examined, including the Max real time, embedded computer, a hypercube distributed supercomputer, a SAR data processor, a processor for the High Resolution IR Imaging Spectrometer, and a robotic vision multiresolution pyramid machine for processsing images obtained by a Mars Rover.

  5. Argonne's Laboratory computing resource center : 2006 annual report.

    SciTech Connect

    Bair, R. B.; Kaushik, D. K.; Riley, K. R.; Valdes, J. V.; Drugan, C. D.; Pieper, G. P.

    2007-05-31

    Argonne National Laboratory founded the Laboratory Computing Resource Center (LCRC) in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. In September 2002 the LCRC deployed a 350-node computing cluster from Linux NetworX to address Laboratory needs for mid-range supercomputing. This cluster, named 'Jazz', achieved over a teraflop of computing power (10{sup 12} floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the 50 fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2006, there were 76 active projects on Jazz involving over 380 scientists and engineers. These projects represent a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to foster growth in the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure providers to offer more scientific data management capabilities, expanding Argonne staff use of national

  6. Factors Affecting Teachers' Student-Centered Classroom Computer Use

    ERIC Educational Resources Information Center

    Friedrich, Helmut Felix; Hron, Aemilian

    2011-01-01

    The present study aims at investigating which factors are relevant to induce teachers' student-centered classroom computer use. Survey data were collected from 361 teachers at comprehensive schools. Based on a systemic view of technology use in schools, different individual teacher characteristics and school contextual factors were examined.…

  7. Funding Public Computing Centers: Balancing Broadband Availability and Expected Demand

    ERIC Educational Resources Information Center

    Jayakar, Krishna; Park, Eun-A

    2012-01-01

    The National Broadband Plan (NBP) recently announced by the Federal Communication Commission visualizes a significantly enhanced commitment to public computing centers (PCCs) as an element of the Commission's plans for promoting broadband availability. In parallel, the National Telecommunications and Information Administration (NTIA) has…

  8. Computer Center Introductory Reference Manual for CDC 6000.

    DTIC Science & Technology

    1983-05-01

    List files with I1i- w 6/7/g/i eol -. "rint .*)ne copy of the Computer Center Reference Manual ( CCqM ) on narrow oaper at central site. j obname...Simscriot, Snonot, Sort/Merget SPSS, System 2000, and text proc-ssors. See CCQM , rhinter 11. Graphics ** graphics software o)ackanes are availabLe

  9. Advanced Stirling Technology Development at NASA Glenn Research Center

    NASA Technical Reports Server (NTRS)

    Shaltens, Richard K.; Wong, Wayne A.

    2007-01-01

    The NASA Glenn Research Center has been developing advanced energy-conversion technologies for use with both radioisotope power systems and fission surface power systems for many decades. Under NASA's Science Mission Directorate, Planetary Science Theme, Technology Program, Glenn is developing the next generation of advanced Stirling convertors (ASCs) for use in the Department of Energy/Lockheed Martin Advanced Stirling Radioisotope Generator (ASRG). The next-generation power-conversion technologies require high efficiency and high specific power (watts electric per kilogram) to meet future mission requirements to use less of the Department of Energy's plutonium-fueled general-purpose heat source modules and reduce system mass. Important goals include long-life (greater than 14-yr) reliability and scalability so that these systems can be considered for a variety of future applications and missions including outer-planet missions and continual operation on the surface of Mars. This paper provides an update of the history and status of the ASC being developed for Glenn by Sunpower Inc. of Athens, Ohio.

  10. Some Recent Advances in Computer Graphics.

    ERIC Educational Resources Information Center

    Whitted, Turner

    1982-01-01

    General principles of computer graphics are reviewed, including discussions of display hardware, geometric modeling, algorithms, and applications in science, computer-aided design, flight training, communications, business, art, and entertainment. (JN)

  11. Computing Advances in the Teaching of Chemistry.

    ERIC Educational Resources Information Center

    Baskett, W. P.; Matthews, G. P.

    1984-01-01

    Discusses three trends in computer-oriented chemistry instruction: (1) availability of interfaces to integrate computers with experiments; (2) impact of the development of higher resolution graphics and greater memory capacity; and (3) role of videodisc technology on computer assisted instruction. Includes program listings for auto-titration and…

  12. ADVANCED COMPUTATIONAL METHODS IN DOSE MODELING: APPLICATION OF COMPUTATIONAL BIOPHYSICAL TRANSPORT, COMPUTATIONAL CHEMISTRY, AND COMPUTATIONAL BIOLOGY

    EPA Science Inventory

    Computational toxicology (CompTox) leverages the significant gains in computing power and computational techniques (e.g., numerical approaches, structure-activity relationships, bioinformatics) realized over the last few years, thereby reducing costs and increasing efficiency i...

  13. White paper: A plan for cooperation between NASA and DARPA to establish a center for advanced architectures

    NASA Technical Reports Server (NTRS)

    Denning, P. J.; Adams, G. B., III; Brown, R. L.; Kanerva, P.; Leiner, B. M.; Raugh, M. R.

    1986-01-01

    Large, complex computer systems require many years of development. It is recognized that large scale systems are unlikely to be delivered in useful condition unless users are intimately involved throughout the design process. A mechanism is described that will involve users in the design of advanced computing systems and will accelerate the insertion of new systems into scientific research. This mechanism is embodied in a facility called the Center for Advanced Architectures (CAA). CAA would be a division of RIACS (Research Institute for Advanced Computer Science) and would receive its technical direction from a Scientific Advisory Board established by RIACS. The CAA described here is a possible implementation of a center envisaged in a proposed cooperation between NASA and DARPA.

  14. The Role of Computers in Research and Development at Langley Research Center

    NASA Technical Reports Server (NTRS)

    Wieseman, Carol D. (Compiler)

    1994-01-01

    This document is a compilation of presentations given at a workshop on the role cf computers in research and development at the Langley Research Center. The objectives of the workshop were to inform the Langley Research Center community of the current software systems and software practices in use at Langley. The workshop was organized in 10 sessions: Software Engineering; Software Engineering Standards, methods, and CASE tools; Solutions of Equations; Automatic Differentiation; Mosaic and the World Wide Web; Graphics and Image Processing; System Design Integration; CAE Tools; Languages; and Advanced Topics.

  15. Advancing crime scene computer forensics techniques

    NASA Astrophysics Data System (ADS)

    Hosmer, Chet; Feldman, John; Giordano, Joe

    1999-02-01

    Computers and network technology have become inexpensive and powerful tools that can be applied to a wide range of criminal activity. Computers have changed the world's view of evidence because computers are used more and more as tools in committing `traditional crimes' such as embezzlements, thefts, extortion and murder. This paper will focus on reviewing the current state-of-the-art of the data recovery and evidence construction tools used in both the field and laboratory for prosection purposes.

  16. Computer-aided analysis at NASA Langley Research Center - Looking toward the 1990's

    NASA Technical Reports Server (NTRS)

    Petersen, Richard H.

    1985-01-01

    Aerospace research is inextricably intertwined with the programmable digital computer. Engineers and scientists at NASA Langley Research Center are requiring ever-increasing computing resources to carry out basic and applied research on problems and complex systems that would have been unthinkable Just ten years ago. The rapid changes in computer technology make planning for the future especially difficult, even five years in advance. In this paper, the evolution of computer resources and usage in research at Langley are briefly considered over the past thirty years, followed by a snapshot of the present. Finally, an extrapolation to the 1990's computer environment is made, with some thoughts on the tasks that engineers might face, and the background they will probably need.

  17. CROSSCUTTING TECHNOLOGY DEVELOPMENT AT THE CENTER FOR ADVANCED SEPARATION TECHNOLOGIES

    SciTech Connect

    Christopher E. Hull

    2005-01-20

    The U.S. is the largest producer of mining products in the world. In 2003, U.S. mining operations produced $57 billion worth of raw materials that contributed a total of $564 billion to the nation's wealth. Despite these contributions, the mining industry has not been well supported with research and development funds as compared to mining industries in other countries. To overcome this problem, the Center for Advanced Separation Technologies (CAST) was established to develop technologies that can be used by the U.S. mining industry to create new products, reduce production costs, and meet environmental regulations. Much of the research to be conducted with Cooperative Agreement funds will be longer-term, high-risk, basic research and will be carried out in five broad areas: (1) Solid-solid separation; (2) Solid-liquid separation; (3) Chemical/Biological Extraction; (4) Modeling and Control; and (5) Environmental Control.

  18. Latest Development in Advanced Sensors at Kennedy Space Center (KSC)

    NASA Technical Reports Server (NTRS)

    Perotti, Jose M.; Eckhoff, Anthony J.; Voska, N. (Technical Monitor)

    2002-01-01

    Inexpensive space transportation system must be developed in order to make spaceflight more affordable. To achieve this goal, there is a need to develop inexpensive smart sensors to allow autonomous checking of the health of the vehicle and associated ground support equipment, warn technicians or operators of an impending problem and facilitate rapid vehicle pre-launch operations. The Transducers and Data Acquisition group at Kennedy Space Center has initiated an effort to study, research, develop and prototype inexpensive smart sensors to accomplish these goals. Several technological challenges are being investigated and integrated in this project multi-discipline sensors; self-calibration, health self-diagnosis capabilities embedded in sensors; advanced data acquisition systems with failure prediction algorithms and failure correction (self-healing) capabilities.

  19. UC Merced Center for Computational Biology Final Report

    SciTech Connect

    Colvin, Michael; Watanabe, Masakatsu

    2010-11-30

    Final report for the UC Merced Center for Computational Biology. The Center for Computational Biology (CCB) was established to support multidisciplinary scientific research and academic programs in computational biology at the new University of California campus in Merced. In 2003, the growing gap between biology research and education was documented in a report from the National Academy of Sciences, Bio2010 Transforming Undergraduate Education for Future Research Biologists. We believed that a new type of biological sciences undergraduate and graduate programs that emphasized biological concepts and considered biology as an information science would have a dramatic impact in enabling the transformation of biology. UC Merced as newest UC campus and the first new U.S. research university of the 21st century was ideally suited to adopt an alternate strategy - to create a new Biological Sciences majors and graduate group that incorporated the strong computational and mathematical vision articulated in the Bio2010 report. CCB aimed to leverage this strong commitment at UC Merced to develop a new educational program based on the principle of biology as a quantitative, model-driven science. Also we expected that the center would be enable the dissemination of computational biology course materials to other university and feeder institutions, and foster research projects that exemplify a mathematical and computations-based approach to the life sciences. As this report describes, the CCB has been successful in achieving these goals, and multidisciplinary computational biology is now an integral part of UC Merced undergraduate, graduate and research programs in the life sciences. The CCB began in fall 2004 with the aid of an award from U.S. Department of Energy (DOE), under its Genomes to Life program of support for the development of research and educational infrastructure in the modern biological sciences. This report to DOE describes the research and academic programs

  20. Crosscutting Technology Development at the Center for Advanced Separation Technologies

    SciTech Connect

    Christopher Hull

    2009-10-31

    The U.S. is the largest producer of mining products in the world. In 2003, U.S. mining operations produced $57 billion worth of raw materials that contributed a total of $564 billion to the nation's wealth. Despite these contributions, the mining industry has not been well supported with research and development funds as compared to mining industries in other countries. To overcome this problem, the Center for Advanced Separation Technologies (CAST) was established to develop technologies that can be used by the U.S. mining industry to create new products, reduce production costs, and meet environmental regulations. Originally set up by Virginia Tech and West Virginia University, this endeavor has been expanded into a seven-university consortium -- Virginia Tech, West Virginia University, University of Kentucky, University of Utah, Montana Tech, New Mexico Tech and University of Nevada, Reno - that is supported through U.S. DOE Cooperative Agreement No. DE-FC26-02NT41607: Crosscutting Technology Development at the Center for Advanced Separation Technologies. Much of the research to be conducted with Cooperative Agreement funds will be longer-term, high-risk, basic research and will be carried out in five broad areas: (1) Solid-solid separation; (2) Solid-liquid separation; (3) Chemical/biological extraction; (4) Modeling and control; and (5) Environmental control. Distribution of funds is handled via competitive solicitation of research proposals through Site Coordinators at the seven member universities. These were first reviewed and ranked by a group of technical reviewers (selected primarily from industry). Based on these reviews, and an assessment of overall program requirements, the CAST Technical Committee made an initial selection/ranking of proposals and forwarded these to the DOE/NETL Project Officer for final review and approval. The successful projects are listed by category, along with brief abstracts of their aims and objectives.

  1. Computation of Viscous Flow about Advanced Projectiles.

    DTIC Science & Technology

    1983-09-09

    Domain". Journal of Comp. Physics, Vol. 8, 1971, pp. 392-408. 10. Thompson , J . F ., Thames, F. C., and Mastin, C. M., "Automatic Numerical Generation of...computations, USSR Comput. Math. Math. Phys., 12, 2 (1972), 182-195. I~~ll A - - 18. Thompson , J . F ., F. C. Thames, and C. M. Mastin, Automatic

  2. Advanced Stirling Convertor Testing at NASA Glenn Research Center

    NASA Technical Reports Server (NTRS)

    Oriti, Salvatore M.; Blaze, Gina M.

    2007-01-01

    The U.S. Department of Energy (DOE), Lockheed Martin Space Systems (LMSS), Sunpower Inc., and NASA Glenn Research Center (GRC) have been developing an Advanced Stirling Radioisotope Generator (ASRG) for use as a power system on space science and exploration missions. This generator will make use of the free-piston Stirling convertors to achieve higher conversion efficiency than currently available alternatives. The ASRG will utilize two Advanced Stirling Convertors (ASC) to convert thermal energy from a radioisotope heat source to electricity. NASA GRC has initiated several experiments to demonstrate the functionality of the ASC, including: in-air extended operation, thermal vacuum extended operation, and ASRG simulation for mobile applications. The in-air and thermal vacuum test articles are intended to provide convertor performance data over an extended operating time. These test articles mimic some features of the ASRG without the requirement of low system mass. Operation in thermal vacuum adds the element of simulating deep space. This test article is being used to gather convertor performance and thermal data in a relevant environment. The ASRG simulator was designed to incorporate a minimum amount of support equipment, allowing integration onto devices powered directly by the convertors, such as a rover. This paper discusses the design, fabrication, and implementation of these experiments.

  3. Center for Advanced Energy Studies (CAES) Strategic Plan

    SciTech Connect

    Kevin Kostelnik; Keith Perry

    2007-07-01

    Twenty-first century energy challenges include demand growth, national energy security, and global climate protection. The Center for Advanced Energy Studies (CAES) is a public/private partnership between the State of Idaho and its academic research institutions, the federal government through the U.S. Department of Energy (DOE) and the Idaho National Laboratory (INL) managed by the Battelle Energy Alliance (BEA). CAES serves to advance energy security for our nation by expanding the educational opportunities at the Idaho universities in energy-related areas, creating new capabilities within its member institutions, and delivering technological innovations leading to technology-based economic development for the intermountain region. CAES has developed this strategic plan based on the Balanced Scorecard approach. A Strategy Map (Section 7) summarizes the CAES vision, mission, customers, and strategic objectives. Identified strategic objectives encompass specific outcomes related to three main areas: Research, Education, and Policy. Technical capabilities and critical enablers needed to support these objectives are also identified. This CAES strategic plan aligns with and supports the strategic objectives of the four CAES institutions. Implementation actions are also presented which will be used to monitor progress towards fulfilling these objectives.

  4. Argonne's Laboratory Computing Resource Center 2009 annual report.

    SciTech Connect

    Bair, R. B.

    2011-05-13

    Now in its seventh year of operation, the Laboratory Computing Resource Center (LCRC) continues to be an integral component of science and engineering research at Argonne, supporting a diverse portfolio of projects for the U.S. Department of Energy and other sponsors. The LCRC's ongoing mission is to enable and promote computational science and engineering across the Laboratory, primarily by operating computing facilities and supporting high-performance computing application use and development. This report describes scientific activities carried out with LCRC resources in 2009 and the broad impact on programs across the Laboratory. The LCRC computing facility, Jazz, is available to the entire Laboratory community. In addition, the LCRC staff provides training in high-performance computing and guidance on application usage, code porting, and algorithm development. All Argonne personnel and collaborators are encouraged to take advantage of this computing resource and to provide input into the vision and plans for computing and computational analysis at Argonne. The LCRC Allocations Committee makes decisions on individual project allocations for Jazz. Committee members are appointed by the Associate Laboratory Directors and span a range of computational disciplines. The 350-node LCRC cluster, Jazz, began production service in April 2003 and has been a research work horse ever since. Hosting a wealth of software tools and applications and achieving high availability year after year, researchers can count on Jazz to achieve project milestones and enable breakthroughs. Over the years, many projects have achieved results that would have been unobtainable without such a computing resource. In fiscal year 2009, there were 49 active projects representing a wide cross-section of Laboratory research and almost all research divisions.

  5. The Center for Computational Biology: resources, achievements, and challenges.

    PubMed

    Toga, Arthur W; Dinov, Ivo D; Thompson, Paul M; Woods, Roger P; Van Horn, John D; Shattuck, David W; Parker, D Stott

    2012-01-01

    The Center for Computational Biology (CCB) is a multidisciplinary program where biomedical scientists, engineers, and clinicians work jointly to combine modern mathematical and computational techniques, to perform phenotypic and genotypic studies of biological structure, function, and physiology in health and disease. CCB has developed a computational framework built around the Manifold Atlas, an integrated biomedical computing environment that enables statistical inference on biological manifolds. These manifolds model biological structures, features, shapes, and flows, and support sophisticated morphometric and statistical analyses. The Manifold Atlas includes tools, workflows, and services for multimodal population-based modeling and analysis of biological manifolds. The broad spectrum of biomedical topics explored by CCB investigators include the study of normal and pathological brain development, maturation and aging, discovery of associations between neuroimaging and genetic biomarkers, and the modeling, analysis, and visualization of biological shape, form, and size. CCB supports a wide range of short-term and long-term collaborations with outside investigators, which drive the center's computational developments and focus the validation and dissemination of CCB resources to new areas and scientific domains.

  6. Argonne's Laboratory Computing Resource Center : 2005 annual report.

    SciTech Connect

    Bair, R. B.; Coghlan, S. C; Kaushik, D. K.; Riley, K. R.; Valdes, J. V.; Pieper, G. P.

    2007-06-30

    Argonne National Laboratory founded the Laboratory Computing Resource Center in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. The first goal of the LCRC was to deploy a mid-range supercomputing facility to support the unmet computational needs of the Laboratory. To this end, in September 2002, the Laboratory purchased a 350-node computing cluster from Linux NetworX. This cluster, named 'Jazz', achieved over a teraflop of computing power (10{sup 12} floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the fifty fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2005, there were 62 active projects on Jazz involving over 320 scientists and engineers. These projects represent a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to improve the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure providers to develop

  7. An Analysis of Collaborative Technology Advancements Achieved through the Center for Network Innovation and Experimentation

    DTIC Science & Technology

    2008-12-01

    COLLABORATIVE TECHNOLOGY ADVANCEMENTS ACHIEVED THROUGH THE CENTER FOR NETWORK INNOVATION AND EXPERIMENTATION by Eric L. Quarles December 2008...Advancements Achieved through the Center for Network Innovation and Experimentation 6. AUTHOR(S) Eric L. Quarles 5. FUNDING NUMBERS 7...cycles which the members of the Naval Postgraduate School Center for Network Innovation and Experimentation (CENETIX) participate. These experiments

  8. [AERA. Dream machines and computing practices at the Mathematical Center].

    PubMed

    Alberts, Gerard; De Beer, Huub T

    2008-01-01

    Dream machines may be just as effective as the ones materialised. Their symbolic thrust can be quite powerful. The Amsterdam 'Mathematisch Centrum' (Mathematical Center), founded February 11, 1946, created a Computing Department in an effort to realise its goal of serving society. When Aad van Wijngaarden was appointed as head of the Computing Department, however, he claimed space for scientific research and computer construction, next to computing as a service. Still, the computing service following the five stage style of Hartree's numerical analysis remained a dominant characteristic of the work of the Computing Department. The high level of ambition held by Aad van Wijngaarden lead to ever renewed projections of big automatic computers, symbolised by the never-built AERA. Even a machine that was actually constructed, the ARRA which followed A.D. Booth's design of the ARC, never made it into real operation. It did serve Van Wijngaarden to bluff his way into the computer age by midsummer 1952. Not until January 1954 did the computing department have a working stored program computer, which for reasons of policy went under the same name: ARRA. After just one other machine, the ARMAC, had been produced, a separate company, Electrologica, was set up for the manufacture of computers, which produced the rather successful X1 computer. The combination of ambition and absence of a working machine lead to a high level of work on programming, way beyond the usual ideas of libraries of subroutines. Edsger W. Dijkstra in particular led the way to an emphasis on the duties of the programmer within the pattern of numerical analysis. Programs generating programs, known elsewhere as autocoding systems, were at the 'Mathematisch Centrum' called 'superprograms'. Practical examples were usually called a 'complex', in Dutch, where in English one might say 'system'. Historically, this is where software begins. Dekker's matrix complex, Dijkstra's interrupt system, Dijkstra and

  9. Computing Algorithms for Nuffield Advanced Physics.

    ERIC Educational Resources Information Center

    Summers, M. K.

    1978-01-01

    Defines all recurrence relations used in the Nuffield course, to solve first- and second-order differential equations, and describes a typical algorithm for computer generation of solutions. (Author/GA)

  10. Aerodynamic optimization studies on advanced architecture computers

    NASA Technical Reports Server (NTRS)

    Chawla, Kalpana

    1995-01-01

    The approach to carrying out multi-discipline aerospace design studies in the future, especially in massively parallel computing environments, comprises of choosing (1) suitable solvers to compute solutions to equations characterizing a discipline, and (2) efficient optimization methods. In addition, for aerodynamic optimization problems, (3) smart methodologies must be selected to modify the surface shape. In this research effort, a 'direct' optimization method is implemented on the Cray C-90 to improve aerodynamic design. It is coupled with an existing implicit Navier-Stokes solver, OVERFLOW, to compute flow solutions. The optimization method is chosen such that it can accomodate multi-discipline optimization in future computations. In the work , however, only single discipline aerodynamic optimization will be included.

  11. Advanced Computational Techniques for Power Tube Design.

    DTIC Science & Technology

    1986-07-01

    fixturing applications, in addition to the existing computer-aided engineering capabilities. o Helix TWT Manufacturing has Implemented a tooling and fixturing...illustrates the ajor features of this computer network. ) The backbone of our system is a Sytek Broadband Network (LAN) which Interconnects terminals and...automatic network analyzer (FANA) which electrically characterizes the slow-wave helices of traveling-wave tubes ( TWTs ) -- both for engineering design

  12. Advanced Crew Personal Support Computer (CPSC) task

    NASA Technical Reports Server (NTRS)

    Muratore, Debra

    1991-01-01

    The topics are presented in view graph form and include: background; objectives of task; benefits to the Space Station Freedom (SSF) Program; technical approach; baseline integration; and growth and evolution options. The objective is to: (1) introduce new computer technology into the SSF Program; (2) augment core computer capabilities to meet additional mission requirements; (3) minimize risk in upgrading technology; and (4) provide a low cost way to enhance crew and ground operations support.

  13. Frontiers of research in advanced computations

    SciTech Connect

    1996-07-01

    The principal mission of the Institute for Scientific Computing Research is to foster interactions among LLNL researchers, universities, and industry on selected topics in scientific computing. In the area of computational physics, the Institute has developed a new algorithm, GaPH, to help scientists understand the chemistry of turbulent and driven plasmas or gases at far less cost than other methods. New low-frequency electromagnetic models better describe the plasma etching and deposition characteristics of a computer chip in the making. A new method for modeling realistic curved boundaries within an orthogonal mesh is resulting in a better understanding of the physics associated with such boundaries and much quicker solutions. All these capabilities are being developed for massively parallel implementation, which is an ongoing focus of Institute researchers. Other groups within the Institute are developing novel computational methods to address a range of other problems. Examples include feature detection and motion recognition by computer, improved monitoring of blood oxygen levels, and entirely new models of human joint mechanics and prosthetic devices.

  14. Advanced Computational Techniques in Regional Wave Studies

    DTIC Science & Technology

    1990-01-03

    the new GERESS data. The dissertation work emphasized the development and use of advanced computa- tional techniques for studying regional seismic...hand, the possibility of new data sources at regional distances permits using previously ignored signals. Unfortunately, these regional signals will...the Green’s function around this new reference point is containing the propagation effects, and V is the source Gnk(x,t;r,t) - (2) volume where fJk

  15. Advanced Stirling Convertor Testing at NASA Glenn Research Center

    NASA Technical Reports Server (NTRS)

    Wilson, Scott D.; Poriti, Sal

    2010-01-01

    The NASA Glenn Research Center (GRC) has been testing high-efficiency free-piston Stirling convertors for potential use in radioisotope power systems (RPSs) since 1999. The current effort is in support of the Advanced Stirling Radioisotope Generator (ASRG), which is being developed by the U.S. Department of Energy (DOE), Lockheed Martin Space Systems Company (LMSSC), Sunpower, Inc., and the NASA GRC. This generator would use two high-efficiency Advanced Stirling Convertors (ASCs) to convert thermal energy from a radioisotope heat source into electricity. As reliability is paramount to a RPS capable of providing spacecraft power for potential multi-year missions, GRC provides direct technology support to the ASRG flight project in the areas of reliability, convertor and generator testing, high-temperature materials, structures, modeling and analysis, organics, structural dynamics, electromagnetic interference (EMI), and permanent magnets to reduce risk and enhance reliability of the convertor as this technology transitions toward flight status. Convertor and generator testing is carried out in short- and long-duration tests designed to characterize convertor performance when subjected to environments intended to simulate launch and space conditions. Long duration testing is intended to baseline performance and observe any performance degradation over the life of the test. Testing involves developing support hardware that enables 24/7 unattended operation and data collection. GRC currently has 14 Stirling convertors under unattended extended operation testing, including two operating in the ASRG Engineering Unit (ASRG-EU). Test data and high-temperature support hardware are discussed for ongoing and future ASC tests with emphasis on the ASC-E and ASC-E2.

  16. Center for Advanced Biofuel Systems (CABS) Final Report

    SciTech Connect

    Kutchan, Toni M.

    2015-12-02

    One of the great challenges facing current and future generations is how to meet growing energy demands in an environmentally sustainable manner. Renewable energy sources, including wind, geothermal, solar, hydroelectric, and biofuel energy systems, are rapidly being developed as sustainable alternatives to fossil fuels. Biofuels are particularly attractive to the U.S., given its vast agricultural resources. The first generation of biofuel systems was based on fermentation of sugars to produce ethanol, typically from food crops. Subsequent generations of biofuel systems, including those included in the CABS project, will build upon the experiences learned from those early research results and will have improved production efficiencies, reduced environmental impacts and decreased reliance on food crops. Thermodynamic models predict that the next generations of biofuel systems will yield three- to five-fold more recoverable energy products. To address the technological challenges necessary to develop enhanced biofuel systems, greater understanding of the non-equilibrium processes involved in solar energy conversion and the channeling of reduced carbon into biofuel products must be developed. The objective of the proposed Center for Advanced Biofuel Systems (CABS) was to increase the thermodynamic and kinetic efficiency of select plant- and algal-based fuel production systems using rational metabolic engineering approaches grounded in modern systems biology. The overall strategy was to increase the efficiency of solar energy conversion into oils and other specialty biofuel components by channeling metabolic flux toward products using advanced catalysts and sensible design:1) employing novel protein catalysts that increase the thermodynamic and kinetic efficiencies of photosynthesis and oil biosynthesis; 2) engineering metabolic networks to enhance acetyl-CoA production and its channeling towards lipid synthesis; and 3) engineering new metabolic networks for the

  17. Advances in Computer-Supported Learning

    ERIC Educational Resources Information Center

    Neto, Francisco; Brasileiro, Francisco

    2007-01-01

    The Internet and growth of computer networks have eliminated geographic barriers, creating an environment where education can be brought to a student no matter where that student may be. The success of distance learning programs and the availability of many Web-supported applications and multimedia resources have increased the effectiveness of…

  18. Advanced Computing Tools and Models for Accelerator Physics

    SciTech Connect

    Ryne, Robert; Ryne, Robert D.

    2008-06-11

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.

  19. Evaluation of Advanced Computing Techniques and Technologies: Reconfigurable Computing

    NASA Technical Reports Server (NTRS)

    Wells, B. Earl

    2003-01-01

    The focus of this project was to survey the technology of reconfigurable computing determine its level of maturity and suitability for NASA applications. To better understand and assess the effectiveness of the reconfigurable design paradigm that is utilized within the HAL-15 reconfigurable computer system. This system was made available to NASA MSFC for this purpose, from Star Bridge Systems, Inc. To implement on at least one application that would benefit from the performance levels that are possible with reconfigurable hardware. It was originally proposed that experiments in fault tolerance and dynamically reconfigurability would be perform but time constraints mandated that these be pursued as future research.

  20. National Center for Advanced Information Components Manufacturing. Program summary report, Volume 1

    SciTech Connect

    1996-10-01

    The National Center for Advanced Information Components Manufacturing focused on manufacturing research and development for flat panel displays, advanced lithography, microelectronics, and optoelectronics. This report provides an overview of the program, summaries of the technical projects, and key program accomplishments.

  1. 77 FR 37422 - National Center for Advancing Translational Sciences; Notice of Closed Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-21

    ... Panel; Division of Comparative Medicine Peer Review Meeting; Office of Research Infrastructure Programs... Center for Advancing Translational Sciences, National Institutes of Health, 6701 Democracy Blvd., Dem....

  2. ASDA - Advanced Suit Design Analyzer computer program

    NASA Technical Reports Server (NTRS)

    Bue, Grant C.; Conger, Bruce C.; Iovine, John V.; Chang, Chi-Min

    1992-01-01

    An ASDA model developed to evaluate the heat and mass transfer characteristics of advanced pressurized suit design concepts for low pressure or vacuum planetary applications is presented. The model is based on a generalized 3-layer suit that uses the Systems Integrated Numerical Differencing Analyzer '85 in conjunction with a 41-node FORTRAN routine. The latter simulates the transient heat transfer and respiratory processes of a human body in a suited environment. The user options for the suit encompass a liquid cooled garment, a removable jacket, a CO2/H2O permeable layer, and a phase change layer.

  3. Exposure science and the U.S. EPA National Center for Computational Toxicology.

    PubMed

    Cohen Hubal, Elaine A; Richard, Ann M; Shah, Imran; Gallagher, Jane; Kavlock, Robert; Blancato, Jerry; Edwards, Stephen W

    2010-05-01

    The emerging field of computational toxicology applies mathematical and computer models and molecular biological and chemical approaches to explore both qualitative and quantitative relationships between sources of environmental pollutant exposure and adverse health outcomes. The integration of modern computing with molecular biology and chemistry will allow scientists to better prioritize data, inform decision makers on chemical risk assessments and understand a chemical's progression from the environment to the target tissue within an organism and ultimately to the key steps that trigger an adverse health effect. In this paper, several of the major research activities being sponsored by Environmental Protection Agency's National Center for Computational Toxicology are highlighted. Potential links between research in computational toxicology and human exposure science are identified. As with the traditional approaches for toxicity testing and hazard assessment, exposure science is required to inform design and interpretation of high-throughput assays. In addition, common themes inherent throughout National Center for Computational Toxicology research activities are highlighted for emphasis as exposure science advances into the 21st century.

  4. Intelligent Software Tools for Advanced Computing

    SciTech Connect

    Baumgart, C.W.

    2001-04-03

    Feature extraction and evaluation are two procedures common to the development of any pattern recognition application. These features are the primary pieces of information which are used to train the pattern recognition tool, whether that tool is a neural network, a fuzzy logic rulebase, or a genetic algorithm. Careful selection of the features to be used by the pattern recognition tool can significantly streamline the overall development and training of the solution for the pattern recognition application. This report summarizes the development of an integrated, computer-based software package called the Feature Extraction Toolbox (FET), which can be used for the development and deployment of solutions to generic pattern recognition problems. This toolbox integrates a number of software techniques for signal processing, feature extraction and evaluation, and pattern recognition, all under a single, user-friendly development environment. The toolbox has been developed to run on a laptop computer, so that it may be taken to a site and used to develop pattern recognition applications in the field. A prototype version of this toolbox has been completed and is currently being used for applications development on several projects in support of the Department of Energy.

  5. The Nuclear Energy Advanced Modeling and Simulation Enabling Computational Technologies FY09 Report

    SciTech Connect

    Diachin, L F; Garaizar, F X; Henson, V E; Pope, G

    2009-10-12

    In this document we report on the status of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Enabling Computational Technologies (ECT) effort. In particular, we provide the context for ECT In the broader NEAMS program and describe the three pillars of the ECT effort, namely, (1) tools and libraries, (2) software quality assurance, and (3) computational facility (computers, storage, etc) needs. We report on our FY09 deliverables to determine the needs of the integrated performance and safety codes (IPSCs) in these three areas and lay out the general plan for software quality assurance to meet the requirements of DOE and the DOE Advanced Fuel Cycle Initiative (AFCI). We conclude with a brief description of our interactions with the Idaho National Laboratory computer center to determine what is needed to expand their role as a NEAMS user facility.

  6. Advances in computer imaging/applications in facial plastic surgery.

    PubMed

    Papel, I D; Jiannetto, D F

    1999-01-01

    Rapidly progressing computer technology, ever-increasing expectations of patients, and a confusing medicolegal environment requires a clarification of the role of computer imaging/applications. Advances in computer technology and its applications are reviewed. A brief historical discussion is included for perspective. Improvements in both hardware and software with the advent of digital imaging have allowed great increases in speed and accuracy in patient imaging. This facilitates doctor-patient communication and possibly realistic patient expectations. Patients seeking cosmetic surgery now often expect preoperative imaging. Although society in general has become more litigious, a literature search up to 1998 reveals no lawsuits directly involving computer imaging. It appears that conservative utilization of computer imaging by the facial plastic surgeon may actually reduce liability and promote communication. Recent advances have significantly enhanced the value of computer imaging in the practice of facial plastic surgery. These technological advances in computer imaging appear to contribute a useful technique for the practice of facial plastic surgery. Inclusion of computer imaging should be given serious consideration as an adjunct to clinical practice.

  7. Final Report: Center for Programming Models for Scalable Parallel Computing

    SciTech Connect

    Mellor-Crummey, John

    2011-09-13

    As part of the Center for Programming Models for Scalable Parallel Computing, Rice University collaborated with project partners in the design, development and deployment of language, compiler, and runtime support for parallel programming models to support application development for the “leadership-class” computer systems at DOE national laboratories. Work over the course of this project has focused on the design, implementation, and evaluation of a second-generation version of Coarray Fortran. Research and development efforts of the project have focused on the CAF 2.0 language, compiler, runtime system, and supporting infrastructure. This has involved working with the teams that provide infrastructure for CAF that we rely on, implementing new language and runtime features, producing an open source compiler that enabled us to evaluate our ideas, and evaluating our design and implementation through the use of benchmarks. The report details the research, development, findings, and conclusions from this work.

  8. An academic medical center's response to widespread computer failure.

    PubMed

    Genes, Nicholas; Chary, Michael; Chason, Kevin W

    2013-01-01

    As hospitals incorporate information technology (IT), their operations become increasingly vulnerable to technological breakdowns and attacks. Proper emergency management and business continuity planning require an approach to identify, mitigate, and work through IT downtime. Hospitals can prepare for these disasters by reviewing case studies. This case study details the disruption of computer operations at Mount Sinai Medical Center (MSMC), an urban academic teaching hospital. The events, and MSMC's response, are narrated and the impact on hospital operations is analyzed. MSMC's disaster management strategy prevented computer failure from compromising patient care, although walkouts and time-to-disposition in the emergency department (ED) notably increased. This incident highlights the importance of disaster preparedness and mitigation. It also demonstrates the value of using operational data to evaluate hospital responses to disasters. Quantifying normal hospital functions, just as with a patient's vital signs, may help quantitatively evaluate and improve disaster management and business continuity planning.

  9. TOPICAL REVIEW: Advances and challenges in computational plasma science

    NASA Astrophysics Data System (ADS)

    Tang, W. M.; Chan, V. S.

    2005-02-01

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behaviour. Recent advances in simulations of magnetically confined plasmas are reviewed in this paper, with illustrative examples, chosen from associated research areas such as microturbulence, magnetohydrodynamics and other topics. Progress has been stimulated, in particular, by the exponential growth of computer speed along with significant improvements in computer technology. The advances in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics have produced increasingly good agreement between experimental observations and computational modelling. This was enabled by two key factors: (a) innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales and (b) access to powerful new computational resources. Excellent progress has been made in developing codes for which computer run-time and problem-size scale well with the number of processors on massively parallel processors (MPPs). Examples include the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPPs to produce three-dimensional, general geometry, nonlinear particle simulations that have accelerated advances in understanding the nature of turbulence self-regulation by zonal flows. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In looking towards the future, the current results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. This

  10. 78 FR 76634 - National Center for Advancing Translational Sciences; Notice of Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-18

    ... 20892. Contact Person: Danilo A Tagle, Ph.D., Executive Secretary, National Center for Advancing Translational Sciences, 1 Democracy Plaza, Room 992, Bethesda, MD 20892, 301-594-8064, Danilo.Tagle@nih.gov.... Contact Person: Danilo A Tagle, Ph.D., Executive Secretary, National Center for Advancing...

  11. Women's Center Volunteer Intern Program: Building Community While Advancing Social and Gender Justice

    ERIC Educational Resources Information Center

    Murray, Margaret A.; Vlasnik, Amber L.

    2015-01-01

    This program description explores the purpose, structure, activities, and outcomes of the volunteer intern program at the Wright State University Women's Center. Designed to create meaningful, hands-on learning experiences for students and to advance the center's mission, the volunteer intern program builds community while advancing social and…

  12. Vanderbilt University Institute of Imaging Science Center for Computational Imaging XNAT: A multimodal data archive and processing environment

    PubMed Central

    Harrigan, Robert L.; Yvernault, Benjamin C.; Boyd, Brian D.; Damon, Stephen M.; Gibney, Kyla David; Conrad, Benjamin N.; Phillips, Nicholas S.; Rogers, Baxter P.; Gao, Yurui; Landman, Bennett A.

    2015-01-01

    The Vanderbilt University Institute for Imaging Science (VUIIS) Center for Computational Imaging (CCI) has developed a database built on XNAT housing over a quarter of a million scans. The database provides framework for (1) rapid prototyping, (2) large scale batch processing of images and (3) scalable project management. The system uses the web-based interfaces of XNAT and RedCAP to allow for graphical interaction. A python middleware layer, the Distributed Automation for XNAT (DAX) package, distributes computation across the Vanderbilt Advanced Computing Center for Research and Education high performance computing center. All software are made available in open source for use in combining portable batch scripting (PBS) grids and XNAT servers. PMID:25988229

  13. CROSSCUTTING TECHNOLOGY DEVELOPMENT AT THE CENTER FOR ADVANCED SEPARATION TECHNOLOGIES

    SciTech Connect

    Hugh W. Rimmer

    2003-11-15

    The U.S. is the largest producer of mining products in the world. In 1999, U.S. mining operations produced $66.7 billion worth of raw materials that contributed a total of $533 billion to the nation's wealth. Despite these contributions, the mining industry has not been well supported with research and development funds as compared to mining industries in other countries. To overcome this problem, the Center for Advanced Separation Technologies (CAST) was established to develop technologies that can be used by the U.S. mining industry to create new products, reduce production costs, and meet environmental regulations. Much of the research to be conducted with Cooperative Agreement funds will be longer-term, high-risk, basic research and will be carried out in five broad areas: (a) Solid-solid separation (b) Solid-liquid separation (c) Chemical/Biological Extraction (d) Modeling and Control, and (e) Environmental Control. Distribution of funds is being handled via competitive solicitation of research proposals through Site Coordinators at the seven member universities. The first of these solicitations, referred to as the CAST II-Round 1 RFP, was issued on October 28, 2002. Thirty-eight proposals were received by the December 10, 2002 deadline for this RFP-eleven (11) Solid-Solid Separation, seven (7) Solid-Liquid Separation, ten (10) Chemical/Biological Extraction, six (6) Modeling & Control and four (4) Environmental Control. These were first reviewed and ranked by a group of technical reviewers (selected primarily from industry). Based on these reviews, and an assessment of overall program requirements, the CAST Technical Committee made an initial selection/ranking of proposals and forwarded these to the DOE/NETL Project Officer for final review and approval. This process took some 7 months to complete but 17 projects (one joint) were in place at the constituent universities (three at Virginia Tech, two at West Virginia University, three at University of Kentucky

  14. Computer-Assisted Foreign Language Teaching and Learning: Technological Advances

    ERIC Educational Resources Information Center

    Zou, Bin; Xing, Minjie; Wang, Yuping; Sun, Mingyu; Xiang, Catherine H.

    2013-01-01

    Computer-Assisted Foreign Language Teaching and Learning: Technological Advances highlights new research and an original framework that brings together foreign language teaching, experiments and testing practices that utilize the most recent and widely used e-learning resources. This comprehensive collection of research will offer linguistic…

  15. 76 FR 64330 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-18

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION... Reliability, Diffusion on Complex Networks, and Reversible Software Execution Systems Report from Applied Math... at: (301) 903-7486 or by email at: Melea.Baker@science.doe.gov . You must make your request for...

  16. 78 FR 56871 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-16

    ... Advanced Scientific Computing Advisory Committee AGENCY: Office of Science, Department of Energy. ACTION... Exascale technical approaches subcommittee Facilities update Report from Applied Math Committee of Visitors...: ( Melea.Baker@science.doe.gov ). You must make your request for an oral statement at least five...

  17. The Federal Government's Role in Advancing Computer Technology

    ERIC Educational Resources Information Center

    Information Hotline, 1978

    1978-01-01

    As part of the Federal Data Processing Reorganization Study submitted by the Science and Technology Team, the Federal Government's role in advancing and diffusing computer technology is discussed. Findings and conclusions assess the state-of-the-art in government and in industry, and five recommendations provide directions for government policy…

  18. Advanced computational research in materials processing for design and manufacturing

    SciTech Connect

    Zacharia, T.

    1994-12-31

    The computational requirements for design and manufacture of automotive components have seen dramatic increases for producing automobiles with three times the mileage. Automotive component design systems are becoming increasingly reliant on structural analysis requiring both overall larger analysis and more complex analyses, more three-dimensional analyses, larger model sizes, and routine consideration of transient and non-linear effects. Such analyses must be performed rapidly to minimize delays in the design and development process, which drives the need for parallel computing. This paper briefly describes advanced computational research in superplastic forming and automotive crash worthiness.

  19. Center for Programming Models for Scalable Parallel Computing

    SciTech Connect

    John Mellor-Crummey

    2008-02-29

    Rice University's achievements as part of the Center for Programming Models for Scalable Parallel Computing include: (1) design and implemention of cafc, the first multi-platform CAF compiler for distributed and shared-memory machines, (2) performance studies of the efficiency of programs written using the CAF and UPC programming models, (3) a novel technique to analyze explicitly-parallel SPMD programs that facilitates optimization, (4) design, implementation, and evaluation of new language features for CAF, including communication topologies, multi-version variables, and distributed multithreading to simplify development of high-performance codes in CAF, and (5) a synchronization strength reduction transformation for automatically replacing barrier-based synchronization with more efficient point-to-point synchronization. The prototype Co-array Fortran compiler cafc developed in this project is available as open source software from http://www.hipersoft.rice.edu/caf.

  20. Military trauma training at civilian centers: a decade of advancements.

    PubMed

    Thorson, Chad M; Dubose, Joseph J; Rhee, Peter; Knuth, Thomas E; Dorlac, Warren C; Bailey, Jeffrey A; Garcia, George D; Ryan, Mark L; Van Haren, Robert M; Proctor, Kenneth G

    2012-12-01

    In the late 1990s, a Department of Defense subcommittee screened more than 100 civilian trauma centers according to the number of admissions, percentage of penetrating trauma, and institutional interest in relation to the specific training missions of each of the three service branches. By the end of 2001, the Army started a program at University of Miami/Ryder Trauma Center, the Navy began a similar program at University of Southern California/Los Angeles County Medical Center, and the Air Force initiated three Centers for the Sustainment of Trauma and Readiness Skills (C-STARS) at busy academic medical centers: R. Adams Cowley Shock Trauma Center at the University of Maryland (C-STARS Baltimore), Saint Louis University (C-STARS St. Louis), and The University Hospital/University of Cincinnati (C-STARS Cincinnati). Each center focuses on three key areas, didactic training, state-of-the-art simulation and expeditionary equipment training, as well as actual clinical experience in the acute management of trauma patients. Each is integral to delivering lifesaving combat casualty care in theater. Initially, there were growing pains and the struggle to develop an effective curriculum in a short period. With the foresight of each trauma training center director and a dynamic exchange of information with civilian trauma leaders and frontline war fighters, there has been a continuous evolution and improvement of each center's curriculum. Now, it is clear that the longest military conflict in US history and the first of the 21st century has led to numerous innovations in cutting edge trauma training on a comprehensive array of topics. This report provides an overview of the decade-long evolutionary process in providing the highest-quality medical care for our injured heroes.

  1. Advances in Numerical Boundary Conditions for Computational Aeroacoustics

    NASA Technical Reports Server (NTRS)

    Tam, Christopher K. W.

    1997-01-01

    Advances in Computational Aeroacoustics (CAA) depend critically on the availability of accurate, nondispersive, least dissipative computation algorithm as well as high quality numerical boundary treatments. This paper focuses on the recent developments of numerical boundary conditions. In a typical CAA problem, one often encounters two types of boundaries. Because a finite computation domain is used, there are external boundaries. On the external boundaries, boundary conditions simulating the solution outside the computation domain are to be imposed. Inside the computation domain, there may be internal boundaries. On these internal boundaries, boundary conditions simulating the presence of an object or surface with specific acoustic characteristics are to be applied. Numerical boundary conditions, both external or internal, developed for simple model problems are reviewed and examined. Numerical boundary conditions for real aeroacoustic problems are also discussed through specific examples. The paper concludes with a description of some much needed research in numerical boundary conditions for CAA.

  2. Abstracts of digital computer code packages assembled by the Radiation Shielding Information Center

    SciTech Connect

    Carter, B.J.; Maskewitz, B.F.

    1985-04-01

    This publication, ORNL/RSIC-13, Volumes I to III Revised, has resulted from an internal audit of the first 168 packages of computing technology in the Computer Codes Collection (CCC) of the Radiation Shielding Information Center (RSIC). It replaces the earlier three documents published as single volumes between 1966 to 1972. A significant number of the early code packages were considered to be obsolete and were removed from the collection in the audit process and the CCC numbers were not reassigned. Others not currently being used by the nuclear R and D community were retained in the collection to preserve technology not replaced by newer methods, or were considered of potential value for reference purposes. Much of the early technology, however, has improved through developer/RSIC/user interaction and continues at the forefront of the advancing state-of-the-art.

  3. Research Institute for Advanced Computer Science: Annual Report October 1998 through September 1999

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.; Gross, Anthony R. (Technical Monitor)

    1999-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center (ARC). It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. ARC has been designated NASA's Center of Excellence in Information Technology. In this capacity, ARC is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA ARC and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to

  4. Advanced sensor-computer technology for urban runoff monitoring

    NASA Astrophysics Data System (ADS)

    Yu, Byunggu; Behera, Pradeep K.; Ramirez Rochac, Juan F.

    2011-04-01

    The paper presents the project team's advanced sensor-computer sphere technology for real-time and continuous monitoring of wastewater runoff at the sewer discharge outfalls along the receiving water. This research significantly enhances and extends the previously proposed novel sensor-computer technology. This advanced technology offers new computation models for an innovative use of the sensor-computer sphere comprising accelerometer, programmable in-situ computer, solar power, and wireless communication for real-time and online monitoring of runoff quantity. This innovation can enable more effective planning and decision-making in civil infrastructure, natural environment protection, and water pollution related emergencies. The paper presents the following: (i) the sensor-computer sphere technology; (ii) a significant enhancement to the previously proposed discrete runoff quantity model of this technology; (iii) a new continuous runoff quantity model. Our comparative study on the two distinct models is presented. Based on this study, the paper further investigates the following: (1) energy-, memory-, and communication-efficient use of the technology for runoff monitoring; (2) possible sensor extensions for runoff quality monitoring.

  5. Economizer Based Data Center Liquid Cooling with Advanced Metal Interfaces

    SciTech Connect

    Timothy Chainer

    2012-11-30

    A new chiller-less data center liquid cooling system utilizing the outside air environment has been shown to achieve up to 90% reduction in cooling energy compared to traditional chiller based data center cooling systems. The system removes heat from Volume servers inside a Sealed Rack and transports the heat using a liquid loop to an Outdoor Heat Exchanger which rejects the heat to the outdoor ambient environment. The servers in the rack are cooled using a hybrid cooling system by removing the majority of the heat generated by the processors and memory by direct thermal conduction using coldplates and the heat generated by the remaining components using forced air convection to an air- to- liquid heat exchanger inside the Sealed Rack. The anticipated benefits of such energy-centric configurations are significant energy savings at the data center level. When compared to a traditional 10 MW data center, which typically uses 25% of its total data center energy consumption for cooling this technology could potentially enable a cost savings of up to $800,000-$2,200,000/year (assuming electricity costs of 4 to 11 cents per kilowatt-hour) through the reduction in electrical energy usage.

  6. The Design and Implementation of NASA's Advanced Flight Computing Module

    NASA Technical Reports Server (NTRS)

    Alkakaj, Leon; Straedy, Richard; Jarvis, Bruce

    1995-01-01

    This paper describes a working flight computer Multichip Module developed jointly by JPL and TRW under their respective research programs in a collaborative fashion. The MCM is fabricated by nCHIP and is packaged within a 2 by 4 inch Al package from Coors. This flight computer module is one of three modules under development by NASA's Advanced Flight Computer (AFC) program. Further development of the Mass Memory and the programmable I/O MCM modules will follow. The three building block modules will then be stacked into a 3D MCM configuration. The mass and volume of the flight computer MCM achieved at 89 grams and 1.5 cubic inches respectively, represent a major enabling technology for future deep space as well as commercial remote sensing applications.

  7. Center for Technology for Advanced Scientific Component Software (TASCS)

    SciTech Connect

    Bramley, Randall B

    2012-08-02

    Indiana University's SWIM activities have primarily been in three areas. All are completed, but we are continuing to work on two of them because refinements are useful to both DoE laboratories and the high performance computing community.

  8. Advanced Simulation and Computing FY17 Implementation Plan, Version 0

    SciTech Connect

    McCoy, Michel; Archer, Bill; Hendrickson, Bruce; Wade, Doug; Hoang, Thuc

    2016-08-29

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), and quantifying critical margins and uncertainties. Resolving each issue requires increasingly difficult analyses because the aging process has progressively moved the stockpile further away from the original test base. Where possible, the program also enables the use of high performance computing (HPC) and simulation tools to address broader national security needs, such as foreign nuclear weapon assessments and counter nuclear terrorism.

  9. ADVANCED COMPOSITES TECHNOLOGY CASE STUDY AT NASA LANGLEY RESEARCH CENTER

    EPA Science Inventory

    This report summarizes work conducted at the National Aeronautics and Space Administration's Langley Research Center (NASA-LaRC) in Hampton, VA, under the U.S. Environmental Protection Agency’s (EPA) Waste Reduction Evaluations at Federal Sites (WREAFS) Program. Support for...

  10. Center for Advanced Power and Energy Research (CAPEC)

    DTIC Science & Technology

    2015-01-01

    University structured through a cooperative research agreement. Our organizational focuses include: 1. Modeling of plasma physics 2. Modeling fuel cells 3...Testing new innovation and ideas for advanced fuel cells 4. Development of energy related issue for micro air vehicles (MAVs). 15. SUBJECT TERMS plasma ...1 2 Plasma Modeling

  11. Advanced Composite Structures At NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Eldred, Lloyd B.

    2015-01-01

    Dr. Eldred's presentation will discuss several NASA efforts to improve and expand the use of composite structures within aerospace vehicles. Topics will include an overview of NASA's Advanced Composites Project (ACP), Space Launch System (SLS) applications, and Langley's ISAAC robotic composites research tool.

  12. Computers in aeronautics and space research at the Lewis Research Center

    NASA Technical Reports Server (NTRS)

    1991-01-01

    This brochure presents a general discussion of the role of computers in aerospace research at NASA's Lewis Research Center (LeRC). Four particular areas of computer applications are addressed: computer modeling and simulation, computer assisted engineering, data acquisition and analysis, and computer controlled testing.

  13. Data-Base Systems Pace Growth of Computer Center at Major Midwestern University.

    ERIC Educational Resources Information Center

    Technological Horizons in Education, 1982

    1982-01-01

    Describes Iowa State University's Computation Center. Center's responsibilities include computer science research and providing computing services to the university. System Industries' Trade and Exchange program (permitting used Digital Equipment Corporation disk drives to be exchanged for new SI equipment at substantial cost savings) was used to…

  14. Advanced Measurement Technology at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Antcliff, Richard R.

    1998-01-01

    Instrumentation systems have always been essential components of world class wind tunnels and laboratories. Langley continues to be on the forefront of the development of advanced systems for aerospace applications. This paper will describe recent advances in selected measurement systems which have had significant impact on aerospace testing. To fully understand the aerodynamics and aerothermodynamics influencing aerospace vehicles, highly accurate and repeatable measurements need to be made of critical phenomena. However, to maintain leadership in a highly competitive world market, productivity enhancement and the development of new capabilities must also be addressed aggressively. The accomplishment of these sometimes conflicting requirements has been the challenge of advanced measurement developers. However, several new technologies have recently matured to the point where they have enabled the achievement of these goals. One of the critical areas where advanced measurement systems are required is flow field velocity measurements. These measurements are required to correctly characterize the flowfield under study, to quantify the aerodynamic performance of test articles and to assess the effect of aerodynamic vehicles on their environment. Advanced measurement systems are also making great strides in obtaining planar measurements of other important thermodynamic quantities, including species concentration, temperature, pressure and the speed of sound. Langley has been on the forefront of applying these technologies to practical wind tunnel environments. New capabilities in Projection Moire Interferometry and Acoustics Array Measurement systems have extended our capabilities into the model deformation, vibration and noise measurement arenas. An overview of the status of these techniques and recent applications in practical environments will be presented in this paper.

  15. Advanced computer modeling techniques expand belt conveyor technology

    SciTech Connect

    Alspaugh, M.

    1998-07-01

    Increased mining production is continuing to challenge engineers and manufacturers to keep up. The pressure to produce larger and more versatile equipment is increasing. This paper will show some recent major projects in the belt conveyor industry that have pushed the limits of design and engineering technology. Also, it will discuss the systems engineering discipline and advanced computer modeling tools that have helped make these achievements possible. Several examples of technologically advanced designs will be reviewed. However, new technology can sometimes produce increased problems with equipment availability and reliability if not carefully developed. Computer modeling techniques that help one design larger equipment can also compound operational headaches if engineering processes and algorithms are not carefully analyzed every step of the way.

  16. Advances in Cross-Cutting Ideas for Computational Climate Science

    SciTech Connect

    Ng, Esmond; Evans, Katherine J.; Caldwell, Peter; Hoffman, Forrest M.; Jackson, Charles; Kerstin, Van Dam; Leung, Ruby; Martin, Daniel F.; Ostrouchov, George; Tuminaro, Raymond; Ullrich, Paul; Wild, S.; Williams, Samuel

    2017-01-01

    This report presents results from the DOE-sponsored workshop titled, ``Advancing X-Cutting Ideas for Computational Climate Science Workshop,'' known as AXICCS, held on September 12--13, 2016 in Rockville, MD. The workshop brought together experts in climate science, computational climate science, computer science, and mathematics to discuss interesting but unsolved science questions regarding climate modeling and simulation, promoted collaboration among the diverse scientists in attendance, and brainstormed about possible tools and capabilities that could be developed to help address them. Emerged from discussions at the workshop were several research opportunities that the group felt could advance climate science significantly. These include (1) process-resolving models to provide insight into important processes and features of interest and inform the development of advanced physical parameterizations, (2) a community effort to develop and provide integrated model credibility, (3) including, organizing, and managing increasingly connected model components that increase model fidelity yet complexity, and (4) treating Earth system models as one interconnected organism without numerical or data based boundaries that limit interactions. The group also identified several cross-cutting advances in mathematics, computer science, and computational science that would be needed to enable one or more of these big ideas. It is critical to address the need for organized, verified, and optimized software, which enables the models to grow and continue to provide solutions in which the community can have confidence. Effectively utilizing the newest computer hardware enables simulation efficiency and the ability to handle output from increasingly complex and detailed models. This will be accomplished through hierarchical multiscale algorithms in tandem with new strategies for data handling, analysis, and storage. These big ideas and cross-cutting technologies for enabling

  17. Personal Computers, Weather Observations, and the National Climatic Data Center.

    NASA Astrophysics Data System (ADS)

    Heim, Richard, Jr.

    1988-05-01

    The personal computer (PC) has become an important part of meteorological observing, telecommunications, forecasting, research, and data-management systems. The National Climatic Data Center (NCDC) is the nation's quality-control and archival facility for weather data. NCDC's digital archive consists of more than 200 data sets which are stored on over 50 000 reels of high-density magnetic tape. Its size and complexity make on-line access to the complete archive via PC and modem impractical. However, NCDC recognizes the growing importance of PCs in climatic applications and, since 1984, has made selected data sets available in a PC-readable format.The data sets available on diskette fall into the following broad categories: hourly observations, daily observations, derived quantities, and summary statistics. The period of record varies with each data set and with each station. In the digital archive, daily observations generally begin in the late 1800's to the early 1900's, and hourly observations generally begin in the mid 1900's.A review of NCDC data operations and products puts the digital archive into an operational perspective. The two formats (BASIC sequential element, and fixed-position fields) in which data-set diskettes are available are sumniafized. BASIC-sequential-element files can be "imported" into a LOTUS-type spreadsheet.NCDC is also responsible for describing the nation's climate. These functions have been condensed into a climatological data-management and analysis software package, called CLICOM, which can be run on a PC.

  18. Advanced Life Support Project: Crop Experiments at Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Sager, John C.; Stutte, Gary W.; Wheeler, Raymond M.; Yorio, Neil

    2004-01-01

    Crop production systems provide bioregenerative technologies to complement human crew life support requirements on long duration space missions. Kennedy Space Center has lead NASA's research on crop production systems that produce high value fresh foods, provide atmospheric regeneration, and perform water processing. As the emphasis on early missions to Mars has developed, our research focused on modular, scalable systems for transit missions, which can be developed into larger autonomous, bioregenerative systems for subsequent surface missions. Components of these scalable systems will include development of efficient light generating or collecting technologies, low mass plant growth chambers, and capability to operate in the high energy background radiation and reduced atmospheric pressures of space. These systems will be integrated with air, water, and thermal subsystems in an operational system. Extensive crop testing has been done for both staple and salad crops, but limited data is available on specific cultivar selection and breadboard testing to meet nominal Mars mission profiles of a 500-600 day surface mission. The recent research emphasis at Kennedy Space Center has shifted from staple crops, such as wheat, soybean and rice, toward short cycle salad crops such as lettuce, onion, radish, tomato, pepper, and strawberry. This paper will review the results of crop experiments to support the Exploration Initiative and the ongoing development of supporting technologies, and give an overview of capabilities of the newly opened Space Life Science (SLS) Lab at Kennedy Space Center. The 9662 square m (104,000 square ft) SLS Lab was built by the State of Florida and supports all NASA research that had been performed in Hanger-L. In addition to NASA research, the SLS Lab houses the Florida Space Research Institute (FSRI), responsible for co-managing the facility, and the University of Florida (UF) has established the Space Agriculture and Biotechnology Research and

  19. Advanced Material Intelligent Processing Center: Next Generation Scalable Lean Manufacturing

    DTIC Science & Technology

    2012-09-04

    machines and have made significant advances to automated tape laying (ATL) and automated fiber placement (AFP) technologies. Companies are moving...beyond standard thermoplastic and thermoset prepregs and are looking at placing 00A prepregs as well as dry fabrics. Today. Automated Tape Laying (ATL...References [1] Michael N. Grimshaw, " Automated Tape Laying ." in ASM Handbook Vol. 21 Composites.. ASM International, 2001. [2] Obaid Younossi. Michael

  20. 78 FR 26377 - National Center for Advancing Translational Sciences; Notice of Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-06

    ..., Bethesda, MD 20892. Contact Person: Danilo A. Tagle, Ph.D., Executive Secretary, National Center for....Tagle@nih.gov . This notice is being published less than 15 days prior to the meeting due to scheduling...: Danilo A. Tagle, Ph.D., Executive Secretary, National Center for Advancing Translational, Sciences,...

  1. NASA University Research Centers Technical Advances in Education, Aeronautics, Space, Autonomy, Earth and Environment

    NASA Technical Reports Server (NTRS)

    Jamshidi, M. (Editor); Lumia, R. (Editor); Tunstel, E., Jr. (Editor); White, B. (Editor); Malone, J. (Editor); Sakimoto, P. (Editor)

    1997-01-01

    This first volume of the Autonomous Control Engineering (ACE) Center Press Series on NASA University Research Center's (URC's) Advanced Technologies on Space Exploration and National Service constitute a report on the research papers and presentations delivered by NASA Installations and industry and Report of the NASA's fourteen URC's held at the First National Conference in Albuquerque, New Mexico from February 16-19, 1997.

  2. National Center for Advanced Information Components Manufacturing. Program summary report, Volume II

    SciTech Connect

    1996-10-01

    The National Center for Advanced Information Components Manufacturing focused on manufacturing research and development for flat panel displays, advanced lithography, microelectronics, and optoelectronics. This report provides an overview of the program, program history, summaries of the technical projects, and key program accomplishments.

  3. The advanced computational testing and simulation toolkit (ACTS)

    SciTech Connect

    Drummond, L.A.; Marques, O.

    2002-05-21

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  4. High-Performance Computing for Advanced Smart Grid Applications

    SciTech Connect

    Huang, Zhenyu; Chen, Yousu

    2012-07-06

    The power grid is becoming far more complex as a result of the grid evolution meeting an information revolution. Due to the penetration of smart grid technologies, the grid is evolving as an unprecedented speed and the information infrastructure is fundamentally improved with a large number of smart meters and sensors that produce several orders of magnitude larger amounts of data. How to pull data in, perform analysis, and put information out in a real-time manner is a fundamental challenge in smart grid operation and planning. The future power grid requires high performance computing to be one of the foundational technologies in developing the algorithms and tools for the significantly increased complexity. New techniques and computational capabilities are required to meet the demands for higher reliability and better asset utilization, including advanced algorithms and computing hardware for large-scale modeling, simulation, and analysis. This chapter summarizes the computational challenges in smart grid and the need for high performance computing, and present examples of how high performance computing might be used for future smart grid operation and planning.

  5. The Center for Advanced Systems and Engineering (CASE)

    DTIC Science & Technology

    2012-01-01

    Multiple Criteria Workflow Robustness and Resiliency Modeling with Petri Nets The increasing complexity and tight coupling between people and computer...management framework capable of both modeling structure and providing a wide range of quantitative analysis with high-level Petri nets (PNs). Their...Infosphere in net -centric environments. By supporting real-time information dissemination in a context-sensitive fashion, our research is expected to

  6. Computer architectures for computational physics work done by Computational Research and Technology Branch and Advanced Computational Concepts Group

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Slides are reproduced that describe the importance of having high performance number crunching and graphics capability. They also indicate the types of research and development underway at Ames Research Center to ensure that, in the near term, Ames is a smart buyer and user, and in the long-term that Ames knows the best possible solutions for number crunching and graphics needs. The drivers for this research are real computational physics applications of interest to Ames and NASA. They are concerned with how to map the applications, and how to maximize the physics learned from the results of the calculations. The computer graphics activities are aimed at getting maximum information from the three-dimensional calculations by using the real time manipulation of three-dimensional data on the Silicon Graphics workstation. Work is underway on new algorithms that will permit the display of experimental results that are sparse and random, the same way that the dense and regular computed results are displayed.

  7. Advanced interactive displays for deployable command and control centers

    NASA Astrophysics Data System (ADS)

    Jedrysik, Peter A.; Parada, Francisco E.; Stedman, Terrance A.; Zhang, Jingyuan

    2003-09-01

    Command and control in today's battlefield environment requires efficient and effective control of massive amounts of constantly changing information from a variety of databases and real-time sensors. Using advanced information technology for presentation and interactive control enables more extensive data fusion and correlation to present an accurate picture of the battlespace to commanders and their staffs. The Interactive DataWall being developed by the Advanced Displays and Intelligent Interfaces (ADII) technology team of the Air Force Research Laboratory's Information Directorate (AFRL/IF) is a strong contender for solving the information management problems facing the 21st century military commander. It provides an ultra high-resolution large screen display with multi-modal, wireless interaction. Commercial off-the-shelf (COTS) technology has been combined with specialized hardware and software developed in-house to provide a unique capability for multimedia data display and control. The technology once isolated to a laboratory environment has been packaged into deployable systems that have been successfully transitioned to support the warfighter in the field.

  8. Advances and trends in the development of computational models for tires

    NASA Technical Reports Server (NTRS)

    Noor, A. K.; Tanner, J. A.

    1985-01-01

    Status and some recent developments of computational models for tires are summarized. Discussion focuses on a number of aspects of tire modeling and analysis including: tire materials and their characterization; evolution of tire models; characteristics of effective finite element models for analyzing tires; analysis needs for tires; and impact of the advances made in finite element technology, computational algorithms, and new computing systems on tire modeling and analysis. An initial set of benchmark problems has been proposed in concert with the U.S. tire industry. Extensive sets of experimental data will be collected for these problems and used for evaluating and validating different tire models. Also, the new Aircraft Landing Dynamics Facility (ALDF) at NASA Langley Research Center is described.

  9. Computation of the tip vortex flowfield for advanced aircraft propellers

    NASA Technical Reports Server (NTRS)

    Tsai, Tommy M.; Dejong, Frederick J.; Levy, Ralph

    1988-01-01

    The tip vortex flowfield plays a significant role in the performance of advanced aircraft propellers. The flowfield in the tip region is complex, three-dimensional and viscous with large secondary velocities. An analysis is presented using an approximate set of equations which contains the physics required by the tip vortex flowfield, but which does not require the resources of the full Navier-Stokes equations. A computer code was developed to predict the tip vortex flowfield of advanced aircraft propellers. A grid generation package was developed to allow specification of a variety of advanced aircraft propeller shapes. Calculations of the tip vortex generation on an SR3 type blade at high Reynolds numbers were made using this code and a parametric study was performed to show the effect of tip thickness on tip vortex intensity. In addition, calculations of the tip vortex generation on a NACA 0012 type blade were made, including the flowfield downstream of the blade trailing edge. Comparison of flowfield calculations with experimental data from an F4 blade was made. A user's manual was also prepared for the computer code (NASA CR-182178).

  10. User participation in the development of the human/computer interface for control centers

    NASA Technical Reports Server (NTRS)

    Broome, Richard; Quick-Campbell, Marlene; Creegan, James; Dutilly, Robert

    1996-01-01

    Technological advances coupled with the requirements to reduce operations staffing costs led to the demand for efficient, technologically-sophisticated mission operations control centers. The control center under development for the earth observing system (EOS) is considered. The users are involved in the development of a control center in order to ensure that it is cost-efficient and flexible. A number of measures were implemented in the EOS program in order to encourage user involvement in the area of human-computer interface development. The following user participation exercises carried out in relation to the system analysis and design are described: the shadow participation of the programmers during a day of operations; the flight operations personnel interviews; and the analysis of the flight operations team tasks. The user participation in the interface prototype development, the prototype evaluation, and the system implementation are reported on. The involvement of the users early in the development process enables the requirements to be better understood and the cost to be reduced.

  11. Report on Advanced Life Support Activities at Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Wheeler, Raymond M.

    2004-01-01

    Plant studies at Kennedy Space Center last year focused on selecting cultivars of lettuce, tomato, and pepper for further testing as crops for near-term space flight applications. Other testing continued with lettuce, onion, and radish plants grown at different combinations of light (PPF), temperature, and CO2 concentration. In addition, comparisons of mixed versus mono culture approaches for vegetable production were studied. Water processing testing focused on the development and testing of a rotating membrane bioreactor to increase oxygen diffusion levels for reducing total organic carbon levels and promoting nitrification. Other testing continued to study composting testing for food wastes (NRA grant) and the use of supplemental green light with red/blue LED lighting systems for plant production (NRC fellowship).

  12. XII Advanced Computing and Analysis Techniques in Physics Research

    NASA Astrophysics Data System (ADS)

    Speer, Thomas; Carminati, Federico; Werlen, Monique

    November 2008 will be a few months after the official start of LHC when the highest quantum energy ever produced by mankind will be observed by the most complex piece of scientific equipment ever built. LHC will open a new era in physics research and push further the frontier of Knowledge This achievement has been made possible by new technological developments in many fields, but computing is certainly the technology that has made possible this whole enterprise. Accelerator and detector design, construction management, data acquisition, detectors monitoring, data analysis, event simulation and theoretical interpretation are all computing based HEP activities but also occurring many other research fields. Computing is everywhere and forms the common link between all involved scientists and engineers. The ACAT workshop series, created back in 1990 as AIHENP (Artificial Intelligence in High Energy and Nuclear Research) has been covering the tremendous evolution of computing in its most advanced topics, trying to setup bridges between computer science, experimental and theoretical physics. Conference web-site: http://acat2008.cern.ch/ Programme and presentations: http://indico.cern.ch/conferenceDisplay.py?confId=34666

  13. High resolution computed tomography of advanced composite and ceramic materials

    NASA Technical Reports Server (NTRS)

    Yancey, R. N.; Klima, S. J.

    1991-01-01

    Advanced composite and ceramic materials are being developed for use in many new defense and commercial applications. In order to achieve the desired mechanical properties of these materials, the structural elements must be carefully analyzed and engineered. A study was conducted to evaluate the use of high resolution computed tomography (CT) as a macrostructural analysis tool for advanced composite and ceramic materials. Several samples were scanned using a laboratory high resolution CT scanner. Samples were also destructively analyzed at the locations of the scans and the nondestructive and destructive results were compared. The study provides useful information outlining the strengths and limitations of this technique and the prospects for further research in this area.

  14. A microtomography beamline at the Louisiana State University Center for Advanced Microstructures and Devices synchrotron

    NASA Astrophysics Data System (ADS)

    Ham, Kyungmin; Jin, Hua; Butler, Leslie G.; Kurtz, Richard L.

    2002-03-01

    A microtomography beamline has been recently assembled and is currently operating at the Louisiana State University's Center for Advanced Microstructures and Devices synchrotron (CAMD). It has been installed on a bending magnet white-light beamline at port 7A. With the storage ring operating at 1.5 GeV, this beamline has a maximum usable x-ray energy of ˜15 keV. The instrumentation consists of computer-controlled positioning stages for alignment and rotation, a CsI(Tl) phosphor screen, a reflecting mirror, a microscope objective (1:1, 1:4), and Linux/LabVIEW-controlled charge coupled device. With the 1:4 objective, the maximum spatial resolution is 2.25 μm. The positioning and image acquisition computers communicate via transfer control protocol/internet protocol (TCP/IP). A small G4/Linux cluster has been installed for the purpose of on-site reconstruction. Instrument, alignment and reconstruction programs are written in MATLAB, IDL, and C. The applications to date are many and we present several examples. Several biological samples have been studied as part of an effort on biological visualization and computation. Future improvements to this microtomography station include the addition of a double-multilayer monochromator, allowing one to evaluate the three-dimensional elemental composition of materials. Plans also include eventual installation at the CAMD 7 T wiggler beamline, providing x rays in excess of 50 keV to provide better penetration of higher mass-density materials.

  15. Advances in Materials Research: An Internship at Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Barrios, Elizabeth A.; Roberson, Luke B.

    2011-01-01

    My time at Kennedy Space Center. was spent immersing myself in research performed in the Materials Science Division of the Engineering Directorate. My Chemical Engineering background provided me the ability to assist in many different projects ranging from tensile testing of composite materials to making tape via an extrusion process. However, I spent the majority of my time on the following three projects: (1) testing three different materials to determine antimicrobial properties; (2) fabricating and analyzing hydrogen sensing tapes that were placed at the launch pad for STS-133 launch; and (3) researching molten regolith electrolysis at KSC to prepare me for my summer internship at MSFC on a closely related topic. This paper aims to explain, in detail, what I have learned about these three main projects. It will explain why this research is happening and what we are currently doing to resolve the issues. This paper will also explain how the hard work and experiences that I have gained as an intern have provided me with the next big step towards my career at NASA.

  16. Development of Advanced Hydrocarbon Fuels at Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    Bai, S. D.; Dumbacher, P.; Cole, J. W.

    2002-01-01

    This was a small-scale, hot-fire test series to make initial measurements of performance differences of five new liquid fuels relative to rocket propellant-1 (RP-1). The program was part of a high-energy-density materials development at Marshall Space Flight Center (MSFC), and the fuels tested were quadricyclane, 1-7 octodiyne, AFRL-1, biclopropylidene, and competitive impulse noncarcinogenic hypergol (CINCH) (di-methyl-aminoethyl-azide). All tests were conducted at MSFC. The first four fuels were provided by the U.S. Air Force Research Laboratory (AFRL), Edwards Air Force Base, CA. The U.S. Army, Redstone Arsenal, Huntsville, AL, provided the CINCH. The data recorded in all hot-fire tests were used to calculate specific impulse and characteristic exhaust velocity for each fuel, then compared to RP-1 at the same conditions. This was not an exhaustive study, comparing each fuel to RP-1 at an array of mixture ratios, nor did it include important fuel parameters, such as fuel handling or long-term storage. The test hardware was designed for liquid oxygen (lox)/RP-1, then modified for gaseous oxygen/RP-1 to avoid two-phase lox at very small flow rates. All fuels were tested using the same thruster/injector combination designed for RP-1. The results of this test will be used to determine which fuels will be tested in future test programs.

  17. Advances in Computational Stability Analysis of Composite Aerospace Structures

    SciTech Connect

    Degenhardt, R.; Araujo, F. C. de

    2010-09-30

    European aircraft industry demands for reduced development and operating costs. Structural weight reduction by exploitation of structural reserves in composite aerospace structures contributes to this aim, however, it requires accurate and experimentally validated stability analysis of real structures under realistic loading conditions. This paper presents different advances from the area of computational stability analysis of composite aerospace structures which contribute to that field. For stringer stiffened panels main results of the finished EU project COCOMAT are given. It investigated the exploitation of reserves in primary fibre composite fuselage structures through an accurate and reliable simulation of postbuckling and collapse. For unstiffened cylindrical composite shells a proposal for a new design method is presented.

  18. Recent Advances in Computed Tomographic Technology: Cardiopulmonary Imaging Applications.

    PubMed

    Tabari, Azadeh; Lo Gullo, Roberto; Murugan, Venkatesh; Otrakji, Alexi; Digumarthy, Subba; Kalra, Mannudeep

    2017-03-01

    Cardiothoracic diseases result in substantial morbidity and mortality. Chest computed tomography (CT) has been an imaging modality of choice for assessing a host of chest diseases, and technologic advances have enabled the emergence of coronary CT angiography as a robust noninvasive test for cardiac imaging. Technologic developments in CT have also enabled the application of dual-energy CT scanning for assessing pulmonary vascular and neoplastic processes. Concerns over increasing radiation dose from CT scanning are being addressed with introduction of more dose-efficient wide-area detector arrays and iterative reconstruction techniques. This review article discusses the technologic innovations in CT and their effect on cardiothoracic applications.

  19. Advanced Computer Science on Internal Ballistics of Solid Rocket Motors

    NASA Astrophysics Data System (ADS)

    Shimada, Toru; Kato, Kazushige; Sekino, Nobuhiro; Tsuboi, Nobuyuki; Seike, Yoshio; Fukunaga, Mihoko; Daimon, Yu; Hasegawa, Hiroshi; Asakawa, Hiroya

    In this paper, described is the development of a numerical simulation system, what we call “Advanced Computer Science on SRM Internal Ballistics (ACSSIB)”, for the purpose of improvement of performance and reliability of solid rocket motors (SRM). The ACSSIB system is consisting of a casting simulation code of solid propellant slurry, correlation database of local burning-rate of cured propellant in terms of local slurry flow characteristics, and a numerical code for the internal ballistics of SRM, as well as relevant hardware. This paper describes mainly the objectives, the contents of this R&D, and the output of the fiscal year of 2008.

  20. Advances in Electromagnetic Modelling through High Performance Computing

    SciTech Connect

    Ko, K.; Folwell, N.; Ge, L.; Guetz, A.; Lee, L.; Li, Z.; Ng, C.; Prudencio, E.; Schussman, G.; Uplenchwar, R.; Xiao, L.; /SLAC

    2006-03-29

    Under the DOE SciDAC project on Accelerator Science and Technology, a suite of electromagnetic codes has been under development at SLAC that are based on unstructured grids for higher accuracy, and use parallel processing to enable large-scale simulation. The new modeling capability is supported by SciDAC collaborations on meshing, solvers, refinement, optimization and visualization. These advances in computational science are described and the application of the parallel eigensolver Omega3P to the cavity design for the International Linear Collider is discussed.

  1. Computational methods of the Advanced Fluid Dynamics Model

    SciTech Connect

    Bohl, W.R.; Wilhelm, D.; Parker, F.R.; Berthier, J.; Maudlin, P.J.; Schmuck, P.; Goutagny, L.; Ichikawa, S.; Ninokata, H.; Luck, L.B.

    1987-01-01

    To more accurately treat severe accidents in fast reactors, a program has been set up to investigate new computational models and approaches. The product of this effort is a computer code, the Advanced Fluid Dynamics Model (AFDM). This paper describes some of the basic features of the numerical algorithm used in AFDM. Aspects receiving particular emphasis are the fractional-step method of time integration, the semi-implicit pressure iteration, the virtual mass inertial terms, the use of three velocity fields, higher order differencing, convection of interfacial area with source and sink terms, multicomponent diffusion processes in heat and mass transfer, the SESAME equation of state, and vectorized programming. A calculated comparison with an isothermal tetralin/ammonia experiment is performed. We conclude that significant improvements are possible in reliably calculating the progression of severe accidents with further development.

  2. Center for computation and visualization of geometric structures. [Annual], Progress report

    SciTech Connect

    Not Available

    1993-02-12

    The mission of the Center is to establish a unified environment promoting research, education, and software and tool development. The work is centered on computing, interpreted in a broad sense to include the relevant theory, development of algorithms, and actual implementation. The research aspects of the Center are focused on geometry; correspondingly the computational aspects are focused on three (and higher) dimensional visualization. The educational aspects are likewise centered on computing and focused on geometry. A broader term than education is `communication` which encompasses the challenge of explaining to the world current research in mathematics, and specifically geometry.

  3. Computational ocean acoustics: Advances in 3D ocean acoustic modeling

    NASA Astrophysics Data System (ADS)

    Schmidt, Henrik; Jensen, Finn B.

    2012-11-01

    The numerical model of ocean acoustic propagation developed in the 1980's are still in widespread use today, and the field of computational ocean acoustics is often considered a mature field. However, the explosive increase in computational power available to the community has created opportunities for modeling phenomena that earlier were beyond reach. Most notably, three-dimensional propagation and scattering problems have been prohibitive computationally, but are now addressed routinely using brute force numerical approaches such as the Finite Element Method, in particular for target scattering problems, where they are being combined with the traditional wave theory propagation models in hybrid modeling frameworks. Also, recent years has seen the development of hybrid approaches coupling oceanographic circulation models with acoustic propagation models, enabling the forecasting of sonar performance uncertainty in dynamic ocean environments. These and other advances made over the last couple of decades support the notion that the field of computational ocean acoustics is far from being mature. [Work supported by the Office of Naval Research, Code 321OA].

  4. Fluid dynamics parallel computer development at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Townsend, James C.; Zang, Thomas A.; Dwoyer, Douglas L.

    1987-01-01

    To accomplish more detailed simulations of highly complex flows, such as the transition to turbulence, fluid dynamics research requires computers much more powerful than any available today. Only parallel processing on multiple-processor computers offers hope for achieving the required effective speeds. Looking ahead to the use of these machines, the fluid dynamicist faces three issues: algorithm development for near-term parallel computers, architecture development for future computer power increases, and assessment of possible advantages of special purpose designs. Two projects at NASA Langley address these issues. Software development and algorithm exploration is being done on the FLEX/32 Parallel Processing Research Computer. New architecture features are being explored in the special purpose hardware design of the Navier-Stokes Computer. These projects are complementary and are producing promising results.

  5. The ADVANCE network: accelerating data value across a national community health center network

    PubMed Central

    DeVoe, Jennifer E; Gold, Rachel; Cottrell, Erika; Bauer, Vance; Brickman, Andrew; Puro, Jon; Nelson, Christine; Mayer, Kenneth H; Sears, Abigail; Burdick, Tim; Merrell, Jonathan; Matthews, Paul; Fields, Scott

    2014-01-01

    The ADVANCE (Accelerating Data Value Across a National Community Health Center Network) clinical data research network (CDRN) is led by the OCHIN Community Health Information Network in partnership with Health Choice Network and Fenway Health. The ADVANCE CDRN will ‘horizontally’ integrate outpatient electronic health record data for over one million federally qualified health center patients, and ‘vertically’ integrate hospital, health plan, and community data for these patients, often under-represented in research studies. Patient investigators, community investigators, and academic investigators with diverse expertise will work together to meet project goals related to data integration, patient engagement and recruitment, and the development of streamlined regulatory policies. By enhancing the data and research infrastructure of participating organizations, the ADVANCE CDRN will serve as a ‘community laboratory’ for including disadvantaged and vulnerable patients in patient-centered outcomes research that is aligned with the priorities of patients, clinics, and communities in our network. PMID:24821740

  6. Computational cardiology: how computer simulations could be used to develop new therapies and advance existing ones

    PubMed Central

    Trayanova, Natalia A.; O'Hara, Thomas; Bayer, Jason D.; Boyle, Patrick M.; McDowell, Kathleen S.; Constantino, Jason; Arevalo, Hermenegild J.; Hu, Yuxuan; Vadakkumpadan, Fijoy

    2012-01-01

    This article reviews the latest developments in computational cardiology. It focuses on the contribution of cardiac modelling to the development of new therapies as well as the advancement of existing ones for cardiac arrhythmias and pump dysfunction. Reviewed are cardiac modelling efforts aimed at advancing and optimizing existent therapies for cardiac disease (defibrillation, ablation of ventricular tachycardia, and cardiac resynchronization therapy) and at suggesting novel treatments, including novel molecular targets, as well as efforts to use cardiac models in stratification of patients likely to benefit from a given therapy, and the use of models in diagnostic procedures. PMID:23104919

  7. Operation of the Computer Software Management and Information Center (COSMIC)

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The major operational areas of the COSMIC center are described. Quantitative data on the software submittals, program verification, and evaluation are presented. The dissemination activities are summarized. Customer services and marketing activities of the center for the calendar year are described. Those activities devoted to the maintenance and support of selected programs are described. A Customer Information system, the COSMIC Abstract Recording System Project, and the COSMIC Microfiche Project are summarized. Operational cost data are summarized.

  8. Radionuclide Emission Estimation for the Center for Advanced Energy Studies (CAES)

    SciTech Connect

    Bradley J Schrader

    2010-02-01

    An Radiological Safety Analysis Computer Program (RSAC)-7 model dose assessment was performed to evaluate maximum Center for Advanced Energy Studies (CAES) boundary effective dose equivalent (EDE, in mrem/yr) for potential individual releases of radionuclides from the facility. The CAES is a public/private partnership between the State of Idaho and its academic research institutions, the federal government through the U.S. Department of Energy (DOE), and the Idaho National Laboratory (INL) managed by the Battelle Energy Alliance (BEA). CAES serves to advance energy security for our nation by expanding educational opportunities at Idaho universities in energy-related areas, creating new capabilities within its member institutions, and delivering technological innovations leading to technology-based economic development for the intermountain region. CAES has developed a strategic plan (INL/EXT-07-12950) based on the balanced scorecard approach. At the present time it is unknown exactly what processes will be used in the facility in support of this strategic plan. What is known is that the Idaho State University (ISU) Radioactive Materials License (Nuclear Regulatory Commission [NRC] license 11-27380-01) is the basis for handling radioactive material in the facility. The material in this license is shared between the ISU campus and the CAES facility. There currently are no agreements in place to limit the amount of radioactive material at the CAES facility or what is done to the material in the facility. The scope of this analysis is a summary look at the basis dose for each radionuclide included under the license at a distance of 100, 500, and 1,000 m. Inhalation, ingestion and ground surface dose was evaluated using the NRC design basis guidelines. The results can be used to determine a sum of the fractions approach to facility safety. This sum of the fractions allows a facility threshold value (TV) to be established and potential activities to be evaluated against

  9. Realizing the potential of the CUAHSI Water Data Center to advance Earth Science

    NASA Astrophysics Data System (ADS)

    Hooper, R. P.; Seul, M.; Pollak, J.; Couch, A.

    2015-12-01

    The CUAHSI Water Data Center has developed a cloud-based system for data publication, discovery and access. Key features of this system are a semantically enabled catalog to discover data across more than 100 different services and delivery of data and metadata in a standard format. While this represents a significant technical achievement, the purpose of this system is to support data reanalysis for advancing science. A new web-based client, HydroClient, improves access to the data from previous clients. This client is envisioned as the first step in a workflow that can involve visualization and analysis using web-processing services, followed by download to local computers for further analysis. The release of the WaterML library in the R package CRAN repository is an initial attempt at linking the WDC services in a larger analysis workflow. We are seeking community input on other resources required to make the WDC services more valuable in scientific research and education.

  10. Microgravity polymer and crystal growth at the Advanced Materials Center for the Commercial Development of Space

    NASA Technical Reports Server (NTRS)

    Mccauley, Lisa A.

    1990-01-01

    The microgravity research programs currently conducted by the Advanced Materials Center for the Commercial Development of Space (CCDS) are briefly reviewed. Polymer processing in space, which constitutes the most active microgravity program at the Advanced Materials CCDS, is conducted in three areas: membrane processing, multiphase composite behavior, and plasma polymerization. Current work in microgravity crystal growth is discussed with particular reference to the development of the Zeolite Crystal Growth facility.

  11. DOE Advanced Scientific Computing Advisory Committee (ASCAC) Report: Exascale Computing Initiative Review

    SciTech Connect

    Reed, Daniel; Berzins, Martin; Pennington, Robert; Sarkar, Vivek; Taylor, Valerie

    2015-08-01

    On November 19, 2014, the Advanced Scientific Computing Advisory Committee (ASCAC) was charged with reviewing the Department of Energy’s conceptual design for the Exascale Computing Initiative (ECI). In particular, this included assessing whether there are significant gaps in the ECI plan or areas that need to be given priority or extra management attention. Given the breadth and depth of previous reviews of the technical challenges inherent in exascale system design and deployment, the subcommittee focused its assessment on organizational and management issues, considering technical issues only as they informed organizational or management priorities and structures. This report presents the observations and recommendations of the subcommittee.

  12. Recent advances in computational mechanics of the human knee joint.

    PubMed

    Kazemi, M; Dabiri, Y; Li, L P

    2013-01-01

    Computational mechanics has been advanced in every area of orthopedic biomechanics. The objective of this paper is to provide a general review of the computational models used in the analysis of the mechanical function of the knee joint in different loading and pathological conditions. Major review articles published in related areas are summarized first. The constitutive models for soft tissues of the knee are briefly discussed to facilitate understanding the joint modeling. A detailed review of the tibiofemoral joint models is presented thereafter. The geometry reconstruction procedures as well as some critical issues in finite element modeling are also discussed. Computational modeling can be a reliable and effective method for the study of mechanical behavior of the knee joint, if the model is constructed correctly. Single-phase material models have been used to predict the instantaneous load response for the healthy knees and repaired joints, such as total and partial meniscectomies, ACL and PCL reconstructions, and joint replacements. Recently, poromechanical models accounting for fluid pressurization in soft tissues have been proposed to study the viscoelastic response of the healthy and impaired knee joints. While the constitutive modeling has been considerably advanced at the tissue level, many challenges still exist in applying a good material model to three-dimensional joint simulations. A complete model validation at the joint level seems impossible presently, because only simple data can be obtained experimentally. Therefore, model validation may be concentrated on the constitutive laws using multiple mechanical tests of the tissues. Extensive model verifications at the joint level are still crucial for the accuracy of the modeling.

  13. Recent Advances in Computational Mechanics of the Human Knee Joint

    PubMed Central

    Kazemi, M.; Dabiri, Y.; Li, L. P.

    2013-01-01

    Computational mechanics has been advanced in every area of orthopedic biomechanics. The objective of this paper is to provide a general review of the computational models used in the analysis of the mechanical function of the knee joint in different loading and pathological conditions. Major review articles published in related areas are summarized first. The constitutive models for soft tissues of the knee are briefly discussed to facilitate understanding the joint modeling. A detailed review of the tibiofemoral joint models is presented thereafter. The geometry reconstruction procedures as well as some critical issues in finite element modeling are also discussed. Computational modeling can be a reliable and effective method for the study of mechanical behavior of the knee joint, if the model is constructed correctly. Single-phase material models have been used to predict the instantaneous load response for the healthy knees and repaired joints, such as total and partial meniscectomies, ACL and PCL reconstructions, and joint replacements. Recently, poromechanical models accounting for fluid pressurization in soft tissues have been proposed to study the viscoelastic response of the healthy and impaired knee joints. While the constitutive modeling has been considerably advanced at the tissue level, many challenges still exist in applying a good material model to three-dimensional joint simulations. A complete model validation at the joint level seems impossible presently, because only simple data can be obtained experimentally. Therefore, model validation may be concentrated on the constitutive laws using multiple mechanical tests of the tissues. Extensive model verifications at the joint level are still crucial for the accuracy of the modeling. PMID:23509602

  14. NASA Trapezoidal Wing Computations Including Transition and Advanced Turbulence Modeling

    NASA Technical Reports Server (NTRS)

    Rumsey, C. L.; Lee-Rausch, E. M.

    2012-01-01

    Flow about the NASA Trapezoidal Wing is computed with several turbulence models by using grids from the first High Lift Prediction Workshop in an effort to advance understanding of computational fluid dynamics modeling for this type of flowfield. Transition is accounted for in many of the computations. In particular, a recently-developed 4-equation transition model is utilized and works well overall. Accounting for transition tends to increase lift and decrease moment, which improves the agreement with experiment. Upper surface flap separation is reduced, and agreement with experimental surface pressures and velocity profiles is improved. The predicted shape of wakes from upstream elements is strongly influenced by grid resolution in regions above the main and flap elements. Turbulence model enhancements to account for rotation and curvature have the general effect of increasing lift and improving the resolution of the wing tip vortex as it convects downstream. However, none of the models improve the prediction of surface pressures near the wing tip, where more grid resolution is needed.

  15. Computer support for cooperative tasks in Mission Operations Centers

    NASA Technical Reports Server (NTRS)

    Fox, Jeffrey; Moore, Mike

    1994-01-01

    Traditionally, spacecraft management has been performed by fixed teams of operators in Mission Operations Centers. The team cooperatively: (1) ensures that payload(s) on spacecraft perform their work; and (2) maintains the health and safety of the spacecraft through commanding and monitoring the spacecraft's subsystems. In the future, the task demands will increase and overload the operators. This paper describes the traditional spacecraft management environment and describes a new concept in which groupware will be used to create a Virtual Mission Operations Center. Groupware tools will be used to better utilize available resources through increased automation and dynamic sharing of personnel among missions.

  16. Computer support for cooperative tasks in Mission Operations Centers

    SciTech Connect

    Fox, J.; Moore, M.

    1994-10-01

    Traditionally, spacecraft management has been performed by fixed teams of operators in Mission Operations Centers. The team cooperatively (1) ensures that payload(s) on spacecraft perform their work and (2) maintains the health and safety of the spacecraft through commanding and monitoring the spacecraft`s subsystems. In the future, the task demands will increase and overload the operators. This paper describes the traditional spacecraft management environment and describes a new concept in which groupware will be used to create a Virtual Mission Operations Center. Groupware tools will be used to better utilize available resources through increased automation and dynamic sharing of personnel among missions.

  17. Advanced Technologies for Future Spacecraft Cockpits and Space-based Control Centers

    NASA Technical Reports Server (NTRS)

    Garcia-Galan, Carlos; Uckun, Serdar; Gregory, William; Williams, Kerry

    2006-01-01

    The National Aeronautics and Space Administration (NASA) is embarking on a new era of Space Exploration, aimed at sending crewed spacecraft beyond Low Earth Orbit (LEO), in medium and long duration missions to the Lunar surface, Mars and beyond. The challenges of such missions are significant and will require new technologies and paradigms in vehicle design and mission operations. Current roles and responsibilities of spacecraft systems, crew and the flight control team, for example, may not be sustainable when real-time support is not assured due to distance-induced communication lags, radio blackouts, equipment failures, or other unexpected factors. Therefore, technologies and applications that enable greater Systems and Mission Management capabilities on-board the space-based system will be necessary to reduce the dependency on real-time critical Earth-based support. The focus of this paper is in such technologies that will be required to bring advance Systems and Mission Management capabilities to space-based environments where the crew will be required to manage both the systems performance and mission execution without dependence on the ground. We refer to this concept as autonomy. Environments that require high levels of autonomy include the cockpits of future spacecraft such as the Mars Exploration Vehicle, and space-based control centers such as a Lunar Base Command and Control Center. Furthermore, this paper will evaluate the requirements, available technology, and roadmap to enable full operational implementation of onboard System Health Management, Mission Planning/re-planning, Autonomous Task/Command Execution, and Human Computer Interface applications. The technology topics covered by the paper include enabling technology to perform Intelligent Caution and Warning, where the systems provides directly actionable data for human understanding and response to failures, task automation applications that automate nominal and Off-nominal task execution based

  18. National Energy Research Scientific Computing Center 2007 Annual Report

    SciTech Connect

    Hules, John A.; Bashor, Jon; Wang, Ucilia; Yarris, Lynn; Preuss, Paul

    2008-10-23

    This report presents highlights of the research conducted on NERSC computers in a variety of scientific disciplines during the year 2007. It also reports on changes and upgrades to NERSC's systems and services aswell as activities of NERSC staff.

  19. Computer Center: BASIC String Models of Genetic Information Transfer.

    ERIC Educational Resources Information Center

    Spain, James D., Ed.

    1984-01-01

    Discusses some of the major genetic information processes which may be modeled by computer program string manipulation, focusing on replication and transcription. Also discusses instructional applications of using string models. (JN)

  20. Computers & Technology in School Library Media Centers. Professional Growth Series.

    ERIC Educational Resources Information Center

    Bucher, Katherine Toth

    Technology is arriving in school libraries in unprecedented quantities, resulting in many changes in the school library media center. While most librarians agree that technology is wonderful, many are feeling the stress of rapid change and coping with the decisions made by educational policy makers. This looseleaf notebook, written for the novice,…

  1. OPENING REMARKS: SciDAC: Scientific Discovery through Advanced Computing

    NASA Astrophysics Data System (ADS)

    Strayer, Michael

    2005-01-01

    with industry and virtual prototyping. New instruments of collaboration will include institutes and centers while summer schools, workshops and outreach will invite new talent and expertise. Computational science adds new dimensions to science and its practice. Disciplines of fusion, accelerator science, and combustion are poised to blur the boundaries between pure and applied science. As we open the door into FY2006 we shall see a landscape of new scientific challenges: in biology, chemistry, materials, and astrophysics to name a few. The enabling technologies of SciDAC have been transformational as drivers of change. Planning for major new software systems assumes a base line employing Common Component Architectures and this has become a household word for new software projects. While grid algorithms and mesh refinement software have transformed applications software, data management and visualization have transformed our understanding of science from data. The Gordon Bell prize now seems to be dominated by computational science and solvers developed by TOPS ISIC. The priorities of the Office of Science in the Department of Energy are clear. The 20 year facilities plan is driven by new science. High performance computing is placed amongst the two highest priorities. Moore's law says that by the end of the next cycle of SciDAC we shall have peta-flop computers. The challenges of petascale computing are enormous. These and the associated computational science are the highest priorities for computing within the Office of Science. Our effort in Leadership Class computing is just a first step towards this goal. Clearly, computational science at this scale will face enormous challenges and possibilities. Performance evaluation and prediction will be critical to unraveling the needed software technologies. We must not lose sight of our overarching goal—that of scientific discovery. Science does not stand still and the landscape of science discovery and computing holds

  2. The next frontier: stem cells and the Center for the Advancement of Science in Space.

    PubMed

    Ratliff, Duane

    2013-12-01

    The Center for the Advancement of Science in Space (CASIS) manages the International Space Station U.S. National Laboratory, supporting space-based research that seeks to improve life on Earth. The National Laboratory is now open for use by the broad scientific community--and CASIS is the gateway to this powerful in-orbit research platform.

  3. Community College Advanced Technology Centers: Meeting America's Need for Integrated, Comprehensive Economic Development.

    ERIC Educational Resources Information Center

    Hinckley, Richard; And Others

    By entering into partnerships with business and industry, community colleges are able to offset the high cost of remaining current with training techniques, job market skill requirements, and state-of-the-art hardware. The construction of advanced technology centers (ATCs) located on community college campuses is one key element supporting these…

  4. 78 FR 21131 - National Center for Advancing Translational Sciences; Notice of Closed Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-09

    ... Democracy Plaza, 6701 Democracy Boulevard, Bethesda, MD 20892, (Telephone Conference Call). Contact Person... Center for Advancing Translational Sciences (NCATS), National Institutes of Health, 6701 Democracy Blvd., Democracy 1, Room 1084, Bethesda, MD 20892-4874, 301-435-0829, mv10f@nih.gov . Name of Committee:...

  5. High performance parallel computers for science: New developments at the Fermilab advanced computer program

    SciTech Connect

    Nash, T.; Areti, H.; Atac, R.; Biel, J.; Cook, A.; Deppe, J.; Edel, M.; Fischler, M.; Gaines, I.; Hance, R.

    1988-08-01

    Fermilab's Advanced Computer Program (ACP) has been developing highly cost effective, yet practical, parallel computers for high energy physics since 1984. The ACP's latest developments are proceeding in two directions. A Second Generation ACP Multiprocessor System for experiments will include $3500 RISC processors each with performance over 15 VAX MIPS. To support such high performance, the new system allows parallel I/O, parallel interprocess communication, and parallel host processes. The ACP Multi-Array Processor, has been developed for theoretical physics. Each $4000 node is a FORTRAN or C programmable pipelined 20 MFlops (peak), 10 MByte single board computer. These are plugged into a 16 port crossbar switch crate which handles both inter and intra crate communication. The crates are connected in a hypercube. Site oriented applications like lattice gauge theory are supported by system software called CANOPY, which makes the hardware virtually transparent to users. A 256 node, 5 GFlop, system is under construction. 10 refs., 7 figs.

  6. Advanced information processing system: Inter-computer communication services

    NASA Technical Reports Server (NTRS)

    Burkhardt, Laura; Masotto, Tom; Sims, J. Terry; Whittredge, Roy; Alger, Linda S.

    1991-01-01

    The purpose is to document the functional requirements and detailed specifications for the Inter-Computer Communications Services (ICCS) of the Advanced Information Processing System (AIPS). An introductory section is provided to outline the overall architecture and functional requirements of the AIPS and to present an overview of the ICCS. An overview of the AIPS architecture as well as a brief description of the AIPS software is given. The guarantees of the ICCS are provided, and the ICCS is described as a seven-layered International Standards Organization (ISO) Model. The ICCS functional requirements, functional design, and detailed specifications as well as each layer of the ICCS are also described. A summary of results and suggestions for future work are presented.

  7. Cloud Computing Applications in Support of Earth Science Activities at Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    Molthan, Andrew L.; Limaye, Ashutosh S.; Srikishen, Jayanthi

    2011-01-01

    Currently, the NASA Nebula Cloud Computing Platform is available to Agency personnel in a pre-release status as the system undergoes a formal operational readiness review. Over the past year, two projects within the Earth Science Office at NASA Marshall Space Flight Center have been investigating the performance and value of Nebula s "Infrastructure as a Service", or "IaaS" concept and applying cloud computing concepts to advance their respective mission goals. The Short-term Prediction Research and Transition (SPoRT) Center focuses on the transition of unique NASA satellite observations and weather forecasting capabilities for use within the operational forecasting community through partnerships with NOAA s National Weather Service (NWS). SPoRT has evaluated the performance of the Weather Research and Forecasting (WRF) model on virtual machines deployed within Nebula and used Nebula instances to simulate local forecasts in support of regional forecast studies of interest to select NWS forecast offices. In addition to weather forecasting applications, rapidly deployable Nebula virtual machines have supported the processing of high resolution NASA satellite imagery to support disaster assessment following the historic severe weather and tornado outbreak of April 27, 2011. Other modeling and satellite analysis activities are underway in support of NASA s SERVIR program, which integrates satellite observations, ground-based data and forecast models to monitor environmental change and improve disaster response in Central America, the Caribbean, Africa, and the Himalayas. Leveraging SPoRT s experience, SERVIR is working to establish a real-time weather forecasting model for Central America. Other modeling efforts include hydrologic forecasts for Kenya, driven by NASA satellite observations and reanalysis data sets provided by the broader meteorological community. Forecast modeling efforts are supplemented by short-term forecasts of convective initiation, determined by

  8. An expanded framework for the advanced computational testing and simulation toolkit

    SciTech Connect

    Marques, Osni A.; Drummond, Leroy A.

    2003-11-09

    The Advanced Computational Testing and Simulation (ACTS) Toolkit is a set of computational tools developed primarily at DOE laboratories and is aimed at simplifying the solution of common and important computational problems. The use of the tools reduces the development time for new codes and the tools provide functionality that might not otherwise be available. This document outlines an agenda for expanding the scope of the ACTS Project based on lessons learned from current activities. Highlights of this agenda include peer-reviewed certification of new tools; finding tools to solve problems that are not currently addressed by the Toolkit; working in collaboration with other software initiatives and DOE computer facilities; expanding outreach efforts; promoting interoperability, further development of the tools; and improving functionality of the ACTS Information Center, among other tasks. The ultimate goal is to make the ACTS tools more widely used and more effective in solving DOE's and the nation's scientific problems through the creation of a reliable software infrastructure for scientific computing.

  9. Recent advances in computational structural reliability analysis methods

    NASA Technical Reports Server (NTRS)

    Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.

    1993-01-01

    The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.

  10. Reliability of an interactive computer program for advance care planning.

    PubMed

    Schubart, Jane R; Levi, Benjamin H; Camacho, Fabian; Whitehead, Megan; Farace, Elana; Green, Michael J

    2012-06-01

    Despite widespread efforts to promote advance directives (ADs), completion rates remain low. Making Your Wishes Known: Planning Your Medical Future (MYWK) is an interactive computer program that guides individuals through the process of advance care planning, explaining health conditions and interventions that commonly involve life or death decisions, helps them articulate their values/goals, and translates users' preferences into a detailed AD document. The purpose of this study was to demonstrate that (in the absence of major life changes) the AD generated by MYWK reliably reflects an individual's values/preferences. English speakers ≥30 years old completed MYWK twice, 4 to 6 weeks apart. Reliability indices were assessed for three AD components: General Wishes; Specific Wishes for treatment; and Quality-of-Life values (QoL). Twenty-four participants completed the study. Both the Specific Wishes and QoL scales had high internal consistency in both time periods (Knuder Richardson formula 20 [KR-20]=0.83-0.95, and 0.86-0.89). Test-retest reliability was perfect for General Wishes (κ=1), high for QoL (Pearson's correlation coefficient=0.83), but lower for Specific Wishes (Pearson's correlation coefficient=0.57). MYWK generates an AD where General Wishes and QoL (but not Specific Wishes) statements remain consistent over time.

  11. A Computer Model for Determining Operational Centers of Gravity

    DTIC Science & Technology

    2002-05-31

    rest of joint and service doctrine by equating centers of gravity with critical vulnerabilities. Despite this convergence, the preconception persists...Leyte, Malaya, Okinawa, Panama, Philippines, Sicily, and Somalia . Among these USAWC studies, the US invasion of Okinawa (1945) and Operation Just...leaving only “conclusions” that represent the end points of the various lines of reasoning. A graphical notation was used to record the task reduction

  12. Design Model for Learner-Centered, Computer-Based Simulations.

    ERIC Educational Resources Information Center

    Hawley, Chandra L.; Duffy, Thomas M.

    This paper presents a model for designing computer-based simulation environments within a constructivist framework for the K-12 school setting. The following primary criteria for the development of simulations are proposed: (1) the problem needs to be authentic; (2) the cognitive demand in learning should be authentic; (3) scaffolding supports a…

  13. Using advanced computer vision algorithms on small mobile robots

    NASA Astrophysics Data System (ADS)

    Kogut, G.; Birchmore, F.; Biagtan Pacis, E.; Everett, H. R.

    2006-05-01

    The Technology Transfer project employs a spiral development process to enhance the functionality and autonomy of mobile robot systems in the Joint Robotics Program (JRP) Robotic Systems Pool by converging existing component technologies onto a transition platform for optimization. An example of this approach is the implementation of advanced computer vision algorithms on small mobile robots. We demonstrate the implementation and testing of the following two algorithms useful on mobile robots: 1) object classification using a boosted Cascade of classifiers trained with the Adaboost training algorithm, and 2) human presence detection from a moving platform. Object classification is performed with an Adaboost training system developed at the University of California, San Diego (UCSD) Computer Vision Lab. This classification algorithm has been used to successfully detect the license plates of automobiles in motion in real-time. While working towards a solution to increase the robustness of this system to perform generic object recognition, this paper demonstrates an extension to this application by detecting soda cans in a cluttered indoor environment. The human presence detection from a moving platform system uses a data fusion algorithm which combines results from a scanning laser and a thermal imager. The system is able to detect the presence of humans while both the humans and the robot are moving simultaneously. In both systems, the two aforementioned algorithms were implemented on embedded hardware and optimized for use in real-time. Test results are shown for a variety of environments.

  14. Advances in the computational study of language acquisition.

    PubMed

    Brent, M R

    1996-01-01

    This paper provides a tutorial introduction to computational studies of how children learn their native languages. Its aim is to make recent advances accessible to the broader research community, and to place them in the context of current theoretical issues. The first section locates computational studies and behavioral studies within a common theoretical framework. The next two sections review two papers that appear in this volume: one on learning the meanings of words and one or learning the sounds of words. The following section highlights an idea which emerges independently in these two papers and which I have dubbed autonomous bootstrapping. Classical bootstrapping hypotheses propose that children begin to get a toc-hold in a particular linguistic domain, such as syntax, by exploiting information from another domain, such as semantics. Autonomous bootstrapping complements the cross-domain acquisition strategies of classical bootstrapping with strategies that apply within a single domain. Autonomous bootstrapping strategies work by representing partial and/or uncertain linguistic knowledge and using it to analyze the input. The next two sections review two more more contributions to this special issue: one on learning word meanings via selectional preferences and one on algorithms for setting grammatical parameters. The final section suggests directions for future research.

  15. Mary S. Easton Center of Alzheimer's Disease Research at UCLA: advancing the therapeutic imperative.

    PubMed

    Cummings, Jeffrey L; Ringman, John; Metz, Karen

    2010-01-01

    The Mary S. Easton Center for Alzheimer's Disease Research (UCLA-Easton Alzheimer's Center) is committed to the "therapeutic imperative" and is devoted to finding new treatments for Alzheimer's disease (AD) and to developing technologies (biomarkers) to advance that goal. The UCLA-Easton Alzheimer's Center has a continuum of research and research-related activities including basic/foundational studies of peptide interactions; translational studies in transgenic animals and other animal models of AD; clinical research to define the phenotype of AD, characterize familial AD, develop biomarkers, and advance clinical trials; health services and outcomes research; and active education, dissemination, and recruitment activities. The UCLAEaston Alzheimer's Center is supported by the National Institutes on Aging, the State of California, and generous donors who share our commitment to developing new therapies for AD. The naming donor (Jim Easton) provided substantial funds to endow the center and to support projects in AD drug discovery and biomarker development. The Sidell-Kagan Foundation supports the Katherine and Benjamin Kagan Alzheimer's Treatment Development Program, and the Deane F. Johnson Alzheimer's Research Foundation supports the Deane F. Johnson Center for Neurotherapeutics at UCLA. The John Douglas French Alzheimer's Research Foundation provides grants to junior investigators in critical periods of their academic development. The UCLA-Easton Alzheimer's Center partners with community organizations including the Alzheimer's Association California Southland Chapter and the Leeza Gibbons memory Foundation. Collaboration with pharmaceutical companies, biotechnology companies, and device companies is critical to developing new therapeutics for AD and these collaborations are embraced in the mission of the UCLA-Easton Alzheimer's Center. The Center supports excellent senior 3 investigators and serves as an incubator for new scientists, agents, models, technologies

  16. Web application for simplifying access to computer center resources and information.

    SciTech Connect

    Long, J. W.

    2013-05-01

    Lorenz is a product of the ASC Scientific Data Management effort. Lorenz is a web-based application designed to help computer centers make information and resources more easily available to their users.

  17. Information and psychomotor skills knowledge acquisition: A student-customer-centered and computer-supported approach.

    PubMed

    Nicholson, Anita; Tobin, Mary

    2006-01-01

    This presentation will discuss coupling commercial and customized computer-supported teaching aids to provide BSN nursing students with a friendly customer-centered self-study approach to psychomotor skill acquisition.

  18. For operation of the Computer Software Management and Information Center (COSMIC)

    NASA Technical Reports Server (NTRS)

    Carmon, J. L.

    1983-01-01

    Progress report on current status of computer software management and information center (COSMIC) includes the following areas: inventory, evaluation and publication, marketing, customer service, maintenance and support, and budget summary.

  19. High End Computer Network Testbedding at NASA Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Gary, James Patrick

    1998-01-01

    The Earth & Space Data Computing (ESDC) Division, at the Goddard Space Flight Center, is involved in development and demonstrating various high end computer networking capabilities. The ESDC has several high end super computers. These are used to run: (1) computer simulation of the climate systems; (2) to support the Earth and Space Sciences (ESS) project; (3) to support the Grand Challenge (GC) Science, which is aimed at understanding the turbulent convection and dynamos in stars. GC research occurs in many sites throughout the country, and this research is enabled by, in part, the multiple high performance network interconnections. The application drivers for High End Computer Networking use distributed supercomputing to support virtual reality applications, such as TerraVision, (i.e., three dimensional browser of remotely accessed data), and Cave Automatic Virtual Environments (CAVE). Workstations can access and display data from multiple CAVE's with video servers, which allows for group/project collaborations using a combination of video, data, voice and shared white boarding. The ESDC is also developing and demonstrating the high degree of interoperability between satellite and terrestrial-based networks. To this end, the ESDC is conducting research and evaluations of new computer networking protocols and related technologies which improve the interoperability of satellite and terrestrial networks. The ESDC is also involved in the Security Proof of Concept Keystone (SPOCK) program sponsored by National Security Agency (NSA). The SPOCK activity provides a forum for government users and security technology providers to share information on security requirements, emerging technologies and new product developments. Also, the ESDC is involved in the Trans-Pacific Digital Library Experiment, which aims to demonstrate and evaluate the use of high performance satellite communications and advanced data communications protocols to enable interactive digital library data

  20. Computational Fluid Dynamics at NASA Ames Research Center

    NASA Technical Reports Server (NTRS)

    Kutler, Paul

    1994-01-01

    Computational fluid dynamics (CFD) is beginning to play a major role in the aircraft industry of the United States because of the realization that CFD can be a new and effective design tool and thus could provide a company with a competitive advantage. It is also playing a significant role in research institutions, both governmental and academic, as a tool for researching new fluid physics, as well as supplementing and complementing experimental testing. In this presentation, some of the progress made to date in CFD at NASA Ames will be reviewed. The presentation addresses the status of CFD in terms of methods, examples of CFD solutions, and computer technology. In addition, the role CFD will play in supporting the revolutionary goals set forth by the Aeronautical Policy Review Committee established by the Office of Science and Technology Policy is noted. The need for validated CFD tools is also briefly discussed.

  1. Computer vision research at Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    Vinz, Frank L.

    1990-01-01

    Orbital docking, inspection, and sevicing are operations which have the potential for capability enhancement as well as cost reduction for space operations by the application of computer vision technology. Research at MSFC has been a natural outgrowth of orbital docking simulations for remote manually controlled vehicles such as the Teleoperator Retrieval System and the Orbital Maneuvering Vehicle (OMV). Baseline design of the OMV dictates teleoperator control from a ground station. This necessitates a high data-rate communication network and results in several seconds of time delay. Operational costs and vehicle control difficulties could be alleviated by an autonomous or semi-autonomous control system onboard the OMV which would be based on a computer vision system having capability to recognize video images in real time. A concept under development at MSFC with these attributes is based on syntactic pattern recognition. It uses tree graphs for rapid recognition of binary images of known orbiting target vehicles. This technique and others being investigated at MSFC will be evaluated in realistic conditions by the use of MSFC orbital docking simulators. Computer vision is also being applied at MSFC as part of the supporting development for Work Package One of Space Station Freedom.

  2. Intention and Usage of Computer Based Information Systems in Primary Health Centers

    ERIC Educational Resources Information Center

    Hosizah; Kuntoro; Basuki N., Hari

    2016-01-01

    The computer-based information system (CBIS) is adopted by almost all of in health care setting, including the primary health center in East Java Province Indonesia. Some of softwares available were SIMPUS, SIMPUSTRONIK, SIKDA Generik, e-puskesmas. Unfortunately they were most of the primary health center did not successfully implemented. This…

  3. A Librarian's Internship in a Campus Computing Center: Lessons Learned and Implications for Community College Libraries

    ERIC Educational Resources Information Center

    Blummer, Barbara; Martin, Amy Chase; Kenton, Jeffrey

    2009-01-01

    This research reports on the conduction of a needs assessment and the development of a training and evaluation program for a student associate position at the campus computing center. The center provides faculty support for a range of software and technologies. At the project's onset, the majority of students performed basic office duties such as…

  4. For operation of the Computer Software Management and Information Center (COSMIC)

    NASA Technical Reports Server (NTRS)

    Carmon, J. L.

    1983-01-01

    During the month of June, the Survey Research Center (SRC) at the University of Georgia designed new benefits questionnaires for computer software management and information center (COSMIC). As a test of their utility, these questionnaires are now used in the benefits identification process.

  5. NASA Center for Climate Simulation (NCCS) Advanced Technology AT5 Virtualized Infiniband Report

    NASA Technical Reports Server (NTRS)

    Thompson, John H.; Bledsoe, Benjamin C.; Wagner, Mark; Shakshober, John; Fromkin, Russ

    2013-01-01

    The NCCS is part of the Computational and Information Sciences and Technology Office (CISTO) of Goddard Space Flight Center's (GSFC) Sciences and Exploration Directorate. The NCCS's mission is to enable scientists to increase their understanding of the Earth, the solar system, and the universe by supplying state-of-the-art high performance computing (HPC) solutions. To accomplish this mission, the NCCS (https://www.nccs.nasa.gov) provides high performance compute engines, mass storage, and network solutions to meet the specialized needs of the Earth and space science user communities

  6. Recovery Act: Advanced Direct Methanol Fuel Cell for Mobile Computing

    SciTech Connect

    Fletcher, James H.; Cox, Philip; Harrington, William J; Campbell, Joseph L

    2013-09-03

    ABSTRACT Project Title: Recovery Act: Advanced Direct Methanol Fuel Cell for Mobile Computing PROJECT OBJECTIVE The objective of the project was to advance portable fuel cell system technology towards the commercial targets of power density, energy density and lifetime. These targets were laid out in the DOE’s R&D roadmap to develop an advanced direct methanol fuel cell power supply that meets commercial entry requirements. Such a power supply will enable mobile computers to operate non-stop, unplugged from the wall power outlet, by using the high energy density of methanol fuel contained in a replaceable fuel cartridge. Specifically this project focused on balance-of-plant component integration and miniaturization, as well as extensive component, subassembly and integrated system durability and validation testing. This design has resulted in a pre-production power supply design and a prototype that meet the rigorous demands of consumer electronic applications. PROJECT TASKS The proposed work plan was designed to meet the project objectives, which corresponded directly with the objectives outlined in the Funding Opportunity Announcement: To engineer the fuel cell balance-of-plant and packaging to meet the needs of consumer electronic systems, specifically at power levels required for mobile computing. UNF used existing balance-of-plant component technologies developed under its current US Army CERDEC project, as well as a previous DOE project completed by PolyFuel, to further refine them to both miniaturize and integrate their functionality to increase the system power density and energy density. Benefits of UNF’s novel passive water recycling MEA (membrane electrode assembly) and the simplified system architecture it enabled formed the foundation of the design approach. The package design was hardened to address orientation independence, shock, vibration, and environmental requirements. Fuel cartridge and fuel subsystems were improved to ensure effective fuel

  7. Books, Bytes, and Bridges: Libraries and Computer Centers in Academic Institutions.

    ERIC Educational Resources Information Center

    Hardesty, Larry, Ed.

    This book about the relationship between computer centers and libraries at academic institutions contains the following chapters: (1) "A History of the Rhetoric and Reality of Library and Computing Relationships" (Peggy Seiden and Michael D. Kathman); (2) "An Issue in Search of a Metaphor: Readings on the Marriageability of…

  8. Center for computation and visualization of geometric structures. Final report, 1992 - 1995

    SciTech Connect

    1995-11-01

    This report describes the overall goals and the accomplishments of the Geometry Center of the University of Minnesota, whose mission is to develop, support, and promote computational tools for visualizing geometric structures, for facilitating communication among mathematical and computer scientists and between these scientists and the public at large, and for stimulating research in geometry.

  9. Cloud Computing Applications in Support of Earth Science Activities at Marshall Space Flight Center

    NASA Astrophysics Data System (ADS)

    Molthan, A.; Limaye, A. S.

    2011-12-01

    Currently, the NASA Nebula Cloud Computing Platform is available to Agency personnel in a pre-release status as the system undergoes a formal operational readiness review. Over the past year, two projects within the Earth Science Office at NASA Marshall Space Flight Center have been investigating the performance and value of Nebula's "Infrastructure as a Service", or "IaaS" concept and applying cloud computing concepts to advance their respective mission goals. The Short-term Prediction Research and Transition (SPoRT) Center focuses on the transition of unique NASA satellite observations and weather forecasting capabilities for use within the operational forecasting community through partnerships with NOAA's National Weather Service (NWS). SPoRT has evaluated the performance of the Weather Research and Forecasting (WRF) model on virtual machines deployed within Nebula and used Nebula instances to simulate local forecasts in support of regional forecast studies of interest to select NWS forecast offices. In addition to weather forecasting applications, rapidly deployable Nebula virtual machines have supported the processing of high resolution NASA satellite imagery to support disaster assessment following the historic severe weather and tornado outbreak of April 27, 2011. Other modeling and satellite analysis activities are underway in support of NASA's SERVIR program, which integrates satellite observations, ground-based data and forecast models to monitor environmental change and improve disaster response in Central America, the Caribbean, Africa, and the Himalayas. Leveraging SPoRT's experience, SERVIR is working to establish a real-time weather forecasting model for Central America. Other modeling efforts include hydrologic forecasts for Kenya, driven by NASA satellite observations and reanalysis data sets provided by the broader meteorological community. Forecast modeling efforts are supplemented by short-term forecasts of convective initiation, determined by

  10. Computational fluid dynamics at NASA Ames Research Center

    NASA Technical Reports Server (NTRS)

    Kutler, Paul

    1989-01-01

    Computational fluid dynamics (CFD) has made great strides in the detailed simulation of complex fluid flows, including the fluid physics of flows heretofore not understood. It is now being routinely applied to some rather complicated problems, and starting to impact the design cycle of aerospace flight vehicles and their components. In addition, it is being used to complement, and is being complemented by, experimental studies. In the present paper, some major elements of contemporary CFD research, such as code validation, turbulence physics, and hypersonic flows are discussed, along with a review of the principal pacing items that currently govern CFD. Several examples of pioneering CFD research are presented to illustrate the current state of the art. Finally, prospects for the future development and application of CFD are suggested.

  11. Quantitative Computed Tomography and Image Analysis for Advanced Muscle Assessment

    PubMed Central

    Edmunds, Kyle Joseph; Gíslason, Magnus K.; Arnadottir, Iris D.; Marcante, Andrea; Piccione, Francesco; Gargiulo, Paolo

    2016-01-01

    Medical imaging is of particular interest in the field of translational myology, as extant literature describes the utilization of a wide variety of techniques to non-invasively recapitulate and quantity various internal and external tissue morphologies. In the clinical context, medical imaging remains a vital tool for diagnostics and investigative assessment. This review outlines the results from several investigations on the use of computed tomography (CT) and image analysis techniques to assess muscle conditions and degenerative process due to aging or pathological conditions. Herein, we detail the acquisition of spiral CT images and the use of advanced image analysis tools to characterize muscles in 2D and 3D. Results from these studies recapitulate changes in tissue composition within muscles, as visualized by the association of tissue types to specified Hounsfield Unit (HU) values for fat, loose connective tissue or atrophic muscle, and normal muscle, including fascia and tendon. We show how results from these analyses can be presented as both average HU values and compositions with respect to total muscle volumes, demonstrating the reliability of these tools to monitor, assess and characterize muscle degeneration. PMID:27478562

  12. Graduate Automotive Technology Education (GATE) Program: Center of Automotive Technology Excellence in Advanced Hybrid Vehicle Technology at West Virginia University

    SciTech Connect

    Nigle N. Clark

    2006-12-31

    This report summarizes the technical and educational achievements of the Graduate Automotive Technology Education (GATE) Center at West Virginia University (WVU), which was created to emphasize Advanced Hybrid Vehicle Technology. The Center has supported the graduate studies of 17 students in the Department of Mechanical and Aerospace Engineering and the Lane Department of Computer Science and Electrical Engineering. These students have addressed topics such as hybrid modeling, construction of a hybrid sport utility vehicle (in conjunction with the FutureTruck program), a MEMS-based sensor, on-board data acquisition for hybrid design optimization, linear engine design and engine emissions. Courses have been developed in Hybrid Vehicle Design, Mobile Source Powerplants, Advanced Vehicle Propulsion, Power Electronics for Automotive Applications and Sensors for Automotive Applications, and have been responsible for 396 hours of graduate student coursework. The GATE program also enhanced the WVU participation in the U.S. Department of Energy Student Design Competitions, in particular FutureTruck and Challenge X. The GATE support for hybrid vehicle technology enhanced understanding of hybrid vehicle design and testing at WVU and encouraged the development of a research agenda in heavy-duty hybrid vehicles. As a result, WVU has now completed three programs in hybrid transit bus emissions characterization, and WVU faculty are leading the Transportation Research Board effort to define life cycle costs for hybrid transit buses. Research and enrollment records show that approximately 100 graduate students have benefited substantially from the hybrid vehicle GATE program at WVU.

  13. Patterns of treatment and costs of intermediate and advanced hepatocellular carcinoma management in four Italian centers

    PubMed Central

    Colombo, Giorgio Lorenzo; Cammà, Calogero; Attili, Adolfo Francesco; Ganga, Roberto; Gaeta, Giovanni Battista; Brancaccio, Giuseppina; Franzini, Jean Marie; Volpe, Marco; Turchetti, Giuseppe

    2015-01-01

    Background Hepatocellular carcinoma (HCC) is a severe health condition associated with high hospitalizations and mortality rates, which also imposes a relevant economic burden. Purpose The aim of the present survey is to investigate treatment strategies and related costs for HCC in the intermediate and advanced stages of the disease. Patients and methods The survey was conducted in four Italian centers through structured interviews with physicians. Information regarding the stage of disease, treatments performed, and related health care resource consumption was included in the questionnaire. Direct health care cost per patient associated with the most relevant treatments such as sorafenib, transarterial chemoembolization (TACE), and transarterial radioembolization (TARE) was evaluated. Results Between 2013 and 2014, 285 patients with HCC were treated in the four participating centers; of these, 80 were in intermediate stage HCC (Barcelona Clinic Liver Cancer Classification [BCLC] B), and 57 were in the advanced stage of the disease (BCLC C). In intermediate stage HCC, the most frequent first-line treatment was TACE (63%) followed by sorafenib (15%), radiofrequency ablation (14%), and TARE (1.3%). In the advanced stage of HCC, the most frequently used first-line therapy was sorafenib (56%), followed by best supportive care (21%), TACE (18%), and TARE (3.5%). The total costs of treatment per patient amounted to €12,214.54 with sorafenib, €13,418.49 with TACE, and €26,106.08 with TARE. Both in the intermediate and in the advanced stage of the disease, variability in treatment patterns among centers was observed. Conclusion The present analysis raises for the first time the awareness of the overall costs incurred by the Italian National Healthcare System for different treatments used in intermediate and advanced HCC. Further investigations would be important to better understand the effective health care resource usage. PMID:26527877

  14. Investigating Impact Metrics for Performance for the US EPA National Center for Computational Toxicology (ACS Fall meeting)

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Computational Toxicology Program integrates advances in biology, chemistry, and computer science to help prioritize chemicals for further research based on potential human health risks. This work involves computational and data drive...

  15. Embrittlement Database from the Radiation Safety Information Computational Center

    DOE Data Explorer

    The Embrittlement Data Base (EDB) is a comprehensive collection of data from surveillance capsules of U.S. commercial nuclear power reactors and from experiments in material test reactors. The collected data are contained in either the Power Reactor Embrittlement Data Base (PR-EDB) or the Test Reactor Embrittlement Data Base (TR-EDB). The EDB work includes verification of the quality of the EDB, provision for user-friendly software to access and process the data, exploration and/or confirmation of embrittlement prediction models, provision for rapid investigation of regulatory issues, and provision for the technical bases for voluntary consensus standards or regulatory guides. The EDB is designed for use with a personal computer. The data are collected into "raw data files." Traceability of all data is maintained by including complete references along with the page numbers. External data verification of the PR-EDB is the responsibility of the vendors, who were responsible for the insertion and testing of the materials in the surveillance capsules. Internal verification is accomplished by checking against references and checking for inconsistencies. Examples of information contained in the EDBs are: Charpy data, tensile data, reactor type, irradiation environments, fracture toughness data, instrumented Charpy data, pressure-temperature (P-T) data, chemistry data, and material history. The TR-EDB additionally has annealing Charpy data. The current version of the PR-EDB contains the test results from 269 Charpy capsules irradiated in 101 reactors. These results include 320 plate data points, 123 forging data points, 113 standard reference materials (SRMS) or correlation monitor (CM) points, 244 weld material data points, and 220 heat-affected-zone (HAZ) material data points. Similarly, the TR-EDB contains information for 290 SRM or CM points, 342 plate data points, 165 forging data points, 378 welds, and 55 HAZ materials. [copied from http://rsicc.ornl.gov/RelatedLinks.aspx?t=edb

  16. Recent advances in data assimilation in computational geodynamic models

    NASA Astrophysics Data System (ADS)

    Ismail-Zadeh, Alik

    2010-05-01

    . The QRV method was most recently introduced in geodynamic modelling (Ismail-Zadeh et al., 2007, 2008; Tantsyrev, 2008; Glisovic et al., 2009). The advances in computational geodynamics and in data assimilation attract an interest of the community dealing with lithosphere, mantle and core dynamics.

  17. Removing the center from computing: biology's new mode of digital knowledge production.

    PubMed

    November, Joseph

    2011-06-01

    This article shows how the USA's National Institutes of Health (NIH) helped to bring about a major shift in the way computers are used to produce knowledge and in the design of computers themselves as a consequence of its early 1960s efforts to introduce information technology to biologists. Starting in 1960 the NIH sought to reform the life sciences by encouraging researchers to make use of digital electronic computers, but despite generous federal support biologists generally did not embrace the new technology. Initially the blame fell on biologists' lack of appropriate (i.e. digital) data for computers to process. However, when the NIH consulted MIT computer architect Wesley Clark about this problem, he argued that the computer's quality as a device that was centralized posed an even greater challenge to potential biologist users than did the computer's need for digital data. Clark convinced the NIH that if the agency hoped to effectively computerize biology, it would need to satisfy biologists' experimental and institutional needs by providing them the means to use a computer without going to a computing center. With NIH support, Clark developed the 1963 Laboratory Instrument Computer (LINC), a small, real-time interactive computer intended to be used inside the laboratory and controlled entirely by its biologist users. Once built, the LINC provided a viable alternative to the 1960s norm of large computers housed in computing centers. As such, the LINC not only became popular among biologists, but also served in later decades as an important precursor of today's computing norm in the sciences and far beyond, the personal computer.

  18. The Astromaterials X-Ray Computed Tomography Laboratory at Johnson Space Center

    NASA Technical Reports Server (NTRS)

    Zeigler, R. A.; Coleff, D. M.; McCubbin, F. M.

    2017-01-01

    The Astromaterials Acquisition and Curation Office at NASA's Johnson Space Center (hereafter JSC curation) is the past, present, and future home of all of NASA's astromaterials sample collections. JSC curation currently houses all or part of nine different sample collections: (1) Apollo samples (1969), (2) Lunar samples (1972), (3) Antarctic meteorites (1976), (4) Cosmic Dust particles (1981), (5) Microparticle Impact Collection (1985), (6) Genesis solar wind atoms (2004); (7) Stardust comet Wild-2 particles (2006), (8) Stardust interstellar particles (2006), and (9) Hayabusa asteroid Itokawa particles (2010). Each sample collection is housed in a dedicated clean room, or suite of clean rooms, that is tailored to the requirements of that sample collection. Our primary goals are to maintain the long-term integrity of the samples and ensure that the samples are distributed for scientific study in a fair, timely, and responsible manner, thus maximizing the return on each sample. Part of the curation process is planning for the future, and we also perform fundamental research in advanced curation initiatives. Advanced Curation is tasked with developing procedures, technology, and data sets necessary for curating new types of sample collections, or getting new results from existing sample collections [2]. We are (and have been) planning for future curation, including cold curation, extended curation of ices and volatiles, curation of samples with special chemical considerations such as perchlorate-rich samples, and curation of organically- and biologically-sensitive samples. As part of these advanced curation efforts we are augmenting our analytical facilities as well. A micro X-Ray computed tomography (micro-XCT) laboratory dedicated to the study of astromaterials will be coming online this spring within the JSC Curation office, and we plan to add additional facilities that will enable nondestructive (or minimally-destructive) analyses of astromaterials in the near

  19. SANs and Large Scale Data Migration at the NASA Center for Computational Sciences

    NASA Technical Reports Server (NTRS)

    Salmon, Ellen M.

    2004-01-01

    Evolution and migration are a way of life for provisioners of high-performance mass storage systems that serve high-end computers used by climate and Earth and space science researchers: the compute engines come and go, but the data remains. At the NASA Center for Computational Sciences (NCCS), disk and tape SANs are deployed to provide high-speed I/O for the compute engines and the hierarchical storage management systems. Along with gigabit Ethernet, they also enable the NCCS's latest significant migration: the transparent transfer of 300 Til3 of legacy HSM data into the new Sun SAM-QFS cluster.

  20. C-130 Advanced Technology Center wing box conceptual design/cost study

    NASA Technical Reports Server (NTRS)

    Whitehead, R. S.; Foreman, C. R.; Silva, K.

    1992-01-01

    A conceptual design was developed by Northrop/LTV for an advanced C-130 Center Wing Box (CWB) which could meet the severe mission requirements of the SOF C-130 aircraft. The goals for the advanced technology CWB relative to the current C-130H CWB were: (1) the same acquisition cost; (2) lower operating support costs; (3) equal or lower weight; (4) a 30,000 hour service life for the SOF mission; and (5) minimum impact on the current maintenance concept. Initially, the structural arrangement, weight, external and internal loads, fatigue spectrum, flutter envelope and design criteria for the SOF C-130 aircraft CWB were developed. An advanced materials assessment was then conducted to determine the suitability of advanced materials for a 1994 production availability and detailed trade studies were performed on candidate CWB conceptual designs. Finally, a life-cycle cost analysis was performed on the advanced CWB. The study results showed that a hybrid composite/metallic CWB could meet the severe SOF design requirements, reduce the CWB weight by 14 pct., and was cost effective relative to an all metal beefed up C-130H CWB.

  1. The Design and Transfer of Advanced Command and Control (C2) Computer-Based Systems

    DTIC Science & Technology

    1980-03-31

    TECHNICAL REPORT 80-02 QUARTERLY TECHNICAL REPORT: THE DESIGN AND TRANSFER OF ADVANCED COMMAND AND CONTROL (C 2 ) COMPUTER-BASED SYSTEMS ARPA...The Tasks/Objectives and/or Purposes of the overall project are connected with the design , development, demonstration and transfer of advanced...command and control (C2 ) computer-based systems; this report covers work in the computer-based design and transfer areas only. The Technical Problems thus

  2. First 3 years of operation of RIACS (Research Institute for Advanced Computer Science) (1983-1985)

    NASA Technical Reports Server (NTRS)

    Denning, P. J.

    1986-01-01

    The focus of the Research Institute for Advanced Computer Science (RIACS) is to explore matches between advanced computing architectures and the processes of scientific research. An architecture evaluation of the MIT static dataflow machine, specification of a graphical language for expressing distributed computations, and specification of an expert system for aiding in grid generation for two-dimensional flow problems was initiated. Research projects for 1984 and 1985 are summarized.

  3. Implementing RECONSIDER, a diagnostic prompting computer system, at the Georgetown University Medical Center.

    PubMed

    Broering, N C; Corn, M; Ayers, W R; Mistry, P

    1988-04-01

    RECONSIDER, a computer program for diagnostic prompting developed at the University of California, San Francisco, has been implemented at the Georgetown University Medical Center as part of the Integrated Academic Information Management System Model Development grant project supported by the National Library of Medicine. The system is available for student use in the Biomedical Information Resources Center of the Dahlgren Memorial Library. Instruction on use of the computer system is provided by the library and instruction on medical use of the knowledge base is directed by the faculty. The implementation, capabilities, enhancements such as the addition of Current Medical Information and Terminology (5th ed.), and evaluation of the system are reported.

  4. A Queue Simulation Tool for a High Performance Scientific Computing Center

    NASA Technical Reports Server (NTRS)

    Spear, Carrie; McGalliard, James

    2007-01-01

    The NASA Center for Computational Sciences (NCCS) at the Goddard Space Flight Center provides high performance highly parallel processors, mass storage, and supporting infrastructure to a community of computational Earth and space scientists. Long running (days) and highly parallel (hundreds of CPUs) jobs are common in the workload. NCCS management structures batch queues and allocates resources to optimize system use and prioritize workloads. NCCS technical staff use a locally developed discrete event simulation tool to model the impacts of evolving workloads, potential system upgrades, alternative queue structures and resource allocation policies.

  5. Prediction of helicopter rotor discrete frequency noise: A computer program incorporating realistic blade motions and advanced acoustic formulation

    NASA Technical Reports Server (NTRS)

    Brentner, K. S.

    1986-01-01

    A computer program has been developed at the Langley Research Center to predict the discrete frequency noise of conventional and advanced helicopter rotors. The program, called WOPWOP, uses the most advanced subsonic formulation of Farassat that is less sensitive to errors and is valid for nearly all helicopter rotor geometries and flight conditions. A brief derivation of the acoustic formulation is presented along with a discussion of the numerical implementation of the formulation. The computer program uses realistic helicopter blade motion and aerodynamic loadings, input by the user, for noise calculation in the time domain. A detailed definition of all the input variables, default values, and output data is included. A comparison with experimental data shows good agreement between prediction and experiment; however, accurate aerodynamic loading is needed.

  6. Advanced Overset Grid Methods For Massively Parallel Rotary Wing Computations

    DTIC Science & Technology

    2014-08-18

    in 2- D for cell-centered unstructured grid show feasibility of this approach and improvements in predictions when compared with conventional... D for cell-centered unstructured grid show feasibility of this approach and improvements in predictions when compared with conventional overset...Proceeding publications (other than abstracts): ( d ) Manuscripts Received Paper TOTAL: Received Paper TOTAL: Received Paper TOTAL: Received Book TOTAL

  7. Generalized Advanced Propeller Analysis System (GAPAS). Volume 2: Computer program user manual

    NASA Technical Reports Server (NTRS)

    Glatt, L.; Crawford, D. R.; Kosmatka, J. B.; Swigart, R. J.; Wong, E. W.

    1986-01-01

    The Generalized Advanced Propeller Analysis System (GAPAS) computer code is described. GAPAS was developed to analyze advanced technology multi-bladed propellers which operate on aircraft with speeds up to Mach 0.8 and altitudes up to 40,000 feet. GAPAS includes technology for analyzing aerodynamic, structural, and acoustic performance of propellers. The computer code was developed for the CDC 7600 computer and is currently available for industrial use on the NASA Langley computer. A description of all the analytical models incorporated in GAPAS is included. Sample calculations are also described as well as users requirements for modifying the analysis system. Computer system core requirements and running times are also discussed.

  8. Recent advances in computational methods for nuclear magnetic resonance data processing.

    PubMed

    Gao, Xin

    2013-02-01

    Although three-dimensional protein structure determination using nuclear magnetic resonance (NMR) spectroscopy is a computationally costly and tedious process that would benefit from advanced computational techniques, it has not garnered much research attention from specialists in bioinformatics and computational biology. In this paper, we review recent advances in computational methods for NMR protein structure determination. We summarize the advantages of and bottlenecks in the existing methods and outline some open problems in the field. We also discuss current trends in NMR technology development and suggest directions for research on future computational methods for NMR.

  9. Final Report -- Center for Programmng Models for Scalable Parallel Computing (UIUC subgroup)

    SciTech Connect

    Marianne Winslett; Michael Folk

    2007-03-31

    The mission of the Center for Scalable Programming Models (Pmodels) was to create new ways of programming parallel computers that are much easier for humans to conceptualize, that allow programs to be written, updated and debugged quickly, and that run extremely efficiently---even on computers with thousands or even millions of processors. At UIUC, our work for Pmodels focused on support for I/O in a massively parallel environment, and included both research and technology transfer activities.

  10. Computer Center.

    ERIC Educational Resources Information Center

    Kramer, David W.

    1991-01-01

    Discusses the use of videodisk technology to teach biology. Presents videodisk selection criteria for biology teachers, lists 13 biology videodisks and their suppliers, and lists videodisks offered with biology textbooks. (MDH)

  11. Deep Space Network (DSN), Network Operations Control Center (NOCC) computer-human interfaces

    NASA Technical Reports Server (NTRS)

    Ellman, Alvin; Carlton, Magdi

    1993-01-01

    The Network Operations Control Center (NOCC) of the DSN is responsible for scheduling the resources of DSN, and monitoring all multi-mission spacecraft tracking activities in real-time. Operations performs this job with computer systems at JPL connected to over 100 computers at Goldstone, Australia and Spain. The old computer system became obsolete, and the first version of the new system was installed in 1991. Significant improvements for the computer-human interfaces became the dominant theme for the replacement project. Major issues required innovating problem solving. Among these issues were: How to present several thousand data elements on displays without overloading the operator? What is the best graphical representation of DSN end-to-end data flow? How to operate the system without memorizing mnemonics of hundreds of operator directives? Which computing environment will meet the competing performance requirements? This paper presents the technical challenges, engineering solutions, and results of the NOCC computer-human interface design.

  12. Advanced computed tomography inspection system (ACTIS): an overview of the technology and its applications

    NASA Astrophysics Data System (ADS)

    Beshears, Ronald D.; Hediger, Lisa H.

    1994-10-01

    The Advanced Computed Tomography Inspection System (ACTIS) was developed by the Marshall Space Flight Center to support in-house solid propulsion test programs. ACTIS represents a significant advance in state-of-the-art inspection systems. Its flexibility and superior technical performance have made ACTIS very popular, both within and outside the aerospace community. Through Technology Utilization efforts, ACTIS has been applied to inspection problems in commercial aerospace, lumber, automotive, and nuclear waste disposal industries. ACTIS has even been used to inspect items of historical interest. ACTIS has consistently produced valuable results, providing information which was unattainable through conventional inspection methods. Although many successes have already been demonstrated, the full potential of ACTIS has not yet been realized. It is currently being applied in the commercial aerospace industry by Boeing Aerospace Company. Smaller systems, based on ACTIS technology are becoming increasingly available. This technology has much to offer small businesses and industry, especially in identifying design and process problems early in the product development cycle to prevent defects. Several options are available to businesses interested in pursuing this technology.

  13. Computational studies of horizontal axis wind turbines in high wind speed condition using advanced turbulence models

    NASA Astrophysics Data System (ADS)

    Benjanirat, Sarun

    Next generation horizontal-axis wind turbines (HAWTs) will operate at very high wind speeds. Existing engineering approaches for modeling the flow phenomena are based on blade element theory, and cannot adequately account for 3-D separated, unsteady flow effects. Therefore, researchers around the world are beginning to model these flows using first principles-based computational fluid dynamics (CFD) approaches. In this study, an existing first principles-based Navier-Stokes approach is being enhanced to model HAWTs at high wind speeds. The enhancements include improved grid topology, implicit time-marching algorithms, and advanced turbulence models. The advanced turbulence models include the Spalart-Allmaras one-equation model, k-epsilon, k-o and Shear Stress Transport (k-o-SST) models. These models are also integrated with detached eddy simulation (DES) models. Results are presented for a range of wind speeds, for a configuration termed National Renewable Energy Laboratory Phase VI rotor, tested at NASA Ames Research Center. Grid sensitivity studies are also presented. Additionally, effects of existing transition models on the predictions are assessed. Data presented include power/torque production, radial distribution of normal and tangential pressure forces, root bending moments, and surface pressure fields. Good agreement was obtained between the predictions and experiments for most of the conditions, particularly with the Spalart-Allmaras-DES model.

  14. [Prospect of the Advanced Life Support Program Breadboard Project at Kennedy Space Center in USA].

    PubMed

    Guo, S S; Ai, W D

    2001-04-01

    The Breadboard Project at Kennedy Space Center in NASA of USA was focused on the development of the bioregenerative life support components, crop plants for water, air, and food production and bioreactors for recycling of wastes. The keystone of the Breadboard Project was the Biomass Production Chamber (BPC), which was supported by 15 environmentally controlled chambers and several laboratory facilities holding a total area of 2150 m2. In supporting the Advanced Life Support Program (ALS Program), the Project utilizes these facilities for large-scale testing of components and development of required technologies for human-rated test-beds at Johnson Space Center in NASA, in order to enable a Lunar and a Mars mission finally.

  15. A new synchrotron light source at Louisiana State University's Center for Advanced Microstructures and Devices

    NASA Astrophysics Data System (ADS)

    Stockbauer, Roger L.; Ajmera, Pratul; Poliakoff, Erwin D.; Craft, Ben C.; Saile, Volker

    1990-05-01

    A 1.2-GeV synchrotron light source is being constructed at the Center for Advanced Microstructures and Devices (CAMD) at Louisiana State University. The expressed purpose of the center, which has been funded by a grant from the US Department of Energy, is to develop X-ray lithography techniques for manufacturing microcircuits, although basic science programs are also being established. The storage ring will be optimized for the soft-X-ray region and will be the first commercially manufactured electron storage ring in the United States. The magnetic lattice is based on a design developed by Chasman and Green and will allow up to three insertion devices to be installed for higher-energy and higher-intensity radiation. In addition to the lithography effort, experimental programs are being established in physics, chemistry, and related areas.

  16. Advanced technology needs for a global change science program: Perspective of the Langley Research Center

    NASA Technical Reports Server (NTRS)

    Rowell, Lawrence F.; Swissler, Thomas J.

    1991-01-01

    The focus of the NASA program in remote sensing is primarily the Earth system science and the monitoring of the Earth global changes. One of NASA's roles is the identification and development of advanced sensing techniques, operational spacecraft, and the many supporting technologies necessary to meet the stringent science requirements. Langley Research Center has identified the elements of its current and proposed advanced technology development program that are relevant to global change science according to three categories: sensors, spacecraft, and information system technologies. These technology proposals are presented as one-page synopses covering scope, objective, approach, readiness timeline, deliverables, and estimated funding. In addition, the global change science requirements and their measurement histories are briefly discussed.

  17. Advancing LGBT Health at an Academic Medical Center: A Case Study.

    PubMed

    Yehia, Baligh R; Calder, Daniel; Flesch, Judd D; Hirsh, Rebecca L; Higginbotham, Eve; Tkacs, Nancy; Crawford, Beverley; Fishman, Neil

    2015-12-01

    Academic health centers are strategically positioned to impact the health of lesbian, gay, bisexual and transgender (LGBT) populations by advancing science, educating future generations of providers, and delivering integrated care that addresses the unique health needs of the LGBT community. This report describes the early experiences of the Penn Medicine Program for LGBT Health, highlighting the favorable environment that led to its creation, the mission and structure of the Program, strategic planning process used to set priorities and establish collaborations, and the reception and early successes of the Program.

  18. Development and Validation of a Fast, Accurate and Cost-Effective Aeroservoelastic Method on Advanced Parallel Computing Systems

    NASA Technical Reports Server (NTRS)

    Goodwin, Sabine A.; Raj, P.

    1999-01-01

    Progress to date towards the development and validation of a fast, accurate and cost-effective aeroelastic method for advanced parallel computing platforms such as the IBM SP2 and the SGI Origin 2000 is presented in this paper. The ENSAERO code, developed at the NASA-Ames Research Center has been selected for this effort. The code allows for the computation of aeroelastic responses by simultaneously integrating the Euler or Navier-Stokes equations and the modal structural equations of motion. To assess the computational performance and accuracy of the ENSAERO code, this paper reports the results of the Navier-Stokes simulations of the transonic flow over a flexible aeroelastic wing body configuration. In addition, a forced harmonic oscillation analysis in the frequency domain and an analysis in the time domain are done on a wing undergoing a rigid pitch and plunge motion. Finally, to demonstrate the ENSAERO flutter-analysis capability, aeroelastic Euler and Navier-Stokes computations on an L-1011 wind tunnel model including pylon, nacelle and empennage are underway. All computational solutions are compared with experimental data to assess the level of accuracy of ENSAERO. As the computations described above are performed, a meticulous log of computational performance in terms of wall clock time, execution speed, memory and disk storage is kept. Code scalability is also demonstrated by studying the impact of varying the number of processors on computational performance on the IBM SP2 and the Origin 2000 systems.

  19. CNC Turning Center Operations and Prove Out. Computer Numerical Control Operator/Programmer. 444-334.

    ERIC Educational Resources Information Center

    Skowronski, Steven D.

    This student guide provides materials for a course designed to instruct the student in the recommended procedures used when setting up tooling and verifying part programs for a two-axis computer numerical control (CNC) turning center. The course consists of seven units. Unit 1 discusses course content and reviews and demonstrates set-up procedures…

  20. The Utility of Computer Tracking Tools for User-Centered Design.

    ERIC Educational Resources Information Center

    Gay, Geri; Mazur, Joan

    1993-01-01

    Describes tracking tools used by designers and users to evaluate the efficacy of hypermedia systems. Highlights include human-computer interaction research; tracking tools and user-centered design; and three examples from the Interactive Multimedia Group at Cornell University that illustrate uses of various tracking tools. (27 references) (LRW)

  1. A Report on the Design and Construction of the University of Massachusetts Computer Science Center.

    ERIC Educational Resources Information Center

    Massachusetts State Office of the Inspector General, Boston.

    This report describes a review conducted by the Massachusetts Office of the Inspector General on the construction of the Computer Science and Development Center at the University of Massachusetts, Amherst. The office initiated the review after hearing concerns about the management of the project, including its delayed completion and substantial…

  2. Merging Libraries and Computer Centers: Manifest Destiny or Manifestly Deranged? An Academic Services Perspective.

    ERIC Educational Resources Information Center

    Neff, Raymond K.

    1985-01-01

    Details trends in information access, services, packaging, dissemination, and networking, service fees, archival storage devices, and electronic information packaging that could lead to complete mergers of academic libraries and computing centers with shared responsibilities. University of California at Berkeley's comprehensive strategy for…

  3. Computer Center-Library Relations at Smaller Institutions: A Look from Both Sides.

    ERIC Educational Resources Information Center

    Hardesty, Larry

    1998-01-01

    Interviews with 51 librarians and 40 computer-center administrators at smaller colleges found that they face similar challenges in providing services and seeking economies but are uneasy about formal structural changes that bring their operations closer together. Reasons for their concern and implications for college administration are discussed,…

  4. CENTER CONDITIONS AND CYCLICITY FOR A FAMILY OF CUBIC SYSTEMS: COMPUTER ALGEBRA APPROACH.

    PubMed

    Ferčec, Brigita; Mahdi, Adam

    2013-01-01

    Using methods of computational algebra we obtain an upper bound for the cyclicity of a family of cubic systems. We overcame the problem of nonradicality of the associated Bautin ideal by moving from the ring of polynomials to a coordinate ring. Finally, we determine the number of limit cycles bifurcating from each component of the center variety.

  5. CENTER CONDITIONS AND CYCLICITY FOR A FAMILY OF CUBIC SYSTEMS: COMPUTER ALGEBRA APPROACH

    PubMed Central

    Ferčec, Brigita; Mahdi, Adam

    2013-01-01

    Using methods of computational algebra we obtain an upper bound for the cyclicity of a family of cubic systems. We overcame the problem of nonradicality of the associated Bautin ideal by moving from the ring of polynomials to a coordinate ring. Finally, we determine the number of limit cycles bifurcating from each component of the center variety. PMID:24223469

  6. SAM: The "Search and Match" Computer Program of the Escherichia coli Genetic Stock Center

    ERIC Educational Resources Information Center

    Bachmann, B. J.; And Others

    1973-01-01

    Describes a computer program used at a genetic stock center to locate particular strains of bacteria. The program can match up to 30 strain descriptions requested by a researcher with the records on file. Uses of this particular program can be made in many fields. (PS)

  7. Using a small/low cost computer in an information center

    NASA Technical Reports Server (NTRS)

    Wilde, D. U.

    1972-01-01

    Small/low cost computers are available with I/O capacities that make them suitable for SDI and retrospective searching on any of the many commercially available data bases. A small two-tape computer system is assumed, and an analysis of its run-time equations leads to a three-step search procedure. Run times and costs are shown as a function of file size, number of search terms, and input transmission rates. Actual examples verify that it is economically feasible for an information center to consider its own small, dedicated computer system.

  8. A Computational Methodology for Simulating Thermal Loss Testing of the Advanced Stirling Convertor

    NASA Technical Reports Server (NTRS)

    Reid, Terry V.; Wilson, Scott D.; Schifer, Nicholas A.; Briggs, Maxwell H.

    2012-01-01

    The U.S. Department of Energy (DOE) and Lockheed Martin Space Systems Company (LMSSC) have been developing the Advanced Stirling Radioisotope Generator (ASRG) for use as a power system for space science missions. This generator would use two highefficiency Advanced Stirling Convertors (ASCs), developed by Sunpower Inc. and NASA Glenn Research Center (GRC). The ASCs convert thermal energy from a radioisotope heat source into electricity. As part of ground testing of these ASCs, different operating conditions are used to simulate expected mission conditions. These conditions require achieving a particular operating frequency, hot end and cold end temperatures, and specified electrical power output for a given net heat input. In an effort to improve net heat input predictions, numerous tasks have been performed which provided a more accurate value for net heat input into the ASCs, including the use of multidimensional numerical models. Validation test hardware has also been used to provide a direct comparison of numerical results and validate the multi-dimensional numerical models used to predict convertor net heat input and efficiency. These validation tests were designed to simulate the temperature profile of an operating Stirling convertor and resulted in a measured net heat input of 244.4 W. The methodology was applied to the multi-dimensional numerical model which resulted in a net heat input of 240.3 W. The computational methodology resulted in a value of net heat input that was 1.7 percent less than that measured during laboratory testing. The resulting computational methodology and results are discussed.

  9. The Advance of Computing from the Ground to the Cloud

    ERIC Educational Resources Information Center

    Breeding, Marshall

    2009-01-01

    A trend toward the abstraction of computing platforms that has been developing in the broader IT arena over the last few years is just beginning to make inroads into the library technology scene. Cloud computing offers for libraries many interesting possibilities that may help reduce technology costs and increase capacity, reliability, and…

  10. Computational Enzyme Design: Advances, hurdles and possible ways forward

    PubMed Central

    Linder, Mats

    2012-01-01

    This mini review addresses recent developments in computational enzyme design. Successful protocols as well as known issues and limitations are discussed from an energetic perspective. It will be argued that improved results can be obtained by including a dynamic treatment in the design protocol. Finally, a molecular dynamics-based approach for evaluating and refining computational designs is presented. PMID:24688650

  11. Identification of Restrictive Computer and Software Variables among Preoperational Users of a Computer Learning Center.

    ERIC Educational Resources Information Center

    Kozubal, Diane K.

    While manufacturers have produced a wide variety of software said to be easy for even the youngest child to use, there are conflicting perspectives on computer issues such as ease of use, influence on meeting educational objectives, effects on procedural learning, and rationale for use with young children. Addressing these concerns, this practicum…

  12. South Carolina Center for the Advancement of Teaching and School Leadership: Professional Development Schools. Policy Paper Series 1.3.

    ERIC Educational Resources Information Center

    Gottesman, Barbara; And Others

    In 1990, the South Carolina Center for the Advancement of Teaching and School Leadership was established by the state's legislature to provide support to schools undergoing or planning restructuring. The Center assists schools to analyze needs, establish goals, and implement those goals. Technical assistance and college and school faculty training…

  13. The role of university and college counseling centers in advancing the professionalization of psychology.

    PubMed

    Bingham, Rosie Phillips

    2015-11-01

    Psychologists in university and college counseling centers (UCCCs) have helped to shape and advance the professionalization of psychology. Most definitions of a profession contain at least 5 components. A profession has (1) systematic theories and underlying principles; (2) authority to practice provided by the client; (3) a long educational process, including training and mentoring; (4) standards and a code of ethics; and (5) a culture of service and accountability to the public. UCCC professionals have evolved in a manner that demonstrates all 5 components of a profession. They advance the discipline of psychology as a profession through their counseling interventions because such interventions are based on scientific theories and principles. While their practice rests on scientific principles, their work helps to confirm and modify that science. Authority to practice is evidenced by the continuous growth of counseling centers since World War II. UCCCs aid the extended educational process for psychology graduate students as evidenced by their providing more internship training sites than any other category of training agencies. The majority of UCCC professionals are licensed and must abide by their state code of ethics. Such codes hold psychologists accountable to the public because they regularly deliver counseling service to at least 10% of the campus student population and offer outreach services to many more in their communities.

  14. A Study into Advanced Guidance Laws Using Computational Methods

    DTIC Science & Technology

    2011-12-01

    computing aerodynamic forces % and moments. Except where noted, all dimensions in % MKS system. % Inputs...9] R. L. Shaw, Fighter Combat: Tactics and Maneuvering. Annapolis, MD: Naval Institute Press, 1988. [10] U. S. Shukla and P. R. Mahapatra

  15. RECENT ADVANCES IN COMPUTATIONAL MECHANICS FOR CIVIL ENGINEERING

    NASA Astrophysics Data System (ADS)

    Applied Mechanics Committee, Computational Mechanics Subcommittee,

    In order to clarify mechanical phenomena in civil engineering, it is necessary to improve computational theory and technique in consideration of the particularity of objects to be analyzed and to update computational mechanics focusing on practical use. In addition to the analysis of infrastructure, for damage prediction of natural disasters such as earthquake, tsunami and flood, since it is essential to reflect broad ranges in space and time inherent to fields of civil engineering as well as material properties, it is important to newly develop computational method in view of the particularity of fields of civil engineering. In this context, research trend of methods of computational mechanics which is noteworthy for resolving the complex mechanics problems in civil engineering is reviewed in this paper.

  16. Advances in Domain Mapping of Massively Parallel Scientific Computations

    SciTech Connect

    Leland, Robert W.; Hendrickson, Bruce A.

    2015-10-01

    One of the most important concerns in parallel computing is the proper distribution of workload across processors. For most scientific applications on massively parallel machines, the best approach to this distribution is to employ data parallelism; that is, to break the datastructures supporting a computation into pieces and then to assign those pieces to different processors. Collectively, these partitioning and assignment tasks comprise the domain mapping problem.

  17. Advances in computational design and analysis of airbreathing propulsion systems

    NASA Technical Reports Server (NTRS)

    Klineberg, John M.

    1989-01-01

    The development of commercial and military aircraft depends, to a large extent, on engine manufacturers being able to achieve significant increases in propulsion capability through improved component aerodynamics, materials, and structures. The recent history of propulsion has been marked by efforts to develop computational techniques that can speed up the propulsion design process and produce superior designs. The availability of powerful supercomputers, such as the NASA Numerical Aerodynamic Simulator, and the potential for even higher performance offered by parallel computer architectures, have opened the door to the use of multi-dimensional simulations to study complex physical phenomena in propulsion systems that have previously defied analysis or experimental observation. An overview of several NASA Lewis research efforts is provided that are contributing toward the long-range goal of a numerical test-cell for the integrated, multidisciplinary design, analysis, and optimization of propulsion systems. Specific examples in Internal Computational Fluid Mechanics, Computational Structural Mechanics, Computational Materials Science, and High Performance Computing are cited and described in terms of current capabilities, technical challenges, and future research directions.

  18. The GridKa Tier-1 Computing Center within the ALICE Grid Framework

    NASA Astrophysics Data System (ADS)

    Park, WooJin J.; Christopher, Jung; Heiss, Andreas; Petzold, Andreas; Schwarz, Kilian

    2014-06-01

    The GridKa computing center, hosted by Steinbuch Centre for Computing at the Karlsruhe Institute for Technology (KIT) in Germany, is serving as the largest Tier-1 center used by the ALICE collaboration at the LHC. In 2013, GridKa provides 30k HEPSPEC06, 2.7 PB of disk space, and 5.25 PB of tape storage to ALICE. The 10Gbit/s network connections from GridKa to CERN, several Tier-1 centers and the general purpose network are used by ALICE intensively. In 2012 a total amount of ~1 PB was transferred to and from GridKa. As Grid framework, AliEn (ALICE Environment) is being used to access the resources, and various monitoring tools including the MonALISA (MONitoring Agent using a Large Integrated Services Architecture) are always running to alert in case of any problem. GridKa on-call engineers provide 24/7 support to guarantee minimal loss of availability of computing and storage resources in case of hardware or software problems. We introduce the GridKa Tier-1 center from the viewpoint of ALICE services.

  19. Accomplishments of the Advanced Reusable Technologies (ART) RBCC Project at NASA/Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    Nelson, Karl W.; McArthur, J. Craig (Technical Monitor)

    2001-01-01

    The focus of the NASA / Marshall Space Flight Center (MSFC) Advanced Reusable Technologies (ART) project is to advance and develop Rocket-Based Combined-Cycle (RBCC) technologies. The ART project began in 1996 as part of the Advanced Space Transportation Program (ASTP). The project is composed of several activities including RBCC engine ground testing, tool development, vehicle / mission studies, and component testing / development. The major contractors involved in the ART project are Aerojet and Rocketdyne. A large database of RBCC ground test data was generated for the air-augmented rocket (AAR), ramjet, scramjet, and ascent rocket modes of operation for both the Aerojet and Rocketdyne concepts. Transition between consecutive modes was also demonstrated as well as trajectory simulation. The Rocketdyne freejet tests were conducted at GASL in the Flight Acceleration Simulation Test (FAST) facility. During a single test, the FAST facility is capable of simulating both the enthalpy and aerodynamic conditions over a range of Mach numbers in a flight trajectory. Aerojet performed freejet testing in the Pebble Bed facility at GASL as well as direct-connect testing at GASL. Aerojet also performed sea-level static (SLS) testing at the Aerojet A-Zone facility in Sacramento, CA. Several flight-type flowpath components were developed under the ART project. Aerojet designed and fabricated ceramic scramjet injectors. The structural design of the injectors will be tested in a simulated scramjet environment where thermal effects and performance will be assessed. Rocketdyne will be replacing the cooled combustor in the A5 rig with a flight-weight combustor that is near completion. Aerojet's formed duct panel is currently being fabricated and will be tested in the SLS rig in Aerojet's A-Zone facility. Aerojet has already successfully tested a cooled cowl panel in the same facility. In addition to MSFC, other NASA centers have contributed to the ART project as well. Inlet testing

  20. Computational structural mechanics: A new activity at the NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Knight, N. F., Jr.; Stroud, W. J.

    1985-01-01

    Complex structures considered for the late 1980's and early 1990's include composite primary aircraft structures and the space station. These structures are much more difficult to analyze than today's structures and necessitate a major upgrade in computerized structural analysis technology. A major research activity in computational structural mechanics (CSM) was initiated. The objective of the CSM activity is develop advanced structural analysis technology that will exploit modern and emerging computers such as computers with vector and/or parallel processing capabilities. The three main research activities underway in CSM include: (1) structural analysis methods development; (2) a software testbed for evaluating the methods; and (3) numerical techniques for parallel processing computers. The motivation and objectives of the CSM activity are presented and CSM activity is described. The current CSM research thrusts, and near and long term CSM research thrusts are outlined.

  1. Advanced Simulation and Computing Co-Design Strategy

    SciTech Connect

    Ang, James A.; Hoang, Thuc T.; Kelly, Suzanne M.; McPherson, Allen; Neely, Rob

    2015-11-01

    This ASC Co-design Strategy lays out the full continuum and components of the co-design process, based on what we have experienced thus far and what we wish to do more in the future to meet the program’s mission of providing high performance computing (HPC) and simulation capabilities for NNSA to carry out its stockpile stewardship responsibility.

  2. Connecting Performance Analysis and Visualization to Advance Extreme Scale Computing

    SciTech Connect

    Bremer, Peer-Timo; Mohr, Bernd; Schulz, Martin; Pasccci, Valerio; Gamblin, Todd; Brunst, Holger

    2015-07-29

    The characterization, modeling, analysis, and tuning of software performance has been a central topic in High Performance Computing (HPC) since its early beginnings. The overall goal is to make HPC software run faster on particular hardware, either through better scheduling, on-node resource utilization, or more efficient distributed communication.

  3. Advanced Computer Image Generation Techniques Exploiting Perceptual Characteristics. Final Report.

    ERIC Educational Resources Information Center

    Stenger, Anthony J.; And Others

    This study suggests and identifies computer image generation (CIG) algorithms for visual simulation that improve the training effectiveness of CIG simulators and identifies areas of basic research in visual perception that are significant for improving CIG technology. The first phase of the project entailed observing three existing CIG simulators.…

  4. MAX - An advanced parallel computer for space applications

    NASA Technical Reports Server (NTRS)

    Lewis, Blair F.; Bunker, Robert L.

    1991-01-01

    MAX is a fault-tolerant multicomputer hardware and software architecture designed to meet the needs of NASA spacecraft systems. It consists of conventional computing modules (computers) connected via a dual network topology. One network is used to transfer data among the computers and between computers and I/O devices. This network's topology is arbitrary. The second network operates as a broadcast medium for operating system synchronization messages and supports the operating system's Byzantine resilience. A fully distributed operating system supports multitasking in an asynchronous event and data driven environment. A large grain dataflow paradigm is used to coordinate the multitasking and provide easy control of concurrency. It is the basis of the system's fault tolerance and allows both static and dynamical location of tasks. Redundant execution of tasks with software voting of results may be specified for critical tasks. The dataflow paradigm also supports simplified software design, test and maintenance. A unique feature is a method for reliably patching code in an executing dataflow application.

  5. Advanced Computational Aeroacoustics Methods for Fan Noise Prediction

    NASA Technical Reports Server (NTRS)

    Envia, Edmane (Technical Monitor); Tam, Christopher

    2003-01-01

    Direct computation of fan noise is presently not possible. One of the major difficulties is the geometrical complexity of the problem. In the case of fan noise, the blade geometry is critical to the loading on the blade and hence the intensity of the radiated noise. The precise geometry must be incorporated into the computation. In computational fluid dynamics (CFD), there are two general ways to handle problems with complex geometry. One way is to use unstructured grids. The other is to use body fitted overset grids. In the overset grid method, accurate data transfer is of utmost importance. For acoustic computation, it is not clear that the currently used data transfer methods are sufficiently accurate as not to contaminate the very small amplitude acoustic disturbances. In CFD, low order schemes are, invariably, used in conjunction with unstructured grids. However, low order schemes are known to be numerically dispersive and dissipative. dissipative errors are extremely undesirable for acoustic wave problems. The objective of this project is to develop a high order unstructured grid Dispersion-Relation-Preserving (DRP) scheme. would minimize numerical dispersion and dissipation errors. contains the results of the funded portion of the project. scheme on an unstructured grid has been developed. constructed in the wave number space. The characteristics of the scheme can be improved by the inclusion of additional constraints. Stability of the scheme has been investigated. Stability can be improved by adopting the upwinding strategy.

  6. Storage and network bandwidth requirements through the year 2000 for the NASA Center for Computational Sciences

    NASA Technical Reports Server (NTRS)

    Salmon, Ellen

    1996-01-01

    The data storage and retrieval demands of space and Earth sciences researchers have made the NASA Center for Computational Sciences (NCCS) Mass Data Storage and Delivery System (MDSDS) one of the world's most active Convex UniTree systems. Science researchers formed the NCCS's Computer Environments and Research Requirements Committee (CERRC) to relate their projected supercomputing and mass storage requirements through the year 2000. Using the CERRC guidelines and observations of current usage, some detailed projections of requirements for MDSDS network bandwidth and mass storage capacity and performance are presented.

  7. Project T.E.A.M. (Technical Education Advancement Modules). Introduction to Computers.

    ERIC Educational Resources Information Center

    Ellis, Brenda

    This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 3-hour introduction to computers. The purpose is to develop the following competencies: (1) orientation to data processing; (2) use of data entry devices; (3) use of computer menus; and (4) entry of data with accuracy and…

  8. Advanced Computational Thermal Studies and their Assessment for Supercritical-Pressure Reactors (SCRs)

    SciTech Connect

    D. M. McEligot; J. Y. Yoo; J. S. Lee; S. T. Ro; E. Lurien; S. O. Park; R. H. Pletcher; B. L. Smith; P. Vukoslavcevic; J. M. Wallace

    2009-04-01

    The goal of this laboratory / university collaboration of coupled computational and experimental studies is the improvement of predictive methods for supercritical-pressure reactors. The general objective is to develop supporting knowledge needed of advanced computational techniques for the technology development of the concepts and their safety systems.

  9. Teaching Advanced Concepts in Computer Networks: VNUML-UM Virtualization Tool

    ERIC Educational Resources Information Center

    Ruiz-Martinez, A.; Pereniguez-Garcia, F.; Marin-Lopez, R.; Ruiz-Martinez, P. M.; Skarmeta-Gomez, A. F.

    2013-01-01

    In the teaching of computer networks the main problem that arises is the high price and limited number of network devices the students can work with in the laboratories. Nowadays, with virtualization we can overcome this limitation. In this paper, we present a methodology that allows students to learn advanced computer network concepts through…

  10. Advances in Engine Test Capabilities at the NASA Glenn Research Center's Propulsion Systems Laboratory

    NASA Technical Reports Server (NTRS)

    Pachlhofer, Peter M.; Panek, Joseph W.; Dicki, Dennis J.; Piendl, Barry R.; Lizanich, Paul J.; Klann, Gary A.

    2006-01-01

    The Propulsion Systems Laboratory at the National Aeronautics and Space Administration (NASA) Glenn Research Center is one of the premier U.S. facilities for research on advanced aeropropulsion systems. The facility can simulate a wide range of altitude and Mach number conditions while supplying the aeropropulsion system with all the support services necessary to operate at those conditions. Test data are recorded on a combination of steady-state and highspeed data-acquisition systems. Recently a number of upgrades were made to the facility to meet demanding new requirements for the latest aeropropulsion concepts and to improve operational efficiency. Improvements were made to data-acquisition systems, facility and engine-control systems, test-condition simulation systems, video capture and display capabilities, and personnel training procedures. This paper discusses the facility s capabilities, recent upgrades, and planned future improvements.

  11. The rehabilitation engineering research center for the advancement of cognitive technologies.

    PubMed

    Heyn, Patricia Cristine; Cassidy, Joy Lucille; Bodine, Cathy

    2015-02-01

    Barring few exceptions, allied health professionals, engineers, manufacturers of assistive technologies (ATs), and consumer product manufacturers have developed few technologies for individuals with cognitive impairments (CIs). In 2004, the National Institute on Disability Rehabilitation Research (NIDRR) recognized the need to support research in this emergent field. They funded the first Rehabilitation Engineering Research Center for the Advancement of Cognitive Technologies (RERC-ACT). The RERC-ACT has since designed and evaluated existing and emerging technologies through rigorous research, improving upon existing AT devices, and creating new technologies for individuals with CIs. The RERC-ACT has contributed to the development and testing of AT products that assist persons with CIs to actively engage in tasks of daily living at home, school, work, and in the community. This article highlights the RERC-ACT's engineering development and research projects and discusses how current research may impact the quality of life for an aging population.

  12. Testing of the Advanced Stirling Radioisotope Generator Engineering Unit at NASA Glenn Research Center

    NASA Technical Reports Server (NTRS)

    Lewandowski, Edward J.

    2013-01-01

    The Advanced Stirling Radioisotope Generator (ASRG) is a high-efficiency generator being developed for potential use on a Discovery 12 space mission. Lockheed Martin designed and fabricated the ASRG Engineering Unit (EU) under contract to the Department of Energy. This unit was delivered to NASA Glenn Research Center in 2008 and has been undergoing extended operation testing to generate long-term performance data for an integrated system. It has also been used for tests to characterize generator operation while varying control parameters and system inputs, both when controlled with an alternating current (AC) bus and with a digital controller. The ASRG EU currently has over 27,000 hours of operation. This paper summarizes all of the tests that have been conducted on the ASRG EU over the past 3 years and provides an overview of the test results and what was learned.

  13. Examining the Role of Gender in Career Advancement at the Centers for Disease Control and Prevention

    PubMed Central

    Roy, Kakoli; Gotway Crawford, Carol A.

    2010-01-01

    During the past decade, efforts to promote gender parity in the healing and public health professions have met with only partial success. We provide a critical update regarding the status of women in the public health profession by exploring gender-related differences in promotion rates at the nation's leading public health agency, the Centers for Disease Control and Prevention (CDC). Using personnel data drawn from CDC, we found that the gender gap in promotion has diminished across time and that this reduction can be attributed to changes in individual characteristics (e.g., higher educational levels and more federal work experience). However, a substantial gap in promotion that cannot be explained by such characteristics has persisted, indicating continuing barriers in women's career advancement. PMID:20075327

  14. Examining the role of gender in career advancement at the Centers for Disease Control and Prevention.

    PubMed

    Chen, Zhuo; Roy, Kakoli; Gotway Crawford, Carol A

    2010-03-01

    During the past decade, efforts to promote gender parity in the healing and public health professions have met with only partial success. We provide a critical update regarding the status of women in the public health profession by exploring gender-related differences in promotion rates at the nation's leading public health agency, the Centers for Disease Control and Prevention (CDC). Using personnel data drawn from CDC, we found that the gender gap in promotion has diminished across time and that this reduction can be attributed to changes in individual characteristics (e.g., higher educational levels and more federal work experience). However, a substantial gap in promotion that cannot be explained by such characteristics has persisted, indicating continuing barriers in women's career advancement.

  15. Bayesian Research at the NASA Ames Research Center,Computational Sciences Division

    NASA Technical Reports Server (NTRS)

    Morris, Robin D.

    2003-01-01

    NASA Ames Research Center is one of NASA s oldest centers, having started out as part of the National Advisory Committee on Aeronautics, (NACA). The site, about 40 miles south of San Francisco, still houses many wind tunnels and other aviation related departments. In recent years, with the growing realization that space exploration is heavily dependent on computing and data analysis, its focus has turned more towards Information Technology. The Computational Sciences Division has expanded rapidly as a result. In this article, I will give a brief overview of some of the past and present projects with a Bayesian content. Much more than is described here goes on with the Division. The web pages at http://ic.arc. nasa.gov give more information on these, and the other Division projects.

  16. Development of computational methods for unsteady aerodynamics at the NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Yates, E. Carson, Jr.; Whitlow, Woodrow, Jr.

    1987-01-01

    The current scope, recent progress, and plans for research and development of computational methods for unsteady aerodynamics at the NASA Langley Research Center are reviewed. Both integral equations and finite difference methods for inviscid and viscous flows are discussed. Although the great bulk of the effort has focused on finite difference solution of the transonic small perturbation equation, the integral equation program is given primary emphasis here because it is less well known.

  17. Development of computational methods for unsteady aerodynamics at the NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Yates, E. Carson, Jr.; Whitlow, Woodrow, Jr.

    1987-01-01

    The current scope, recent progress, and plans for research and development of computational methods for unsteady aerodynamics at the NASA Langley Research Center are reviewed. Both integral-equations and finite-difference method for inviscid and viscous flows are discussed. Although the great bulk of the effort has focused on finite-difference solution of the transonic small-perturbation equation, the integral-equation program is given primary emphasis here because it is less well known.

  18. The psychology of computer displays in the modern mission control center

    NASA Technical Reports Server (NTRS)

    Granaas, Michael M.; Rhea, Donald C.

    1988-01-01

    Work at NASA's Western Aeronautical Test Range (WATR) has demonstrated the need for increased consideration of psychological factors in the design of computer displays for the WATR mission control center. These factors include color perception, memory load, and cognitive processing abilities. A review of relevant work in the human factors psychology area is provided to demonstrate the need for this awareness. The information provided should be relevant in control room settings where computerized displays are being used.

  19. Advances on modelling of ITER scenarios: physics and computational challenges

    NASA Astrophysics Data System (ADS)

    Giruzzi, G.; Garcia, J.; Artaud, J. F.; Basiuk, V.; Decker, J.; Imbeaux, F.; Peysson, Y.; Schneider, M.

    2011-12-01

    Methods and tools for design and modelling of tokamak operation scenarios are discussed with particular application to ITER advanced scenarios. Simulations of hybrid and steady-state scenarios performed with the integrated tokamak modelling suite of codes CRONOS are presented. The advantages of a possible steady-state scenario based on cyclic operations, alternating phases of positive and negative loop voltage, with no magnetic flux consumption on average, are discussed. For regimes in which current alignment is an issue, a general method for scenario design is presented, based on the characteristics of the poloidal current density profile.

  20. National facility for advanced computational science: A sustainable path to scientific discovery

    SciTech Connect

    Simon, Horst; Kramer, William; Saphir, William; Shalf, John; Bailey, David; Oliker, Leonid; Banda, Michael; McCurdy, C. William; Hules, John; Canning, Andrew; Day, Marc; Colella, Philip; Serafini, David; Wehner, Michael; Nugent, Peter

    2004-04-02

    Lawrence Berkeley National Laboratory (Berkeley Lab) proposes to create a National Facility for Advanced Computational Science (NFACS) and to establish a new partnership between the American computer industry and a national consortium of laboratories, universities, and computing facilities. NFACS will provide leadership-class scientific computing capability to scientists and engineers nationwide, independent of their institutional affiliation or source of funding. This partnership will bring into existence a new class of computational capability in the United States that is optimal for science and will create a sustainable path towards petaflops performance.

  1. Using Advanced Computer Vision Algorithms on Small Mobile Robots

    DTIC Science & Technology

    2006-04-20

    Lab. This classification algorithm has been used to successfully detect the license plates of automobiles in motion in real-time. While working...use in real-time. Test results are shown for a variety of environments. KEYWORDS: robotics, computer vision, car /license plate detection, SIFT...when detecting the make and model of automobiles , SIFT can be used to achieve very high detection rates at the expense of a hefty performance cost when

  2. Cogeneration computer model assessment: Advanced cogeneration research study

    NASA Technical Reports Server (NTRS)

    Rosenberg, L.

    1983-01-01

    Cogeneration computer simulation models to recommend the most desirable models or their components for use by the Southern California Edison Company (SCE) in evaluating potential cogeneration projects was assessed. Existing cogeneration modeling capabilities are described, preferred models are identified, and an approach to the development of a code which will best satisfy SCE requirements is recommended. Five models (CELCAP, COGEN 2, CPA, DEUS, and OASIS) are recommended for further consideration.

  3. Advanced Computational Methods for Security Constrained Financial Transmission Rights

    SciTech Connect

    Kalsi, Karanjit; Elbert, Stephen T.; Vlachopoulou, Maria; Zhou, Ning; Huang, Zhenyu

    2012-07-26

    Financial Transmission Rights (FTRs) are financial insurance tools to help power market participants reduce price risks associated with transmission congestion. FTRs are issued based on a process of solving a constrained optimization problem with the objective to maximize the FTR social welfare under power flow security constraints. Security constraints for different FTR categories (monthly, seasonal or annual) are usually coupled and the number of constraints increases exponentially with the number of categories. Commercial software for FTR calculation can only provide limited categories of FTRs due to the inherent computational challenges mentioned above. In this paper, first an innovative mathematical reformulation of the FTR problem is presented which dramatically improves the computational efficiency of optimization problem. After having re-formulated the problem, a novel non-linear dynamic system (NDS) approach is proposed to solve the optimization problem. The new formulation and performance of the NDS solver is benchmarked against widely used linear programming (LP) solvers like CPLEX™ and tested on both standard IEEE test systems and large-scale systems using data from the Western Electricity Coordinating Council (WECC). The performance of the NDS is demonstrated to be comparable and in some cases is shown to outperform the widely used CPLEX algorithms. The proposed formulation and NDS based solver is also easily parallelizable enabling further computational improvement.

  4. Parallel computing in genomic research: advances and applications

    PubMed Central

    Ocaña, Kary; de Oliveira, Daniel

    2015-01-01

    Today’s genomic experiments have to process the so-called “biological big data” that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC) environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities. PMID:26604801

  5. Parallel computing in genomic research: advances and applications.

    PubMed

    Ocaña, Kary; de Oliveira, Daniel

    2015-01-01

    Today's genomic experiments have to process the so-called "biological big data" that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC) environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities.

  6. Vision 20/20: Automation and advanced computing in clinical radiation oncology

    SciTech Connect

    Moore, Kevin L. Moiseenko, Vitali; Kagadis, George C.; McNutt, Todd R.; Mutic, Sasa

    2014-01-15

    This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authors contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy.

  7. Crop Production for Advanced Life Support Systems - Observations From the Kennedy Space Center Breadboard Project

    NASA Technical Reports Server (NTRS)

    Wheeler, R. M.; Sager, J. C.; Prince, R. P.; Knott, W. M.; Mackowiak, C. L.; Stutte, G. W.; Yorio, N. C.; Ruffe, L. M.; Peterson, B. V.; Goins, G. D.

    2003-01-01

    The use of plants for bioregenerative life support for space missions was first studied by the US Air Force in the 1950s and 1960s. Extensive testing was also conducted from the 1960s through the 1980s by Russian researchers located at the Institute of Biophysics in Krasnoyarsk, Siberia, and the Institute for Biomedical Problems in Moscow. NASA initiated bioregenerative research in the 1960s (e.g., Hydrogenomonas) but this research did not include testing with plants until about 1980, with the start of the Controlled Ecological Life Support System (CELSS) Program. The NASA CELSS research was carried out at universities, private corporations, and NASA field centers, including Kennedy Space Center (KSC). The project at KSC began in 1985 and was called the CELSS Breadboard Project to indicate the capability for plugging in and testing various life support technologies; this name has since been dropped but bioregenerative testing at KSC has continued to the present under the NASA s Advanced Life Support (ALS) Program. A primary objective of the KSC testing was to conduct pre-integration tests with plants (crops) in a large, atmospherically closed test chamber called the Biomass Production Chamber (BPC). Test protocols for the BPC were based on observations and growing procedures developed by university investigators, as well as procedures developed in plant growth chamber studies at KSC. Growth chamber studies to support BPC testing focused on plant responses to different carbon dioxide (CO2) concentrations, different spectral qualities from various electric lamps, and nutrient film hydroponic culture techniques.

  8. Center to Advance Palliative Care palliative care clinical care and customer satisfaction metrics consensus recommendations.

    PubMed

    Weissman, David E; Morrison, R Sean; Meier, Diane E

    2010-02-01

    Data collection and analysis are vital for strategic planning, quality improvement, and demonstration of palliative care program impact to hospital administrators, private funders and policymakers. Since 2000, the Center to Advance Palliative Care (CAPC) has provided technical assistance to hospitals, health systems and hospices working to start, sustain, and grow nonhospice palliative care programs. CAPC convened a consensus panel in 2008 to develop recommendations for specific clinical and customer metrics that programs should track. The panel agreed on four key domains of clinical metrics and two domains of customer metrics. Clinical metrics include: daily assessment of physical/psychological/spiritual symptoms by a symptom assessment tool; establishment of patient-centered goals of care; support to patient/family caregivers; and management of transitions across care sites. For customer metrics, consensus was reached on two domains that should be tracked to assess satisfaction: patient/family satisfaction, and referring clinician satisfaction. In an effort to ensure access to reliably high-quality palliative care data throughout the nation, hospital palliative care programs are encouraged to collect and report outcomes for each of the metric domains described here.

  9. Fulvestrant in advanced breast cancer following tamoxifen and aromatase inhibition: a single center experience.

    PubMed

    Wang, Jayson; Jain, Sandeep; Coombes, Charles R; Palmieri, Carlo

    2009-01-01

    Fulvestrant is a pure estrogen receptor (ER) antagonist with no agonist effects. We describe the experience of a single center involving 45 postmenopausal women with advanced breast cancer where fulvestrant was utilized following progression on tamoxifen and a third generation aromatase inhibitor. Patients received fulvestrant as first line one (2%), second line 18 (40%), third line 13 (29%), fourth line 10 (22%), and fifth line three (7%) treatment. Median duration of treatment with Fulvestrant was 4 months (range 1-20 months). One patient had a partial response, 14 other (31%) experienced clinical benefit (CB) (defined as response or stable disease for at least 6 months). The median time to progression (TTP) from initiation of fulvestrant was 4 months (range 1-20 months) and the median survival was 10 months (range 1-55 months). In those patients who experienced CB the median TTP was 10 months (range 6-20) and median survival was 21 months (range 7-55). Fulvestrant was well tolerated; two patients experienced side effects severe enough to stop therapy. Despite the fact that fulvestrant was used in the majority of cases, later in the treatment sequence CB was seen in a number of patients. This data suggest fulvestrant is well tolerated and is a useful treatment option in patients with advanced breast cancer who progress on prior endocrine treatment.

  10. An Overview of Advanced Elastomeric Seal Development and Testing Capabilities at NASA Glenn Research Center

    NASA Technical Reports Server (NTRS)

    Dunlap, Patrick H.

    2014-01-01

    NASA is developing advanced space-rated elastomeric seals to support future space exploration missions to low Earth orbit, the Moon, near Earth asteroids, and other destinations. This includes seals for a new docking system and vehicle hatches. These seals must exhibit extremely low leak rates to ensure that astronauts have sufficient breathable air for extended missions. Seal compression loads must be below prescribed limits so as not to overload the mechanisms that compress them, and seal adhesion forces must be low to allow the sealed interface to be separated when required (e.g., during undocking or hatch opening). NASA Glenn Research Center has developed a number of unique test fixtures to measure the leak rates and compression and adhesion loads of candidate seal designs under simulated thermal, vacuum, and engagement conditions. Tests can be performed on full-scale seals with diameters on the order of 50 in., subscale seals that are about 12 in. in diameter, and smaller specimens such as O-rings. Test conditions include temperatures ranging from -238 to 662 F (-150 to 350 C), operational pressure gradients, and seal-on-seal or seal-on-flange mating configurations. Nominal and off-nominal conditions (e.g., incomplete seal compression) can also be simulated. This paper describes the main design features and capabilities of each type of test apparatus and provides an overview of advanced seal development activities at NASA Glenn.

  11. NASA Glenn Research Center Support of the Advanced Stirling Radioisotope Generator Project

    NASA Technical Reports Server (NTRS)

    Wilson, Scott D.; Wong, Wayne A.

    2015-01-01

    A high-efficiency radioisotope power system was being developed for long-duration NASA space science missions. The U.S. Department of Energy (DOE) managed a flight contract with Lockheed Martin Space Systems Company to build Advanced Stirling Radioisotope Generators (ASRGs), with support from NASA Glenn Research Center. DOE initiated termination of that contract in late 2013, primarily due to budget constraints. Sunpower, Inc., held two parallel contracts to produce Advanced Stirling Convertors (ASCs), one with Lockheed Martin to produce ASC-F flight units, and one with Glenn for the production of ASC-E3 engineering unit "pathfinders" that are built to the flight design. In support of those contracts, Glenn provided testing, materials expertise, Government-furnished equipment, inspection capabilities, and related data products to Lockheed Martin and Sunpower. The technical support included material evaluations, component tests, convertor characterization, and technology transfer. Material evaluations and component tests were performed on various ASC components in order to assess potential life-limiting mechanisms and provide data for reliability models. Convertor level tests were conducted to characterize performance under operating conditions that are representative of various mission conditions. Despite termination of the ASRG flight development contract, NASA continues to recognize the importance of high-efficiency ASC power conversion for Radioisotope Power Systems (RPS) and continues investment in the technology, including the continuation of the ASC-E3 contract. This paper describes key Government support for the ASRG project and future tests to be used to provide data for ongoing reliability assessments.

  12. Test Rack Development for Extended Operation of Advanced Stirling Convertors at NASA Glenn Research Center

    NASA Technical Reports Server (NTRS)

    Dugala, Gina M.

    2010-01-01

    The U.S. Department of Energy, Lockheed Martin Space Systems Company, Sunpower Inc., and NASA Glenn Research Center (GRC) have been developing an Advanced Stirling Radioisotope Generator (ASRG) for use as a power system on space science missions. This generator will make use of free-piston Stirling convertors to achieve higher conversion efficiency than with currently available alternatives. One part of NASA GRC's support of ASRG development includes extended operation testing of Advanced Stirling Convertors (ASCs) developed by Sunpower Inc. and GRC. The ASC consists of a free-piston Stirling engine integrated with a linear alternator. NASA GRC has been building test facilities to support extended operation of the ASCs for several years. Operation of the convertors in the test facility provides convertor performance data over an extended period of time. One part of the test facility is the test rack, which provides a means for data collection, convertor control, and safe operation. Over the years, the test rack requirements have changed. The initial ASC test rack utilized an alternating-current (AC) bus for convertor control; the ASRG Engineering Unit (EU) test rack can operate with AC bus control or with an ASC Control Unit (ACU). A new test rack is being developed to support extended operation of the ASC-E2s with higher standards of documentation, component selection, and assembly practices. This paper discusses the differences among the ASC, ASRG EU, and ASC-E2 test racks.

  13. Test Rack Development for Extended Operation of Advanced Stirling Convertors at NASA Glenn Research Center

    NASA Technical Reports Server (NTRS)

    Dugala, Gina M.

    2009-01-01

    The U.S. Department of Energy (DOE), Lockheed Martin Space Company (LMSC), Sun power Inc., and NASA Glenn Research Center (GRC) have been developing an Advanced Stirling Radioisotope Generator (ASRG) for use as a power system on space science missions. This generator will make use of free-piston Stirling convertors to achieve higher conversion efficiency than currently available alternatives. NASA GRC's support of ASRG development includes extended operation testing of Advanced Stirling Convertors (ASCs) developed by Sunpower Inc. In the past year, NASA GRC has been building a test facility to support extended operation of a pair of engineering level ASCs. Operation of the convertors in the test facility provides convertor performance data over an extended period of time. Mechanical support hardware, data acquisition software, and an instrumentation rack were developed to prepare the pair of convertors for continuous extended operation. Short-term tests were performed to gather baseline performance data before extended operation was initiated. These tests included workmanship vibration, insulation thermal loss characterization, low-temperature checkout, and fUll-power operation. Hardware and software features are implemented to ensure reliability of support systems. This paper discusses the mechanical support hardware, instrumentation rack, data acquisition software, short-term tests, and safety features designed to support continuous unattended operation of a pair of ASCs.

  14. An Overview of Advanced Elastomeric Seal Development and Testing Capabilities at NASA Glenn Research Center

    NASA Technical Reports Server (NTRS)

    Dunlap, Patrick H., Jr.

    2014-01-01

    NASA is developing advanced space-rated elastomeric seals to support future space exploration missions to low Earth orbit, the Moon, near Earth asteroids, and other destinations. This includes seals for a new docking system and vehicle hatches. These seals must exhibit extremely low leak rates to ensure that astronauts have sufficient breathable air for extended missions. Seal compression loads must be below prescribed limits so as not to overload the mechanisms that compress them, and seal adhesion forces must be low to allow the sealed interface to be separated when required (e.g., during undocking or hatch opening). NASA Glenn Research Center has developed a number of unique test fixtures to measure the leak rates and compression and adhesion loads of candidate seal designs under simulated thermal, vacuum, and engagement conditions. Tests can be performed on full-scale seals with diameters on the order of 50 in., subscale seals that are about 12 in. in diameter, and smaller specimens such as O-rings. Test conditions include temperatures ranging from -238 to +662F (-150 to +350C), operational pressure gradients, and seal-on-seal or seal-on-flange mating configurations. Nominal and off-nominal conditions (e.g., incomplete seal compression) can also be simulated. This paper describes the main design features and capabilities of each test apparatus and provides an overview of advanced seal development activities at NASA Glenn.

  15. An Overview of Advanced Elastomeric Seal Development and Testing Capabilities at NASA Glenn Research Center

    NASA Technical Reports Server (NTRS)

    Dunlap, Patrick H., Jr.

    2014-01-01

    NASA is developing advanced space-rated elastomeric seals to support future space exploration missions to low Earth orbit, the Moon, near Earth asteroids, and other destinations. This includes seals for a new docking system and vehicle hatches. These seals must exhibit extremely low leak rates to ensure that astronauts have sufficient breathable air for extended missions. Seal compression loads must be below prescribed limits so as not to overload the mechanisms that compress them, and seal adhesion forces must be low to allow the sealed interface to be separated when required (e.g., during undocking or hatch opening). NASA Glenn Research Center has developed a number of unique test fixtures to measure the leak rates and compression and adhesion loads of candidate seal designs under simulated thermal, vacuum, and engagement conditions. Tests can be performed on fullscale seals with diameters on the order of 50 in., subscale seals that are about 12 in. in diameter, and smaller specimens such as O-rings. Test conditions include temperatures ranging from -238 to 662degF (-150 to 350degC), operational pressure gradients, and seal-on-seal or seal-on-flange mating configurations. Nominal and off-nominal conditions (e.g., incomplete seal compression) can also be simulated. This paper describes the main design features and capabilities of each type of test apparatus and provides an overview of advanced seal development activities at NASA Glenn.

  16. Test Platform for Advanced Digital Control of Brushless DC Motors (MSFC Center Director's Discretionary Fund)

    NASA Technical Reports Server (NTRS)

    Gwaltney, D. A.

    2002-01-01

    A FY 2001 Center Director's Discretionary Fund task to develop a test platform for the development, implementation. and evaluation of adaptive and other advanced control techniques for brushless DC (BLDC) motor-driven mechanisms is described. Important applications for BLDC motor-driven mechanisms are the translation of specimens in microgravity experiments and electromechanical actuation of nozzle and fuel valves in propulsion systems. Motor-driven aerocontrol surfaces are also being utilized in developmental X vehicles. The experimental test platform employs a linear translation stage that is mounted vertically and driven by a BLDC motor. Control approaches are implemented on a digital signal processor-based controller for real-time, closed-loop control of the stage carriage position. The goal of the effort is to explore the application of advanced control approaches that can enhance the performance of a motor-driven actuator over the performance obtained using linear control approaches with fixed gains. Adaptive controllers utilizing an exact model knowledge controller and a self-tuning controller are implemented and the control system performance is illustrated through the presentation of experimental results.

  17. Processing and Preparation of Advanced Stirling Convertors for Extended Operation at NASA Glenn Research Center

    NASA Technical Reports Server (NTRS)

    Oriti, Salvatore M.; Cornell, Peggy A.

    2008-01-01

    The U.S. Department of Energy (DOE), Lockheed Martin Space Company (LMSC), Sunpower Inc., and NASA Glenn Research Center (GRC) have been developing an Advanced Stirling Radioisotope Generator (ASRG) for use as a power system on space science missions. This generator will make use of the free-piston Stirling convertors to achieve higher conversion efficiency than currently available alternatives. NASA GRC is supporting the development of the ASRG by providing extended operation of several Sunpower Inc. Advanced Stirling Convertors (ASCs). In the past year and a half, eight ASCs have operated in continuous, unattended mode in both air and thermal vacuum environments. Hardware, software, and procedures were developed to prepare each convertor for extended operation with intended durations on the order of tens of thousands of hours. Steps taken to prepare a convertor for long-term operation included geometry measurements, thermocouple instrumentation, evaluation of working fluid purity, evacuation with bakeout, and high purity charge. Actions were also taken to ensure the reliability of support systems, such as data acquisition and automated shutdown checkouts. Once a convertor completed these steps, it underwent short-term testing to gather baseline performance data before initiating extended operation. These tests included insulation thermal loss characterization, low-temperature checkout, and full-temperature and power demonstration. This paper discusses the facilities developed to support continuous, unattended operation, and the processing results of the eight ASCs currently on test.

  18. Meaning-centered group psychotherapy for patients with advanced cancer: a pilot randomized controlled trial

    PubMed Central

    Breitbart, William; Rosenfeld, Barry; Gibson, Christopher; Pessin, Hayley; Poppito, Shannon; Nelson, Christian; Tomarken, Alexis; Timm, Anne Kosinski; Berg, Amy; Jacobson, Colleen; Sorger, Brooke; Abbey, Jennifer; Olden, Megan

    2013-01-01

    Objectives An increasingly important concern for clinicians who care for patients at the end of life is their spiritual well-being and sense of meaning and purpose in life. In response to the need for short-term interventions to address spiritual well-being, we developed Meaning Centered Group Psychotherapy (MCGP) to help patients with advanced cancer sustain or enhance a sense of meaning, peace and purpose in their lives, even as they approach the end of life. Methods Patients with advanced (stage III or IV) solid tumor cancers (N = 90) were randomly assigned to either MCGP or a supportive group psychotherapy (SGP). Patients were assessed before and after completing the 8-week intervention, and again 2 months after completion. Outcome assessment included measures of spiritual well-being, meaning, hopelessness, desire for death, optimism/pessimism, anxiety, depression and overall quality of life. Results MCGP resulted in significantly greater improvements in spiritual well-being and a sense of meaning. Treatment gains were even more substantial (based on effect size estimates) at the second follow-up assessment. Improvements in anxiety and desire for death were also significant (and increased over time). There was no significant improvement on any of these variables for patients participating in SGP. Conclusions MCGP appears to be a potentially beneficial intervention for patients’ emotional and spiritual suffering at the end of life. Further research, with larger samples, is clearly needed to better understand the potential benefits of this novel intervention. PMID:19274623

  19. Pilot Randomized Controlled Trial of Individual Meaning-Centered Psychotherapy for Patients With Advanced Cancer

    PubMed Central

    Breitbart, William; Poppito, Shannon; Rosenfeld, Barry; Vickers, Andrew J.; Li, Yuelin; Abbey, Jennifer; Olden, Megan; Pessin, Hayley; Lichtenthal, Wendy; Sjoberg, Daniel; Cassileth, Barrie R.

    2012-01-01

    Purpose Spiritual well-being and sense of meaning are important concerns for clinicians who care for patients with cancer. We developed Individual Meaning-Centered Psychotherapy (IMCP) to address the need for brief interventions targeting spiritual well-being and meaning for patients with advanced cancer. Patients and Methods Patients with stage III or IV cancer (N = 120) were randomly assigned to seven sessions of either IMCP or therapeutic massage (TM). Patients were assessed before and after completing the intervention and 2 months postintervention. Primary outcome measures assessed spiritual well-being and quality of life; secondary outcomes included anxiety, depression, hopelessness, symptom burden, and symptom-related distress. Results Of the 120 participants randomly assigned, 78 (65%) completed the post-treatment assessment and 67 (56%) completed the 2-month follow-up. At the post-treatment assessment, IMCP participants demonstrated significantly greater improvement than the control condition for the primary outcomes of spiritual well-being (b = 0.39; P <.001, including both components of spiritual well-being (sense of meaning: b = 0.34; P = .003 and faith: b = 0.42; P = .03), and quality of life (b = 0.76; P = .013). Significantly greater improvements for IMCP patients were also observed for the secondary outcomes of symptom burden (b = −6.56; P < .001) and symptom-related distress (b = −0.47; P < .001) but not for anxiety, depression, or hopelessness. At the 2-month follow-up assessment, the improvements observed for the IMCP group were no longer significantly greater than those observed for the TM group. Conclusion IMCP has clear short-term benefits for spiritual suffering and quality of life in patients with advanced cancer. Clinicians working with patients who have advanced cancer should consider IMCP as an approach to enhance quality of life and spiritual well-being. PMID:22370330

  20. Eye center localization and gaze gesture recognition for human-computer interaction.

    PubMed

    Zhang, Wenhao; Smith, Melvyn L; Smith, Lyndon N; Farooq, Abdul

    2016-03-01

    This paper introduces an unsupervised modular approach for accurate and real-time eye center localization in images and videos, thus allowing a coarse-to-fine, global-to-regional scheme. The trajectories of eye centers in consecutive frames, i.e., gaze gestures, are further analyzed, recognized, and employed to boost the human-computer interaction (HCI) experience. This modular approach makes use of isophote and gradient features to estimate the eye center locations. A selective oriented gradient filter has been specifically designed to remove strong gradients from eyebrows, eye corners, and shadows, which sabotage most eye center localization methods. A real-world implementation utilizing these algorithms has been designed in the form of an interactive advertising billboard to demonstrate the effectiveness of our method for HCI. The eye center localization algorithm has been compared with 10 other algorithms on the BioID database and six other algorithms on the GI4E database. It outperforms all the other algorithms in comparison in terms of localization accuracy. Further tests on the extended Yale Face Database b and self-collected data have proved this algorithm to be robust against moderate head poses and poor illumination conditions. The interactive advertising billboard has manifested outstanding usability and effectiveness in our tests and shows great potential for benefiting a wide range of real-world HCI applications.

  1. Computational Approaches to Enhance Nanosafety and Advance Nanomedicine

    NASA Astrophysics Data System (ADS)

    Mendoza, Eduardo R.

    With the increasing use of nanoparticles in food processing, filtration/purification and consumer products, as well as the huge potential of their use in nanomedicine, a quantitative understanding of the effects of nanoparticle uptake and transport is needed. We provide examples of novel methods for modeling complex bio-nano interactions which are based on stochastic process algebras. Since model construction presumes sufficient availability of experimental data, recent developments in "nanoinformatics", an emerging discipline analogous to bioinfomatics, in building an accessible information infrastructure are subsequently discussed. Both computational areas offer opportunities for Filipinos to engage in collaborative, cutting edge research in this impactful field.

  2. First Responders Guide to Computer Forensics: Advanced Topics

    DTIC Science & Technology

    2005-09-01

    server of the sender , the mail server of the receiver, and the computer that receives the email. Assume that Alice wants to send an email to her friend...pleased to meet you MAIL FROM: alice.price@alphanet.com 250 alice.price@alphanet.com... Sender ok RCPT TO: bob.doe@betanet.com 250 bob.doe...betanet.com... Sender ok DATA 354 Please start mail input From: alice.price@alphanet.com To: bob.doe@betanet.com Subject: Lunch Bob, It was good

  3. Advanced Computational Methods for Thermal Radiative Heat Transfer

    SciTech Connect

    Tencer, John; Carlberg, Kevin Thomas; Larsen, Marvin E.; Hogan, Roy E.

    2016-10-01

    Participating media radiation (PMR) in weapon safety calculations for abnormal thermal environments are too costly to do routinely. This cost may be s ubstantially reduced by applying reduced order modeling (ROM) techniques. The application of ROM to PMR is a new and unique approach for this class of problems. This approach was investigated by the authors and shown to provide significant reductions in the computational expense associated with typical PMR simulations. Once this technology is migrated into production heat transfer analysis codes this capability will enable the routine use of PMR heat transfer in higher - fidelity simulations of weapon resp onse in fire environments.

  4. Advanced Computational Framework for Environmental Management ZEM, Version 1.x

    SciTech Connect

    Vesselinov, Velimir V.; O'Malley, Daniel; Pandey, Sachin

    2016-11-04

    Typically environmental management problems require analysis of large and complex data sets originating from concurrent data streams with different data collection frequencies and pedigree. These big data sets require on-the-fly integration into a series of models with different complexity for various types of model analyses where the data are applied as soft and hard model constraints. This is needed to provide fast iterative model analyses based on the latest available data to guide decision-making. Furthermore, the data and model are associated with uncertainties. The uncertainties are probabilistic (e.g. measurement errors) and non-probabilistic (unknowns, e.g. alternative conceptual models characterizing site conditions). To address all of these issues, we have developed an integrated framework for real-time data and model analyses for environmental decision-making called ZEM. The framework allows for seamless and on-the-fly integration of data and modeling results for robust and scientifically-defensible decision-making applying advanced decision analyses tools such as Bayesian- Information-Gap Decision Theory (BIG-DT). The framework also includes advanced methods for optimization that are capable of dealing with a large number of unknown model parameters, and surrogate (reduced order) modeling capabilities based on support vector regression techniques. The framework is coded in Julia, a state-of-the-art high-performance programing language (http://julialang.org). The ZEM framework is open-source and can be applied to any environmental management site. The framework will be open-source and released under GPL V3 license.

  5. Exploring Effective Decision Making through Human-Centered and Computational Intelligence Methods

    SciTech Connect

    Han, Kyungsik; Cook, Kristin A.; Shih, Patrick C.

    2016-06-13

    Decision-making has long been studied to understand a psychological, cognitive, and social process of selecting an effective choice from alternative options. Its studies have been extended from a personal level to a group and collaborative level, and many computer-aided decision-making systems have been developed to help people make right decisions. There has been significant research growth in computational aspects of decision-making systems, yet comparatively little effort has existed in identifying and articulating user needs and requirements in assessing system outputs and the extent to which human judgments could be utilized for making accurate and reliable decisions. Our research focus is decision-making through human-centered and computational intelligence methods in a collaborative environment, and the objectives of this position paper are to bring our research ideas to the workshop, and share and discuss ideas.

  6. Computational Efforts in Support of Advanced Coal Research

    SciTech Connect

    Suljo Linic

    2006-08-17

    The focus in this project was to employ first principles computational methods to study the underlying molecular elementary processes that govern hydrogen diffusion through Pd membranes as well as the elementary processes that govern the CO- and S-poisoning of these membranes. Our computational methodology integrated a multiscale hierarchical modeling approach, wherein a molecular understanding of the interactions between various species is gained from ab-initio quantum chemical Density Functional Theory (DFT) calculations, while a mesoscopic statistical mechanical model like Kinetic Monte Carlo is employed to predict the key macroscopic membrane properties such as permeability. The key developments are: (1) We have coupled systematically the ab initio calculations with Kinetic Monte Carlo (KMC) simulations to model hydrogen diffusion through the Pd based-membranes. The predicted tracer diffusivity of hydrogen atoms through the bulk of Pd lattice from KMC simulations are in excellent agreement with experiments. (2) The KMC simulations of dissociative adsorption of H{sub 2} over Pd(111) surface indicates that for thin membranes (less than 10{micro} thick), the diffusion of hydrogen from surface to the first subsurface layer is rate limiting. (3) Sulfur poisons the Pd surface by altering the electronic structure of the Pd atoms in the vicinity of the S atom. The KMC simulations indicate that increasing sulfur coverage drastically reduces the hydrogen coverage on the Pd surface and hence the driving force for diffusion through the membrane.

  7. Supplemental final environmental impact statement for advanced solid rocket motor testing at Stennis Space Center

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Since the Final Environmental Impact Statement (FEIS) and Record of Decision on the FEIS describing the potential impacts to human health and the environment associated with the program, three factors have caused NASA to initiate additional studies regarding these issues. These factors are: (1) The U.S. Army Corps of Engineers and the Environmental Protection Agency (EPA) agreed to use the same comprehensive procedures to identify and delineate wetlands; (2) EPA has given NASA further guidance on how best to simulate the exhaust plume from the Advanced Solid Rocket Motor (ASRM) testing through computer modeling, enabling more realistic analysis of emission impacts; and (3) public concerns have been raised concerning short and long term impacts on human health and the environment from ASRM testing.

  8. Supporting Development for the Stirling Radioisotope Generator and Advanced Stirling Technology Development at NASA Glenn Research Center

    NASA Technical Reports Server (NTRS)

    Thieme, Lanny G.; Schreiber, Jeffrey G.

    2005-01-01

    A high-efficiency, 110-We (watts electric) Stirling Radioisotope Generator (SRG110) for possible use on future NASA Space Science missions is being developed by the Department of Energy, Lockheed Martin, Stirling Technology Company (STC), and NASA Glenn Research Center (GRC). Potential mission use includes providing spacecraft onboard electric power for deep space missions and power for unmanned Mars rovers. GRC is conducting an in-house supporting technology project to assist in SRG110 development. One-, three-, and six-month heater head structural benchmark tests have been completed in support of a heater head life assessment. Testing is underway to evaluate the key epoxy bond of the permanent magnets to the linear alternator stator lamination stack. GRC has completed over 10,000 hours of extended duration testing of the Stirling convertors for the SRG110, and a three-year test of two Stirling convertors in a thermal vacuum environment will be starting shortly. GRC is also developing advanced technology for Stirling convertors, aimed at substantially improving the specific power and efficiency of the convertor and the overall generator. Sunpower, Inc. has begun the development of a lightweight Stirling convertor, under a NASA Research Announcement (NRA) award, that has the potential to double the system specific power to about 8 We/kg. GRC has performed random vibration testing of a lower-power version of this convertor to evaluate robustness for surviving launch vibrations. STC has also completed the initial design of a lightweight convertor. Status of the development of a multi-dimensional computational fluid dynamics code and high-temperature materials work on advanced superalloys, refractory metal alloys, and ceramics are also discussed.

  9. Activities and operations of the Advanced Computing Research Facility, July-October 1986

    SciTech Connect

    Pieper, G.W.

    1986-01-01

    Research activities and operations of the Advanced Computing Research Facility (ACRF) at Argonne National Laboratory are discussed for the period from July 1986 through October 1986. The facility is currently supported by the Department of Energy, and is operated by the Mathematics and Computer Science Division at Argonne. Over the past four-month period, a new commercial multiprocessor, the Intel iPSC-VX/d4 hypercube was installed. In addition, four other commercial multiprocessors continue to be available for research - an Encore Multimax, a Sequent Balance 21000, an Alliant FX/8, and an Intel iPSC/d5 - as well as a locally designed multiprocessor, the Lemur. These machines are being actively used by scientists at Argonne and throughout the nation in a wide variety of projects concerning computer systems with parallel and vector architectures. A variety of classes, workshops, and seminars have been sponsored to train researchers on computing techniques for the advanced computer systems at the Advanced Computing Research Facility. For example, courses were offered on writing programs for parallel computer systems and hosted the first annual Alliant users group meeting. A Sequent users group meeting and a two-day workshop on performance evaluation of parallel computers and programs are being organized.

  10. Advancing Efficient All-Electron Electronic Structure Methods Based on Numeric Atom-Centered Orbitals for Energy Related Materials

    NASA Astrophysics Data System (ADS)

    Blum, Volker

    This talk describes recent advances of a general, efficient, accurate all-electron electronic theory approach based on numeric atom-centered orbitals; emphasis is placed on developments related to materials for energy conversion and their discovery. For total energies and electron band structures, we show that the overall accuracy is on par with the best benchmark quality codes for materials, but scalable to large system sizes (1,000s of atoms) and amenable to both periodic and non-periodic simulations. A recent localized resolution-of-identity approach for the Coulomb operator enables O (N) hybrid functional based descriptions of the electronic structure of non-periodic and periodic systems, shown for supercell sizes up to 1,000 atoms; the same approach yields accurate results for many-body perturbation theory as well. For molecular systems, we also show how many-body perturbation theory for charged and neutral quasiparticle excitation energies can be efficiently yet accurately applied using basis sets of computationally manageable size. Finally, the talk highlights applications to the electronic structure of hybrid organic-inorganic perovskite materials, as well as to graphene-based substrates for possible future transition metal compound based electrocatalyst materials. All methods described here are part of the FHI-aims code. VB gratefully acknowledges contributions by numerous collaborators at Duke University, Fritz Haber Institute Berlin, TU Munich, USTC Hefei, Aalto University, and many others around the globe.

  11. The Joint Space Operations Center Mission System and the Advanced Research, Collaboration, and Application Development Environment Status Update 2016

    NASA Astrophysics Data System (ADS)

    Murray-Krezan, Jeremy; Howard, Samantha; Sabol, Chris; Kim, Richard; Echeverry, Juan

    2016-05-01

    The Joint Space Operations Center (JSpOC) Mission System (JMS) is a service-oriented architecture (SOA) infrastructure with increased process automation and improved tools to enhance Space Situational Awareness (SSA) performed at the US-led JSpOC. The Advanced Research, Collaboration, and Application Development Environment (ARCADE) is a test-bed maintained and operated by the Air Force to (1) serve as a centralized test-bed for all research and development activities related to JMS applications, including algorithm development, data source exposure, service orchestration, and software services, and provide developers reciprocal access to relevant tools and data to accelerate technology development, (2) allow the JMS program to communicate user capability priorities and requirements to developers, (3) provide the JMS program with access to state-of-the-art research, development, and computing capabilities, and (4) support JMS Program Office-led market research efforts by identifying outstanding performers that are available to shepherd into the formal transition process. In this paper we will share with the international remote sensing community some of the recent JMS and ARCADE developments that may contribute to greater SSA at the JSpOC in the future, and share technical areas still in great need.

  12. 78 FR 69926 - Privacy Act of 1974, as Amended; Computer Matching Program (SSA/Centers for Medicare & Medicaid...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-21

    ... ADMINISTRATION Privacy Act of 1974, as Amended; Computer Matching Program (SSA/ Centers for Medicare & Medicaid... renewal of an existing computer matching program that will expire on October 16, 2013. SUMMARY: In... computer matching program that we conduct with CMS. DATES: We will file a report of the subject...

  13. 77 FR 33547 - Privacy Act of 1974, as Amended; Computer Matching Program (SSA/Centers for Medicare and Medicaid...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-06

    ... ADMINISTRATION Privacy Act of 1974, as Amended; Computer Matching Program (SSA/ Centers for Medicare and Medicaid... of the Privacy Act, as amended, this notice announces a new computer matching program that we will... Privacy Act We have taken action to ensure that all of our computer matching programs comply with...

  14. Yahoo! Compute Coop (YCC). A Next-Generation Passive Cooling Design for Data Centers

    SciTech Connect

    Robison, AD; Page, Christina; Lytle, Bob

    2011-07-20

    The purpose of the Yahoo! Compute Coop (YCC) project is to research, design, build and implement a greenfield "efficient data factory" and to specifically demonstrate that the YCC concept is feasible for large facilities housing tens of thousands of heat-producing computing servers. The project scope for the Yahoo! Compute Coop technology includes: - Analyzing and implementing ways in which to drastically decrease energy consumption and waste output. - Analyzing the laws of thermodynamics and implementing naturally occurring environmental effects in order to maximize the "free-cooling" for large data center facilities. "Free cooling" is the direct usage of outside air to cool the servers vs. traditional "mechanical cooling" which is supplied by chillers or other Dx units. - Redesigning and simplifying building materials and methods. - Shortening and simplifying build-to-operate schedules while at the same time reducing initial build and operating costs. Selected for its favorable climate, the greenfield project site is located in Lockport, NY. Construction on the 9.0 MW critical load data center facility began in May 2009, with the fully operational facility deployed in September 2010. The relatively low initial build cost, compatibility with current server and network models, and the efficient use of power and water are all key features that make it a highly compatible and globally implementable design innovation for the data center industry. Yahoo! Compute Coop technology is designed to achieve 99.98% uptime availability. This integrated building design allows for free cooling 99% of the year via the building's unique shape and orientation, as well as server physical configuration.

  15. The transition to massively parallel computing within a production environment at a DOE access center

    SciTech Connect

    McCoy, M.G.

    1993-04-01

    In contemplating the transition from sequential to MP computing, the National Energy Research Supercomputer Center (NERSC) is faced with the frictions inherent in the duality of its mission. There have been two goals, the first has been to provide a stable, serviceable, production environment to the user base, the second to bring the most capable early serial supercomputers to the Center to make possible the leading edge simulations. This seeming conundrum has in reality been a source of strength. The task of meeting both goals was faced before with the CRAY 1 which, as delivered, was all iron; so the problems associated with the advent of parallel computers are not entirely new, but they are serious. Current vector supercomputers, such as the C90, offer mature production environments, including software tools, a large applications base, and generality; these machines can be used to attack the spectrum of scientific applications by a large user base knowledgeable in programming techniques for this architecture. Parallel computers to date have offered less developed, even rudimentary, working environments, a sparse applications base, and forced specialization. They have been specialized in terms of programming models, and specialized in terms of the kinds of applications which would do well on the machines. Given this context, why do many service computer centers feel that now is the time to cease or slow the procurement of traditional vector supercomputers in favor of MP systems? What are some of the issues that NERSC must face to engineer a smooth transition? The answers to these questions are multifaceted and by no means completely clear. However, a route exists as a result of early efforts at the Laboratories combined with research within the HPCC Program. One can begin with an analysis of why the hardware and software appearing shortly should be made available to the mainstream, and then address what would be required in an initial production environment.

  16. Advances in x-ray computed microtomography at the NSLS

    SciTech Connect

    Dowd, B.A.; Andrews, A.B.; Marr, R.B.; Siddons, D.P.; Jones, K.W.; Peskin, A.M.

    1998-08-01

    The X-Ray Computed Microtomography workstation at beamline X27A at the NSLS has been utilized by scientists from a broad range of disciplines from industrial materials processing to environmental science. The most recent applications are presented here as well as a description of the facility that has evolved to accommodate a wide variety of materials and sample sizes. One of the most exciting new developments reported here resulted from a pursuit of faster reconstruction techniques. A Fast Filtered Back Transform (FFBT) reconstruction program has been developed and implemented, that is based on a refinement of the gridding algorithm first developed for use with radio astronomical data. This program has reduced the reconstruction time to 8.5 sec for a 929 x 929 pixel{sup 2} slice on an R10,000 CPU, more than 8x reduction compared with the Filtered Back-Projection method.

  17. ADVANCES IN X-RAY COMPUTED MICROTOMOGRAPHY AT THE NSLS.

    SciTech Connect

    DOWD,B.A.

    1998-08-07

    The X-Ray Computed Microtomography workstation at beamline X27A at the NSLS has been utilized by scientists from a broad range of disciplines from industrial materials processing to environmental science. The most recent applications are presented here as well as a description of the facility that has evolved to accommodate a wide variety of materials and sample sizes. One of the most exciting new developments reported here resulted from a pursuit of faster reconstruction techniques. A Fast Filtered Back Transform (FFBT) reconstruction program has been developed and implemented, that is based on a refinement of the ''gridding'' algorithm first developed for use with radio astronomical data. This program has reduced the reconstruction time to 8.5 sec for a 929 x 929 pixel{sup 2} slice on an R10,000 CPU, more than 8x reduction compared with the Filtered Back-Projection method.

  18. Experimental and computing strategies in advanced material characterization problems

    SciTech Connect

    Bolzon, G.

    2015-10-28

    The mechanical characterization of materials relies more and more often on sophisticated experimental methods that permit to acquire a large amount of data and, contemporarily, to reduce the invasiveness of the tests. This evolution accompanies the growing demand of non-destructive diagnostic tools that assess the safety level of components in use in structures and infrastructures, for instance in the strategic energy sector. Advanced material systems and properties that are not amenable to traditional techniques, for instance thin layered structures and their adhesion on the relevant substrates, can be also characterized by means of combined experimental-numerical tools elaborating data acquired by full-field measurement techniques. In this context, parameter identification procedures involve the repeated simulation of the laboratory or in situ tests by sophisticated and usually expensive non-linear analyses while, in some situation, reliable and accurate results would be required in real time. The effectiveness and the filtering capabilities of reduced models based on decomposition and interpolation techniques can be profitably used to meet these conflicting requirements. This communication intends to summarize some results recently achieved in this field by the author and her co-workers. The aim is to foster further interaction between engineering and mathematical communities.

  19. An Analysis of Cloud Computing with Amazon Web Services for the Atmospheric Science Data Center

    NASA Astrophysics Data System (ADS)

    Gleason, J. L.; Little, M. M.

    2013-12-01

    NASA science and engineering efforts rely heavily on compute and data handling systems. The nature of NASA science data is such that it is not restricted to NASA users, instead it is widely shared across a globally distributed user community including scientists, educators, policy decision makers, and the public. Therefore NASA science computing is a candidate use case for cloud computing where compute resources are outsourced to an external vendor. Amazon Web Services (AWS) is a commercial cloud computing service developed to use excess computing capacity at Amazon, and potentially provides an alternative to costly and potentially underutilized dedicated acquisitions whenever NASA scientists or engineers require additional data processing. AWS desires to provide a simplified avenue for NASA scientists and researchers to share large, complex data sets with external partners and the public. AWS has been extensively used by JPL for a wide range of computing needs and was previously tested on a NASA Agency basis during the Nebula testing program. Its ability to support the Langley Science Directorate needs to be evaluated by integrating it with real world operational needs across NASA and the associated maturity that would come with that. The strengths and weaknesses of this architecture and its ability to support general science and engineering applications has been demonstrated during the previous testing. The Langley Office of the Chief Information Officer in partnership with the Atmospheric Sciences Data Center (ASDC) has established a pilot business interface to utilize AWS cloud computing resources on a organization and project level pay per use model. This poster discusses an effort to evaluate the feasibility of the pilot business interface from a project level perspective by specifically using a processing scenario involving the Clouds and Earth's Radiant Energy System (CERES) project.

  20. The mobilize center: an NIH big data to knowledge center to advance human movement research and improve mobility.

    PubMed

    Ku, Joy P; Hicks, Jennifer L; Hastie, Trevor; Leskovec, Jure; Ré, Christopher; Delp, Scott L

    2015-11-01

    Regular physical activity helps prevent heart disease, stroke, diabetes, and other chronic diseases, yet a broad range of conditions impair mobility at great personal and societal cost. Vast amounts of data characterizing human movement are available from research labs, clinics, and millions of smartphones and wearable sensors, but integration and analysis of this large quantity of mobility data are extremely challenging. The authors have established the Mobilize Center (http://mobilize.stanford.edu) to harness these data to improve human mobility and help lay the foundation for using data science methods in biomedicine. The Center is organized around 4 data science research cores: biomechanical modeling, statistical learning, behavioral and social modeling, and integrative modeling. Important biomedical applications, such as osteoarthritis and weight management, will focus the development of new data science methods. By developing these new approaches, sharing data and validated software tools, and training thousands of researchers, the Mobilize Center will transform human movement research.

  1. The mobilize center: an NIH big data to knowledge center to advance human movement research and improve mobility

    PubMed Central

    Ku, Joy P; Hicks, Jennifer L; Hastie, Trevor; Leskovec, Jure; Ré, Christopher

    2015-01-01

    Regular physical activity helps prevent heart disease, stroke, diabetes, and other chronic diseases, yet a broad range of conditions impair mobility at great personal and societal cost. Vast amounts of data characterizing human movement are available from research labs, clinics, and millions of smartphones and wearable sensors, but integration and analysis of this large quantity of mobility data are extremely challenging. The authors have established the Mobilize Center (http://mobilize.stanford.edu) to harness these data to improve human mobility and help lay the foundation for using data science methods in biomedicine. The Center is organized around 4 data science research cores: biomechanical modeling, statistical learning, behavioral and social modeling, and integrative modeling. Important biomedical applications, such as osteoarthritis and weight management, will focus the development of new data science methods. By developing these new approaches, sharing data and validated software tools, and training thousands of researchers, the Mobilize Center will transform human movement research. PMID:26272077

  2. Human-Centered Software Engineering: Software Engineering Architectures, Patterns, and Sodels for Human Computer Interaction

    NASA Astrophysics Data System (ADS)

    Seffah, Ahmed; Vanderdonckt, Jean; Desmarais, Michel C.

    The Computer-Human Interaction and Software Engineering (CHISE) series of edited volumes originated from a number of workshops and discussions over the latest research and developments in the field of Human Computer Interaction (HCI) and Software Engineering (SE) integration, convergence and cross-pollination. A first volume in this series (CHISE Volume I - Human-Centered Software Engineering: Integrating Usability in the Development Lifecycle) aims at bridging the gap between the field of SE and HCI, and addresses specifically the concerns of integrating usability and user-centered systems design methods and tools into the software development lifecycle and practices. This has been done by defining techniques, tools and practices that can fit into the entire software engineering lifecycle as well as by defining ways of addressing the knowledge and skills needed, and the attitudes and basic values that a user-centered development methodology requires. The first volume has been edited as Vol. 8 in the Springer HCI Series (Seffah, Gulliksen and Desmarais, 2005).

  3. Spectrum of tablet computer use by medical students and residents at an academic medical center.

    PubMed

    Robinson, Robert

    2015-01-01

    Introduction. The value of tablet computer use in medical education is an area of considerable interest, with preliminary investigations showing that the majority of medical trainees feel that tablet computers added value to the curriculum. This study investigated potential differences in tablet computer use between medical students and resident physicians. Materials & Methods. Data collection for this survey was accomplished with an anonymous online questionnaire shared with the medical students and residents at Southern Illinois University School of Medicine (SIU-SOM) in July and August of 2012. Results. There were 76 medical student responses (26% response rate) and 66 resident/fellow responses to this survey (21% response rate). Residents/fellows were more likely to use tablet computers several times daily than medical students (32% vs. 20%, p = 0.035). The most common reported uses were for accessing medical reference applications (46%), e-Books (45%), and board study (32%). Residents were more likely than students to use a tablet computer to access an electronic medical record (41% vs. 21%, p = 0.010), review radiology images (27% vs. 12%, p = 0.019), and enter patient care orders (26% vs. 3%, p < 0.001). Discussion. This study shows a high prevalence and frequency of tablet computer use among physicians in training at this academic medical center. Most residents and students use tablet computers to access medical references, e-Books, and to study for board exams. Residents were more likely to use tablet computers to complete clinical tasks. Conclusions. Tablet computer use among medical students and resident physicians was common in this survey. All learners used tablet computers for point of care references and board study. Resident physicians were more likely to use tablet computers to access the EMR, enter patient care orders, and review radiology studies. This difference is likely due to the differing educational and professional demands placed on

  4. Spectrum of tablet computer use by medical students and residents at an academic medical center

    PubMed Central

    2015-01-01

    Introduction. The value of tablet computer use in medical education is an area of considerable interest, with preliminary investigations showing that the majority of medical trainees feel that tablet computers added value to the curriculum. This study investigated potential differences in tablet computer use between medical students and resident physicians. Materials & Methods. Data collection for this survey was accomplished with an anonymous online questionnaire shared with the medical students and residents at Southern Illinois University School of Medicine (SIU-SOM) in July and August of 2012. Results. There were 76 medical student responses (26% response rate) and 66 resident/fellow responses to this survey (21% response rate). Residents/fellows were more likely to use tablet computers several times daily than medical students (32% vs. 20%, p = 0.035). The most common reported uses were for accessing medical reference applications (46%), e-Books (45%), and board study (32%). Residents were more likely than students to use a tablet computer to access an electronic medical record (41% vs. 21%, p = 0.010), review radiology images (27% vs. 12%, p = 0.019), and enter patient care orders (26% vs. 3%, p < 0.001). Discussion. This study shows a high prevalence and frequency of tablet computer use among physicians in training at this academic medical center. Most residents and students use tablet computers to access medical references, e-Books, and to study for board exams. Residents were more likely to use tablet computers to complete clinical tasks. Conclusions. Tablet computer use among medical students and resident physicians was common in this survey. All learners used tablet computers for point of care references and board study. Resident physicians were more likely to use tablet computers to access the EMR, enter patient care orders, and review radiology studies. This difference is likely due to the differing educational and professional demands placed on

  5. Strategies for casualty mitigation programs by using advanced tsunami computation

    NASA Astrophysics Data System (ADS)

    IMAI, K.; Imamura, F.

    2012-12-01

    1. Purpose of the study In this study, based on the scenario of great earthquakes along the Nankai trough, we aim on the estimation of the run up and high accuracy inundation process of tsunami in coastal areas including rivers. Here, using a practical method of tsunami analytical model, and taking into account characteristics of detail topography, land use and climate change in a realistic present and expected future environment, we examined the run up and tsunami inundation process. Using these results we estimated the damage due to tsunami and obtained information for the mitigation of human casualties. Considering the time series from the occurrence of the earthquake and the risk of tsunami damage, in order to mitigate casualties we provide contents of disaster risk information displayed in a tsunami hazard and risk map. 2. Creating a tsunami hazard and risk map From the analytical and practical tsunami model (a long wave approximated model) and the high resolution topography (5 m) including detailed data of shoreline, rivers, building and houses, we present a advanced analysis of tsunami inundation considering the land use. Based on the results of tsunami inundation and its analysis; it is possible to draw a tsunami hazard and risk map with information of human casualty, building damage estimation, drift of vehicles, etc. 3. Contents of disaster prevention information To improve the hazard, risk and evacuation information distribution, it is necessary to follow three steps. (1) Provide basic information such as tsunami attack info, areas and routes for evacuation and location of tsunami evacuation facilities. (2) Provide as additional information the time when inundation starts, the actual results of inundation, location of facilities with hazard materials, presence or absence of public facilities and areas underground that required evacuation. (3) Provide information to support disaster response such as infrastructure and traffic network damage prediction

  6. CART V: recent advancements in computer-aided camouflage assessment

    NASA Astrophysics Data System (ADS)

    Müller, Thomas; Müller, Markus

    2011-05-01

    In order to facilitate systematic, computer aided improvements of camouflage and concealment assessment methods, the software system CART (Camouflage Assessment in Real-Time) was built up for the camouflage assessment of objects in multispectral image sequences (see contributions to SPIE 2007-2010 [1], [2], [3], [4]). It comprises a semi-automatic marking of target objects (ground truth generation) including their propagation over the image sequence and the evaluation via user-defined feature extractors as well as methods to assess the object's movement conspicuity. In this fifth part in an annual series at the SPIE conference in Orlando, this paper presents the enhancements over the recent year and addresses the camouflage assessment of static and moving objects in multispectral image data that can show noise or image artefacts. The presented methods fathom the correlations between image processing and camouflage assessment. A novel algorithm is presented based on template matching to assess the structural inconspicuity of an object objectively and quantitatively. The results can easily be combined with an MTI (moving target indication) based movement conspicuity assessment function in order to explore the influence of object movement to a camouflage effect in different environments. As the results show, the presented methods contribute to a significant benefit in the field of camouflage assessment.

  7. Block sparse Cholesky algorithms on advanced uniprocessor computers

    SciTech Connect

    Ng, E.G.; Peyton, B.W.

    1991-12-01

    As with many other linear algebra algorithms, devising a portable implementation of sparse Cholesky factorization that performs well on the broad range of computer architectures currently available is a formidable challenge. Even after limiting our attention to machines with only one processor, as we have done in this report, there are still several interesting issues to consider. For dense matrices, it is well known that block factorization algorithms are the best means of achieving this goal. We take this approach for sparse factorization as well. This paper has two primary goals. First, we examine two sparse Cholesky factorization algorithms, the multifrontal method and a blocked left-looking sparse Cholesky method, in a systematic and consistent fashion, both to illustrate the strengths of the blocking techniques in general and to obtain a fair evaluation of the two approaches. Second, we assess the impact of various implementation techniques on time and storage efficiency, paying particularly close attention to the work-storage requirement of the two methods and their variants.

  8. Advances in computed radiography systems and their physical imaging characteristics.

    PubMed

    Cowen, A R; Davies, A G; Kengyelics, S M

    2007-12-01

    Radiological imaging is progressing towards an all-digital future, across the spectrum of medical imaging techniques. Computed radiography (CR) has provided a ready pathway from screen film to digital radiography and a convenient entry point to PACS. This review briefly revisits the principles of modern CR systems and their physical imaging characteristics. Wide dynamic range and digital image enhancement are well-established benefits of CR, which lend themselves to improved image presentation and reduced rates of repeat exposures. However, in its original form CR offered limited scope for reducing the radiation dose per radiographic exposure, compared with screen film. Recent innovations in CR, including the use of dual-sided image readout and channelled storage phosphor have eased these concerns. For example, introduction of these technologies has improved detective quantum efficiency (DQE) by approximately 50 and 100%, respectively, compared with standard CR. As a result CR currently affords greater scope for reducing patient dose, and provides a more substantive challenge to the new solid-state, flat-panel, digital radiography detectors.

  9. A brief overview of computational structures technology related activities at NASA Lewis Research Center

    NASA Astrophysics Data System (ADS)

    Hopkins, Dale A.

    1992-05-01

    The presentation gives a partial overview of research and development underway in the Structures Division of LeRC, which collectively is referred to as the Computational Structures Technology Program. The activities in the program are diverse and encompass four major categories: (1) composite materials and structures; (2) probabilistic analysis and reliability; (3) design optimization and expert systems; and (4) computational methods and simulation. The approach of the program is comprehensive and entails exploration of fundamental theories of structural mechanics to accurately represent the complex physics governing engine structural performance, formulation, and implementation of computational techniques and integrated simulation strategies to provide accurate and efficient solutions of the governing theoretical models by exploiting the emerging advances in computer technology, and validation and verification through numerical and experimental tests to establish confidence and define the qualities and limitations of the resulting theoretical models and computational solutions. The program comprises both in-house and sponsored research activities. The remainder of the presentation provides a sample of activities to illustrate the breadth and depth of the program and to demonstrate the accomplishments and benefits that have resulted.

  10. Instrumentation for synchrotron based micromachining at the Center for Advanced Microstructures and Devices (abstract)

    NASA Astrophysics Data System (ADS)

    Aigeldinger, G.; Goettert, J.; Desta, Y.; Ling, Z. L.; Rupp, L.

    2002-03-01

    The J. Bennett Johnston Sr., Center for Advanced Microstructures and Devices (CAMD) is a synchrotron radiation facility owned by Louisiana State University and operated with financial support from the State of Louisiana (for information how to submit a project proposal go to: http://www.camd.lsu.edu). The centerpiece of CAMD is a 1.3-1.5 GeV electron storage ring. CAMD supports a strong program in x-ray lithography micromachining (XRLM) or LIGA. A total of four beamlines equipped with different scanners is available for exposures. A 2.500 sq. ft class 100 clean room provides basic processing capability for MEMS including optical lithography, thin film deposition, electroplating, and metrology. Three micromachining beamlines are connected to bending magnets. All beamlines are "white light" beamlines, terminated with a beryllium window. The typical source point to scanner distance is 10 m and the horizontal acceptance ranges from 6.5 to 10 mrad. A number of low Z filters can be inserted into the beam adapting the exposure spectrum to the resist thickness. Two beamlines are equipped with commercial scanners from Jenoptik GmbH (for details see Jenoptik's webpage at www.jo-mikrotechnik.com/) and one beamline with a "vacuum" scanner designed in house. The latest model of Jenoptik's DEX02 scanner has been installed at CAMD's XRLM1 beamline in December 2000 and allows advanced exposures using overlay as well as tilt and rotate functions. In addition to these beamlines CAMD has installed a "white light" beamline at its 7 T wiggler source. Preliminary exposure tests in ultrathick samples (1 mm and thicker) have been conducted using an "air scanner." Currently this beamline is dismantled and will be reinstalled together with a PX beamline. In the article further details of the beamlines and scanners as well as some examples of applications of LIGA microstructures fabricated at CAMD will be discussed.

  11. Advanced Stirling Convertor (ASC-E2) Performance Testing at NASA Glenn Research Center

    NASA Technical Reports Server (NTRS)

    Oriti, Salvatore; Wilson, Scott

    2011-01-01

    The National Aeronautics and Space Administration (NASA) Glenn Research Center (GRC) has been supporting development of the Advanced Stirling Radioisotope Generator (ASRG) since 2006. A key element of the ASRG Project is providing life, reliability, and performance testing of the Advanced Stirling Convertor (ASC). For this purpose, four pairs of ASCs capable of operating to 850 C and designated with the model number ASC-E2, were delivered by Sunpower of Athens, Ohio, to GRC in 2010. The ASC-E2s underwent a series of tests that included workmanship vibration testing, performance mapping, and extended operation. Workmanship vibration testing was performed following fabrication of each convertor to verify proper hardware build. Performance mapping consisted of operating each convertor at various conditions representing the range expected during a mission. Included were conditions representing beginning-of-mission (BOM), end-of-mission (EOM), and fueling. This same series of tests was performed by Sunpower prior to ASC-E2 delivery. The data generated during the GRC test were compared to performance before delivery. Extended operation consisted of a 500-hr period of operation with conditions maintained at the BOM point. This was performed to demonstrate steady convertor performance following performance mapping. Following this initial 500-hr period, the ASC-E2s will continue extended operation, controller development and special durability testing, during which the goal is to accumulate tens of thousands of hours of operation. Data collected during extended operation will support reliability analysis. Performance data from these tests is summarized in this paper.

  12. Advanced Stirling Convertor (ASC-E2) Performance Testing at NASA Glenn Research Center

    NASA Technical Reports Server (NTRS)

    Oriti, Salvatore; Wilson, Scott

    2011-01-01

    The National Aeronautics and Space Administration (NASA) Glenn Research Center (GRC) has been supporting development of the Advanced Stirling Radioisotope Generator (ASRG) since 2006. A key element of the ASRG Project is providing life, reliability, and performance testing of the Advanced Stirling Convertor (ASC). For this purpose, four pairs of ASCs capable of operating to 850 C and designated with the model number ASC-E2, were delivered by Sunpower of Athens, OH, to GRC in 2010. The ASC-E2s underwent a series of tests that included workmanship vibration testing, performance mapping, and extended operation. Workmanship vibration testing was performed following fabrication of each convertor to verify proper hardware build. Performance mapping consisted of operating each convertor at various conditions representing the range expected during a mission. Included were conditions representing beginning-of-mission (BOM), end-of-mission (EOM), and fueling. This same series of tests was performed by Sunpower prior to ASC-E2 delivery. The data generated during the GRC test were compared to performance before delivery. Extended operation consisted of a 500-hour period of operation with conditions maintained at the BOM point. This was performed to demonstrate steady convertor performance following performance mapping. Following this initial 500-hour period, the ASC-E2s will continue extended operation, controller development and special durability testing, during which the goal is to accumulate tens of thousands of hours of operation. Data collected during extended operation will support reliability analysis. Performance data from these tests is summarized in this paper.

  13. WAATS: A computer program for Weights Analysis of Advanced Transportation Systems

    NASA Technical Reports Server (NTRS)

    Glatt, C. R.

    1974-01-01

    A historical weight estimating technique for advanced transportation systems is presented. The classical approach to weight estimation is discussed and sufficient data is presented to estimate weights for a large spectrum of flight vehicles including horizontal and vertical takeoff aircraft, boosters and reentry vehicles. A computer program, WAATS (Weights Analysis for Advanced Transportation Systems) embracing the techniques discussed has been written and user instructions are presented. The program was developed for use in the ODIN (Optimal Design Integration System) system.

  14. ADVANCED METHODS FOR THE COMPUTATION OF PARTICLE BEAM TRANSPORT AND THE COMPUTATION OF ELECTROMAGNETIC FIELDS AND MULTIPARTICLE PHENOMENA

    SciTech Connect

    Alex J. Dragt

    2012-08-31

    Since 1980, under the grant DEFG02-96ER40949, the Department of Energy has supported the educational and research work of the University of Maryland Dynamical Systems and Accelerator Theory (DSAT) Group. The primary focus of this educational/research group has been on the computation and analysis of charged-particle beam transport using Lie algebraic methods, and on advanced methods for the computation of electromagnetic fields and multiparticle phenomena. This Final Report summarizes the accomplishments of the DSAT Group from its inception in 1980 through its end in 2011.

  15. ADVANCED COMPUTATIONAL MODEL FOR THREE-PHASE SLURRY REACTORS

    SciTech Connect

    Goodarz Ahmadi

    2004-10-01

    In this project, an Eulerian-Lagrangian formulation for analyzing three-phase slurry flows in a bubble column was developed. The approach used an Eulerian analysis of liquid flows in the bubble column, and made use of the Lagrangian trajectory analysis for the bubbles and particle motions. The bubble-bubble and particle-particle collisions are included the model. The model predictions are compared with the experimental data and good agreement was found An experimental setup for studying two-dimensional bubble columns was developed. The multiphase flow conditions in the bubble column were measured using optical image processing and Particle Image Velocimetry techniques (PIV). A simple shear flow device for bubble motion in a constant shear flow field was also developed. The flow conditions in simple shear flow device were studied using PIV method. Concentration and velocity of particles of different sizes near a wall in a duct flow was also measured. The technique of Phase-Doppler anemometry was used in these studies. An Eulerian volume of fluid (VOF) computational model for the flow condition in the two-dimensional bubble column was also developed. The liquid and bubble motions were analyzed and the results were compared with observed flow patterns in the experimental setup. Solid-fluid mixture flows in ducts and passages at different angle of orientations were also analyzed. The model predictions were compared with the experimental data and good agreement was found. Gravity chute flows of solid-liquid mixtures were also studied. The simulation results were compared with the experimental data and discussed A thermodynamically consistent model for multiphase slurry flows with and without chemical reaction in a state of turbulent motion was developed. The balance laws were obtained and the constitutive laws established.

  16. Advances and perspectives in lung cancer imaging using multidetector row computed tomography.

    PubMed

    Coche, Emmanuel

    2012-10-01

    The introduction of multidetector row computed tomography (CT) into clinical practice has revolutionized many aspects of the clinical work-up. Lung cancer imaging has benefited from various breakthroughs in computing technology, with advances in the field of lung cancer detection, tissue characterization, lung cancer staging and response to therapy. Our paper discusses the problems of radiation, image visualization and CT examination comparison. It also reviews the most significant advances in lung cancer imaging and highlights the emerging clinical applications that use state of the art CT technology in the field of lung cancer diagnosis and follow-up.

  17. Advanced entry guidance algorithm with landing footprint computation

    NASA Astrophysics Data System (ADS)

    Leavitt, James Aaron

    -determined angle of attack profile. The method is also capable of producing orbital footprints using an automatically-generated set of angle of attack profiles of varying range, with the lowest profile designed for near-maximum range in the absence of an active heat load constraint. The accuracy of the footprint method is demonstrated by direct comparison with footprints computed independently by an optimization program.

  18. Comparison of nonmesonic hypernuclear decay rates computed in laboratory and center-of-mass coordinates

    SciTech Connect

    De Conti, C.; Barbero, C.; Galeão, A. P.; Krmpotić, F.

    2014-11-11

    In this work we compute the one-nucleon-induced nonmesonic hypernuclear decay rates of {sub Λ}{sup 5}He, {sub Λ}{sup 12}C and {sub Λ}{sup 13}C using a formalism based on the independent particle shell model in terms of laboratory coordinates. To ascertain the correctness and precision of the method, these results are compared with those obtained using a formalism in terms of center-of-mass coordinates, which has been previously reported in the literature. The formalism in terms of laboratory coordinates will be useful in the shell-model approach to two-nucleon-induced transitions.

  19. Computational fluid dynamics research and applications at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    South, Jerry C., Jr.

    1989-01-01

    Information on computational fluid dynamics (CFD) research and applications carried out at the NASA Langley Research Center is given in viewgraph form. The Langley CFD strategy, the five-year plan in CFD and flow physics, 3-block grid topology, the effect of a patching algorithm, F-18 surface flow, entropy and vorticity effects that improve accuracy of unsteady transonic small disturbance theory, and the effects of reduced frequency on first harmonic components of unsteady pressures due to airfoil pitching are among the topics covered.

  20. Mass Storage System Upgrades at the NASA Center for Computational Sciences

    NASA Technical Reports Server (NTRS)

    Tarshish, Adina; Salmon, Ellen; Macie, Medora; Saletta, Marty

    2000-01-01

    The NASA Center for Computational Sciences (NCCS) provides supercomputing and mass storage services to over 1200 Earth and space scientists. During the past two years, the mass storage system at the NCCS went through a great deal of changes both major and minor. Tape drives, silo control software, and the mass storage software itself were upgraded, and the mass storage platform was upgraded twice. Some of these upgrades were aimed at achieving year-2000 compliance, while others were simply upgrades to newer and better technologies. In this paper we will describe these upgrades.

  1. Defense Information Systems Agency Controls Over the Center for Computing Services

    DTIC Science & Technology

    2007-04-09

    Center for Computing Services 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR( S ) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f...WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME( S ) AND ADDRESS(ES) ODIG-AUD Department of Defense Inspector General,400 Army Navy Drive Suite 801...Arlington,VA,22202-4704 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME( S ) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S

  2. Some propulsion system noise data handling conventions and computer programs used at the Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Montegani, F. J.

    1974-01-01

    Methods of handling one-third-octave band noise data originating from the outdoor full-scale fan noise facility and the engine acoustic facility at the Lewis Research Center are presented. Procedures for standardizing, retrieving, extrapolating, and reporting these data are explained. Computer programs are given which are used to accomplish these and other noise data analysis tasks. This information is useful as background for interpretation of data from these facilities appearing in NASA reports and can aid data exchange by promoting standardization.

  3. Advanced Study Center: Proceedings of the National Faculty Plenary Conference (Columbus, Ohio, October 30-November 1, 1978).

    ERIC Educational Resources Information Center

    Jackson, Elise B., Ed.; Russell, Earl B., Ed.

    These proceedings contain presentations made at the National Faculty Plenary Conference, whose theme, Nurturing Vocational Education's Leadership and Intellectual Capital, involved these topics: planning, evaluation, recruitment, and policy implications as they relate to the development and implementation of an Advanced Study Center. Introductory…

  4. 75 FR 69468 - Dentek.com, D/B/A Nsequence Center for Advanced Dentistry; Reno, NV; Notice of Affirmative...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-12

    ... Employment and Training Administration Dentek.com , D/B/A Nsequence Center for Advanced Dentistry; Reno, NV... engaged in employment related to the production of dental prosthetics. The initial determination was based... directly competitive with dental prosthetics or a shift/acquisition of these articles to a foreign...

  5. System Analysis for the Huntsville Operation Support Center, Distributed Computer System

    NASA Technical Reports Server (NTRS)

    Ingels, F. M.; Massey, D.

    1985-01-01

    HOSC as a distributed computing system, is responsible for data acquisition and analysis during Space Shuttle operations. HOSC also provides computing services for Marshall Space Flight Center's nonmission activities. As mission and nonmission activities change, so do the support functions of HOSC change, demonstrating the need for some method of simulating activity at HOSC in various configurations. The simulation developed in this work primarily models the HYPERchannel network. The model simulates the activity of a steady state network, reporting statistics such as, transmitted bits, collision statistics, frame sequences transmitted, and average message delay. These statistics are used to evaluate such performance indicators as throughout, utilization, and delay. Thus the overall performance of the network is evaluated, as well as predicting possible overload conditions.

  6. Initial Flight Test of the Production Support Flight Control Computers at NASA Dryden Flight Research Center

    NASA Technical Reports Server (NTRS)

    Carter, John; Stephenson, Mark

    1999-01-01

    The NASA Dryden Flight Research Center has completed the initial flight test of a modified set of F/A-18 flight control computers that gives the aircraft a research control law capability. The production support flight control computers (PSFCC) provide an increased capability for flight research in the control law, handling qualities, and flight systems areas. The PSFCC feature a research flight control processor that is "piggybacked" onto the baseline F/A-18 flight control system. This research processor allows for pilot selection of research control law operation in flight. To validate flight operation, a replication of a standard F/A-18 control law was programmed into the research processor and flight-tested over a limited envelope. This paper provides a brief description of the system, summarizes the initial flight test of the PSFCC, and describes future experiments for the PSFCC.

  7. Computational Modeling and High Performance Computing in Advanced Materials Processing, Synthesis, and Design

    DTIC Science & Technology

    2014-12-07

    research efforts in this project focused on the synergistic coupling of: • Computational material science and mechanics of hybrid and light weight polymeric...MATERIAL SCIENCE AND MECHANICS OF HYBRID AND LIGHT WEIGHT POLYMERIC COMPOSITE STRUCTURES 11 A-l-l: Atomistic Modeling in Polymer Nanocomposite Systems...DETAILED TECHNICAL REPORT 16 A-1: COMPUTATIONAL MATERIAL SCIENCE AND MECHANICS OF HYBRID AND LIGHT WEIGHT POLYMERIC COMPOSITE STRUCTURES 16 A-l-l

  8. Projected role of advanced computational aerodynamic methods at the Lockheed-Georgia company

    NASA Technical Reports Server (NTRS)

    Lores, M. E.

    1978-01-01

    Experience with advanced computational methods being used at the Lockheed-Georgia Company to aid in the evaluation and design of new and modified aircraft indicates that large and specialized computers will be needed to make advanced three-dimensional viscous aerodynamic computations practical. The Numerical Aerodynamic Simulation Facility should be used to provide a tool for designing better aerospace vehicles while at the same time reducing development costs by performing computations using Navier-Stokes equations solution algorithms and permitting less sophisticated but nevertheless complex calculations to be made efficiently. Configuration definition procedures and data output formats can probably best be defined in cooperation with industry, therefore, the computer should handle many remote terminals efficiently. The capability of transferring data to and from other computers needs to be provided. Because of the significant amount of input and output associated with 3-D viscous flow calculations and because of the exceedingly fast computation speed envisioned for the computer, special attention should be paid to providing rapid, diversified, and efficient input and output.

  9. Computational fluid dynamics research at the United Technologies Research Center requiring supercomputers

    NASA Technical Reports Server (NTRS)

    Landgrebe, Anton J.

    1987-01-01

    An overview of research activities at the United Technologies Research Center (UTRC) in the area of Computational Fluid Dynamics (CFD) is presented. The requirement and use of various levels of computers, including supercomputers, for the CFD activities is described. Examples of CFD directed toward applications to helicopters, turbomachinery, heat exchangers, and the National Aerospace Plane are included. Helicopter rotor codes for the prediction of rotor and fuselage flow fields and airloads were developed with emphasis on rotor wake modeling. Airflow and airload predictions and comparisons with experimental data are presented. Examples are presented of recent parabolized Navier-Stokes and full Navier-Stokes solutions for hypersonic shock-wave/boundary layer interaction, and hydrogen/air supersonic combustion. In addition, other examples of CFD efforts in turbomachinery Navier-Stokes methodology and separated flow modeling are presented. A brief discussion of the 3-tier scientific computing environment is also presented, in which the researcher has access to workstations, mid-size computers, and supercomputers.

  10. The Operation of a Specialized Scientific Information and Data Analysis Center With Computer Base and Associated Communications Network.

    ERIC Educational Resources Information Center

    Cottrell, William B.; And Others

    The Nuclear Safety Information Center (NSIC) is a highly sophisticated scientific information center operated at Oak Ridge National Laboratory (ORNL) for the U.S. Atomic Energy Commission. Its information file, which consists of both data and bibliographic information, is computer stored and numerous programs have been developed to facilitate the…

  11. Examining the Fundamental Obstructs of Adopting Cloud Computing for 9-1-1 Dispatch Centers in the USA

    ERIC Educational Resources Information Center

    Osman, Abdulaziz

    2016-01-01

    The purpose of this research study was to examine the unknown fears of embracing cloud computing which stretches across measurements like fear of change from leaders and the complexity of the technology in 9-1-1 dispatch centers in USA. The problem that was addressed in the study was that many 9-1-1 dispatch centers in USA are still using old…

  12. Volumes to learn: advancing therapeutics with innovative computed tomography image data analysis.

    PubMed

    Maitland, Michael L

    2010-09-15

    Semi-automated methods for calculating tumor volumes from computed tomography images are a new tool for advancing the development of cancer therapeutics. Volumetric measurements, relying on already widely available standard clinical imaging techniques, could shorten the observation intervals needed to identify cohorts of patients sensitive or resistant to treatment.

  13. Next Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and Systems Biology (Final Report)

    EPA Science Inventory

    EPA announced the release of the final report, Next Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and Systems Biology. This report describes new approaches that are faster, less resource intensive, and more robust that can help ...

  14. Response to House Joint Resolution No. 118 [To Advance Computer-Assisted Instruction].

    ERIC Educational Resources Information Center

    Virginia State General Assembly, Richmond.

    This response by the Virginia Department of Education to House Joint Resolution No. 118 of the General Assembly of Virginia, which requested the Department of Education to study initiatives to advance computer-assisted instruction, is based on input from state and national task forces and on a 1986 survey of 80 Viriginia school divisions. The…

  15. PARTNERING WITH DOE TO APPLY ADVANCED BIOLOGICAL, ENVIRONMENTAL, AND COMPUTATIONAL SCIENCE TO ENVIRONMENTAL ISSUES

    EPA Science Inventory

    On February 18, 2004, the U.S. Environmental Protection Agency and Department of Energy signed a Memorandum of Understanding to expand the research collaboration of both agencies to advance biological, environmental, and computational sciences for protecting human health and the ...

  16. COMPUTATIONAL TOXICOLOGY ADVANCES: EMERGING CAPABILITIES FOR DATA EXPLORATION AND SAR MODEL DEVELOPMENT

    EPA Science Inventory

    Computational Toxicology Advances: Emerging capabilities for data exploration and SAR model development
    Ann M. Richard and ClarLynda R. Williams, National Health & Environmental Effects Research Laboratory, US EPA, Research Triangle Park, NC, USA; email: richard.ann@epa.gov

  17. Computers-for-edu: An Advanced Business Application Programming (ABAP) Teaching Case

    ERIC Educational Resources Information Center

    Boyle, Todd A.

    2007-01-01

    The "Computers-for-edu" case is designed to provide students with hands-on exposure to creating Advanced Business Application Programming (ABAP) reports and dialogue programs, as well as navigating various mySAP Enterprise Resource Planning (ERP) transactions needed by ABAP developers. The case requires students to apply a wide variety…

  18. Scientific Discovery through Advanced Computing (SciDAC-3) Partnership Project Annual Report

    SciTech Connect

    Hoffman, Forest M.; Bochev, Pavel B.; Cameron-Smith, Philip J..; Easter, Richard C; Elliott, Scott M.; Ghan, Steven J.; Liu, Xiaohong; Lowrie, Robert B.; Lucas, Donald D.; Ma, Po-lun; Sacks, William J.; Shrivastava, Manish; Singh, Balwinder; Tautges, Timothy J.; Taylor, Mark A.; Vertenstein, Mariana; Worley, Patrick H.

    2014-01-15

    The Applying Computationally Efficient Schemes for BioGeochemical Cycles ACES4BGC Project is advancing the predictive capabilities of Earth System Models (ESMs) by reducing two of the largest sources of uncertainty, aerosols and biospheric feedbacks, with a highly efficient computational approach. In particular, this project is implementing and optimizing new computationally efficient tracer advection algorithms for large numbers of tracer species; adding important biogeochemical interactions between the atmosphere, land, and ocean models; and applying uncertainty quanti cation (UQ) techniques to constrain process parameters and evaluate uncertainties in feedbacks between biogeochemical cycles and the climate system.

  19. Research in Computational Aeroscience Applications Implemented on Advanced Parallel Computing Systems

    NASA Technical Reports Server (NTRS)

    Wigton, Larry

    1996-01-01

    Improving the numerical linear algebra routines for use in new Navier-Stokes codes, specifically Tim Barth's unstructured grid code, with spin-offs to TRANAIR is reported. A fast distance calculation routine for Navier-Stokes codes using the new one-equation turbulence models is written. The primary focus of this work was devoted to improving matrix-iterative methods. New algorithms have been developed which activate the full potential of classical Cray-class computers as well as distributed-memory parallel computers.

  20. Impact of computer advances on future finite elements computations. [for aircraft and spacecraft design

    NASA Technical Reports Server (NTRS)

    Fulton, Robert E.

    1985-01-01

    Research performed over the past 10 years in engineering data base management and parallel computing is discussed, and certain opportunities for research toward the next generation of structural analysis capability are proposed. Particular attention is given to data base management associated with the IPAD project and parallel processing associated with the Finite Element Machine project, both sponsored by NASA, and a near term strategy for a distributed structural analysis capability based on relational data base management software and parallel computers for a future structural analysis system.

  1. Changing the batch system in a Tier 1 computing center: why and how

    NASA Astrophysics Data System (ADS)

    Chierici, Andrea; Dal Pra, Stefano

    2014-06-01

    At the Italian Tierl Center at CNAF we are evaluating the possibility to change the current production batch system. This activity is motivated mainly because we are looking for a more flexible licensing model as well as to avoid vendor lock-in. We performed a technology tracking exercise and among many possible solutions we chose to evaluate Grid Engine as an alternative because its adoption is increasing in the HEPiX community and because it's supported by the EMI middleware that we currently use on our computing farm. Another INFN site evaluated Slurm and we will compare our results in order to understand pros and cons of the two solutions. We will present the results of our evaluation of Grid Engine, in order to understand if it can fit the requirements of a Tier 1 center, compared to the solution we adopted long ago. We performed a survey and a critical re-evaluation of our farming infrastructure: many production softwares (accounting and monitoring on top of all) rely on our current solution and changing it required us to write new wrappers and adapt the infrastructure to the new system. We believe the results of this investigation can be very useful to other Tier-ls and Tier-2s centers in a similar situation, where the effort of switching may appear too hard to stand. We will provide guidelines in order to understand how difficult this operation can be and how long the change may take.

  2. A first attempt to bring computational biology into advanced high school biology classrooms.

    PubMed

    Gallagher, Suzanne Renick; Coon, William; Donley, Kristin; Scott, Abby; Goldberg, Debra S

    2011-10-01

    Computer science has become ubiquitous in many areas of biological research, yet most high school and even college students are unaware of this. As a result, many college biology majors graduate without adequate computational skills for contemporary fields of biology. The absence of a computational element in secondary school biology classrooms is of growing concern to the computational biology community and biology teachers who would like to acquaint their students with updated approaches in the discipline. We present a first attempt to correct this absence by introducing a computational biology element to teach genetic evolution into advanced biology classes in two local high schools. Our primary goal was to show students how computation is used in biology and why a basic understanding of computation is necessary for research in many fields of biology. This curriculum is intended to be taught by a computational biologist who has worked with a high school advanced biology teacher to adapt the unit for his/her classroom, but a motivated high school teacher comfortable with mathematics and computing may be able to teach this alone. In this paper, we present our curriculum, which takes into consideration the constraints of the required curriculum, and discuss our experiences teaching it. We describe the successes and challenges we encountered while bringing this unit to high school students, discuss how we addressed these challenges, and make suggestions for future versions of this curriculum.We believe that our curriculum can be a valuable seed for further development of computational activities aimed at high school biology students. Further, our experiences may be of value to others teaching computational biology at this level. Our curriculum can be obtained at http://ecsite.cs.colorado.edu/?page_id=149#biology or by contacting the authors.

  3. Building an advanced climate model: Program plan for the CHAMMP (Computer Hardware, Advanced Mathematics, and Model Physics) Climate Modeling Program

    SciTech Connect

    Not Available

    1990-12-01

    The issue of global warming and related climatic changes from increasing concentrations of greenhouse gases in the atmosphere has received prominent attention during the past few years. The Computer Hardware, Advanced Mathematics, and Model Physics (CHAMMP) Climate Modeling Program is designed to contribute directly to this rapid improvement. The goal of the CHAMMP Climate Modeling Program is to develop, verify, and apply a new generation of climate models within a coordinated framework that incorporates the best available scientific and numerical approaches to represent physical, biogeochemical, and ecological processes, that fully utilizes the hardware and software capabilities of new computer architectures, that probes the limits of climate predictability, and finally that can be used to address the challenging problem of understanding the greenhouse climate issue through the ability of the models to simulate time-dependent climatic changes over extended times and with regional resolution.

  4. Computed tomography evaluation of rotary systems on the root canal transportation and centering ability.

    PubMed

    Pagliosa, André; Sousa-Neto, Manoel Damião; Versiani, Marco Aurélio; Raucci-Neto, Walter; Silva-Sousa, Yara Teresinha Corrêa; Alfredo, Edson

    2015-01-01

    The endodontic preparation of curved and narrow root canals is challenging, with a tendency for the prepared canal to deviate away from its natural axis. The aim of this study was to evaluate, by cone-beam computed tomography, the transportation and centering ability of curved mesiobuccal canals in maxillary molars after biomechanical preparation with different nickel-titanium (NiTi) rotary systems. Forty teeth with angles of curvature ranging from 20° to 40° and radii between 5.0 mm and 10.0 mm were selected and assigned into four groups (n = 10), according to the biomechanical preparative system used: Hero 642 (HR), Liberator (LB), ProTaper (PT), and Twisted File (TF). The specimens were inserted into an acrylic device and scanned with computed tomography prior to, and following, instrumentation at 3, 6 and 9 mm from the root apex. The canal degree of transportation and centering ability were calculated and analyzed using one-way ANOVA and Tukey's tests (α = 0.05). The results demonstrated no significant difference (p > 0.05) in shaping ability among the rotary systems. The mean canal transportation was: -0.049 ± 0.083 mm (HR); -0.004 ± 0.044 mm (LB); -0.003 ± 0.064 mm (PT); -0.021 ± 0.064 mm (TF). The mean canal centering ability was: -0.093 ± 0.147 mm (HR); -0.001 ± 0.100 mm (LB); -0.002 ± 0.134 mm (PT); -0.033 ± 0.133 mm (TF). Also, there was no significant difference among the root segments (p > 0.05). It was concluded that the Hero 642, Liberator, ProTaper, and Twisted File rotary systems could be safely used in curved canal instrumentation, resulting in satisfactory preservation of the original canal shape.

  5. Computational fluid dynamics assessment: Volume 1, Computer simulations of the METC (Morgantown Energy Technology Center) entrained-flow gasifier: Final report

    SciTech Connect

    Celik, I.; Chattree, M.

    1988-07-01

    An assessment of the theoretical and numerical aspects of the computer code, PCGC-2, is made; and the results of the application of this code to the Morgantown Energy Technology Center (METC) advanced gasification facility entrained-flow reactor, ''the gasifier,'' are presented. PCGC-2 is a code suitable for simulating pulverized coal combustion or gasification under axisymmetric (two-dimensional) flow conditions. The governing equations for the gas and particulate phase have been reviewed. The numerical procedure and the related programming difficulties have been elucidated. A single-particle model similar to the one used in PCGC-2 has been developed, programmed, and applied to some simple situations in order to gain insight to the physics of coal particle heat-up, devolatilization, and char oxidation processes. PCGC-2 was applied to the METC entrained-flow gasifier to study numerically the flash pyrolysis of coal, and gasification of coal with steam or carbon dioxide. The results from the simulations are compared with measurements. The gas and particle residence times, particle temperature, and mass component history were also calculated and the results were analyzed. The results provide useful information for understanding the fundamentals of coal gasification and for assessment of experimental results performed using the reactor considered. 69 refs., 35 figs., 23 tabs.

  6. FY05-FY06 Advanced Simulation and Computing Implementation Plan, Volume 2

    SciTech Connect

    Baron, A L

    2004-07-19

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program will require the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapon design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile life extension programs and the resolution of significant finding investigations (SFIs). This requires a balanced system of technical staff, hardware, simulation software, and computer science solutions.

  7. Proposed center for advanced industrial processes. Washington State University, College of Engineering and Architecture

    SciTech Connect

    1995-03-01

    The DOE proposes to authorize Washington State University (WSU) to proceed with the detailed design, construction, and equipping of the proposed Center for Advanced Industrial Processes (CAIP). The proposed project would involve construction of a three story building containing laboratories, classrooms, seminar rooms, and graduate student and administrative office space. Existing buildings would be demolished. The proposed facility would house research in thermal/fluid sciences, bioengineering, manufacturing processes, and materials processing. Under the {open_quotes}no-action{close_quotes} DOE would not authorize WSU to proceed with construction under the grant. WSU would then need to consider alternatives for proceeding without DOE funds. Such alternatives (including delaying or scaling back the project), would result in a postponement or slight reduction in the minor adverse environmental, safety and health Impacts of the project evaluated in this assessment. More importantly, these alternatives would affect the important environmental, safety, health, and programmatic benefits of the projects. The surrounding area is fully urbanized and the campus is intensely developed around the proposed site. The buildings scheduled for demolition do not meet State energy codes, are not air conditioned, and lack handicapped access. Sensitive resources (historical/archeological, protected species/critical habitats, wetlands/floodplains, national forests/parks/trails, prime farmland and special sources of water) would not be affected as they do not occur on or near the proposed site. Cumulative impacts would be small. The proposed action is not related to other actions being considered under other NEPA reviews. There is no conflict between the proposed action and any applicable Federal, State, regional or local land use plans and policies.

  8. Advances in surveillance of periodontitis: the Centers for Disease Control and Prevention periodontal disease surveillance project.

    PubMed

    Eke, Paul I; Thornton-Evans, Gina; Dye, Bruce; Genco, Robert

    2012-11-01

    The Centers for Disease Control and Prevention (CDC) has as one of its strategic goals to support and improve surveillance of periodontal disease. In 2003, the CDC initiated the CDC Periodontal Disease Surveillance Project in collaboration with the American Academy of Periodontology to address population-based surveillance of periodontal disease at the local, state, and national levels. This initiative has made significant advancements toward the goal of improved surveillance, including developing valid self-reported measures that can be obtained from interview-based surveys to predict prevalence of periodontitis in populations. This will allow surveillance of periodontitis at the state and local levels and in countries where clinical resources for surveillance are scarce. This work has produced standard case definitions for surveillance of periodontitis that are now widely recognized and applied in population studies and research. At the national level, this initiative has evaluated the validity of previous clinical examination protocols and tested new protocols on the National Health and Nutrition Examination Survey (NHANES), recommending and supporting funding for the gold-standard full-mouth periodontal examination in NHANES 2009 to 2012. These examinations will generate accurate estimates of the prevalence of periodontitis in the US adult population and provide a superior dataset for surveillance and research. Also, this data will be used to generate the necessary coefficients for our self-report questions for use in subsets of the total US population. The impact of these findings on population-based surveillance of periodontitis and future directions of the project are discussed along with plans for dissemination and translation efforts for broader public health use.

  9. Advancing the Culture of Teaching on Campus: How a Teaching Center Can Make a Difference

    ERIC Educational Resources Information Center

    Cook, Constance, Ed.; Kaplan, Matthew, Ed.

    2011-01-01

    Written by the director and staff of the first, and one of the largest, teaching centers in American higher education--the University of Michigan's Center for Research on Learning and Teaching (CRLT)--this book offers a unique perspective on the strategies for making a teaching center integral to an institution's educational mission. It presents a…

  10. Investigation of advanced counterrotation blade configuration concepts for high speed turboprop systems. Task 4: Advanced fan section aerodynamic analysis computer program user's manual

    NASA Technical Reports Server (NTRS)

    Crook, Andrew J.; Delaney, Robert A.

    1992-01-01

    The computer program user's manual for the ADPACAPES (Advanced Ducted Propfan Analysis Code-Average Passage Engine Simulation) program is included. The objective of the computer program is development of a three-dimensional Euler/Navier-Stokes flow analysis for fan section/engine geometries containing multiple blade rows and multiple spanwise flow splitters. An existing procedure developed by Dr. J. J. Adamczyk and associates at the NASA Lewis Research Center was modified to accept multiple spanwise splitter geometries and simulate engine core conditions. The numerical solution is based upon a finite volume technique with a four stage Runge-Kutta time marching procedure. Multiple blade row solutions are based upon the average-passage system of equations. The numerical solutions are performed on an H-type grid system, with meshes meeting the requirement of maintaining a common axisymmetric mesh for each blade row grid. The analysis was run on several geometry configurations ranging from one to five blade rows and from one to four radial flow splitters. The efficiency of the solution procedure was shown to be the same as the original analysis.

  11. 75 FR 71463 - Dentek.Com, Inc. D/B/A Nsequence Center for Advanced Dentistry Reno, NV; Notice of Negative...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-23

    ... Employment and Training Administration Dentek.Com, Inc. D/B/A Nsequence Center for Advanced Dentistry Reno... production of dental prosthetics (such as crowns and the bridges). Pursuant to 29 CFR 90.18(c... workers at Dentek.com , Inc., d/b/a nSequence Center for Advanced Dentistry, Reno, Nevada (the...

  12. Deep Space Network (DSN), Network Operations Control Center (NOCC) computer-human interfaces

    NASA Technical Reports Server (NTRS)

    Ellman, Alvin; Carlton, Magdi

    1993-01-01

    The technical challenges, engineering solutions, and results of the NOCC computer-human interface design are presented. The use-centered design process was as follows: determine the design criteria for user concerns; assess the impact of design decisions on the users; and determine the technical aspects of the implementation (tools, platforms, etc.). The NOCC hardware architecture is illustrated. A graphical model of the DSN that represented the hierarchical structure of the data was constructed. The DSN spacecraft summary display is shown. Navigation from top to bottom is accomplished by clicking the appropriate button for the element about which the user desires more detail. The telemetry summary display and the antenna color decision table are also shown.

  13. Effects of a computer-assisted language intervention in a rural Nevada center.

    PubMed

    Krumpe, Jo Anne; Harlow, Steven

    2008-06-01

    A computer-assisted language intervention, Fast ForWord-Language (FFW-L), was tested at a rural Nevada center in a group of children (Grades 2-12) referred by parents and teachers to assess enhancement of language skills. Given conflicting results from previous studies, language scores were measured using Clinical Evaluation of Language Fundamentals, Third Edition (CELF-3) before and after the FFW-L intervention. 58 children's CELF-3 postintervention scores were adjusted for age-specific expected changes and compared with pretest scores. Adjusted scores increased in both receptive and expressive domains of the CELF-3. Children with prior diagnoses of language and/or learning impairment did not differ from other referrals on adjusted CELF-3 adjusted gain scores after treatment. Thus the Fast ForWord-Language intervention may benefit a much broader group of children referred by parents and teachers for language or reading problems.

  14. System analysis for the Huntsville Operational Support Center distributed computer system

    NASA Technical Reports Server (NTRS)

    Ingels, F. M.; Mauldin, J.

    1984-01-01

    The Huntsville Operations Support Center (HOSC) is a distributed computer system used to provide real time data acquisition, analysis and display during NASA space missions and to perform simulation and study activities during non-mission times. The primary purpose is to provide a HOSC system simulation model that is used to investigate the effects of various HOSC system configurations. Such a model would be valuable in planning the future growth of HOSC and in ascertaining the effects of data rate variations, update table broadcasting and smart display terminal data requirements on the HOSC HYPERchannel network system. A simulation model was developed in PASCAL and results of the simulation model for various system configuraions were obtained. A tutorial of the model is presented and the results of simulation runs are presented. Some very high data rate situations were simulated to observe the effects of the HYPERchannel switch over from contention to priority mode under high channel loading.

  15. ENHANCING THE ATOMIC-LEVEL UNDERSTANDING OF CO2 MINERAL SEQUESTRATION MECHANISMS VIA ADVANCED COMPUTATIONAL MODELING

    SciTech Connect

    A.V.G. Chizmeshya; M.J. McKelvy; G.H. Wolf; R.W. Carpenter; D.A. Gormley; J.R. Diefenbacher; R. Marzke

    2006-03-01

    significantly improved our understanding of mineral carbonation. Group members at the Albany Research Center have recently shown that carbonation of olivine and serpentine, which naturally occurs over geological time (i.e., 100,000s of years), can be accelerated to near completion in hours. Further process refinement will require a synergetic science/engineering approach that emphasizes simultaneous investigation of both thermodynamic processes and the detailed microscopic, atomic-level mechanisms that govern carbonation kinetics. Our previously funded Phase I Innovative Concepts project demonstrated the value of advanced quantum-mechanical modeling as a complementary tool in bridging important gaps in our understanding of the atomic/molecular structure and reaction mechanisms that govern CO2 mineral sequestration reaction processes for the model Mg-rich lamellar hydroxide feedstock material Mg(OH)2. In the present simulation project, improved techniques and more efficient computational schemes have allowed us to expand and augment these capabilities and explore more complex Mg-rich, lamellar hydroxide-based feedstock materials, including the serpentine-based minerals. These feedstock materials are being actively investigated due to their wide availability, and low-cost CO2 mineral sequestration potential. Cutting-edge first principles quantum chemical, computational solid-state and materials simulation methodology studies proposed herein, have been strategically integrated with our new DOE supported (ASU-Argonne National Laboratory) project to investigate the mechanisms that govern mineral feedstock heat-treatment and aqueous/fluid-phase serpentine mineral carbonation in situ. This unified, synergetic theoretical and experimental approach has provided a deeper understanding of the key reaction mechanisms than either individual approach can alone. We used ab initio techniques to significantly advance our understanding of atomic-level processes at the solid/solution interface by

  16. ENHANCING THE ATOMIC-LEVEL UNDERSTANDING OF CO2 MINERAL SEQUESTRATION MECHANISMS VIA ADVANCED COMPUTATIONAL MODELING

    SciTech Connect

    A.V.G. Chizmeshya

    2003-12-19

    /NETL managed National Mineral Sequestration Working Group we have already significantly improved our understanding of mineral carbonation. Group members at the Albany Research Center have recently shown that carbonation of olivine and serpentine, which naturally occurs over geological time (i.e., 100,000s of years), can be accelerated to near completion in hours. Further process refinement will require a synergetic science/engineering approach that emphasizes simultaneous investigation of both thermodynamic processes and the detailed microscopic, atomic-level mechanisms that govern carbonation kinetics. Our previously funded Phase I Innovative Concepts project demonstrated the value of advanced quantum-mechanical modeling as a complementary tool in bridging important gaps in our understanding of the atomic/molecular structure and reaction mechanisms that govern CO{sub 2} mineral sequestration reaction processes for the model Mg-rich lamellar hydroxide feedstock material Mg(OH){sub 2}. In the present simulation project, improved techniques and more efficient computational schemes have allowed us to expand and augment these capabilities and explore more complex Mg-rich, lamellar hydroxide-based feedstock materials, including the serpentine-based minerals. These feedstock materials are being actively investigated due to their wide availability, and low-cost CO{sub 2} mineral sequestration potential. Cutting-edge first principles quantum chemical, computational solid-state and materials simulation methodology studies proposed herein, have been strategically integrated with our new DOE supported (ASU-Argonne National Laboratory) project to investigate the mechanisms that govern mineral feedstock heat-treatment and aqueous/fluid-phase serpentine mineral carbonation in situ. This unified, synergetic theoretical and experimental approach will provide a deeper understanding of the key reaction mechanisms than either individual approach can alone. Ab initio techniques will also

  17. ENHANCING THE ATOMIC-LEVEL UNDERSTANDING OF CO2 MINERAL SEQUESTRATION MECHANISMS VIA ADVANCED COMPUTATIONAL MODELING

    SciTech Connect

    A.V.G. Chizmeshya

    2002-12-19

    /NETL managed National Mineral Sequestration Working Group we have already significantly improved our understanding of mineral carbonation. Group members at the Albany Research Center have recently shown that carbonation of olivine and serpentine, which naturally occurs over geological time (i.e., 100,000s of years), can be accelerated to near completion in hours. Further process refinement will require a synergetic science/engineering approach that emphasizes simultaneous investigation of both thermodynamic processes and the detailed microscopic, atomic-level mechanisms that govern carbonation kinetics. Our previously funded Phase I Innovative Concepts project demonstrated the value of advanced quantum-mechanical modeling as a complementary tool in bridging important gaps in our understanding of the atomic/molecular structure and reaction mechanisms that govern CO{sub 2} mineral sequestration reaction processes for the model Mg-rich lamellar hydroxide feedstock material Mg(OH){sub 2}. In the present simulation project, improved techniques and more efficient computational schemes have allowed us to expand and augment these capabilities and explore more complex Mg-rich, lamellar hydroxide-based feedstock materials, including the serpentine-based minerals. These feedstock materials are being actively investigated due to their wide availability, and low-cost CO{sub 2} mineral sequestration potential. Cutting-edge first principles quantum chemical, computational solid-state and materials simulation methodology studies proposed herein, have been strategically integrated with our new DOE supported (ASU-Argonne National Laboratory) project to investigate the mechanisms that govern mineral feedstock heat-treatment and aqueous/fluid-phase serpentine mineral carbonation in situ. This unified, synergetic theoretical and experimental approach will provide a deeper understanding of the key reaction mechanisms than either individual approach can alone. Ab initio techniques will also

  18. Programming in C at NMFECC (National Magnetic Fusion Energy Computing Center): A practical guide

    SciTech Connect

    Haney, S.W.; Crotinger, J.A.

    1989-07-26

    Despite its popularity elsewhere, C has not been extensively used for scientific programming on supercomputers. There are a number of reasons for this but perhaps the most compelling has been the lack of C compilers. However, this situation has recently begun to change at the National Magnetic Fusion Energy Computing Center (NMFECC) where two C development platforms --- the Hybrid C Compiler (HCC) written at the Livermore Computer Center and the Portable C Compiler (CC version 4.1) distributed by Cray Research, Inc. (CRI) --- have become available for use. These compilers produce object code for all of the Cray models at NMFECC and, in addition, possess good scalar optimization capabilities along with rudimentary vectorization capabilities. With the advent of the Cray C compilers, it is possible to consider physics code development in C at NMFECC. However, when one actually attempts to pursue this goal, one is quickly faced with a number of practical problems. For instance, How do I compile, link, and debug C codes What special features of C are useful to me as a scientific programmer Are there things I currently can't do in C programs How do I interface my C program to existing Fortran code Can I make use of the Basis code development system from C Over the last three years we have incorporated C into numerous physics codes written at NMFECC and, in the course of this work, we have had to develop solutions to all of the above problems. This turned out to be a surprisingly frustrating and time-consuming venture requiring some rather subtle techniques and hacks. This guide is an attempt to document these techniques.

  19. Earth resources programs at the Langley Research Center. Part 1: Advanced Applications Flight Experiments (AAFE) and microwave remote sensing program

    NASA Technical Reports Server (NTRS)

    Parker, R. N.

    1972-01-01

    The earth resources activity is comprised of two basic programs as follows: advanced applications flight experiments, and microwave remote sensing. The two programs are in various stages of implementation, extending from experimental investigations within both the AAFE program and the microwave remote sensing program, to multidisciplinary studies and planning. The purpose of this paper is simply to identify the main thrust of the Langley Research Center activity in earth resources.

  20. Computation of Loads on the McDonnell Douglas Advanced Bearingless Rotor

    NASA Technical Reports Server (NTRS)

    Nguyen, Khanh; Lauzon, Dan; Anand, Vaidyanathan

    1994-01-01

    Computed results from UMARC and DART analyses are compared with the blade bending moments and vibratory hub loads data obtained from a full-scale wind tunnel test of the McDonnell Douglas five-bladed advanced bearingless rotor. The 5 per-rev vibratory hub loads data are corrected using results from a dynamic calibration of the rotor balance. The comparison between UMARC computed blade bending moments at different flight conditions are poor to fair, while DART results are fair to good. Using the free wake module, UMARC adequately computes the 5P vibratory hub loads for this rotor, capturing both magnitude and variations with forward speed. DART employs a uniform inflow wake model and does not adequately compute the 5P vibratory hub loads for this rotor.

  1. Advanced Simulation & Computing FY15 Implementation Plan Volume 2, Rev. 0.5

    SciTech Connect

    McCoy, Michel; Archer, Bill; Matzen, M. Keith

    2014-09-16

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. As the program approaches the end of its second decade, ASC is intently focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), quantify critical margins and uncertainties, and resolve increasingly difficult analyses needed for the SSP. Where possible, the program also enables the use of high-performance simulation and computing tools to address broader national security needs, such as foreign nuclear weapon assessments and counternuclear terrorism.

  2. Artificial intelligence in medicine and cardiac imaging: harnessing big data and advanced computing to provide personalized medical diagnosis and treatment.

    PubMed

    Dilsizian, Steven E; Siegel, Eliot L

    2014-01-01

    Although advances in information technology in the past decade have come in quantum leaps in nearly every aspect of our lives, they seem to be coming at a slower pace in the field of medicine. However, the implementation of electronic health records (EHR) in hospitals is increasing rapidly, accelerated by the meaningful use initiatives associated with the Center for Medicare & Medicaid Services EHR Incentive Programs. The transition to electronic medical records and availability of patient data has been associated with increases in the volume and complexity of patient information, as well as an increase in medical alerts, with resulting "alert fatigue" and increased expectations for rapid and accurate diagnosis and treatment. Unfortunately, these increased demands on health care providers create greater risk for diagnostic and therapeutic errors. In the near future, artificial intelligence (AI)/machine learning will likely assist physicians with differential diagnosis of disease, treatment options suggestions, and recommendations, and, in the case of medical imaging, with cues in image interpretation. Mining and advanced analysis of "big data" in health care provide the potential not only to perform "in silico" research but also to provide "real time" diagnostic and (potentially) therapeutic recommendations based on empirical data. "On demand" access to high-performance computing and large health care databases will support and sustain our ability to achieve personalized medicine. The IBM Jeopardy! Challenge, which pitted the best all-time human players against the Watson computer, captured the imagination of millions of people across the world and demonstrated the potential to apply AI approaches to a wide variety of subject matter, including medicine. The combination of AI, big data, and massively parallel computing offers the potential to create a revolutionary way of practicing evidence-based, personalized medicine.

  3. PREFACE: 16th International workshop on Advanced Computing and Analysis Techniques in physics research (ACAT2014)

    NASA Astrophysics Data System (ADS)

    Fiala, L.; Lokajicek, M.; Tumova, N.

    2015-05-01

    This volume of the IOP Conference Series is dedicated to scientific contributions presented at the 16th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2014), this year the motto was ''bridging disciplines''. The conference took place on September 1-5, 2014, at the Faculty of Civil Engineering, Czech Technical University in Prague, Czech Republic. The 16th edition of ACAT explored the boundaries of computing system architectures, data analysis algorithmics, automatic calculations, and theoretical calculation technologies. It provided a forum for confronting and exchanging ideas among these fields, where new approaches in computing technologies for scientific research were explored and promoted. This year's edition of the workshop brought together over 140 participants from all over the world. The workshop's 16 invited speakers presented key topics on advanced computing and analysis techniques in physics. During the workshop, 60 talks and 40 posters were presented in three tracks: Computing Technology for Physics Research, Data Analysis - Algorithms and Tools, and Computations in Theoretical Physics: Techniques and Methods. The round table enabled discussions on expanding software, knowledge sharing and scientific collaboration in the respective areas. ACAT 2014 was generously sponsored by Western Digital, Brookhaven National Laboratory, Hewlett Packard, DataDirect Networks, M Computers, Bright Computing, Huawei and PDV-Systemhaus. Special appreciations go to the track liaisons Lorenzo Moneta, Axel Naumann and Grigory Rubtsov for their work on the scientific program and the publication preparation. ACAT's IACC would also like to express its gratitude to all referees for their work on making sure the contributions are published in the proceedings. Our thanks extend to the conference liaisons Andrei Kataev and Jerome Lauret who worked with the local contacts and made this conference possible as well as to the program

  4. Advances in computer-aided design and computer-aided manufacture technology.

    PubMed

    Calamia, J R

    1994-01-01

    Although the development of computer-aided design (CAD) and computer-aided manufacture (CAM) technology and the benefits of increased productivity became obvious in the automobile and aerospace industries in the 1970s, investigations of this technology's application in the field of dentistry did not begin until the 1980s. Only now are we beginning to see the fruits of this work with the commercial availability of some systems; the potential for this technology seems boundless. This article reviews the recent literature with emphasis on the period from June 1992 to May 1993. This review should familiarize the reader with some of the latest developments in this technology, including a brief description of some systems currently available and the clinical and economical rationale for their acceptance into the dental mainstream. This article concentrates on a particular system, the Cerec (Siemens/Pelton and Crane, Charlotte, NC) system, for three reasons: first, this system has been available since 1985 and, as a result, has a track record of almost 7 years of data. Most of the data have just recently been released and consequently, much of this year's literature on CAD-CAM is monopolized by studies using this system. Second, this system was developed as a mobile, affordable, direct chairside CAD-CAM restorative method. As such, it is of special interest to the dentist who will offer this new technology directly to the patient, providing a one-visit restoration. Third, the author is currently engaged in research using this particular system and has a working knowledge of this system's capabilities.

  5. DOE Advanced Scientific Computing Advisory Subcommittee (ASCAC) Report: Top Ten Exascale Research Challenges

    SciTech Connect

    Lucas, Robert; Ang, James; Bergman, Keren; Borkar, Shekhar; Carlson, William; Carrington, Laura; Chiu, George; Colwell, Robert; Dally, William; Dongarra, Jack; Geist, Al; Haring, Rud; Hittinger, Jeffrey; Hoisie, Adolfy; Klein, Dean Micron; Kogge, Peter; Lethin, Richard; Sarkar, Vivek; Schreiber, Robert; Shalf, John; Sterling, Thomas; Stevens, Rick; Bashor, Jon; Brightwell, Ron; Coteus, Paul; Debenedictus, Erik; Hiller, Jon; Kim, K. H.; Langston, Harper; Murphy, Richard Micron; Webster, Clayton; Wild, Stefan; Grider, Gary; Ross, Rob; Leyffer, Sven; Laros III, James

    2014-02-10

    Exascale computing systems are essential for the scientific fields that will transform the 21st century global economy, including energy, biotechnology, nanotechnology, and materials science. Progress in these fields is predicated on the ability to perform advanced scientific and engineering simulations, and analyze the deluge of data. On July 29, 2013, ASCAC was charged by Patricia Dehmer, the Acting Director of the Office of Science, to assemble a subcommittee to provide advice on exascale computing. This subcommittee was directed to return a list of no more than ten technical approaches (hardware and software) that will enable the development of a system that achieves the Department's goals for exascale computing. Numerous reports over the past few years have documented the technical challenges and the non¬-viability of simply scaling existing computer designs to reach exascale. The technical challenges revolve around energy consumption, memory performance, resilience, extreme concurrency, and big data. Drawing from these reports and more recent experience, this ASCAC subcommittee has identified the top ten computing technology advancements that are critical to making a capable, economically viable, exascale system.

  6. Recovery Act: Advanced Interaction, Computation, and Visualization Tools for Sustainable Building Design

    SciTech Connect

    Greenberg, Donald P.; Hencey, Brandon M.

    2013-08-20

    Current building energy simulation technology requires excessive labor, time and expertise to create building energy models, excessive computational time for accurate simulations and difficulties with the interpretation of the results. These deficiencies can be ameliorated using modern graphical user interfaces and algorithms which take advantage of modern computer architectures and display capabilities. To prove this hypothesis, we developed an experimental test bed for building energy simulation. This novel test bed environment offers an easy-to-use interactive graphical interface, provides access to innovative simulation modules that run at accelerated computational speeds, and presents new graphics visualization methods to interpret simulation results. Our system offers the promise of dramatic ease of use in comparison with currently available building energy simulation tools. Its modular structure makes it suitable for early stage building design, as a research platform for the investigation of new simulation methods, and as a tool for teaching concepts of sustainable design. Improvements in the accuracy and execution speed of many of the simulation modules are based on the modification of advanced computer graphics rendering algorithms. Significant performance improvements are demonstrated in several computationally expensive energy simulation modules. The incorporation of these modern graphical techniques should advance the state of the art in the domain of whole building energy analysis and building performance simulation, particularly at the conceptual design stage when decisions have the greatest impact. More importantly, these better simulation tools will enable the transition from prescriptive to performative energy codes, resulting in better, more efficient designs for our future built environment.

  7. Advancing Mental Health Research: Washington University's Center for Mental Health Services Research

    ERIC Educational Resources Information Center

    Proctor, Enola K.; McMillen, Curtis; Haywood, Sally; Dore, Peter

    2008-01-01

    Research centers have become a key component of the research infrastructure in schools of social work, including the George Warren Brown School of Social Work at Washington University. In 1993, that school's Center for Mental Health Services Research (CMHSR) received funding from the National Institute of Mental Health (NIMH) as a Social Work…

  8. 78 FR 50069 - National Center for Advancing Translational Sciences; Notice of Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-16

    ... clearly unwarranted invasion of personal privacy. ] Name of Committee: Cures Acceleration Network Review.... Place: National Institutes of Health, Building 31, Conference Room 6, 31 Center Drive, Bethesda, MD.... Place: National Institutes of Health, Building 31, Conference Room 6, 31 Center Drive, Bethesda,...

  9. A Wish List for the Advancement of University and College Counseling Centers

    ERIC Educational Resources Information Center

    Bishop, John B.

    2016-01-01

    University and college counseling centers continue to meet emerging challenges in higher education. This article addresses three issues: the need for a more unified organizational structure to represent the profession, the potential value for counseling centers in seeking accreditation, and the importance of specialized training for those entering…

  10. Teaching Advanced Skills to Educationally Disadvantaged Students. Data Analysis Support Center (DASC) Task 4. Final Report.

    ERIC Educational Resources Information Center

    Means, Barbara, Ed.; Knapp, Michael S., Ed.

    This document comprises six papers that discuss teaching advanced skills to educationally disadvantaged students. An introductory paper, "Models for Teaching Advanced Skills to Educationally Disadvantaged Children" (B. Means and M. S. Knapp), synthesizes the themes that characterize the collection of papers as a whole, and discusses general issues…

  11. High Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

    SciTech Connect

    Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

    2014-07-28

    The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space, and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing are essential for accurately modeling them. In the past decade, the US Department of Energy's SciDAC program has produced accelerator-modeling tools that have been employed to tackle some of the most difficult accelerator science problems. The authors discuss the Synergia framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable of handling the entire spectrum of beam dynamics simulations. Our authors present Synergia's design principles and its performance on HPC platforms.

  12. High Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

    DOE PAGES

    Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

    2014-07-28

    The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space, and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing are essential for accurately modeling them. In the past decade, the US Department of Energy's SciDAC program has produced accelerator-modeling tools that have been employed to tackle some of the most difficult accelerator science problems. The authors discuss the Synergia framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable ofmore » handling the entire spectrum of beam dynamics simulations. Our authors present Synergia's design principles and its performance on HPC platforms.« less

  13. Advanced Practice Registered Nurses and Physician Assistants in Sleep Centers and Clinics: A Survey of Current Roles and Educational Background

    PubMed Central

    Colvin, Loretta; Cartwright, Ann; Collop, Nancy; Freedman, Neil; McLeod, Don; Weaver, Terri E.; Rogers, Ann E.

    2014-01-01

    Study Objectives: To survey Advanced Practice Registered Nurse (APRN) and Physician Assistant (PA) utilization, roles and educational background within the field of sleep medicine. Methods: Electronic surveys distributed to American Academy of Sleep Medicine (AASM) member centers and APRNs and PAs working within sleep centers and clinics. Results: Approximately 40% of responding AASM sleep centers reported utilizing APRNs or PAs in predominantly clinical roles. Of the APRNs and PAs surveyed, 95% reported responsibilities in sleep disordered breathing and more than 50% in insomnia and movement disorders. Most APRNs and PAs were prepared at the graduate level (89%), with sleep-specific education primarily through “on the job” training (86%). All APRNs surveyed were Nurse Practitioners (NPs), with approximately double the number of NPs compared to PAs. Conclusions: APRNs and PAs were reported in sleep centers at proportions similar to national estimates of NPs and PAs in physicians' offices. They report predominantly clinical roles, involving common sleep disorders. Given current predictions that the outpatient healthcare structure will change and the number of APRNs and PAs will increase, understanding the role and utilization of these professionals is necessary to plan for the future care of patients with sleep disorders. Surveyed APRNs and PAs reported a significant deficiency in formal and standardized sleep-specific education. Efforts to provide formal and standardized educational opportunities for APRNs and PAs that focus on their clinical roles within sleep centers could help fill a current educational gap. Citation: Colvin L, Cartwright Ann, Collop N, Freedman N, McLeod D, Weaver TE, Rogers AE. Advanced practice registered nurses and physician assistants in sleep centers and clinics: a survey of current roles and educational background. J Clin Sleep Med 2014;10(5):581-587. PMID:24812545

  14. Recent Advances in Cardiac Computed Tomography: Dual Energy, Spectral and Molecular CT Imaging

    PubMed Central

    Danad, Ibrahim; Fayad, Zahi A.; Willemink, Martin J.; Min, James K.

    2015-01-01

    Computed tomography (CT) evolved into a powerful diagnostic tool and it is impossible to imagine current clinical practice without CT imaging. Due to its widespread availability, ease of clinical application, superb sensitivity for detection of CAD, and non-invasive nature, CT has become a valuable tool within the armamentarium of the cardiologist. In the last few years, numerous technological advances in CT have occurred—including dual energy CT (DECT), spectral CT and CT-based molecular imaging. By harnessing the advances in technology, cardiac CT has advanced beyond the mere evaluation of coronary stenosis to an imaging modality tool that permits accurate plaque characterization, assessment of myocardial perfusion and even probing of molecular processes that are involved in coronary atherosclerosis. Novel innovations in CT contrast agents and pre-clinical spectral CT devices have paved the way for CT-based molecular imaging. PMID:26068288

  15. Advances in Single-Photon Emission Computed Tomography Hardware and Software.

    PubMed

    Piccinelli, Marina; Garcia, Ernest V

    2016-02-01

    Nuclear imaging techniques remain today's most reliable modality for the assessment and quantification of myocardial perfusion. In recent years, the field has experienced tremendous progress both in terms of dedicated cameras for cardiac applications and software techniques for image reconstruction. The most recent advances in single-photon emission computed tomography hardware and software are reviewed, focusing on how these improvements have resulted in an even more powerful diagnostic tool with reduced injected radiation dose and acquisition time.

  16. Condition monitoring through advanced sensor and computational technology : final report (January 2002 to May 2005).

    SciTech Connect

    Kim, Jung-Taek; Luk, Vincent K.

    2005-05-01

    The overall goal of this joint research project was to develop and demonstrate advanced sensors and computational technology for continuous monitoring of the condition of components, structures, and systems in advanced and next-generation nuclear power plants (NPPs). This project included investigating and adapting several advanced sensor technologies from Korean and US national laboratory research communities, some of which were developed and applied in non-nuclear industries. The project team investigated and developed sophisticated signal processing, noise reduction, and pattern recognition techniques and algorithms. The researchers installed sensors and conducted condition monitoring tests on two test loops, a check valve (an active component) and a piping elbow (a passive component), to demonstrate the feasibility of using advanced sensors and computational technology to achieve the project goal. Acoustic emission (AE) devices, optical fiber sensors, accelerometers, and ultrasonic transducers (UTs) were used to detect mechanical vibratory response of check valve and piping elbow in normal and degraded configurations. Chemical sensors were also installed to monitor the water chemistry in the piping elbow test loop. Analysis results of processed sensor data indicate that it is feasible to differentiate between the normal and degraded (with selected degradation mechanisms) configurations of these two components from the acquired sensor signals, but it is questionable that these methods can reliably identify the level and type of degradation. Additional research and development efforts are needed to refine the differentiation techniques and to reduce the level of uncertainties.

  17. Computer-assisted virtual planning and surgical template fabrication for frontoorbital advancement.

    PubMed

    Soleman, Jehuda; Thieringer, Florian; Beinemann, Joerg; Kunz, Christoph; Guzman, Raphael

    2015-05-01

    OBJECT The authors describe a novel technique using computer-assisted design (CAD) and computed-assisted manufacturing (CAM) for the fabrication of individualized 3D printed surgical templates for frontoorbital advancement surgery. METHODS Two patients underwent frontoorbital advancement surgery for unilateral coronal synostosis. Virtual surgical planning (SurgiCase-CMF, version 5.0, Materialise) was done by virtual mirroring techniques and superposition of an age-matched normative 3D pediatric skull model. Based on these measurements, surgical templates were fabricated using a 3D printer. Bifrontal craniotomy and the osteotomies for the orbital bandeau were performed based on the sterilized 3D templates. The remodeling was then done placing the bone plates within the negative 3D templates and fixing them using absorbable poly-dl-lactic acid plates and screws. RESULTS Both patients exhibited a satisfying head shape postoperatively and at follow-up. No surgery-related complications occurred. The cutting and positioning of the 3D surgical templates proved to be very accurate and easy to use as well as reproducible and efficient. CONCLUSIONS Computer-assisted virtual planning and 3D template fabrication for frontoorbital advancement surgery leads to reconstructions based on standardizedmeasurements, precludes subjective remodeling, and seems to be overall safe and feasible. A larger series of patients with long-term follow-up is needed for further evaluation of this novel technique.

  18. Recent advances in 3D computed tomography techniques for simulation and navigation in hepatobiliary pancreatic surgery.

    PubMed

    Uchida, Masafumi

    2014-04-01

    A few years ago it could take several hours to complete a 3D image using a 3D workstation. Thanks to advances in computer science, obtaining results of interest now requires only a few minutes. Many recent 3D workstations or multimedia computers are equipped with onboard 3D virtual patient modeling software, which enables patient-specific preoperative assessment and virtual planning, navigation, and tool positioning. Although medical 3D imaging can now be conducted using various modalities, including computed tomography (CT), magnetic resonance imaging (MRI), positron emission tomography (PET), and ultrasonography (US) among others, the highest quality images are obtained using CT data, and CT images are now the most commonly used source of data for 3D simulation and navigation image. If the 2D source image is bad, no amount of 3D image manipulation in software will provide a quality 3D image. In this exhibition, the recent advances in CT imaging technique and 3D visualization of the hepatobiliary and pancreatic abnormalities are featured, including scan and image reconstruction technique, contrast-enhanced techniques, new application of advanced CT scan techniques, and new virtual reality simulation and navigation imaging.

  19. Computer-generated formulas for three-center nuclear-attraction integrals (electrostatic potential) for Slater-type orbitals

    NASA Technical Reports Server (NTRS)

    Jones, H. W.

    1984-01-01

    The computer-assisted C-matrix, Loewdin-alpha-function, single-center expansion method in spherical harmonics has been applied to the three-center nuclear-attraction integral (potential due to the product of separated Slater-type orbitals). Exact formulas are produced for 13 terms of an infinite series that permits evaluation to ten decimal digits of an example using 1s orbitals.

  20. Computer-Assisted Instruction in the Context of the Advanced Instructional System: Authoring Support Software. Final Report.

    ERIC Educational Resources Information Center

    Montgomery, Ann D.; Judd, Wilson A.

    This report details the design, development, and implementation of computer software to support the cost-effective production of computer assisted instruction (CAI) within the context of the Advanced Instructional System (AIS) located at Lowry Air Force Base. The report supplements the computer managed Air Force technical training that is…

  1. ADVANCING THE FUNDAMENTAL UNDERSTANDING AND SCALE-UP OF TRISO FUEL COATERS VIA ADVANCED MEASUREMENT AND COMPUTATIONAL TECHNIQUES

    SciTech Connect

    Biswas, Pratim; Al-Dahhan, Muthanna

    2012-11-01

    to advance the fundamental understanding of the hydrodynamics by systematically investigating the effect of design and operating variables, to evaluate the reported dimensionless groups as scaling factors, and to establish a reliable scale-up methodology for the TRISO fuel particle spouted bed coaters based on hydrodynamic similarity via advanced measurement and computational techniques. An additional objective is to develop an on-line non-invasive measurement technique based on gamma ray densitometry (i.e. Nuclear Gauge Densitometry) that can be installed and used for coater process monitoring to ensure proper performance and operation and to facilitate the developed scale-up methodology. To achieve the objectives set for the project, the work will use optical probes and gamma ray computed tomography (CT) (for the measurements of solids/voidage holdup cross-sectional distribution and radial profiles along the bed height, spouted diameter, and fountain height) and radioactive particle tracking (RPT) (for the measurements of the 3D solids flow field, velocity, turbulent parameters, circulation time, solids lagrangian trajectories, and many other of spouted bed related hydrodynamic parameters). In addition, gas dynamic measurement techniques and pressure transducers will be utilized to complement the obtained information. The measurements obtained by these techniques will be used as benchmark data to evaluate and validate the computational fluid dynamic (CFD) models (two fluid model or discrete particle model) and their closures. The validated CFD models and closures will be used to facilitate the developed methodology for scale-up, design and hydrodynamic similarity. Successful execution of this work and the proposed tasks will advance the fundamental understanding of the coater flow field and quantify it for proper and safe design, scale-up, and performance. Such achievements will overcome the barriers to AGR applications and will help assure that the US maintains

  2. 77 FR 75180 - National Center for Advancing Translational Sciences; Notice of Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-19

    ... Room 6, Bethesda, MD 20892. Contact Person: Danilo A Tagle, Ph.D., Executive Secretary, National Center..., Danilo.Tagle@nih.gov . Any interested person may file written comments with the committee by...

  3. Center for Advancing ystemic Heliophysics Education (CAHEd): Outreach through Community Building

    NASA Astrophysics Data System (ADS)

    Whitman, K.; Kadooka, M.

    2012-12-01

    In 2010, the Center for Advancing ystemic Heliophysics Education (CAHEd) was established at the University of Hawaii Institute for Astronomy to promote public outreach and education of solar astronomy and heliophysics. The primary objectives of CAHEd are to increase public awareness of the significance of heliophysics and space weather through lectures, open houses, and online resources. In addition, CAHEd works to educate secondary teachers and students on physics concepts essential for understanding heliophysics ideas. For the first two years of the NASA sponsored grant, CAHEd has focused its efforts on teachers and students in Hawaii. Approaching its third year, CAHEd has begun to expand to a national level, partnering with teachers in locations across the United States. Two core goals of CAHEd will be discussed here: collaboration with a select group of Master Teachers and student mentoring in research projects. CAHEd has built a partnership with over a dozen Master Teachers that work with scientists to develop curriculum for the middle and high school classroom. These teachers come from diverse backgrounds with a variety of scientific experiences. Master Teachers play the important role of assessing and improving CAHEd curriculum and provide support for CAHEd activities. All Master Teachers participate in in-depth multi-day workshops that allow them to develop a deeper understanding of the science behind heliophysics. After building a strong background, Master Teachers organize workshops, growing a community of teachers who incorporate heliophysics into their curriculum. Scientists also work closely with middle school and high school students who wish to pursue study in heliophysics. Student research is a fundamental goal of CAHEd and scientists work with students to complete projects for school and state science fairs. Four students have completed award winning heliophysics projects to date and three of the four students have gone on to pursue a second

  4. NASA. Lewis Research Center Advanced Modulation and Coding Project: Introduction and overview

    NASA Technical Reports Server (NTRS)

    Budinger, James M.

    1992-01-01

    The Advanced Modulation and Coding Project at LeRC is sponsored by the Office of Space Science and Applications, Communications Division, Code EC, at NASA Headquarters and conducted by the Digital Systems Technology Branch of the Space Electronics Division. Advanced Modulation and Coding is one of three focused technology development projects within the branch's overall Processing and Switching Program. The program consists of industry contracts for developing proof-of-concept (POC) and demonstration model hardware, university grants for analyzing advanced techniques, and in-house integration and testing of performance verification and systems evaluation. The Advanced Modulation and Coding Project is broken into five elements: (1) bandwidth- and power-efficient modems; (2) high-speed codecs; (3) digital modems; (4) multichannel demodulators; and (5) very high-data-rate modems. At least one contract and one grant were awarded for each element.

  5. Center for space microelectronics technology

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The 1992 Technical Report of the Jet Propulsion Laboratory Center for Space Microelectronics Technology summarizes the technical accomplishments, publications, presentations, and patents of the center during the past year. The report lists 187 publications, 253 presentations, and 111 new technology reports and patents in the areas of solid-state devices, photonics, advanced computing, and custom microcircuits.

  6. NREL - Advanced Vehicles and Fuels Basics - Center for Transportation Technologies and Systems 2010

    SciTech Connect

    2010-01-01

    We can improve the fuel economy of our cars, trucks, and buses by designing them to use the energy in fuels more efficiently. Researchers at the National Renewable Energy Laboratory (NREL) are helping the nation achieve these goals by developing transportation technologies like: advanced vehicle systems and components; alternative fuels; as well as fuel cells, hybrid electric, and plug-in hybrid vehicles. For a text version of this video visit http://www.nrel.gov/learning/advanced_vehicles_fuels.html

  7. NREL - Advanced Vehicles and Fuels Basics - Center for Transportation Technologies and Systems 2010

    ScienceCinema

    None

    2016-07-12

    We can improve the fuel economy of our cars, trucks, and buses by designing them to use the energy in fuels more efficiently. Researchers at the National Renewable Energy Laboratory (NREL) are helping the nation achieve these goals by developing transportation technologies like: advanced vehicle systems and components; alternative fuels; as well as fuel cells, hybrid electric, and plug-in hybrid vehicles. For a text version of this video visit http://www.nrel.gov/learning/advanced_vehicles_fuels.html

  8. PREFACE: 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT2013)

    NASA Astrophysics Data System (ADS)

    Wang, Jianxiong

    2014-06-01

    This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2013) which took place on 16-21 May 2013 at the Institute of High Energy Physics, Chinese Academy of Sciences, Beijing, China. The workshop series brings together computer science researchers and practitioners, and researchers from particle physics and related fields to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. This year's edition of the workshop brought together over 120 participants from all over the world. 18 invited speakers presented key topics on the universe in computer, Computing in Earth Sciences, multivariate data analysis, automated computation in Quantum Field Theory as well as computing and data analysis challenges in many fields. Over 70 other talks and posters presented state-of-the-art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. The round table discussions on open-source, knowledge sharing and scientific collaboration stimulate us to think over the issue in the respective areas. ACAT 2013 was generously sponsored by the Chinese Academy of Sciences (CAS), National Natural Science Foundation of China (NFSC), Brookhaven National Laboratory in the USA (BNL), Peking University (PKU), Theoretical Physics Cernter for Science facilities of CAS (TPCSF-CAS) and Sugon. We would like to thank all the participants for their scientific contributions and for the en- thusiastic participation in all its activities of the workshop. Further information on ACAT 2013 can be found at http://acat2013.ihep.ac.cn. Professor Jianxiong Wang Institute of High Energy Physics Chinese Academy of Science Details of committees and sponsors are available in the PDF

  9. PREFACE: 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011)

    NASA Astrophysics Data System (ADS)

    Teodorescu, Liliana; Britton, David; Glover, Nigel; Heinrich, Gudrun; Lauret, Jérôme; Naumann, Axel; Speer, Thomas; Teixeira-Dias, Pedro

    2012-06-01

    ACAT2011 This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011) which took place on 5-7 September 2011 at Brunel University, UK. The workshop series, which began in 1990 in Lyon, France, brings together computer science researchers and practitioners, and researchers from particle physics and related fields in order to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. It is a forum for the exchange of ideas among the fields, exploring and promoting cutting-edge computing, data analysis and theoretical calculation techniques in fundamental physics research. This year's edition of the workshop brought together over 100 participants from all over the world. 14 invited speakers presented key topics on computing ecosystems, cloud computing, multivariate data analysis, symbolic and automatic theoretical calculations as well as computing and data analysis challenges in astrophysics, bioinformatics and musicology. Over 80 other talks and posters presented state-of-the art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. Panel and round table discussions on data management and multivariate data analysis uncovered new ideas and collaboration opportunities in the respective areas. This edition of ACAT was generously sponsored by the Science and Technology Facility Council (STFC), the Institute for Particle Physics Phenomenology (IPPP) at Durham University, Brookhaven National Laboratory in the USA and Dell. We would like to thank all the participants of the workshop for the high level of their scientific contributions and for the enthusiastic participation in all its activities which were, ultimately, the key factors in the

  10. Rethinking Human-Centered Computing: Finding the Customer and Negotiated Interactions at the Airport

    NASA Technical Reports Server (NTRS)

    Wales, Roxana; O'Neill, John; Mirmalek, Zara

    2003-01-01

    The breakdown in the air transportation system over the past several years raises an interesting question for researchers: How can we help improve the reliability of airline operations? In offering some answers to this question, we make a statement about Huuman-Centered Computing (HCC). First we offer the definition that HCC is a multi-disciplinary research and design methodology focused on supporting humans as they use technology by including cognitive and social systems, computational tools and the physical environment in the analysis of organizational systems. We suggest that a key element in understanding organizational systems is that there are external cognitive and social systems (customers) as well as internal cognitive and social systems (employees) and that they interact dynamically to impact the organization and its work. The design of human-centered intelligent systems must take this outside-inside dynamic into account. In the past, the design of intelligent systems has focused on supporting the work and improvisation requirements of employees but has often assumed that customer requirements are implicitly satisfied by employee requirements. Taking a customer-centric perspective provides a different lens for understanding this outside-inside dynamic, the work of the organization and the requirements of both customers and employees In this article we will: 1) Demonstrate how the use of ethnographic methods revealed the important outside-inside dynamic in an airline, specifically the consequential relationship between external customer requirements and perspectives and internal organizational processes and perspectives as they came together in a changing environment; 2) Describe how taking a customer centric perspective identifies places where the impact of the outside-inside dynamic is most critical and requires technology that can be adaptive; 3) Define and discuss the place of negotiated interactions in airline operations, identifying how these

  11. Stable and highly persistent quinoxaline-centered metalloorganic radical anions: preparation, structural, spectroscopic, and computational investigations.

    PubMed

    Choua, Sylvie; Djukic, Jean-Pierre; Dalléry, Jérôme; Bieber, André; Welter, Richard; Gisselbrecht, Jean-Paul; Turek, Philippe; Ricard, Louis

    2009-01-05

    Coordination of diazines such as quinoxaline to transition metals stabilizes radical anions generated by chemical or electrochemical cathodic reduction. However, even though various sorts of radical anionic diazines have been subjected to spectroscopic investigations in the recent past, reports combining structural, solid-state electron paramagnetic resonance (EPR) and computational investigations of kinetically stable species are still missing. In this study, four radical anions derived from tricarbonylmanganese- and tricarbonylrhenium-bound quinoxaline chelates, embedded within a triple-decker architecture, have been prepared from neutral substrates by chemical reduction over alkaline metals (K, Rb); the electronic structure of the latter metalloorganic paramagnetic salts was investigated by the means of structural X-ray diffraction analysis, electrochemistry, solution and crystal EPR spectroscopy, and density functional theory (DFT). Unprecedented structures of three manganese-bound and one rhenium-bound quinoxaline-derived paramagnetic salts were obtained from solutions of the corresponding radical anions crystallized in the presence of cryptand 222. It is inferred from a comparative study of the structures of anionic and neutral quinoxaline complexes that reduction does not have any significant impact over the coordination mode of the metal centers and over the overall geometry of the triple-decker architecture. The most notable changes in the radical-anionic metalloorganic species, as compared to the neutral parent molecules, comprise a slight hapticity shift of the metal-bound benzyl moiety and a weak intraannular distortion of the quinoxalyl core. Single-crystal EPR experiments carried out with the rhenium and manganese compounds produced the respective anisotropic g tensor, which was found in each case to be essentially located at the quinoxalyl fragment. Computations, carried out using DFT methods (B3LYP-LANL2DZ and Becke-Perdew-TZP), corroborated the

  12. Advanced Simulation and Computing Fiscal Year 2011-2012 Implementation Plan, Revision 0

    SciTech Connect

    McCoy, M; Phillips, J; Hpson, J; Meisner, R

    2010-04-22

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering (D&E) programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model

  13. Advanced Simulation and Computing FY08-09 Implementation Plan Volume 2 Revision 0

    SciTech Connect

    McCoy, M; Kusnezov, D; Bikkel, T; Hopson, J

    2007-04-25

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  14. Advanced Simulation and Computing FY07-08 Implementation Plan Volume 2

    SciTech Connect

    Kusnezov, D; Hale, A; McCoy, M; Hopson, J

    2006-06-22

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program will require the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from

  15. Advanced Simulation & Computing FY09-FY10 Implementation Plan Volume 2, Rev. 0

    SciTech Connect

    Meisner, R; Perry, J; McCoy, M; Hopson, J

    2008-04-30

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  16. Advanced Simulation and Computing FY09-FY10 Implementation Plan, Volume 2, Revision 0.5

    SciTech Connect

    Meisner, R; Hopson, J; Peery, J; McCoy, M

    2008-10-07

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  17. Advanced Simulation and Computing FY08-09 Implementation Plan, Volume 2, Revision 0.5

    SciTech Connect

    Kusnezov, D; Bickel, T; McCoy, M; Hopson, J

    2007-09-13

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from

  18. Advanced Simulation and Computing FY09-FY10 Implementation Plan Volume 2, Rev. 1

    SciTech Connect

    Kissel, L

    2009-04-01

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one that

  19. Advanced Simulation and Computing FY10-FY11 Implementation Plan Volume 2, Rev. 0.5

    SciTech Connect

    Meisner, R; Peery, J; McCoy, M; Hopson, J

    2009-09-08

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering (D&E) programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model

  20. Center for Programming Models for Scalable Parallel Computing: Future Programming Models

    SciTech Connect

    Gao, Guang, R.

    2008-07-24

    The mission of the pmodel center project is to develop software technology to support scalable parallel programming models for terascale systems. The goal of the specific UD subproject is in the context developing an efficient and robust methodology and tools for HPC programming. More specifically, the focus is on developing new programming models which facilitate programmers in porting their application onto parallel high performance computing systems. During the course of the research in the past 5 years, the landscape of microprocessor chip architecture has witnessed a fundamental change – the emergence of multi-core/many-core chip architecture appear to become the mainstream technology and will have a major impact to for future generation parallel machines. The programming model for shared-address space machines is becoming critical to such multi-core architectures. Our research highlight is the in-depth study of proposed fine-grain parallelism/multithreading support on such future generation multi-core architectures. Our research has demonstrated the significant impact such fine-grain multithreading model can have on the productivity of parallel programming models and their efficient implementation.