Science.gov

Sample records for advanced computations department

  1. Advanced Computer Typography.

    DTIC Science & Technology

    1981-12-01

    ADVANCED COMPUTER TYPOGRAPHY .(U) DEC 81 A V HERSHEY UNCLASSIFIED NPS012-81-005 M MEEEIEEEII IIUJIL15I.4 MICROCQP RE SO.JjI ON ft R NPS012-81-005...NAVAL POSTGRADUATE SCHOOL 0Monterey, California DTIC SELECTEWA APR 5 1982 B ADVANCED COMPUTER TYPOGRAPHY by A. V. HERSHEY December 1981 OApproved for...Subtitle) S. TYPE Or REPORT & PERIOD COVERED Final ADVANCED COMPUTER TYPOGRAPHY Dec 1979 - Dec 1981 S. PERFORMING ORG. REPORT NUMBER 7. AUTHOR(s) S CONTRACT

  2. 75 FR 64720 - DOE/Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-20

    .../Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing... Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U. S. Department...

  3. 75 FR 9887 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-04

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing... Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U.S. Department...

  4. 78 FR 6087 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-29

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing..., Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U. S. Department of...

  5. 76 FR 9765 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-22

    ... Advanced Scientific Computing Advisory Committee AGENCY: Office of Science, Department of Energy. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing..., Office of Advanced Scientific Computing Research, SC-21/Germantown Building, U.S. Department of...

  6. 77 FR 45345 - DOE/Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-31

    .../Advanced Scientific Computing Advisory Committee AGENCY: Office of Science, Department of Energy. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing... Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U.S. Department...

  7. A Visit to the Computer Science Department,

    DTIC Science & Technology

    1983-01-11

    THE COMPUTER SCIENCE DEPARTMENT by Zbong Qing FES 23 I Approved for public "release; Udistribution unlimited. -- 83 02 023 AI FTD-zD(sj)T-&7-42 EDITED...TRANSLATION FTD-ID(RS)T-1722-82 11 January 1983 MICROFICHE NR: PTD-83-C-000022 A VISIT TO THE COMPUTER SCIENCE DEPARTMENT ly: Zhong Qing English...Zhong Qing AernauicsInstitute,anBejgAro nautics Institute all have computer science departments. Why are computer science departments needed at

  8. 75 FR 43518 - Advanced Scientific Computing Advisory Committee; Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-26

    ... Advanced Scientific Computing Advisory Committee; Meeting AGENCY: Office of Science, DOE. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing Advisory..., Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U. S. Department of...

  9. 76 FR 41234 - Advanced Scientific Computing Advisory Committee Charter Renewal

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-13

    ... Advanced Scientific Computing Advisory Committee Charter Renewal AGENCY: Department of Energy, Office of... Administration, notice is hereby given that the Advanced Scientific Computing Advisory Committee will be renewed... concerning the Advanced Scientific Computing program in response only to charges from the Director of...

  10. 78 FR 41046 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-09

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION... hereby given that the Advanced Scientific Computing Advisory Committee will be renewed for a two-year... (DOE), on the Advanced Scientific Computing Research Program managed by the Office of...

  11. Advances in Computational Astrophysics

    SciTech Connect

    Calder, Alan C.; Kouzes, Richard T.

    2009-03-01

    I was invited to be the guest editor for a special issue of Computing in Science and Engineering along with a colleague from Stony Brook. This is the guest editors' introduction to a special issue of Computing in Science and Engineering. Alan and I have written this introduction and have been the editors for the 4 papers to be published in this special edition.

  12. On teaching computer ethics within a computer science department.

    PubMed

    Quinn, Michael J

    2006-04-01

    The author has surveyed a quarter of the accredited undergraduate computer science programs in the United States. More than half of these programs offer a 'social and ethical implications of computing' course taught by a computer science faculty member, and there appears to be a trend toward teaching ethics classes within computer science departments. Although the decision to create an 'in house' computer ethics course may sometimes be a pragmatic response to pressure from the accreditation agency, this paper argues that teaching ethics within a computer science department can provide students and faculty members with numerous benefits. The paper lists topics that can be covered in a computer ethics course and offers some practical suggestions for making the course successful.

  13. 75 FR 57742 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-22

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION... Scientific Computing Advisory Committee (ASCAC). Federal Advisory Committee Act (Pub. L. 92-463, 86 Stat. 770...: Melea Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building;...

  14. 76 FR 45786 - Advanced Scientific Computing Advisory Committee; Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-01

    ... Advanced Scientific Computing Advisory Committee; Meeting AGENCY: Office of Science, Department of Energy... Computing Advisory Committee (ASCAC). Federal Advisory Committee Act (Pub. L. 92-463, 86 Stat. 770) requires... INFORMATION CONTACT: Melea Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown...

  15. Advanced computations in plasma physics

    NASA Astrophysics Data System (ADS)

    Tang, W. M.

    2002-05-01

    Scientific simulation in tandem with theory and experiment is an essential tool for understanding complex plasma behavior. In this paper we review recent progress and future directions for advanced simulations in magnetically confined plasmas with illustrative examples chosen from magnetic confinement research areas such as microturbulence, magnetohydrodynamics, magnetic reconnection, and others. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales together with access to powerful new computational resources. In particular, the fusion energy science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPP's to produce three-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of turbulence self-regulation by zonal flows. It should be emphasized that these calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to

  16. Center for Advanced Computational Technology

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    2000-01-01

    The Center for Advanced Computational Technology (ACT) was established to serve as a focal point for diverse research activities pertaining to application of advanced computational technology to future aerospace systems. These activities include the use of numerical simulations, artificial intelligence methods, multimedia and synthetic environments, and computational intelligence, in the modeling, analysis, sensitivity studies, optimization, design and operation of future aerospace systems. The Center is located at NASA Langley and is an integral part of the School of Engineering and Applied Science of the University of Virginia. The Center has four specific objectives: 1) conduct innovative research on applications of advanced computational technology to aerospace systems; 2) act as pathfinder by demonstrating to the research community what can be done (high-potential, high-risk research); 3) help in identifying future directions of research in support of the aeronautical and space missions of the twenty-first century; and 4) help in the rapid transfer of research results to industry and in broadening awareness among researchers and engineers of the state-of-the-art in applications of advanced computational technology to the analysis, design prototyping and operations of aerospace and other high-performance engineering systems. In addition to research, Center activities include helping in the planning and coordination of the activities of a multi-center team of NASA and JPL researchers who are developing an intelligent synthesis environment for future aerospace systems; organizing workshops and national symposia; as well as writing state-of-the-art monographs and NASA special publications on timely topics.

  17. Advanced Computation in Plasma Physics

    NASA Astrophysics Data System (ADS)

    Tang, William

    2001-10-01

    Scientific simulation in tandem with theory and experiment is an essential tool for understanding complex plasma behavior. This talk will review recent progress and future directions for advanced simulations in magnetically-confined plasmas with illustrative examples chosen from areas such as microturbulence, magnetohydrodynamics, magnetic reconnection, and others. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales together with access to powerful new computational resources. In particular, the fusion energy science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop MPP's to produce 3-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of turbulence self-regulation by zonal flows. It should be emphasized that these calculations, which typically utilized billions of particles for tens of thousands time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to stimulate improved cross-cutting collaborations with other fields and also to help attract

  18. Advanced flight computer. Special study

    NASA Technical Reports Server (NTRS)

    Coo, Dennis

    1995-01-01

    This report documents a special study to define a 32-bit radiation hardened, SEU tolerant flight computer architecture, and to investigate current or near-term technologies and development efforts that contribute to the Advanced Flight Computer (AFC) design and development. An AFC processing node architecture is defined. Each node may consist of a multi-chip processor as needed. The modular, building block approach uses VLSI technology and packaging methods that demonstrate a feasible AFC module in 1998 that meets that AFC goals. The defined architecture and approach demonstrate a clear low-risk, low-cost path to the 1998 production goal, with intermediate prototypes in 1996.

  19. 76 FR 64330 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-18

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION... Reliability, Diffusion on Complex Networks, and Reversible Software Execution Systems Report from Applied Math... at: (301) 903-7486 or by email at: Melea.Baker@science.doe.gov . You must make your request for...

  20. 78 FR 56871 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-16

    ... Advanced Scientific Computing Advisory Committee AGENCY: Office of Science, Department of Energy. ACTION... Exascale technical approaches subcommittee Facilities update Report from Applied Math Committee of Visitors...: ( Melea.Baker@science.doe.gov ). You must make your request for an oral statement at least five...

  1. Aerodynamic Analyses Requiring Advanced Computers, part 2

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Papers given at the conference present the results of theoretical research on aerodynamic flow problems requiring the use of advanced computers. Topics discussed include two-dimensional configurations, three-dimensional configurations, transonic aircraft, and the space shuttle.

  2. Aerodynamic Analyses Requiring Advanced Computers, Part 1

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Papers are presented which deal with results of theoretical research on aerodynamic flow problems requiring the use of advanced computers. Topics discussed include: viscous flows, boundary layer equations, turbulence modeling and Navier-Stokes equations, and internal flows.

  3. Bringing Advanced Computational Techniques to Energy Research

    SciTech Connect

    Mitchell, Julie C

    2012-11-17

    Please find attached our final technical report for the BACTER Institute award. BACTER was created as a graduate and postdoctoral training program for the advancement of computational biology applied to questions of relevance to bioenergy research.

  4. Quantum chromodynamics with advanced computing

    SciTech Connect

    Kronfeld, Andreas S.; /Fermilab

    2008-07-01

    We survey results in lattice quantum chromodynamics from groups in the USQCD Collaboration. The main focus is on physics, but many aspects of the discussion are aimed at an audience of computational physicists.

  5. Identifying the Computer Competency Levels of Recreation Department Undergraduates

    ERIC Educational Resources Information Center

    Zorba, Erdal

    2011-01-01

    Computer-based and web-based applications are as major instructional tools to increase undergraduates' motivation at school. In the recreation field usage of, computer and the internet based recreational applications has become more prevalent in order to present visual and interactive entertainment activities. Recreation department undergraduates…

  6. Advanced Biomedical Computing Center (ABCC) | DSITP

    Cancer.gov

    The Advanced Biomedical Computing Center (ABCC), located in Frederick Maryland (MD), provides HPC resources for both NIH/NCI intramural scientists and the extramural biomedical research community. Its mission is to provide HPC support, to provide collaborative research, and to conduct in-house research in various areas of computational biology and biomedical research.

  7. Advanced laptop and small personal computer technology

    NASA Technical Reports Server (NTRS)

    Johnson, Roger L.

    1991-01-01

    Advanced laptop and small personal computer technology is presented in the form of the viewgraphs. The following areas of hand carried computers and mobile workstation technology are covered: background, applications, high end products, technology trends, requirements for the Control Center application, and recommendations for the future.

  8. Opportunities in computational mechanics: Advances in parallel computing

    SciTech Connect

    Lesar, R.A.

    1999-02-01

    In this paper, the authors will discuss recent advances in computing power and the prospects for using these new capabilities for studying plasticity and failure. They will first review the new capabilities made available with parallel computing. They will discuss how these machines perform and how well their architecture might work on materials issues. Finally, they will give some estimates on the size of problems possible using these computers.

  9. Advances and challenges in computational plasma science

    NASA Astrophysics Data System (ADS)

    Tang, W. M.

    2005-02-01

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behaviour. Recent advances in simulations of magnetically confined plasmas are reviewed in this paper, with illustrative examples, chosen from associated research areas such as microturbulence, magnetohydrodynamics and other topics. Progress has been stimulated, in particular, by the exponential growth of computer speed along with significant improvements in computer technology. The advances in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics have produced increasingly good agreement between experimental observations and computational modelling. This was enabled by two key factors: (a) innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales and (b) access to powerful new computational resources. Excellent progress has been made in developing codes for which computer run-time and problem-size scale well with the number of processors on massively parallel processors (MPPs). Examples include the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPPs to produce three-dimensional, general geometry, nonlinear particle simulations that have accelerated advances in understanding the nature of turbulence self-regulation by zonal flows. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In looking towards the future, the current results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. This

  10. Role of HPC in Advancing Computational Aeroelasticity

    NASA Technical Reports Server (NTRS)

    Guruswamy, Guru P.

    2004-01-01

    On behalf of the High Performance Computing and Modernization Program (HPCMP) and NASA Advanced Supercomputing Division (NAS) a study is conducted to assess the role of supercomputers on computational aeroelasticity of aerospace vehicles. The study is mostly based on the responses to a web based questionnaire that was designed to capture the nuances of high performance computational aeroelasticity, particularly on parallel computers. A procedure is presented to assign a fidelity-complexity index to each application. Case studies based on major applications using HPCMP resources are presented.

  11. Advances and trends in computational structural mechanics

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Atluri, Satya N.

    1987-01-01

    The development status and applicational range of techniques in computational structural mechanics (CSM) are evaluated with a view to advances in computational models for material behavior, discrete-element technology, quality assessment, the control of numerical simulations of structural response, hybrid analysis techniques, techniques for large-scale optimization, and the impact of new computing systems on CSM. Primary pacers of CSM development encompass prediction and analysis of novel materials for structural components, computational strategies for large-scale structural calculations, and the assessment of response prediction reliability together with its adaptive improvement.

  12. OPENING REMARKS: Scientific Discovery through Advanced Computing

    NASA Astrophysics Data System (ADS)

    Strayer, Michael

    2006-01-01

    Good morning. Welcome to SciDAC 2006 and Denver. I share greetings from the new Undersecretary for Energy, Ray Orbach. Five years ago SciDAC was launched as an experiment in computational science. The goal was to form partnerships among science applications, computer scientists, and applied mathematicians to take advantage of the potential of emerging terascale computers. This experiment has been a resounding success. SciDAC has emerged as a powerful concept for addressing some of the biggest challenges facing our world. As significant as these successes were, I believe there is also significance in the teams that achieved them. In addition to their scientific aims these teams have advanced the overall field of computational science and set the stage for even larger accomplishments as we look ahead to SciDAC-2. I am sure that many of you are expecting to hear about the results of our current solicitation for SciDAC-2. I’m afraid we are not quite ready to make that announcement. Decisions are still being made and we will announce the results later this summer. Nearly 250 unique proposals were received and evaluated, involving literally thousands of researchers, postdocs, and students. These collectively requested more than five times our expected budget. This response is a testament to the success of SciDAC in the community. In SciDAC-2 our budget has been increased to about 70 million for FY 2007 and our partnerships have expanded to include the Environment and National Security missions of the Department. The National Science Foundation has also joined as a partner. These new partnerships are expected to expand the application space of SciDAC, and broaden the impact and visibility of the program. We have, with our recent solicitation, expanded to turbulence, computational biology, and groundwater reactive modeling and simulation. We are currently talking with the Department’s applied energy programs about risk assessment, optimization of complex systems - such

  13. Advances and Challenges in Computational Plasma Science

    SciTech Connect

    W.M. Tang; V.S. Chan

    2005-01-03

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behavior. Recent advances in simulations of magnetically-confined plasmas are reviewed in this paper with illustrative examples chosen from associated research areas such as microturbulence, magnetohydrodynamics, and other topics. Progress has been stimulated in particular by the exponential growth of computer speed along with significant improvements in computer technology.

  14. Advanced computational tools for 3-D seismic analysis

    SciTech Connect

    Barhen, J.; Glover, C.W.; Protopopescu, V.A.

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  15. Advanced Computing Architectures for Cognitive Processing

    DTIC Science & Technology

    2009-07-01

    AND IS APPROVED FOR PUBLICATION IN ACCORDANCE WITH ASSIGNED DISTRIBUTION STATEMENT. FOR THE DIRECTOR: / s ... s / LOK YAN EDWARD J. JONES, Deputy Chief Work Unit Manager Advanced Computing Division...ELEMENT NUMBER 62702F 6. AUTHOR( S ) Gregory D. Peterson 5d. PROJECT NUMBER 459T 5e. TASK NUMBER AC 5f. WORK UNIT NUMBER CP 7. PERFORMING

  16. Predictive Dynamic Security Assessment through Advanced Computing

    SciTech Connect

    Huang, Zhenyu; Diao, Ruisheng; Jin, Shuangshuang; Chen, Yousu

    2014-11-30

    Abstract— Traditional dynamic security assessment is limited by several factors and thus falls short in providing real-time information to be predictive for power system operation. These factors include the steady-state assumption of current operating points, static transfer limits, and low computational speed. This addresses these factors and frames predictive dynamic security assessment. The primary objective of predictive dynamic security assessment is to enhance the functionality and computational process of dynamic security assessment through the use of high-speed phasor measurements and the application of advanced computing technologies for faster-than-real-time simulation. This paper presents algorithms, computing platforms, and simulation frameworks that constitute the predictive dynamic security assessment capability. Examples of phasor application and fast computation for dynamic security assessment are included to demonstrate the feasibility and speed enhancement for real-time applications.

  17. 10 CFR 719.36 - Who at the Department must give advance approval?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 4 2011-01-01 2011-01-01 false Who at the Department must give advance approval? 719.36 Section 719.36 Energy DEPARTMENT OF ENERGY CONTRACTOR LEGAL MANAGEMENT REQUIREMENTS Reimbursement of Costs Subject to This Part § 719.36 Who at the Department must give advance approval? If advance approval...

  18. 10 CFR 719.36 - Who at the Department must give advance approval?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 4 2012-01-01 2012-01-01 false Who at the Department must give advance approval? 719.36 Section 719.36 Energy DEPARTMENT OF ENERGY CONTRACTOR LEGAL MANAGEMENT REQUIREMENTS Reimbursement of Costs Subject to This Part § 719.36 Who at the Department must give advance approval? If advance approval...

  19. 10 CFR 719.36 - Who at the Department must give advance approval?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Who at the Department must give advance approval? 719.36 Section 719.36 Energy DEPARTMENT OF ENERGY CONTRACTOR LEGAL MANAGEMENT REQUIREMENTS Reimbursement of Costs Subject to This Part § 719.36 Who at the Department must give advance approval? If advance approval...

  20. Simulation methods for advanced scientific computing

    SciTech Connect

    Booth, T.E.; Carlson, J.A.; Forster, R.A.

    1998-11-01

    This is the final report of a three-year, Laboratory Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The objective of the project was to create effective new algorithms for solving N-body problems by computer simulation. The authors concentrated on developing advanced classical and quantum Monte Carlo techniques. For simulations of phase transitions in classical systems, they produced a framework generalizing the famous Swendsen-Wang cluster algorithms for Ising and Potts models. For spin-glass-like problems, they demonstrated the effectiveness of an extension of the multicanonical method for the two-dimensional, random bond Ising model. For quantum mechanical systems, they generated a new method to compute the ground-state energy of systems of interacting electrons. They also improved methods to compute excited states when the diffusion quantum Monte Carlo method is used and to compute longer time dynamics when the stationary phase quantum Monte Carlo method is used.

  1. Advances in Computational Capabilities for Hypersonic Flows

    NASA Technical Reports Server (NTRS)

    Kumar, Ajay; Gnoffo, Peter A.; Moss, James N.; Drummond, J. Philip

    1997-01-01

    The paper reviews the growth and advances in computational capabilities for hypersonic applications over the period from the mid-1980's to the present day. The current status of the code development issues such as surface and field grid generation, algorithms, physical and chemical modeling, and validation is provided. A brief description of some of the major codes being used at NASA Langley Research Center for hypersonic continuum and rarefied flows is provided, along with their capabilities and deficiencies. A number of application examples are presented, and future areas of research to enhance accuracy, reliability, efficiency, and robustness of computational codes are discussed.

  2. Computer Support to Navy Public Works Departments for Their Utilities Function.

    DTIC Science & Technology

    1980-12-01

    34menus". It can be self-instructing, in the English language and without abbreviations. Even some of the programming can now be done in user technology ...Approved by: Thesis Advisor Second Reader Chairm ,Department of Administrative Sciences Dean of Information and Policy Sciences i 3 ABSTRACT This thesis...us all. At the same time; however, computer technology has advanced tremendously and has become available at dramatically reduced costs. This

  3. Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2000-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center. It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. Ames has been designated NASA's Center of Excellence in Information Technology. In this capacity, Ames is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA Ames and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth; (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking. Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a

  4. Airborne Advanced Reconfigurable Computer System (ARCS)

    NASA Technical Reports Server (NTRS)

    Bjurman, B. E.; Jenkins, G. M.; Masreliez, C. J.; Mcclellan, K. L.; Templeman, J. E.

    1976-01-01

    A digital computer subsystem fault-tolerant concept was defined, and the potential benefits and costs of such a subsystem were assessed when used as the central element of a new transport's flight control system. The derived advanced reconfigurable computer system (ARCS) is a triple-redundant computer subsystem that automatically reconfigures, under multiple fault conditions, from triplex to duplex to simplex operation, with redundancy recovery if the fault condition is transient. The study included criteria development covering factors at the aircraft's operation level that would influence the design of a fault-tolerant system for commercial airline use. A new reliability analysis tool was developed for evaluating redundant, fault-tolerant system availability and survivability; and a stringent digital system software design methodology was used to achieve design/implementation visibility.

  5. First Year Preservice Teachers' Attitudes toward Computers from Computer Education and Instructional Technology Department

    ERIC Educational Resources Information Center

    Yakin, Ilker, Sumuer, Evren

    2007-01-01

    The purpose of the study is to explore the attitudes of first year university students towards computers. The study focuses on preservice teachers (N=46) included 33 male and 12 female from Middle East Technical University, Computer Education and Instructional Technology (CEIT) department. The study is delimited to first grade preservice teachers…

  6. Computational Design of Advanced Nuclear Fuels

    SciTech Connect

    Savrasov, Sergey; Kotliar, Gabriel; Haule, Kristjan

    2014-06-03

    The objective of the project was to develop a method for theoretical understanding of nuclear fuel materials whose physical and thermophysical properties can be predicted from first principles using a novel dynamical mean field method for electronic structure calculations. We concentrated our study on uranium, plutonium, their oxides, nitrides, carbides, as well as some rare earth materials whose 4f eletrons provide a simplified framework for understanding complex behavior of the f electrons. We addressed the issues connected to the electronic structure, lattice instabilities, phonon and magnon dynamics as well as thermal conductivity. This allowed us to evaluate characteristics of advanced nuclear fuel systems using computer based simulations and avoid costly experiments.

  7. ATCA for Machines-- Advanced Telecommunications Computing Architecture

    SciTech Connect

    Larsen, R.S.; /SLAC

    2008-04-22

    The Advanced Telecommunications Computing Architecture is a new industry open standard for electronics instrument modules and shelves being evaluated for the International Linear Collider (ILC). It is the first industrial standard designed for High Availability (HA). ILC availability simulations have shown clearly that the capabilities of ATCA are needed in order to achieve acceptable integrated luminosity. The ATCA architecture looks attractive for beam instruments and detector applications as well. This paper provides an overview of ongoing R&D including application of HA principles to power electronics systems.

  8. DOE Advanced Scientific Computing Advisory Committee (ASCAC) Report: Exascale Computing Initiative Review

    SciTech Connect

    Reed, Daniel; Berzins, Martin; Pennington, Robert; Sarkar, Vivek; Taylor, Valerie

    2015-08-01

    On November 19, 2014, the Advanced Scientific Computing Advisory Committee (ASCAC) was charged with reviewing the Department of Energy’s conceptual design for the Exascale Computing Initiative (ECI). In particular, this included assessing whether there are significant gaps in the ECI plan or areas that need to be given priority or extra management attention. Given the breadth and depth of previous reviews of the technical challenges inherent in exascale system design and deployment, the subcommittee focused its assessment on organizational and management issues, considering technical issues only as they informed organizational or management priorities and structures. This report presents the observations and recommendations of the subcommittee.

  9. Response to House Joint Resolution No. 118 [To Advance Computer-Assisted Instruction].

    ERIC Educational Resources Information Center

    Virginia State General Assembly, Richmond.

    This response by the Virginia Department of Education to House Joint Resolution No. 118 of the General Assembly of Virginia, which requested the Department of Education to study initiatives to advance computer-assisted instruction, is based on input from state and national task forces and on a 1986 survey of 80 Viriginia school divisions. The…

  10. Defense Science Board Report on Advanced Computing

    DTIC Science & Technology

    2009-03-01

    complex computational  issues are  pursued , and that several vendors remain at  the  leading edge of  supercomputing  capability  in  the U.S.  In... pursuing   the  ASC  program  to  help  assure  that  HPC  advances  are  available  to  the  broad  national  security  community. As  in  the past, many...apply HPC  to  technical  problems  related  to  weapons  physics,  but  that  are  entirely  unclassified.  Examples include explosive  astrophysical

  11. Advanced high-performance computer system architectures

    NASA Astrophysics Data System (ADS)

    Vinogradov, V. I.

    2007-02-01

    Convergence of computer systems and communication technologies are moving to switched high-performance modular system architectures on the basis of high-speed switched interconnections. Multi-core processors become more perspective way to high-performance system, and traditional parallel bus system architectures (VME/VXI, cPCI/PXI) are moving to new higher speed serial switched interconnections. Fundamentals in system architecture development are compact modular component strategy, low-power processor, new serial high-speed interface chips on the board, and high-speed switched fabric for SAN architectures. Overview of advanced modular concepts and new international standards for development high-performance embedded and compact modular systems for real-time applications are described.

  12. The advanced computational testing and simulation toolkit (ACTS)

    SciTech Connect

    Drummond, L.A.; Marques, O.

    2002-05-21

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  13. A Suggested Syllabus for Advanced Writing Skills at English Language Teaching Departments

    ERIC Educational Resources Information Center

    Altay, Ismail Firat

    2010-01-01

    As is known, writing is an indispensable part of language education. As far as English Language Teaching Departments are concerned, writing courses, especially Advanced Writing Skills, are taken as a course of higher importance. However, forming a syllabus for Advanced Writing Course for English Language Teaching Departments is not an easy matter.…

  14. Making Advanced Computer Science Topics More Accessible through Interactive Technologies

    ERIC Educational Resources Information Center

    Shao, Kun; Maher, Peter

    2012-01-01

    Purpose: Teaching advanced technical concepts in a computer science program to students of different technical backgrounds presents many challenges. The purpose of this paper is to present a detailed experimental pedagogy in teaching advanced computer science topics, such as computer networking, telecommunications and data structures using…

  15. Intelligent Software Tools for Advanced Computing

    SciTech Connect

    Baumgart, C.W.

    2001-04-03

    Feature extraction and evaluation are two procedures common to the development of any pattern recognition application. These features are the primary pieces of information which are used to train the pattern recognition tool, whether that tool is a neural network, a fuzzy logic rulebase, or a genetic algorithm. Careful selection of the features to be used by the pattern recognition tool can significantly streamline the overall development and training of the solution for the pattern recognition application. This report summarizes the development of an integrated, computer-based software package called the Feature Extraction Toolbox (FET), which can be used for the development and deployment of solutions to generic pattern recognition problems. This toolbox integrates a number of software techniques for signal processing, feature extraction and evaluation, and pattern recognition, all under a single, user-friendly development environment. The toolbox has been developed to run on a laptop computer, so that it may be taken to a site and used to develop pattern recognition applications in the field. A prototype version of this toolbox has been completed and is currently being used for applications development on several projects in support of the Department of Energy.

  16. Application of advanced electronics to a future spacecraft computer design

    NASA Technical Reports Server (NTRS)

    Carney, P. C.

    1980-01-01

    Advancements in hardware and software technology are summarized with specific emphasis on spacecraft computer capabilities. Available state of the art technology is reviewed and candidate architectures are defined.

  17. 77 FR 12823 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-02

    ... final report, Advanced Networking update Status from Computer Science COV Early Career technical talks Summary of Applied Math and Computer Science Workshops ASCR's new SBIR awards Data-intensive...

  18. Energy Department Helps Advance Island Clean Energy Goals (Fact Sheet)

    SciTech Connect

    Not Available

    2012-10-01

    This U.S. Department of Energy (DOE) Office of Energy Efficiency and Renewable Energy (EERE) fact sheet highlights a June 2012 solar power purchase agreement between the Virgin Islands Water and Power Authority and three corporations. The fact sheet describes how financial support from DOE and technical assistance from DOE's National Renewable Energy Laboratory enabled the U.S. Virgin Islands to realistically assess its clean energy resources and identify the most viable and cost-effective solutions to its energy challenges--resulting in a $65 million investment in solar energy in the territory.

  19. Developing an Advanced Environment for Collaborative Computing

    NASA Technical Reports Server (NTRS)

    Becerra-Fernandez, Irma; Stewart, Helen; DelAlto, Martha; DelAlto, Martha; Knight, Chris

    1999-01-01

    Knowledge management in general tries to organize and make available important know-how, whenever and where ever is needed. Today, organizations rely on decision-makers to produce "mission critical" decisions that am based on inputs from multiple domains. The ideal decision-maker has a profound understanding of specific domains that influence the decision-making process coupled with the experience that allows them to act quickly and decisively on the information. In addition, learning companies benefit by not repeating costly mistakes, and by reducing time-to-market in Research & Development projects. Group-decision making tools can help companies make better decisions by capturing the knowledge from groups of experts. Furthermore, companies that capture their customers preferences can improve their customer service, which translates to larger profits. Therefore collaborative computing provides a common communication space, improves sharing of knowledge, provides a mechanism for real-time feedback on the tasks being performed, helps to optimize processes, and results in a centralized knowledge warehouse. This paper presents the research directions. of a project which seeks to augment an advanced collaborative web-based environment called Postdoc, with workflow capabilities. Postdoc is a "government-off-the-shelf" document management software developed at NASA-Ames Research Center (ARC).

  20. Computational and design methods for advanced imaging

    NASA Astrophysics Data System (ADS)

    Birch, Gabriel C.

    This dissertation merges the optical design and computational aspects of imaging systems to create novel devices that solve engineering problems in optical science and attempts to expand the solution space available to the optical designer. This dissertation is divided into two parts: the first discusses a new active illumination depth sensing modality, while the second part discusses a passive illumination system called plenoptic, or lightfield, imaging. The new depth sensing modality introduced in part one is called depth through controlled aberration. This technique illuminates a target with a known, aberrated projected pattern and takes an image using a traditional, unmodified imaging system. Knowing how the added aberration in the projected pattern changes as a function of depth, we are able to quantitatively determine depth of a series of points from the camera. A major advantage this method permits is the ability for illumination and imaging axes to be coincident. Plenoptic cameras capture both spatial and angular data simultaneously. This dissertation present a new set of parameters that permit the design and comparison of plenoptic devices outside the traditionally published plenoptic 1.0 and plenoptic 2.0 configurations. Additionally, a series of engineering advancements are presented, including full system raytraces of raw plenoptic images, Zernike compression techniques of raw image files, and non-uniform lenslet arrays to compensate for plenoptic system aberrations. Finally, a new snapshot imaging spectrometer is proposed based off the plenoptic configuration.

  1. Advanced flight computers for planetary exploration

    NASA Technical Reports Server (NTRS)

    Stephenson, R. Rhoads

    1988-01-01

    Research concerning flight computers for use on interplanetary probes is reviewed. The history of these computers from the Viking mission to the present is outlined. The differences between ground commercial computers and computers for planetary exploration are listed. The development of a computer for the Mariner Mark II comet rendezvous asteroid flyby mission is described. Various aspects of recently developed computer systems are examined, including the Max real time, embedded computer, a hypercube distributed supercomputer, a SAR data processor, a processor for the High Resolution IR Imaging Spectrometer, and a robotic vision multiresolution pyramid machine for processsing images obtained by a Mars Rover.

  2. Some Recent Advances in Computer Graphics.

    ERIC Educational Resources Information Center

    Whitted, Turner

    1982-01-01

    General principles of computer graphics are reviewed, including discussions of display hardware, geometric modeling, algorithms, and applications in science, computer-aided design, flight training, communications, business, art, and entertainment. (JN)

  3. Computing Advances in the Teaching of Chemistry.

    ERIC Educational Resources Information Center

    Baskett, W. P.; Matthews, G. P.

    1984-01-01

    Discusses three trends in computer-oriented chemistry instruction: (1) availability of interfaces to integrate computers with experiments; (2) impact of the development of higher resolution graphics and greater memory capacity; and (3) role of videodisc technology on computer assisted instruction. Includes program listings for auto-titration and…

  4. ADVANCED COMPUTATIONAL METHODS IN DOSE MODELING: APPLICATION OF COMPUTATIONAL BIOPHYSICAL TRANSPORT, COMPUTATIONAL CHEMISTRY, AND COMPUTATIONAL BIOLOGY

    EPA Science Inventory

    Computational toxicology (CompTox) leverages the significant gains in computing power and computational techniques (e.g., numerical approaches, structure-activity relationships, bioinformatics) realized over the last few years, thereby reducing costs and increasing efficiency i...

  5. Department of Energy Mathematical, Information, and Computational Sciences Division: High Performance Computing and Communications Program

    SciTech Connect

    1996-11-01

    This document is intended to serve two purposes. Its first purpose is that of a program status report of the considerable progress that the Department of Energy (DOE) has made since 1993, the time of the last such report (DOE/ER-0536, The DOE Program in HPCC), toward achieving the goals of the High Performance Computing and Communications (HPCC) Program. The second purpose is that of a summary report of the many research programs administered by the Mathematical, Information, and Computational Sciences (MICS) Division of the Office of Energy Research under the auspices of the HPCC Program and to provide, wherever relevant, easy access to pertinent information about MICS-Division activities via universal resource locators (URLs) on the World Wide Web (WWW).

  6. Department of Energy: MICS (Mathematical Information, and Computational Sciences Division). High performance computing and communications program

    SciTech Connect

    1996-06-01

    This document is intended to serve two purposes. Its first purpose is that of a program status report of the considerable progress that the Department of Energy (DOE) has made since 1993, the time of the last such report (DOE/ER-0536, {open_quotes}The DOE Program in HPCC{close_quotes}), toward achieving the goals of the High Performance Computing and Communications (HPCC) Program. The second purpose is that of a summary report of the many research programs administered by the Mathematical, Information, and Computational Sciences (MICS) Division of the Office of Energy Research under the auspices of the HPCC Program and to provide, wherever relevant, easy access to pertinent information about MICS-Division activities via universal resource locators (URLs) on the World Wide Web (WWW). The information pointed to by the URL is updated frequently, and the interested reader is urged to access the WWW for the latest information.

  7. Advancing crime scene computer forensics techniques

    NASA Astrophysics Data System (ADS)

    Hosmer, Chet; Feldman, John; Giordano, Joe

    1999-02-01

    Computers and network technology have become inexpensive and powerful tools that can be applied to a wide range of criminal activity. Computers have changed the world's view of evidence because computers are used more and more as tools in committing `traditional crimes' such as embezzlements, thefts, extortion and murder. This paper will focus on reviewing the current state-of-the-art of the data recovery and evidence construction tools used in both the field and laboratory for prosection purposes.

  8. PARTNERING WITH DOE TO APPLY ADVANCED BIOLOGICAL, ENVIRONMENTAL, AND COMPUTATIONAL SCIENCE TO ENVIRONMENTAL ISSUES

    EPA Science Inventory

    On February 18, 2004, the U.S. Environmental Protection Agency and Department of Energy signed a Memorandum of Understanding to expand the research collaboration of both agencies to advance biological, environmental, and computational sciences for protecting human health and the ...

  9. Application of advanced computational technology to propulsion CFD

    NASA Astrophysics Data System (ADS)

    Szuch, John R.

    The Internal Fluid Mechanics Division of the NASA Lewis Research Center is combining the key elements of computational fluid dynamics, aerothermodynamic experiments, and advanced computational technology to bring internal computational fluid dynamics (ICFM) to a state of practical application for aerospace propulsion system design. This paper presents an overview of efforts underway at NASA Lewis to advance and apply computational technology to ICFM. These efforts include the use of modern, software engineering principles for code development, the development of an AI-based user-interface for large codes, the establishment of a high-performance, data communications network to link ICFM researchers and facilities, and the application of parallel processing to speed up computationally intensive and/or time-critical ICFM problems. A multistage compressor flow physics program is cited as an example of efforts to use advanced computational technology to enhance a current NASA Lewis ICFM research program.

  10. Computation of Viscous Flow about Advanced Projectiles.

    DTIC Science & Technology

    1983-09-09

    Domain". Journal of Comp. Physics, Vol. 8, 1971, pp. 392-408. 10. Thompson , J . F ., Thames, F. C., and Mastin, C. M., "Automatic Numerical Generation of...computations, USSR Comput. Math. Math. Phys., 12, 2 (1972), 182-195. I~~ll A - - 18. Thompson , J . F ., F. C. Thames, and C. M. Mastin, Automatic

  11. Using Grid Computing within the Department of Defense

    DTIC Science & Technology

    2008-06-24

    for Extraterrestrial Intelligence Examples in Research         (SETI@HOME) – Probably one of the most well known – Uses Berkeley Open Infrastructure...instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send...Computing Within DoD • DoD already owns the largest supercomputer in the world 75 fold Premise                     Grid Computing Within DoD • Search

  12. Computation, Mathematics and Logistics Department Report for Fiscal Year 1978.

    DTIC Science & Technology

    1980-03-01

    ENCODING CONSTRAINTS PROGRAM SYSTEMS UER INN INFORMATION FILE DESIGN ANALYST FIELDY IES S M ANALYZER ADATA BASE ROUTINE DESIGNER RECORD COMPUTER...NTIPS automation is considered feasible. Prototypes for all the functions can be built. The big payoff areas are in improved TI update capability and

  13. Computing Algorithms for Nuffield Advanced Physics.

    ERIC Educational Resources Information Center

    Summers, M. K.

    1978-01-01

    Defines all recurrence relations used in the Nuffield course, to solve first- and second-order differential equations, and describes a typical algorithm for computer generation of solutions. (Author/GA)

  14. Aerodynamic optimization studies on advanced architecture computers

    NASA Technical Reports Server (NTRS)

    Chawla, Kalpana

    1995-01-01

    The approach to carrying out multi-discipline aerospace design studies in the future, especially in massively parallel computing environments, comprises of choosing (1) suitable solvers to compute solutions to equations characterizing a discipline, and (2) efficient optimization methods. In addition, for aerodynamic optimization problems, (3) smart methodologies must be selected to modify the surface shape. In this research effort, a 'direct' optimization method is implemented on the Cray C-90 to improve aerodynamic design. It is coupled with an existing implicit Navier-Stokes solver, OVERFLOW, to compute flow solutions. The optimization method is chosen such that it can accomodate multi-discipline optimization in future computations. In the work , however, only single discipline aerodynamic optimization will be included.

  15. Advanced Computational Techniques for Power Tube Design.

    DTIC Science & Technology

    1986-07-01

    fixturing applications, in addition to the existing computer-aided engineering capabilities. o Helix TWT Manufacturing has Implemented a tooling and fixturing...illustrates the ajor features of this computer network. ) The backbone of our system is a Sytek Broadband Network (LAN) which Interconnects terminals and...automatic network analyzer (FANA) which electrically characterizes the slow-wave helices of traveling-wave tubes ( TWTs ) -- both for engineering design

  16. Advanced Crew Personal Support Computer (CPSC) task

    NASA Technical Reports Server (NTRS)

    Muratore, Debra

    1991-01-01

    The topics are presented in view graph form and include: background; objectives of task; benefits to the Space Station Freedom (SSF) Program; technical approach; baseline integration; and growth and evolution options. The objective is to: (1) introduce new computer technology into the SSF Program; (2) augment core computer capabilities to meet additional mission requirements; (3) minimize risk in upgrading technology; and (4) provide a low cost way to enhance crew and ground operations support.

  17. Frontiers of research in advanced computations

    SciTech Connect

    1996-07-01

    The principal mission of the Institute for Scientific Computing Research is to foster interactions among LLNL researchers, universities, and industry on selected topics in scientific computing. In the area of computational physics, the Institute has developed a new algorithm, GaPH, to help scientists understand the chemistry of turbulent and driven plasmas or gases at far less cost than other methods. New low-frequency electromagnetic models better describe the plasma etching and deposition characteristics of a computer chip in the making. A new method for modeling realistic curved boundaries within an orthogonal mesh is resulting in a better understanding of the physics associated with such boundaries and much quicker solutions. All these capabilities are being developed for massively parallel implementation, which is an ongoing focus of Institute researchers. Other groups within the Institute are developing novel computational methods to address a range of other problems. Examples include feature detection and motion recognition by computer, improved monitoring of blood oxygen levels, and entirely new models of human joint mechanics and prosthetic devices.

  18. Summary of Research 2000: Department of Electrical and Computer Engineering

    DTIC Science & Technology

    2001-12-01

    Communications, Signal Processing , Computers, Network Security NRL SUPPORT FOR NPS STUDENT THESIS RESEARCH ON VSAT EXPLORATION H. H. Loomis, Jr... Thesis , Naval Postgraduate School, March 2000. McCabe, E.D. and Stone, C.D., "Development of the Beartrap Post Mission Processing System 2000 (S2K) HTML...manufacturer could then build radiation-tolerant commercial devices on these wafer substrates with little or no changes in the manufacturing process . This thesis

  19. Advanced Nursing Directives: Integrating Validated Clinical Scoring Systems into Nursing Care in the Pediatric Emergency Department

    PubMed Central

    deForest, Erin Kate; Thompson, Graham Cameron

    2012-01-01

    In an effort to improve the quality and flow of care provided to children presenting to the emergency department the implementation of nurse-initiated protocols is on the rise. We review the current literature on nurse-initiated protocols, validated emergency department clinical scoring systems, and the merging of the two to create Advanced Nursing Directives (ANDs). The process of developing a clinical pathway for children presenting to our pediatric emergency department (PED) with suspected appendicitis will be used to demonstrate the successful integration of validated clinical scoring systems into practice through the use of Advanced Nursing Directives. Finally, examples of 2 other Advanced Nursing Directives for common clinical PED presentations will be provided. PMID:22778944

  20. Advanced Computational Techniques in Regional Wave Studies

    DTIC Science & Technology

    1990-01-03

    the new GERESS data. The dissertation work emphasized the development and use of advanced computa- tional techniques for studying regional seismic...hand, the possibility of new data sources at regional distances permits using previously ignored signals. Unfortunately, these regional signals will...the Green’s function around this new reference point is containing the propagation effects, and V is the source Gnk(x,t;r,t) - (2) volume where fJk

  1. Advanced earthquake monitoring system for U.S. Department of Veterans Affairs medical buildings--instrumentation

    USGS Publications Warehouse

    Kalkan, Erol; Banga, Krishna; Ulusoy, Hasan S.; Fletcher, Jon Peter B.; Leith, William S.; Reza, Shahneam; Cheng, Timothy

    2012-01-01

    In collaboration with the U.S. Department of Veterans Affairs (VA), the National Strong Motion Project (NSMP; http://nsmp.wr.usgs.gov/) of the U.S. Geological Survey has been installing sophisticated seismic systems that will monitor the structural integrity of 28 VA hospital buildings located in seismically active regions of the conterminous United States, Alaska, and Puerto Rico during earthquake shaking. These advanced monitoring systems, which combine the use of sensitive accelerometers and real-time computer calculations, are designed to determine the structural health of each hospital building rapidly after an event, helping the VA to ensure the safety of patients and staff. This report presents the instrumentation component of this project by providing details of each hospital building, including a summary of its structural, geotechnical, and seismic hazard information, as well as instrumentation objectives and design. The structural-health monitoring component of the project, including data retrieval and processing, damage detection and localization, automated alerting system, and finally data dissemination, will be presented in a separate report.

  2. Advances in Computer-Supported Learning

    ERIC Educational Resources Information Center

    Neto, Francisco; Brasileiro, Francisco

    2007-01-01

    The Internet and growth of computer networks have eliminated geographic barriers, creating an environment where education can be brought to a student no matter where that student may be. The success of distance learning programs and the availability of many Web-supported applications and multimedia resources have increased the effectiveness of…

  3. Advanced Computing Tools and Models for Accelerator Physics

    SciTech Connect

    Ryne, Robert; Ryne, Robert D.

    2008-06-11

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.

  4. Evaluation of Advanced Computing Techniques and Technologies: Reconfigurable Computing

    NASA Technical Reports Server (NTRS)

    Wells, B. Earl

    2003-01-01

    The focus of this project was to survey the technology of reconfigurable computing determine its level of maturity and suitability for NASA applications. To better understand and assess the effectiveness of the reconfigurable design paradigm that is utilized within the HAL-15 reconfigurable computer system. This system was made available to NASA MSFC for this purpose, from Star Bridge Systems, Inc. To implement on at least one application that would benefit from the performance levels that are possible with reconfigurable hardware. It was originally proposed that experiments in fault tolerance and dynamically reconfigurability would be perform but time constraints mandated that these be pursued as future research.

  5. ASDA - Advanced Suit Design Analyzer computer program

    NASA Technical Reports Server (NTRS)

    Bue, Grant C.; Conger, Bruce C.; Iovine, John V.; Chang, Chi-Min

    1992-01-01

    An ASDA model developed to evaluate the heat and mass transfer characteristics of advanced pressurized suit design concepts for low pressure or vacuum planetary applications is presented. The model is based on a generalized 3-layer suit that uses the Systems Integrated Numerical Differencing Analyzer '85 in conjunction with a 41-node FORTRAN routine. The latter simulates the transient heat transfer and respiratory processes of a human body in a suited environment. The user options for the suit encompass a liquid cooled garment, a removable jacket, a CO2/H2O permeable layer, and a phase change layer.

  6. ADVANCED METHODS FOR THE COMPUTATION OF PARTICLE BEAM TRANSPORT AND THE COMPUTATION OF ELECTROMAGNETIC FIELDS AND MULTIPARTICLE PHENOMENA

    SciTech Connect

    Alex J. Dragt

    2012-08-31

    Since 1980, under the grant DEFG02-96ER40949, the Department of Energy has supported the educational and research work of the University of Maryland Dynamical Systems and Accelerator Theory (DSAT) Group. The primary focus of this educational/research group has been on the computation and analysis of charged-particle beam transport using Lie algebraic methods, and on advanced methods for the computation of electromagnetic fields and multiparticle phenomena. This Final Report summarizes the accomplishments of the DSAT Group from its inception in 1980 through its end in 2011.

  7. Advances in computer imaging/applications in facial plastic surgery.

    PubMed

    Papel, I D; Jiannetto, D F

    1999-01-01

    Rapidly progressing computer technology, ever-increasing expectations of patients, and a confusing medicolegal environment requires a clarification of the role of computer imaging/applications. Advances in computer technology and its applications are reviewed. A brief historical discussion is included for perspective. Improvements in both hardware and software with the advent of digital imaging have allowed great increases in speed and accuracy in patient imaging. This facilitates doctor-patient communication and possibly realistic patient expectations. Patients seeking cosmetic surgery now often expect preoperative imaging. Although society in general has become more litigious, a literature search up to 1998 reveals no lawsuits directly involving computer imaging. It appears that conservative utilization of computer imaging by the facial plastic surgeon may actually reduce liability and promote communication. Recent advances have significantly enhanced the value of computer imaging in the practice of facial plastic surgery. These technological advances in computer imaging appear to contribute a useful technique for the practice of facial plastic surgery. Inclusion of computer imaging should be given serious consideration as an adjunct to clinical practice.

  8. Essay: Robert H. Siemann As Leader of the Advanced Accelerator Research Department

    SciTech Connect

    Colby, Eric R.; Hogan, Mark J.; /SLAC

    2011-11-14

    Robert H. Siemann originally conceived of the Advanced Accelerator Research Department (AARD) as an academic, experimental group dedicated to probing the technical limitations of accelerators while providing excellent educational opportunities for young scientists. The early years of the Accelerator Research Department B, as it was then known, were dedicated to a wealth of mostly student-led experiments to examine the promise of advanced accelerator techniques. High-gradient techniques including millimeter-wave rf acceleration, beam-driven plasma acceleration, and direct laser acceleration were pursued, including tests of materials under rf pulsed heating and short-pulse laser radiation, to establish the ultimate limitations on gradient. As the department and program grew, so did the motivation to found an accelerator research center that brought experimentalists together in a test facility environment to conduct a broad range of experiments. The Final Focus Test Beam and later the Next Linear Collider Test Accelerator provided unique experimental facilities for AARD staff and collaborators to carry out advanced accelerator experiments. Throughout the evolution of this dynamic program, Bob maintained a department atmosphere and culture more reminiscent of a university research group than a national laboratory department. His exceptional ability to balance multiple roles as scientist, professor, and administrator enabled the creation and preservation of an environment that fostered technical innovation and scholarship.

  9. TOPICAL REVIEW: Advances and challenges in computational plasma science

    NASA Astrophysics Data System (ADS)

    Tang, W. M.; Chan, V. S.

    2005-02-01

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behaviour. Recent advances in simulations of magnetically confined plasmas are reviewed in this paper, with illustrative examples, chosen from associated research areas such as microturbulence, magnetohydrodynamics and other topics. Progress has been stimulated, in particular, by the exponential growth of computer speed along with significant improvements in computer technology. The advances in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics have produced increasingly good agreement between experimental observations and computational modelling. This was enabled by two key factors: (a) innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales and (b) access to powerful new computational resources. Excellent progress has been made in developing codes for which computer run-time and problem-size scale well with the number of processors on massively parallel processors (MPPs). Examples include the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPPs to produce three-dimensional, general geometry, nonlinear particle simulations that have accelerated advances in understanding the nature of turbulence self-regulation by zonal flows. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In looking towards the future, the current results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. This

  10. Activities and operations of the Advanced Computing Research Facility, July-October 1986

    SciTech Connect

    Pieper, G.W.

    1986-01-01

    Research activities and operations of the Advanced Computing Research Facility (ACRF) at Argonne National Laboratory are discussed for the period from July 1986 through October 1986. The facility is currently supported by the Department of Energy, and is operated by the Mathematics and Computer Science Division at Argonne. Over the past four-month period, a new commercial multiprocessor, the Intel iPSC-VX/d4 hypercube was installed. In addition, four other commercial multiprocessors continue to be available for research - an Encore Multimax, a Sequent Balance 21000, an Alliant FX/8, and an Intel iPSC/d5 - as well as a locally designed multiprocessor, the Lemur. These machines are being actively used by scientists at Argonne and throughout the nation in a wide variety of projects concerning computer systems with parallel and vector architectures. A variety of classes, workshops, and seminars have been sponsored to train researchers on computing techniques for the advanced computer systems at the Advanced Computing Research Facility. For example, courses were offered on writing programs for parallel computer systems and hosted the first annual Alliant users group meeting. A Sequent users group meeting and a two-day workshop on performance evaluation of parallel computers and programs are being organized.

  11. 78 FR 29748 - Privacy Act of 1974; Computer Matching Program Between the Department of Education and the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-21

    ... Privacy Act of 1974; Computer Matching Program Between the Department of Education and the Department of... program. In accordance with the Privacy Act of 1974 (5 U.S.C. 552a), as amended by the Computer Matching... continuation of a computer matching program between the Department of Education and the Department of...

  12. A Review of Current Applications of Computer Systems in Campus Recreation Departments.

    ERIC Educational Resources Information Center

    Nesbitt, Gordon

    1986-01-01

    The use of computers in campus recreation is a trend that cannot be ignored. This paper reviews some uses of software for tabulating, controlling, and scheduling in college recreation departments. (MT)

  13. Advanced energy design and operation technologies research: Recommendations for a US Department of Energy multiyear program plan

    SciTech Connect

    Brambley, M.R.; Crawley, D.B.; Hostetler, D.D.; Stratton, R.C.; Addision, M.S.; Deringer, J.J.; Hall, J.D.; Selkowitz, S.E.

    1988-12-01

    This document describes recommendations for a multiyear plan developed for the US Department of Energy (DOE) as part of the Advanced Energy Design and Operation Technologies (AEDOT) project. The plan is an outgrowth of earlier planning activities conducted for DOE as part of design process research under the Building System Integration Program (BSIP). The proposed research will produce intelligent computer-based design and operation technologies for commercial buildings. In this document, the concept is explained, the need for these new computer-based environments is discussed, the benefits are described, and a plan for developing the AEDOT technologies is presented for the 9-year period beginning FY 1989. 45 refs., 37 figs., 9 tabs.

  14. 77 FR 27263 - Computer Matching Between the Selective Service System and the Department of Education

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-09

    ... From the Federal Register Online via the Government Publishing Office SELECTIVE SERVICE SYSTEM Computer Matching Between the Selective Service System and the Department of Education AGENCY: Selective... the Computer Matching and Privacy Protection Act of 1988 (Pub. L. 100-503), and the Office...

  15. Computer-Assisted Foreign Language Teaching and Learning: Technological Advances

    ERIC Educational Resources Information Center

    Zou, Bin; Xing, Minjie; Wang, Yuping; Sun, Mingyu; Xiang, Catherine H.

    2013-01-01

    Computer-Assisted Foreign Language Teaching and Learning: Technological Advances highlights new research and an original framework that brings together foreign language teaching, experiments and testing practices that utilize the most recent and widely used e-learning resources. This comprehensive collection of research will offer linguistic…

  16. The Federal Government's Role in Advancing Computer Technology

    ERIC Educational Resources Information Center

    Information Hotline, 1978

    1978-01-01

    As part of the Federal Data Processing Reorganization Study submitted by the Science and Technology Team, the Federal Government's role in advancing and diffusing computer technology is discussed. Findings and conclusions assess the state-of-the-art in government and in industry, and five recommendations provide directions for government policy…

  17. Advanced computational research in materials processing for design and manufacturing

    SciTech Connect

    Zacharia, T.

    1994-12-31

    The computational requirements for design and manufacture of automotive components have seen dramatic increases for producing automobiles with three times the mileage. Automotive component design systems are becoming increasingly reliant on structural analysis requiring both overall larger analysis and more complex analyses, more three-dimensional analyses, larger model sizes, and routine consideration of transient and non-linear effects. Such analyses must be performed rapidly to minimize delays in the design and development process, which drives the need for parallel computing. This paper briefly describes advanced computational research in superplastic forming and automotive crash worthiness.

  18. Recovery Act: Advanced Direct Methanol Fuel Cell for Mobile Computing

    SciTech Connect

    Fletcher, James H.; Cox, Philip; Harrington, William J; Campbell, Joseph L

    2013-09-03

    ABSTRACT Project Title: Recovery Act: Advanced Direct Methanol Fuel Cell for Mobile Computing PROJECT OBJECTIVE The objective of the project was to advance portable fuel cell system technology towards the commercial targets of power density, energy density and lifetime. These targets were laid out in the DOE’s R&D roadmap to develop an advanced direct methanol fuel cell power supply that meets commercial entry requirements. Such a power supply will enable mobile computers to operate non-stop, unplugged from the wall power outlet, by using the high energy density of methanol fuel contained in a replaceable fuel cartridge. Specifically this project focused on balance-of-plant component integration and miniaturization, as well as extensive component, subassembly and integrated system durability and validation testing. This design has resulted in a pre-production power supply design and a prototype that meet the rigorous demands of consumer electronic applications. PROJECT TASKS The proposed work plan was designed to meet the project objectives, which corresponded directly with the objectives outlined in the Funding Opportunity Announcement: To engineer the fuel cell balance-of-plant and packaging to meet the needs of consumer electronic systems, specifically at power levels required for mobile computing. UNF used existing balance-of-plant component technologies developed under its current US Army CERDEC project, as well as a previous DOE project completed by PolyFuel, to further refine them to both miniaturize and integrate their functionality to increase the system power density and energy density. Benefits of UNF’s novel passive water recycling MEA (membrane electrode assembly) and the simplified system architecture it enabled formed the foundation of the design approach. The package design was hardened to address orientation independence, shock, vibration, and environmental requirements. Fuel cartridge and fuel subsystems were improved to ensure effective fuel

  19. [Activities of Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2001-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administrations missions. RIACS is located at the NASA Ames Research Center, Moffett Field, California. RIACS research focuses on the three cornerstones of IT research necessary to meet the future challenges of NASA missions: 1. Automated Reasoning for Autonomous Systems Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. 2. Human-Centered Computing Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities. 3. High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to analysis of large scientific datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply IT research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, visiting scientist programs and student summer programs, designed to encourage and facilitate collaboration between the university and NASA IT research communities.

  20. Advances in Numerical Boundary Conditions for Computational Aeroacoustics

    NASA Technical Reports Server (NTRS)

    Tam, Christopher K. W.

    1997-01-01

    Advances in Computational Aeroacoustics (CAA) depend critically on the availability of accurate, nondispersive, least dissipative computation algorithm as well as high quality numerical boundary treatments. This paper focuses on the recent developments of numerical boundary conditions. In a typical CAA problem, one often encounters two types of boundaries. Because a finite computation domain is used, there are external boundaries. On the external boundaries, boundary conditions simulating the solution outside the computation domain are to be imposed. Inside the computation domain, there may be internal boundaries. On these internal boundaries, boundary conditions simulating the presence of an object or surface with specific acoustic characteristics are to be applied. Numerical boundary conditions, both external or internal, developed for simple model problems are reviewed and examined. Numerical boundary conditions for real aeroacoustic problems are also discussed through specific examples. The paper concludes with a description of some much needed research in numerical boundary conditions for CAA.

  1. Advanced sensor-computer technology for urban runoff monitoring

    NASA Astrophysics Data System (ADS)

    Yu, Byunggu; Behera, Pradeep K.; Ramirez Rochac, Juan F.

    2011-04-01

    The paper presents the project team's advanced sensor-computer sphere technology for real-time and continuous monitoring of wastewater runoff at the sewer discharge outfalls along the receiving water. This research significantly enhances and extends the previously proposed novel sensor-computer technology. This advanced technology offers new computation models for an innovative use of the sensor-computer sphere comprising accelerometer, programmable in-situ computer, solar power, and wireless communication for real-time and online monitoring of runoff quantity. This innovation can enable more effective planning and decision-making in civil infrastructure, natural environment protection, and water pollution related emergencies. The paper presents the following: (i) the sensor-computer sphere technology; (ii) a significant enhancement to the previously proposed discrete runoff quantity model of this technology; (iii) a new continuous runoff quantity model. Our comparative study on the two distinct models is presented. Based on this study, the paper further investigates the following: (1) energy-, memory-, and communication-efficient use of the technology for runoff monitoring; (2) possible sensor extensions for runoff quality monitoring.

  2. Taking the High Ground: A Case for Department of Defense Application of Public Cloud Computing

    DTIC Science & Technology

    2011-06-01

    maintain pace with advances in commercial IT. As Waxer observes, private infras - tructures can rarely match the service levels offered by public cloud...hypervisor (See Figure 3) runs directly on the host’s hardware and guest operating systems are installed one layer above it. VMWare ESXi and Microsoft Hyper ...cio.gov/documents/Federal-Cloud-Computing-Strategy.pdf. 8. D. Linthicum, Cloud Computing and SOA Convergence in Your Enterprise: A Step-by-Step Guide

  3. The Design and Implementation of NASA's Advanced Flight Computing Module

    NASA Technical Reports Server (NTRS)

    Alkakaj, Leon; Straedy, Richard; Jarvis, Bruce

    1995-01-01

    This paper describes a working flight computer Multichip Module developed jointly by JPL and TRW under their respective research programs in a collaborative fashion. The MCM is fabricated by nCHIP and is packaged within a 2 by 4 inch Al package from Coors. This flight computer module is one of three modules under development by NASA's Advanced Flight Computer (AFC) program. Further development of the Mass Memory and the programmable I/O MCM modules will follow. The three building block modules will then be stacked into a 3D MCM configuration. The mass and volume of the flight computer MCM achieved at 89 grams and 1.5 cubic inches respectively, represent a major enabling technology for future deep space as well as commercial remote sensing applications.

  4. Advanced Simulation and Computing FY17 Implementation Plan, Version 0

    SciTech Connect

    McCoy, Michel; Archer, Bill; Hendrickson, Bruce; Wade, Doug; Hoang, Thuc

    2016-08-29

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), and quantifying critical margins and uncertainties. Resolving each issue requires increasingly difficult analyses because the aging process has progressively moved the stockpile further away from the original test base. Where possible, the program also enables the use of high performance computing (HPC) and simulation tools to address broader national security needs, such as foreign nuclear weapon assessments and counter nuclear terrorism.

  5. Activities of the Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Oliger, Joseph

    1994-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. Research at RIACS is currently being done in the following areas: (1) parallel computing; (2) advanced methods for scientific computing; (3) high performance networks; and (4) learning systems. RIACS technical reports are usually preprints of manuscripts that have been submitted to research journals or conference proceedings. A list of these reports for the period January 1, 1994 through December 31, 1994 is in the Reports and Abstracts section of this report.

  6. Soft computing in design and manufacturing of advanced materials

    NASA Technical Reports Server (NTRS)

    Cios, Krzysztof J.; Baaklini, George Y; Vary, Alex

    1993-01-01

    The potential of fuzzy sets and neural networks, often referred to as soft computing, for aiding in all aspects of manufacturing of advanced materials like ceramics is addressed. In design and manufacturing of advanced materials, it is desirable to find which of the many processing variables contribute most to the desired properties of the material. There is also interest in real time quality control of parameters that govern material properties during processing stages. The concepts of fuzzy sets and neural networks are briefly introduced and it is shown how they can be used in the design and manufacturing processes. These two computational methods are alternatives to other methods such as the Taguchi method. The two methods are demonstrated by using data collected at NASA Lewis Research Center. Future research directions are also discussed.

  7. Advanced computer modeling techniques expand belt conveyor technology

    SciTech Connect

    Alspaugh, M.

    1998-07-01

    Increased mining production is continuing to challenge engineers and manufacturers to keep up. The pressure to produce larger and more versatile equipment is increasing. This paper will show some recent major projects in the belt conveyor industry that have pushed the limits of design and engineering technology. Also, it will discuss the systems engineering discipline and advanced computer modeling tools that have helped make these achievements possible. Several examples of technologically advanced designs will be reviewed. However, new technology can sometimes produce increased problems with equipment availability and reliability if not carefully developed. Computer modeling techniques that help one design larger equipment can also compound operational headaches if engineering processes and algorithms are not carefully analyzed every step of the way.

  8. Advances in Cross-Cutting Ideas for Computational Climate Science

    SciTech Connect

    Ng, Esmond; Evans, Katherine J.; Caldwell, Peter; Hoffman, Forrest M.; Jackson, Charles; Kerstin, Van Dam; Leung, Ruby; Martin, Daniel F.; Ostrouchov, George; Tuminaro, Raymond; Ullrich, Paul; Wild, S.; Williams, Samuel

    2017-01-01

    This report presents results from the DOE-sponsored workshop titled, ``Advancing X-Cutting Ideas for Computational Climate Science Workshop,'' known as AXICCS, held on September 12--13, 2016 in Rockville, MD. The workshop brought together experts in climate science, computational climate science, computer science, and mathematics to discuss interesting but unsolved science questions regarding climate modeling and simulation, promoted collaboration among the diverse scientists in attendance, and brainstormed about possible tools and capabilities that could be developed to help address them. Emerged from discussions at the workshop were several research opportunities that the group felt could advance climate science significantly. These include (1) process-resolving models to provide insight into important processes and features of interest and inform the development of advanced physical parameterizations, (2) a community effort to develop and provide integrated model credibility, (3) including, organizing, and managing increasingly connected model components that increase model fidelity yet complexity, and (4) treating Earth system models as one interconnected organism without numerical or data based boundaries that limit interactions. The group also identified several cross-cutting advances in mathematics, computer science, and computational science that would be needed to enable one or more of these big ideas. It is critical to address the need for organized, verified, and optimized software, which enables the models to grow and continue to provide solutions in which the community can have confidence. Effectively utilizing the newest computer hardware enables simulation efficiency and the ability to handle output from increasingly complex and detailed models. This will be accomplished through hierarchical multiscale algorithms in tandem with new strategies for data handling, analysis, and storage. These big ideas and cross-cutting technologies for enabling

  9. University Programs of the U.S. Department of Energy Advance Accelerator Applications Program

    SciTech Connect

    Beller, D. E.

    2002-01-01

    The Advanced Accelerator Applications (AAA) Program was initiated in fiscal year 2001 (FY01) by the U.S. Congress, the U.S. Department of Energy (DOE), and the Los Alamos National Laboratory (LANL) in partnership with other national laboratories. The primary goal of this program is to investigate the feasibility of accelerator-driven transmutation of nuclear waste (ATW). Because a large cadre of educated scientists and trained technicians will be needed to conduct the investigations of science and technology for transmutation, the AAA Program Office has begun a multi-year program to involve university faculty and students in various phases of the Project.

  10. High Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

    SciTech Connect

    Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

    2014-07-28

    The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space, and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing are essential for accurately modeling them. In the past decade, the US Department of Energy's SciDAC program has produced accelerator-modeling tools that have been employed to tackle some of the most difficult accelerator science problems. The authors discuss the Synergia framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable of handling the entire spectrum of beam dynamics simulations. Our authors present Synergia's design principles and its performance on HPC platforms.

  11. High Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

    DOE PAGES

    Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

    2014-07-28

    The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space, and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing are essential for accurately modeling them. In the past decade, the US Department of Energy's SciDAC program has produced accelerator-modeling tools that have been employed to tackle some of the most difficult accelerator science problems. The authors discuss the Synergia framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable ofmore » handling the entire spectrum of beam dynamics simulations. Our authors present Synergia's design principles and its performance on HPC platforms.« less

  12. Grand Challenges of Advanced Computing for Energy Innovation Report from the Workshop Held July 31-August 2, 2012

    SciTech Connect

    Larzelere, Alex R.; Ashby, Steven F.; Christensen, Dana C.; Crawford, Dona L.; Khaleel, Mohammad A.; John, Grosh; Stults, B. Ray; Lee, Steven L.; Hammond, Steven W.; Grover, Benjamin T.; Neely, Rob; Dudney, Lee Ann; Goldstein, Noah C.; Wells, Jack; Peltz, Jim

    2013-03-06

    On July 31-August 2 of 2012, the U.S. Department of Energy (DOE) held a workshop entitled Grand Challenges of Advanced Computing for Energy Innovation. This workshop built on three earlier workshops that clearly identified the potential for the Department and its national laboratories to enable energy innovation. The specific goal of the workshop was to identify the key challenges that the nation must overcome to apply the full benefit of taxpayer-funded advanced computing technologies to U.S. energy innovation in the ways that the country produces, moves, stores, and uses energy. Perhaps more importantly, the workshop also developed a set of recommendations to help the Department overcome those challenges. These recommendations provide an action plan for what the Department can do in the coming years to improve the nation’s energy future.

  13. Federal High Performance Computing and Communications Program. The Department of Energy Component.

    ERIC Educational Resources Information Center

    Department of Energy, Washington, DC. Office of Energy Research.

    This report, profusely illustrated with color photographs and other graphics, elaborates on the Department of Energy (DOE) research program in High Performance Computing and Communications (HPCC). The DOE is one of seven agency programs within the Federal Research and Development Program working on HPCC. The DOE HPCC program emphasizes research in…

  14. High-Performance Computing for Advanced Smart Grid Applications

    SciTech Connect

    Huang, Zhenyu; Chen, Yousu

    2012-07-06

    The power grid is becoming far more complex as a result of the grid evolution meeting an information revolution. Due to the penetration of smart grid technologies, the grid is evolving as an unprecedented speed and the information infrastructure is fundamentally improved with a large number of smart meters and sensors that produce several orders of magnitude larger amounts of data. How to pull data in, perform analysis, and put information out in a real-time manner is a fundamental challenge in smart grid operation and planning. The future power grid requires high performance computing to be one of the foundational technologies in developing the algorithms and tools for the significantly increased complexity. New techniques and computational capabilities are required to meet the demands for higher reliability and better asset utilization, including advanced algorithms and computing hardware for large-scale modeling, simulation, and analysis. This chapter summarizes the computational challenges in smart grid and the need for high performance computing, and present examples of how high performance computing might be used for future smart grid operation and planning.

  15. The U.S. Department of Energy`s advanced turbine systems program

    SciTech Connect

    Layne, A.W.; Layne, P.W.

    1998-06-01

    Advanced Turbine Systems (ATS) are poised to capture the majority of new electric power generation capacity well into the next century. US Department of Energy (DOE) programs supporting the development of ATS technology will enable gas turbine manufacturers to provide ATS systems to the commercial marketplace at the turn of the next century. A progress report on the ATS Program will he presented in this paper. The technical challenges, advanced critical technology requirements, and system configurations meeting the goals of the program will be discussed. Progress has been made in the are as of materials, heat transfer, aerodynamics, and combustion. Applied research conducted by universities, industry, and Government has resulted in advanced designs and power cycle configurations to develop an ATS which operates on natural gas, coal, and biomass fuels. Details on the ATS Program research, development, and technology validation and readiness activities will be presented. The future direction of the program and relationship to other Government programs will be discussed in this paper.

  16. Inspiring engineering minds to advance human health: the Henry Samueli School of Engineering's Department of BME.

    PubMed

    Lee, Abraham; Wirtanen, Erik

    2012-07-01

    The growth of biomedical engineering at The Henry Samueli School of Engineering at the University of California, Irvine (UCI) has been rapid since the Center for Biomedical Engineering was first formed in 1998 [and was later renamed as the Department of Biomedical Engineering (BME) in 2002]. Our current mission statement, “Inspiring Engineering Minds to Advance Human Health,” serves as a reminder of why we exist, what we do, and the core principles that we value and by which we abide. BME exists to advance the state of human health via engineering innovation and practices. To attain our goal, we are empowering our faculty to inspire and mobilize our students to address health problems. We treasure the human being, particularly the human mind and health. We believe that BME is where minds are nurtured, challenged, and disciplined, and it is also where the health of the human is held as a core mission value that deserves our utmost priority (Figure 1). Advancing human health is not a theoretical practice; it requires bridging between disciplines (engineering and medicine) and between communities (academic and industry).

  17. Computation of the tip vortex flowfield for advanced aircraft propellers

    NASA Technical Reports Server (NTRS)

    Tsai, Tommy M.; Dejong, Frederick J.; Levy, Ralph

    1988-01-01

    The tip vortex flowfield plays a significant role in the performance of advanced aircraft propellers. The flowfield in the tip region is complex, three-dimensional and viscous with large secondary velocities. An analysis is presented using an approximate set of equations which contains the physics required by the tip vortex flowfield, but which does not require the resources of the full Navier-Stokes equations. A computer code was developed to predict the tip vortex flowfield of advanced aircraft propellers. A grid generation package was developed to allow specification of a variety of advanced aircraft propeller shapes. Calculations of the tip vortex generation on an SR3 type blade at high Reynolds numbers were made using this code and a parametric study was performed to show the effect of tip thickness on tip vortex intensity. In addition, calculations of the tip vortex generation on a NACA 0012 type blade were made, including the flowfield downstream of the blade trailing edge. Comparison of flowfield calculations with experimental data from an F4 blade was made. A user's manual was also prepared for the computer code (NASA CR-182178).

  18. 78 FR 71591 - Privacy Act of 1974; Computer Matching Program between the U.S. Department of Education (ED) and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-29

    ... Privacy Act of 1974; Computer Matching Program between the U.S. Department of Education (ED) and the U.S..., the Computer Matching and Privacy Protection Act of 1988, 54 FR 25818 (June 19, 1989), and OMB... hereby given of the renewal of the computer matching program between the U.S. Department of Education...

  19. XII Advanced Computing and Analysis Techniques in Physics Research

    NASA Astrophysics Data System (ADS)

    Speer, Thomas; Carminati, Federico; Werlen, Monique

    November 2008 will be a few months after the official start of LHC when the highest quantum energy ever produced by mankind will be observed by the most complex piece of scientific equipment ever built. LHC will open a new era in physics research and push further the frontier of Knowledge This achievement has been made possible by new technological developments in many fields, but computing is certainly the technology that has made possible this whole enterprise. Accelerator and detector design, construction management, data acquisition, detectors monitoring, data analysis, event simulation and theoretical interpretation are all computing based HEP activities but also occurring many other research fields. Computing is everywhere and forms the common link between all involved scientists and engineers. The ACAT workshop series, created back in 1990 as AIHENP (Artificial Intelligence in High Energy and Nuclear Research) has been covering the tremendous evolution of computing in its most advanced topics, trying to setup bridges between computer science, experimental and theoretical physics. Conference web-site: http://acat2008.cern.ch/ Programme and presentations: http://indico.cern.ch/conferenceDisplay.py?confId=34666

  20. High resolution computed tomography of advanced composite and ceramic materials

    NASA Technical Reports Server (NTRS)

    Yancey, R. N.; Klima, S. J.

    1991-01-01

    Advanced composite and ceramic materials are being developed for use in many new defense and commercial applications. In order to achieve the desired mechanical properties of these materials, the structural elements must be carefully analyzed and engineered. A study was conducted to evaluate the use of high resolution computed tomography (CT) as a macrostructural analysis tool for advanced composite and ceramic materials. Several samples were scanned using a laboratory high resolution CT scanner. Samples were also destructively analyzed at the locations of the scans and the nondestructive and destructive results were compared. The study provides useful information outlining the strengths and limitations of this technique and the prospects for further research in this area.

  1. DOE Advanced Scientific Computing Advisory Subcommittee (ASCAC) Report: Top Ten Exascale Research Challenges

    SciTech Connect

    Lucas, Robert; Ang, James; Bergman, Keren; Borkar, Shekhar; Carlson, William; Carrington, Laura; Chiu, George; Colwell, Robert; Dally, William; Dongarra, Jack; Geist, Al; Haring, Rud; Hittinger, Jeffrey; Hoisie, Adolfy; Klein, Dean Micron; Kogge, Peter; Lethin, Richard; Sarkar, Vivek; Schreiber, Robert; Shalf, John; Sterling, Thomas; Stevens, Rick; Bashor, Jon; Brightwell, Ron; Coteus, Paul; Debenedictus, Erik; Hiller, Jon; Kim, K. H.; Langston, Harper; Murphy, Richard Micron; Webster, Clayton; Wild, Stefan; Grider, Gary; Ross, Rob; Leyffer, Sven; Laros III, James

    2014-02-10

    Exascale computing systems are essential for the scientific fields that will transform the 21st century global economy, including energy, biotechnology, nanotechnology, and materials science. Progress in these fields is predicated on the ability to perform advanced scientific and engineering simulations, and analyze the deluge of data. On July 29, 2013, ASCAC was charged by Patricia Dehmer, the Acting Director of the Office of Science, to assemble a subcommittee to provide advice on exascale computing. This subcommittee was directed to return a list of no more than ten technical approaches (hardware and software) that will enable the development of a system that achieves the Department's goals for exascale computing. Numerous reports over the past few years have documented the technical challenges and the non¬-viability of simply scaling existing computer designs to reach exascale. The technical challenges revolve around energy consumption, memory performance, resilience, extreme concurrency, and big data. Drawing from these reports and more recent experience, this ASCAC subcommittee has identified the top ten computing technology advancements that are critical to making a capable, economically viable, exascale system.

  2. Advances in Computational Stability Analysis of Composite Aerospace Structures

    SciTech Connect

    Degenhardt, R.; Araujo, F. C. de

    2010-09-30

    European aircraft industry demands for reduced development and operating costs. Structural weight reduction by exploitation of structural reserves in composite aerospace structures contributes to this aim, however, it requires accurate and experimentally validated stability analysis of real structures under realistic loading conditions. This paper presents different advances from the area of computational stability analysis of composite aerospace structures which contribute to that field. For stringer stiffened panels main results of the finished EU project COCOMAT are given. It investigated the exploitation of reserves in primary fibre composite fuselage structures through an accurate and reliable simulation of postbuckling and collapse. For unstiffened cylindrical composite shells a proposal for a new design method is presented.

  3. Recent Advances in Computed Tomographic Technology: Cardiopulmonary Imaging Applications.

    PubMed

    Tabari, Azadeh; Lo Gullo, Roberto; Murugan, Venkatesh; Otrakji, Alexi; Digumarthy, Subba; Kalra, Mannudeep

    2017-03-01

    Cardiothoracic diseases result in substantial morbidity and mortality. Chest computed tomography (CT) has been an imaging modality of choice for assessing a host of chest diseases, and technologic advances have enabled the emergence of coronary CT angiography as a robust noninvasive test for cardiac imaging. Technologic developments in CT have also enabled the application of dual-energy CT scanning for assessing pulmonary vascular and neoplastic processes. Concerns over increasing radiation dose from CT scanning are being addressed with introduction of more dose-efficient wide-area detector arrays and iterative reconstruction techniques. This review article discusses the technologic innovations in CT and their effect on cardiothoracic applications.

  4. Advanced Computer Science on Internal Ballistics of Solid Rocket Motors

    NASA Astrophysics Data System (ADS)

    Shimada, Toru; Kato, Kazushige; Sekino, Nobuhiro; Tsuboi, Nobuyuki; Seike, Yoshio; Fukunaga, Mihoko; Daimon, Yu; Hasegawa, Hiroshi; Asakawa, Hiroya

    In this paper, described is the development of a numerical simulation system, what we call “Advanced Computer Science on SRM Internal Ballistics (ACSSIB)”, for the purpose of improvement of performance and reliability of solid rocket motors (SRM). The ACSSIB system is consisting of a casting simulation code of solid propellant slurry, correlation database of local burning-rate of cured propellant in terms of local slurry flow characteristics, and a numerical code for the internal ballistics of SRM, as well as relevant hardware. This paper describes mainly the objectives, the contents of this R&D, and the output of the fiscal year of 2008.

  5. Advances in Electromagnetic Modelling through High Performance Computing

    SciTech Connect

    Ko, K.; Folwell, N.; Ge, L.; Guetz, A.; Lee, L.; Li, Z.; Ng, C.; Prudencio, E.; Schussman, G.; Uplenchwar, R.; Xiao, L.; /SLAC

    2006-03-29

    Under the DOE SciDAC project on Accelerator Science and Technology, a suite of electromagnetic codes has been under development at SLAC that are based on unstructured grids for higher accuracy, and use parallel processing to enable large-scale simulation. The new modeling capability is supported by SciDAC collaborations on meshing, solvers, refinement, optimization and visualization. These advances in computational science are described and the application of the parallel eigensolver Omega3P to the cavity design for the International Linear Collider is discussed.

  6. Computational methods of the Advanced Fluid Dynamics Model

    SciTech Connect

    Bohl, W.R.; Wilhelm, D.; Parker, F.R.; Berthier, J.; Maudlin, P.J.; Schmuck, P.; Goutagny, L.; Ichikawa, S.; Ninokata, H.; Luck, L.B.

    1987-01-01

    To more accurately treat severe accidents in fast reactors, a program has been set up to investigate new computational models and approaches. The product of this effort is a computer code, the Advanced Fluid Dynamics Model (AFDM). This paper describes some of the basic features of the numerical algorithm used in AFDM. Aspects receiving particular emphasis are the fractional-step method of time integration, the semi-implicit pressure iteration, the virtual mass inertial terms, the use of three velocity fields, higher order differencing, convection of interfacial area with source and sink terms, multicomponent diffusion processes in heat and mass transfer, the SESAME equation of state, and vectorized programming. A calculated comparison with an isothermal tetralin/ammonia experiment is performed. We conclude that significant improvements are possible in reliably calculating the progression of severe accidents with further development.

  7. DOE Advanced Scientific Computing Advisory Committee (ASCAC) Subcommittee Report on Scientific and Technical Information

    SciTech Connect

    Hey, Tony; Agarwal, Deborah; Borgman, Christine; Cartaro, Concetta; Crivelli, Silvia; Van Dam, Kerstin Kleese; Luce, Richard; Arjun, Shankar; Trefethen, Anne; Wade, Alex; Williams, Dean

    2015-09-04

    The Advanced Scientific Computing Advisory Committee (ASCAC) was charged to form a standing subcommittee to review the Department of Energy’s Office of Scientific and Technical Information (OSTI) and to begin by assessing the quality and effectiveness of OSTI’s recent and current products and services and to comment on its mission and future directions in the rapidly changing environment for scientific publication and data. The Committee met with OSTI staff and reviewed available products, services and other materials. This report summaries their initial findings and recommendations.

  8. Computational ocean acoustics: Advances in 3D ocean acoustic modeling

    NASA Astrophysics Data System (ADS)

    Schmidt, Henrik; Jensen, Finn B.

    2012-11-01

    The numerical model of ocean acoustic propagation developed in the 1980's are still in widespread use today, and the field of computational ocean acoustics is often considered a mature field. However, the explosive increase in computational power available to the community has created opportunities for modeling phenomena that earlier were beyond reach. Most notably, three-dimensional propagation and scattering problems have been prohibitive computationally, but are now addressed routinely using brute force numerical approaches such as the Finite Element Method, in particular for target scattering problems, where they are being combined with the traditional wave theory propagation models in hybrid modeling frameworks. Also, recent years has seen the development of hybrid approaches coupling oceanographic circulation models with acoustic propagation models, enabling the forecasting of sonar performance uncertainty in dynamic ocean environments. These and other advances made over the last couple of decades support the notion that the field of computational ocean acoustics is far from being mature. [Work supported by the Office of Naval Research, Code 321OA].

  9. Race and emotion in computer-based HIV prevention videos for emergency department patients.

    PubMed

    Aronson, Ian David; Bania, Theodore C

    2011-04-01

    Computer-based video provides a valuable tool for HIV prevention in hospital emergency departments. However, the type of video content and protocol that will be most effective remain underexplored and the subject of debate. This study employs a new and highly replicable methodology that enables comparisons of multiple video segments, each based on conflicting theories of multimedia learning. Patients in the main treatment areas of a large urban hospital's emergency department used handheld computers running custom-designed software to view video segments and respond to pre-intervention and postintervention data collection items. The videos examine whether participants learn more depending on the race of the person who appears onscreen and whether positive or negative emotional content better facilitates learning. The results indicate important differences by participant race. African American participants responded better to video segments depicting White people. White participants responded better to positive emotional content.

  10. Computational cardiology: how computer simulations could be used to develop new therapies and advance existing ones

    PubMed Central

    Trayanova, Natalia A.; O'Hara, Thomas; Bayer, Jason D.; Boyle, Patrick M.; McDowell, Kathleen S.; Constantino, Jason; Arevalo, Hermenegild J.; Hu, Yuxuan; Vadakkumpadan, Fijoy

    2012-01-01

    This article reviews the latest developments in computational cardiology. It focuses on the contribution of cardiac modelling to the development of new therapies as well as the advancement of existing ones for cardiac arrhythmias and pump dysfunction. Reviewed are cardiac modelling efforts aimed at advancing and optimizing existent therapies for cardiac disease (defibrillation, ablation of ventricular tachycardia, and cardiac resynchronization therapy) and at suggesting novel treatments, including novel molecular targets, as well as efforts to use cardiac models in stratification of patients likely to benefit from a given therapy, and the use of models in diagnostic procedures. PMID:23104919

  11. 78 FR 54875 - Privacy Act of 1974; Computer Matching Program Between the Department of Education (ED) and the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-06

    ... Privacy Act of 1974; Computer Matching Program Between the Department of Education (ED) and the Social... Computer Matching and Privacy Protection Act of 1988, the Computer Matching and Privacy Protections... Computer Matching and Privacy Protection Act of 1988, published in the Federal Register on June 19,...

  12. Recent advances in computational mechanics of the human knee joint.

    PubMed

    Kazemi, M; Dabiri, Y; Li, L P

    2013-01-01

    Computational mechanics has been advanced in every area of orthopedic biomechanics. The objective of this paper is to provide a general review of the computational models used in the analysis of the mechanical function of the knee joint in different loading and pathological conditions. Major review articles published in related areas are summarized first. The constitutive models for soft tissues of the knee are briefly discussed to facilitate understanding the joint modeling. A detailed review of the tibiofemoral joint models is presented thereafter. The geometry reconstruction procedures as well as some critical issues in finite element modeling are also discussed. Computational modeling can be a reliable and effective method for the study of mechanical behavior of the knee joint, if the model is constructed correctly. Single-phase material models have been used to predict the instantaneous load response for the healthy knees and repaired joints, such as total and partial meniscectomies, ACL and PCL reconstructions, and joint replacements. Recently, poromechanical models accounting for fluid pressurization in soft tissues have been proposed to study the viscoelastic response of the healthy and impaired knee joints. While the constitutive modeling has been considerably advanced at the tissue level, many challenges still exist in applying a good material model to three-dimensional joint simulations. A complete model validation at the joint level seems impossible presently, because only simple data can be obtained experimentally. Therefore, model validation may be concentrated on the constitutive laws using multiple mechanical tests of the tissues. Extensive model verifications at the joint level are still crucial for the accuracy of the modeling.

  13. Recent Advances in Computational Mechanics of the Human Knee Joint

    PubMed Central

    Kazemi, M.; Dabiri, Y.; Li, L. P.

    2013-01-01

    Computational mechanics has been advanced in every area of orthopedic biomechanics. The objective of this paper is to provide a general review of the computational models used in the analysis of the mechanical function of the knee joint in different loading and pathological conditions. Major review articles published in related areas are summarized first. The constitutive models for soft tissues of the knee are briefly discussed to facilitate understanding the joint modeling. A detailed review of the tibiofemoral joint models is presented thereafter. The geometry reconstruction procedures as well as some critical issues in finite element modeling are also discussed. Computational modeling can be a reliable and effective method for the study of mechanical behavior of the knee joint, if the model is constructed correctly. Single-phase material models have been used to predict the instantaneous load response for the healthy knees and repaired joints, such as total and partial meniscectomies, ACL and PCL reconstructions, and joint replacements. Recently, poromechanical models accounting for fluid pressurization in soft tissues have been proposed to study the viscoelastic response of the healthy and impaired knee joints. While the constitutive modeling has been considerably advanced at the tissue level, many challenges still exist in applying a good material model to three-dimensional joint simulations. A complete model validation at the joint level seems impossible presently, because only simple data can be obtained experimentally. Therefore, model validation may be concentrated on the constitutive laws using multiple mechanical tests of the tissues. Extensive model verifications at the joint level are still crucial for the accuracy of the modeling. PMID:23509602

  14. Temporal and spatial organization of doctors' computer usage in a UK hospital department.

    PubMed

    Martins, H M G; Nightingale, P; Jones, M R

    2005-06-01

    This paper describes the use of an application accessible via distributed desktop computing and wireless mobile devices in a specialist department of a UK acute hospital. Data (application logs, in-depth interviews, and ethnographic observation) were simultaneously collected to study doctors' work via this application, when and where they accessed different areas of it, and from what computing devices. These show that the application is widely used, but in significantly different ways over time and space. For example, physicians and surgeons differ in how they use the application and in their choice of mobile or desktop computing. Consultants and junior doctors in the same teams also seem to access different sources of patient information, at different times, and from different locations. Mobile technology was used almost exclusively during the morning by groups of clinicians, predominantly for ward rounds.

  15. NASA Trapezoidal Wing Computations Including Transition and Advanced Turbulence Modeling

    NASA Technical Reports Server (NTRS)

    Rumsey, C. L.; Lee-Rausch, E. M.

    2012-01-01

    Flow about the NASA Trapezoidal Wing is computed with several turbulence models by using grids from the first High Lift Prediction Workshop in an effort to advance understanding of computational fluid dynamics modeling for this type of flowfield. Transition is accounted for in many of the computations. In particular, a recently-developed 4-equation transition model is utilized and works well overall. Accounting for transition tends to increase lift and decrease moment, which improves the agreement with experiment. Upper surface flap separation is reduced, and agreement with experimental surface pressures and velocity profiles is improved. The predicted shape of wakes from upstream elements is strongly influenced by grid resolution in regions above the main and flap elements. Turbulence model enhancements to account for rotation and curvature have the general effect of increasing lift and improving the resolution of the wing tip vortex as it convects downstream. However, none of the models improve the prediction of surface pressures near the wing tip, where more grid resolution is needed.

  16. OPENING REMARKS: SciDAC: Scientific Discovery through Advanced Computing

    NASA Astrophysics Data System (ADS)

    Strayer, Michael

    2005-01-01

    with industry and virtual prototyping. New instruments of collaboration will include institutes and centers while summer schools, workshops and outreach will invite new talent and expertise. Computational science adds new dimensions to science and its practice. Disciplines of fusion, accelerator science, and combustion are poised to blur the boundaries between pure and applied science. As we open the door into FY2006 we shall see a landscape of new scientific challenges: in biology, chemistry, materials, and astrophysics to name a few. The enabling technologies of SciDAC have been transformational as drivers of change. Planning for major new software systems assumes a base line employing Common Component Architectures and this has become a household word for new software projects. While grid algorithms and mesh refinement software have transformed applications software, data management and visualization have transformed our understanding of science from data. The Gordon Bell prize now seems to be dominated by computational science and solvers developed by TOPS ISIC. The priorities of the Office of Science in the Department of Energy are clear. The 20 year facilities plan is driven by new science. High performance computing is placed amongst the two highest priorities. Moore's law says that by the end of the next cycle of SciDAC we shall have peta-flop computers. The challenges of petascale computing are enormous. These and the associated computational science are the highest priorities for computing within the Office of Science. Our effort in Leadership Class computing is just a first step towards this goal. Clearly, computational science at this scale will face enormous challenges and possibilities. Performance evaluation and prediction will be critical to unraveling the needed software technologies. We must not lose sight of our overarching goal—that of scientific discovery. Science does not stand still and the landscape of science discovery and computing holds

  17. High performance parallel computers for science: New developments at the Fermilab advanced computer program

    SciTech Connect

    Nash, T.; Areti, H.; Atac, R.; Biel, J.; Cook, A.; Deppe, J.; Edel, M.; Fischler, M.; Gaines, I.; Hance, R.

    1988-08-01

    Fermilab's Advanced Computer Program (ACP) has been developing highly cost effective, yet practical, parallel computers for high energy physics since 1984. The ACP's latest developments are proceeding in two directions. A Second Generation ACP Multiprocessor System for experiments will include $3500 RISC processors each with performance over 15 VAX MIPS. To support such high performance, the new system allows parallel I/O, parallel interprocess communication, and parallel host processes. The ACP Multi-Array Processor, has been developed for theoretical physics. Each $4000 node is a FORTRAN or C programmable pipelined 20 MFlops (peak), 10 MByte single board computer. These are plugged into a 16 port crossbar switch crate which handles both inter and intra crate communication. The crates are connected in a hypercube. Site oriented applications like lattice gauge theory are supported by system software called CANOPY, which makes the hardware virtually transparent to users. A 256 node, 5 GFlop, system is under construction. 10 refs., 7 figs.

  18. The Role of a Computer Science Department in the Use of the Computer in Undergraduate Curricula at a Small Liberal Arts College.

    ERIC Educational Resources Information Center

    Keller, Mary K.

    There are several ways in which the computer science department at the small liberal arts college can contribute to the richness of the institution's undergraduate curriculum. In addition to providing training for students interested in computer-related careers, the department, by offering courses for non-majors in the field, can broaden the…

  19. Advanced information processing system: Inter-computer communication services

    NASA Technical Reports Server (NTRS)

    Burkhardt, Laura; Masotto, Tom; Sims, J. Terry; Whittredge, Roy; Alger, Linda S.

    1991-01-01

    The purpose is to document the functional requirements and detailed specifications for the Inter-Computer Communications Services (ICCS) of the Advanced Information Processing System (AIPS). An introductory section is provided to outline the overall architecture and functional requirements of the AIPS and to present an overview of the ICCS. An overview of the AIPS architecture as well as a brief description of the AIPS software is given. The guarantees of the ICCS are provided, and the ICCS is described as a seven-layered International Standards Organization (ISO) Model. The ICCS functional requirements, functional design, and detailed specifications as well as each layer of the ICCS are also described. A summary of results and suggestions for future work are presented.

  20. Recent advances in computational structural reliability analysis methods

    NASA Technical Reports Server (NTRS)

    Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.

    1993-01-01

    The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.

  1. Reliability of an interactive computer program for advance care planning.

    PubMed

    Schubart, Jane R; Levi, Benjamin H; Camacho, Fabian; Whitehead, Megan; Farace, Elana; Green, Michael J

    2012-06-01

    Despite widespread efforts to promote advance directives (ADs), completion rates remain low. Making Your Wishes Known: Planning Your Medical Future (MYWK) is an interactive computer program that guides individuals through the process of advance care planning, explaining health conditions and interventions that commonly involve life or death decisions, helps them articulate their values/goals, and translates users' preferences into a detailed AD document. The purpose of this study was to demonstrate that (in the absence of major life changes) the AD generated by MYWK reliably reflects an individual's values/preferences. English speakers ≥30 years old completed MYWK twice, 4 to 6 weeks apart. Reliability indices were assessed for three AD components: General Wishes; Specific Wishes for treatment; and Quality-of-Life values (QoL). Twenty-four participants completed the study. Both the Specific Wishes and QoL scales had high internal consistency in both time periods (Knuder Richardson formula 20 [KR-20]=0.83-0.95, and 0.86-0.89). Test-retest reliability was perfect for General Wishes (κ=1), high for QoL (Pearson's correlation coefficient=0.83), but lower for Specific Wishes (Pearson's correlation coefficient=0.57). MYWK generates an AD where General Wishes and QoL (but not Specific Wishes) statements remain consistent over time.

  2. Linear-drive cryocoolers for the Department of Defense standard advanced dewar assembly (SADA)

    NASA Astrophysics Data System (ADS)

    Tate, Garin S.

    2005-05-01

    The Standard Advanced Dewar Assembly (SADA) is the critical module in the Department of Defense (DoD) standardization of scanning second-generation thermal imaging systems. The DoD has established a family of SADAs to fulfill a range of performance requirements for various platforms. The SADA consists of the Infrared Focal Plane Array (IRFPA), Dewar, Command & Control Electronics (C&CE), and the cryogenic cooler, and is used in platforms such as the Apache helicopter, the M1A2 Abrams main battle tank, the M2 Bradley Infantry Fighting Vehicle, and the Javelin Command Launch Unit (CLU). In support of the family of SADAs, the DoD defined a complementary family of tactical linear drive cryocoolers. The Stirling cycle linear drive cryocoolers are utilized to cool the Infrared Focal Plane Arrays (IRFPAs) in the SADAs. These coolers are required to have low input power, a quick cool-down time, low vibration output, low audible noise, and a higher reliability than currently fielded rotary coolers. These coolers must also operate in a military environment with its inherent high vibration level and temperature extremes. This paper will (1) outline the characteristics of each cryocooler, (2) present the status and results of qualification tests, (3) present the status of production efforts, and (4) present the status of efforts to increase linear drive cooler reliability.

  3. Lessons learned from U.S. Department of Defense 911-Bio Advanced Concept Technology Demonstrations.

    SciTech Connect

    Baldwin, T.; Gasper, W.; Lacher, L.; Newsom, D.; Yantosik, G.

    1999-07-06

    The US Department of Defense (DoD), in cooperation with other federal agencies, has taken many initiatives to improve its ability to support civilian response to a domestic biological terrorism incident. This paper discusses one initiative, the 911-Bio Advanced Concept Technology Demonstrations (ACTDs), conducted by the Office of the Secretary of Defense during 1997 to better understand: (1) the capability of newly developed chemical and biological collection and identification technologies in a field environment; (2) the ability of specialized DoD response teams to use these new technologies within the structure of cooperating DoD and civilian consequence management organizations; and (3) the adequacy of current modeling tools for predicting the dispersal of biological hazards. This paper discusses the experience of the ACTDs from the civilian community support perspective. The 911-Bio ACTD project provided a valuable opportunity for DoD and civilian officials to learn how they should use their combined capabilities to manage the aftermath of a domestic biological terrorism incident.

  4. Consensus statement on advancing research in emergency department operations and its impact on patient care.

    PubMed

    Yiadom, Maame Yaa A B; Ward, Michael J; Chang, Anna Marie; Pines, Jesse M; Jouriles, Nick; Yealy, Donald M

    2015-06-01

    The consensus conference on "Advancing Research in Emergency Department (ED) Operations and Its Impact on Patient Care," hosted by The ED Operations Study Group (EDOSG), convened to craft a framework for future investigations in this important but understudied area. The EDOSG is a research consortium dedicated to promoting evidence-based clinical practice in emergency medicine. The consensus process format was a modified version of the NIH Model for Consensus Conference Development. Recommendations provide an action plan for how to improve ED operations study design, create a facilitating research environment, identify data measures of value for process and outcomes research, and disseminate new knowledge in this area. Specifically, we call for eight key initiatives: 1) the development of universal measures for ED patient care processes; 2) attention to patient outcomes, in addition to process efficiency and best practice compliance; 3) the promotion of multisite clinical operations studies to create more generalizable knowledge; 4) encouraging the use of mixed methods to understand the social community and human behavior factors that influence ED operations; 5) the creation of robust ED operations research registries to drive stronger evidence-based research; 6) prioritizing key clinical questions with the input of patients, clinicians, medical leadership, emergency medicine organizations, payers, and other government stakeholders; 7) more consistently defining the functional components of the ED care system, including observation units, fast tracks, waiting rooms, laboratories, and radiology subunits; and 8) maximizing multidisciplinary knowledge dissemination via emergency medicine, public health, general medicine, operations research, and nontraditional publications.

  5. Using advanced computer vision algorithms on small mobile robots

    NASA Astrophysics Data System (ADS)

    Kogut, G.; Birchmore, F.; Biagtan Pacis, E.; Everett, H. R.

    2006-05-01

    The Technology Transfer project employs a spiral development process to enhance the functionality and autonomy of mobile robot systems in the Joint Robotics Program (JRP) Robotic Systems Pool by converging existing component technologies onto a transition platform for optimization. An example of this approach is the implementation of advanced computer vision algorithms on small mobile robots. We demonstrate the implementation and testing of the following two algorithms useful on mobile robots: 1) object classification using a boosted Cascade of classifiers trained with the Adaboost training algorithm, and 2) human presence detection from a moving platform. Object classification is performed with an Adaboost training system developed at the University of California, San Diego (UCSD) Computer Vision Lab. This classification algorithm has been used to successfully detect the license plates of automobiles in motion in real-time. While working towards a solution to increase the robustness of this system to perform generic object recognition, this paper demonstrates an extension to this application by detecting soda cans in a cluttered indoor environment. The human presence detection from a moving platform system uses a data fusion algorithm which combines results from a scanning laser and a thermal imager. The system is able to detect the presence of humans while both the humans and the robot are moving simultaneously. In both systems, the two aforementioned algorithms were implemented on embedded hardware and optimized for use in real-time. Test results are shown for a variety of environments.

  6. Advances in the computational study of language acquisition.

    PubMed

    Brent, M R

    1996-01-01

    This paper provides a tutorial introduction to computational studies of how children learn their native languages. Its aim is to make recent advances accessible to the broader research community, and to place them in the context of current theoretical issues. The first section locates computational studies and behavioral studies within a common theoretical framework. The next two sections review two papers that appear in this volume: one on learning the meanings of words and one or learning the sounds of words. The following section highlights an idea which emerges independently in these two papers and which I have dubbed autonomous bootstrapping. Classical bootstrapping hypotheses propose that children begin to get a toc-hold in a particular linguistic domain, such as syntax, by exploiting information from another domain, such as semantics. Autonomous bootstrapping complements the cross-domain acquisition strategies of classical bootstrapping with strategies that apply within a single domain. Autonomous bootstrapping strategies work by representing partial and/or uncertain linguistic knowledge and using it to analyze the input. The next two sections review two more more contributions to this special issue: one on learning word meanings via selectional preferences and one on algorithms for setting grammatical parameters. The final section suggests directions for future research.

  7. 75 FR 68396 - Privacy Act of 1974, as Amended; Computer Matching Program (SSA/Department of Labor (DOL))-Match...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-05

    ... ADMINISTRATION Privacy Act of 1974, as Amended; Computer Matching Program (SSA/ Department of Labor (DOL))--Match... of the Privacy Act, as amended, this notice announces a renewal of an existing computer matching... INFORMATION: A. General The Computer Matching and Privacy Protection Act of 1988 (Pub. L. 100-503),...

  8. 77 FR 24756 - Privacy Act of 1974, as Amended; Computer Matching Program (SSA/Department of Labor (DOL))-Match...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-25

    ... ADMINISTRATION Privacy Act of 1974, as Amended; Computer Matching Program (SSA/ Department of Labor (DOL))--Match... Privacy Act, as amended, this notice announces a renewal of an existing computer matching program that we... Counsel, SSA, as shown above. SUPPLEMENTARY INFORMATION: A. General The Computer Matching and...

  9. 76 FR 39119 - Privacy Act of 1974; Notice of a Computer Matching Program Between the Department of Housing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-05

    ... URBAN DEVELOPMENT Privacy Act of 1974; Notice of a Computer Matching Program Between the Department of... ED. SUMMARY: In accordance with the Privacy Act of 1974 (5 U.S.C. 552a), as amended by the Computer... Program In accordance with the Computer Matching and Privacy Protection Act of 1988 (Pub. L. 100-503),...

  10. 77 FR 32709 - Privacy Act of 1974, as Amended; Computer Matching Program (SSA/Department of Homeland Security...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-01

    ... ADMINISTRATION Privacy Act of 1974, as Amended; Computer Matching Program (SSA/ Department of Homeland Security... the provisions of the Privacy Act, as amended, this notice announces a renewal of an existing computer.... SUPPLEMENTARY INFORMATION: A. General The Computer Matching and Privacy Protection Act of 1988 (Public Law...

  11. 77 FR 24757 - Privacy Act of 1974, as Amended; Computer Matching Program (SSA/Department of Labor (DOL))-Match...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-25

    ... ADMINISTRATION Privacy Act of 1974, as Amended; Computer Matching Program (SSA/ Department of Labor (DOL))--Match.... SUPPLEMENTARY INFORMATION: A. General The Computer Matching and Privacy Protection Act of 1988 (Public Law (Pub.... The Privacy Act, as amended, regulates the use of computer matching by Federal agencies when...

  12. 75 FR 67755 - Privacy Act of 1974; Notice of a Computer Matching Program Between the Department of Housing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-03

    ... URBAN DEVELOPMENT Privacy Act of 1974; Notice of a Computer Matching Program Between the Department of... the Computer Matching and Privacy Protection Act of 1988 (Pub. L. 100-503), and the Office of... accordance with the Computer Matching and Privacy Protection Act of 1988 (Pub. L. 100-503), as amended,...

  13. 78 FR 12128 - Privacy Act of 1974; Computer Matching Program (SSA/Department of the Treasury, Internal Revenue...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-21

    ... ADMINISTRATION Privacy Act of 1974; Computer Matching Program (SSA/Department of the Treasury, Internal Revenue... Privacy Act, (5 U.S.C. 552a), this notice announces a renewal of an existing computer matching program..., as shown above. SUPPLEMENTARY INFORMATION: A. General The Computer Matching and Privacy...

  14. 75 FR 9012 - Privacy Act of 1974, as Amended; Computer Matching Program (SSA/U.S. Department of Health and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-26

    ... ADMINISTRATION Privacy Act of 1974, as Amended; Computer Matching Program (SSA/ U.S. Department of Health and... existing computer matching program that is scheduled to expire on March 19, 2010. SUMMARY: In accordance... computer matching program that we are currently conducting with OCSE. DATES: We will file a report of...

  15. 75 FR 51154 - Privacy Act of 1974, as Amended; Computer Matching Program (SSA/Department of the Treasury...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-18

    ... ADMINISTRATION Privacy Act of 1974, as Amended; Computer Matching Program (SSA/ Department of the Treasury... of the Privacy Act, as amended, this notice announces a renewal of an existing computer matching... INFORMATION: A. General The Computer Matching and Privacy Protection Act of 1988 (Pub. L. 100-503),...

  16. 75 FR 7648 - Privacy Act of 1974, as Amended; Computer Matching Program (SSA/Department of Veterans Affairs...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-22

    ... ADMINISTRATION Privacy Act of 1974, as Amended; Computer Matching Program (SSA/ Department of Veterans Affairs... INFORMATION: A. General The Computer Matching and Privacy Protection Act of 1988 (Public Law (Pub. L.) 100-503), amended the Privacy Act (5 U.S.C. 552a) by describing the conditions under which computer...

  17. Feasibility of Tablet Computer Screening for Opioid Abuse in the Emergency Department

    PubMed Central

    Weiner, Scott G.; Horton, Laura C.; Green, Traci C.; Butler, Stephen F.

    2015-01-01

    Introduction Tablet computer-based screening may have the potential for detecting patients at risk for opioid abuse in the emergency department (ED). Study objectives were a) to determine if the revised Screener and Opioid Assessment for Patients with Pain (SOAPP®-R), a 24-question previously paper-based screening tool for opioid abuse potential, could be administered on a tablet computer to an ED patient population; b) to demonstrate that >90% of patients can complete the electronic screener without assistance in <5 minutes and; c) to determine patient ease of use with screening on a tablet computer. Methods This was a cross-sectional convenience sample study of patients seen in an urban academic ED. SOAPP®-R was programmed on a tablet computer by study investigators. Inclusion criteria were patients ages ≥18 years who were being considered for discharge with a prescription for an opioid analgesic. Exclusion criteria included inability to understand English or physical disability preventing use of the tablet. Results 93 patients were approached for inclusion and 82 (88%) provided consent. Fifty-two percent (n=43) of subjects were male; 46% (n=38) of subjects were between 18–35 years, and 54% (n=44) were >35 years. One hundred percent of subjects completed the screener. Median time to completion was 148 (interquartile range 117.5–184.3) seconds, and 95% (n=78) completed in <5 minutes. 93% (n=76) rated ease of completion as very easy. Conclusions It is feasible to administer a screening tool to a cohort of ED patients on a tablet computer. The screener administration time is minimal and patient ease of use with this modality is high. PMID:25671003

  18. University programs of the U.S. Department of Energy advanced accelerator applications program

    SciTech Connect

    Beller, D. E.; Ward, T. E.; Bresee, J. C.

    2001-01-01

    The Advanced Accelerator Applications (AAA) Program was initiated in fiscal year 2001 (FY-01) by the U.S. Congress, the U.S. Department of Energy (DOE), and the Los Alamos National Laboratory (LANL) in partnership with other national laboratories. The primary goal of this program is to investigate the feasibility of transmutation of nuclear waste. An Accelerator-Driven Test Facility (ADTF), which may be built during the first decade of the 21st Century, is a major component of this effort. The ADTF would include a large, state-of-the-art charged-particle accelerator, proton-neutron target systems, and accelerator-driven R&D systems. This new facility and its underlying science and technology will require a large cadre of educated scientists and trained technicians. In addition, other applications of nuclear science and engineering (e.g., proliferation monitoring and defense, nuclear medicine, safety regulation, industrial processes, and many others) require increased academic and national infrastructure and student populations. Thus, the AAA Program Office has begun a multi-year program to involve university faculty and students in various phases of the Project to support the infrastructure requirements of nuclear energy, science and technology fields as well as the special needs of the DOE transmutation program. In this paper we describe university programs that have supported, are supporting, and will support the R&D necessary for the AAA Project. Previous work included research for the Accelerator Transmutation of Waste (ATW) project, current (FY-01) programs include graduate fellowships and research for the AAA Project, and it is expected that future programs will expand and add to the existing programs.

  19. Quantitative Computed Tomography and Image Analysis for Advanced Muscle Assessment

    PubMed Central

    Edmunds, Kyle Joseph; Gíslason, Magnus K.; Arnadottir, Iris D.; Marcante, Andrea; Piccione, Francesco; Gargiulo, Paolo

    2016-01-01

    Medical imaging is of particular interest in the field of translational myology, as extant literature describes the utilization of a wide variety of techniques to non-invasively recapitulate and quantity various internal and external tissue morphologies. In the clinical context, medical imaging remains a vital tool for diagnostics and investigative assessment. This review outlines the results from several investigations on the use of computed tomography (CT) and image analysis techniques to assess muscle conditions and degenerative process due to aging or pathological conditions. Herein, we detail the acquisition of spiral CT images and the use of advanced image analysis tools to characterize muscles in 2D and 3D. Results from these studies recapitulate changes in tissue composition within muscles, as visualized by the association of tissue types to specified Hounsfield Unit (HU) values for fat, loose connective tissue or atrophic muscle, and normal muscle, including fascia and tendon. We show how results from these analyses can be presented as both average HU values and compositions with respect to total muscle volumes, demonstrating the reliability of these tools to monitor, assess and characterize muscle degeneration. PMID:27478562

  20. Attitudes Toward Computer Interventions for Partner Abuse and Drug Use Among Women in the Emergency Department

    PubMed Central

    Choo, Esther; Ranney, Megan; Wetle, Terrie; Morrow, Kathleen; Mello, Michael; Squires, Daniel; Tapé, Chantal; Garro, Aris; Zlotnick, Caron

    2014-01-01

    Background Drug use and partner abuse often coexist among women presenting to the emergency department (ED). Technology offers one solution to the limited time and expertise available to address these problems. Aims The aims of this study were to explore womens’ attitudes about use of computers for screening and intervening in drug use and partner abuse. Methods Seventeen adult women with recent histories of partner abuse and drug use were recruited from an urban ED to participate in one-on-one semi-structured interviews. A coding classification scheme was developed and applied to the transcripts by two independent coders. The research team collaboratively decided upon a thematic framework and selected illustrative quotes. Results Most participants used computers and/or mobile phones frequently and reported high self-efficacy with them. Women described emotional difficulty and shame around partner abuse experiences and drug use; however, they felt that reporting drug use and partner abuse was easier and safer through a computer than face-to-face with a person, and that advice from a computer about drug use or partner abuse was acceptable and accessible. Some had very positive experiences completing screening assessments. However, participants were skeptical of a computer’s ability to give empathy, emotional support or meaningful feedback. The ED was felt to be an appropriate venue for such programs, as long as they were private and did not supersede clinical care. Conclusions Women with partner abuse and drug use histories were receptive to computerized screening and advice, while still expressing a need for the empathy and compassion of a human interaction within an intervention. PMID:26167133

  1. Advanced computational sensors technology: testing and evaluation in visible, SWIR, and LWIR imaging

    NASA Astrophysics Data System (ADS)

    Rizk, Charbel G.; Wilson, John P.; Pouliquen, Philippe

    2015-05-01

    The Advanced Computational Sensors Team at the Johns Hopkins University Applied Physics Laboratory and the Johns Hopkins University Department of Electrical and Computer Engineering has been developing advanced readout integrated circuit (ROIC) technology for more than 10 years with a particular focus on the key challenges of dynamic range, sampling rate, system interface and bandwidth, and detector materials or band dependencies. Because the pixel array offers parallel sampling by default, the team successfully demonstrated that adding smarts in the pixel and the chip can increase performance significantly. Each pixel becomes a smart sensor and can operate independently in collecting, processing, and sharing data. In addition, building on the digital circuit revolution, the effective well size can be increased by orders of magnitude within the same pixel pitch over analog designs. This research has yielded an innovative class of a system-on-chip concept: the Flexible Readout and Integration Sensor (FRIS) architecture. All key parameters are programmable and/or can be adjusted dynamically, and this architecture can potentially be sensor and application agnostic. This paper reports on the testing and evaluation of one prototype that can support either detector polarity and includes sample results with visible, short-wavelength infrared (SWIR), and long-wavelength infrared (LWIR) imaging.

  2. Annual Report and Abstracts of Research of the Department of Computer and Information Science, July 1976-June 1977.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Computer and Information Science Research Center.

    The annual report of the Department of Computer and Information Science includes abstracts of research carried out during the 1976-77 academic year with support from grants by governmental agencies and industry, as well as The Ohio State University. The report covers the department's organizational structure, objectives, highlights of department…

  3. 78 FR 47336 - Privacy Act of 1974; Computer Matching Program Between the Department of Housing and Urban...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-05

    ... URBAN DEVELOPMENT Privacy Act of 1974; Computer Matching Program Between the Department of Housing and Urban Development (HUD) and the Department of Health and Human Services (HHS): Matching Tenant Data in... attained by HUD through the matching program. The most recent renewal of the current matching...

  4. Advanced Scientific Computing Research Network Requirements: ASCR Network Requirements Review Final Report

    SciTech Connect

    Bacon, Charles; Bell, Greg; Canon, Shane; Dart, Eli; Dattoria, Vince; Goodwin, Dave; Lee, Jason; Hicks, Susan; Holohan, Ed; Klasky, Scott; Lauzon, Carolyn; Rogers, Jim; Shipman, Galen; Skinner, David; Tierney, Brian

    2013-03-08

    The Energy Sciences Network (ESnet) is the primary provider of network connectivity for the U.S. Department of Energy (DOE) Office of Science (SC), the single largest supporter of basic research in the physical sciences in the United States. In support of SC programs, ESnet regularly updates and refreshes its understanding of the networking requirements of the instruments, facilities, scientists, and science programs that it serves. This focus has helped ESnet to be a highly successful enabler of scientific discovery for over 25 years. In October 2012, ESnet and the Office of Advanced Scientific Computing Research (ASCR) of the DOE SC organized a review to characterize the networking requirements of the programs funded by the ASCR program office. The requirements identified at the review are summarized in the Findings section, and are described in more detail in the body of the report.

  5. Recent advances in data assimilation in computational geodynamic models

    NASA Astrophysics Data System (ADS)

    Ismail-Zadeh, Alik

    2010-05-01

    . The QRV method was most recently introduced in geodynamic modelling (Ismail-Zadeh et al., 2007, 2008; Tantsyrev, 2008; Glisovic et al., 2009). The advances in computational geodynamics and in data assimilation attract an interest of the community dealing with lithosphere, mantle and core dynamics.

  6. Usability Characteristics of Sel-Fadministered Computer-Assisted Interviewing in the Emergency Department

    PubMed Central

    Herrick, D. B.; Nakhasi, A.; Nelson, B.; Rice, S.; Abbott, P. A.; Saber Tehrani, A. S.; Rothman, R. E.; Lehmann, H. P.; Newman-Toker, D. E.

    2013-01-01

    Objective Self-administered computer-assisted interviewing (SACAI) gathers accurate information from patients and could facilitate Emergency Department (ED) diagnosis. As part of an ongoing research effort whose long-range goal is to develop automated medical interviewing for diagnostic decision support, we explored usability attributes of SACAI in the ED. Methods Cross-sectional study at two urban, academic EDs. Convenience sample recruited daily over six weeks. Adult, non-level I trauma patients were eligible. We collected data on ease of use (self-reported difficulty, researcher documented need for help), efficiency (mean time-per-click on a standardized interview segment), and error (self-report age mismatched with age derived from electronic health records) when using SACAI on three different instruments: Elo TouchSystems ESY15A2 (finger touch), Toshiba M200 (with digitizer pen), and Motion C5 (with digitizer pen). We calculated descriptive statistics and used regression analysis to evaluate the impact of patient and computer factors on time-per-click. Results 841 participants completed all SACAI questions. Few (<1%) thought using the touch computer to ascertain medical information was difficult. Most (86%) required no assistance. Participants needing help were older (54 ± 19 vs. 40 ± 15 years, p<0.001) and more often lacked internet at home (13.4% vs. 7.3%, p = 0.004). On multivariate analysis, female sex (p<0.001), White (p<0.001) and other (p = 0.05) race (vs. Black race), younger age (p<0.001), internet access at home (p<0.001), high school graduation (p = 0.04), and touch screen entry (vs. digitizer pen) (p = 0.01) were independent predictors of decreased time-per-click. Participant misclick errors were infrequent, but, in our sample, occurred only during interviews using a digitizer pen rather than a finger touch-screen interface (1.9% vs. 0%, p = 0.09). Discussion Our results support the facility of interactions between ED patients and SACAI

  7. The Design and Transfer of Advanced Command and Control (C2) Computer-Based Systems

    DTIC Science & Technology

    1980-03-31

    TECHNICAL REPORT 80-02 QUARTERLY TECHNICAL REPORT: THE DESIGN AND TRANSFER OF ADVANCED COMMAND AND CONTROL (C 2 ) COMPUTER-BASED SYSTEMS ARPA...The Tasks/Objectives and/or Purposes of the overall project are connected with the design , development, demonstration and transfer of advanced...command and control (C2 ) computer-based systems; this report covers work in the computer-based design and transfer areas only. The Technical Problems thus

  8. First 3 years of operation of RIACS (Research Institute for Advanced Computer Science) (1983-1985)

    NASA Technical Reports Server (NTRS)

    Denning, P. J.

    1986-01-01

    The focus of the Research Institute for Advanced Computer Science (RIACS) is to explore matches between advanced computing architectures and the processes of scientific research. An architecture evaluation of the MIT static dataflow machine, specification of a graphical language for expressing distributed computations, and specification of an expert system for aiding in grid generation for two-dimensional flow problems was initiated. Research projects for 1984 and 1985 are summarized.

  9. Advanced Simulation and Computing: A Summary Report to the Director's Review

    SciTech Connect

    McCoy, M G; Peck, T

    2003-06-01

    It has now been three years since the Advanced Simulation and Computing Program (ASCI), as managed by Defense and Nuclear Technologies (DNT) Directorate, has been reviewed by this Director's Review Committee (DRC). Since that time, there has been considerable progress for all components of the ASCI Program, and these developments will be highlighted in this document and in the presentations planned for June 9 and 10, 2003. There have also been some name changes. Today, the Program is called ''Advanced Simulation and Computing,'' Although it retains the familiar acronym ASCI, the initiative nature of the effort has given way to sustained services as an integral part of the Stockpile Stewardship Program (SSP). All computing efforts at LLNL and the other two Defense Program (DP) laboratories are funded and managed under ASCI. This includes the so-called legacy codes, which remain essential tools in stockpile stewardship. The contract between the Department of Energy (DOE) and the University of California (UC) specifies an independent appraisal of Directorate technical work and programmatic management. Such represents the work of this DNT Review Committee. Beginning this year, the Laboratory is implementing a new review system. This process was negotiated between UC, the National Nuclear Security Administration (NNSA), and the Laboratory Directors. Central to this approach are eight performance objectives that focus on key programmatic and administrative goals. Associated with each of these objectives are a number of performance measures to more clearly characterize the attainment of the objectives. Each performance measure has a lead directorate and one or more contributing directorates. Each measure has an evaluation plan and has identified expected documentation to be included in the ''Assessment File''.

  10. Multidetector computed tomography in the evaluation of pediatric acute abdominal pain in the emergency department.

    PubMed

    Lin, Wei-Ching; Lin, Chien-Heng

    2016-06-01

    The accurate diagnosis of pediatric acute abdominal pain is one of the most challenging tasks in the emergency department (ED) due to its unclear clinical presentation and non-specific findings in physical examinations, laboratory data, and plain radiographs. The objective of this study was to evaluate the impact of abdominal multidetector computed tomography (MDCT) performed in the ED on pediatric patients presenting with acute abdominal pain. A retrospective chart review of children aged <18 years with acute abdominal pain who visited the emergency department and underwent MDCT between September 2004 and June 2007 was conducted. Patients with a history of trauma were excluded. A total of 156 patients with acute abdominal pain (85 males and 71 females, age 1-17 years; mean age 10.9 ± 4.6 years) who underwent abdominal MDCT in the pediatric ED during this 3-year period were enrolled in the study. One hundred and eighteen patients with suspected appendicitis underwent abdominal MDCT. Sixty four (54.2%) of them had appendicitis, which was proven by histopathology. The sensitivity of abdominal MDCT for appendicitis was found to be 98.5% and the specificity was 84.9%. In this study, the other two common causes of nontraumatic abdominal emergencies were gastrointestinal tract (GI) infections and ovarian cysts. The most common etiology of abdominal pain in children that requires imaging with abdominal MDCT is appendicitis. MDCT has become a preferred and invaluable imaging modality in evaluating uncertain cases of pediatric acute abdominal pain in ED, in particular for suspected appendicitis, neoplasms, and gastrointestinal abnormalities.

  11. Generalized Advanced Propeller Analysis System (GAPAS). Volume 2: Computer program user manual

    NASA Technical Reports Server (NTRS)

    Glatt, L.; Crawford, D. R.; Kosmatka, J. B.; Swigart, R. J.; Wong, E. W.

    1986-01-01

    The Generalized Advanced Propeller Analysis System (GAPAS) computer code is described. GAPAS was developed to analyze advanced technology multi-bladed propellers which operate on aircraft with speeds up to Mach 0.8 and altitudes up to 40,000 feet. GAPAS includes technology for analyzing aerodynamic, structural, and acoustic performance of propellers. The computer code was developed for the CDC 7600 computer and is currently available for industrial use on the NASA Langley computer. A description of all the analytical models incorporated in GAPAS is included. Sample calculations are also described as well as users requirements for modifying the analysis system. Computer system core requirements and running times are also discussed.

  12. Recent advances in computational methods for nuclear magnetic resonance data processing.

    PubMed

    Gao, Xin

    2013-02-01

    Although three-dimensional protein structure determination using nuclear magnetic resonance (NMR) spectroscopy is a computationally costly and tedious process that would benefit from advanced computational techniques, it has not garnered much research attention from specialists in bioinformatics and computational biology. In this paper, we review recent advances in computational methods for NMR protein structure determination. We summarize the advantages of and bottlenecks in the existing methods and outline some open problems in the field. We also discuss current trends in NMR technology development and suggest directions for research on future computational methods for NMR.

  13. Usefulness of Nonenhanced Computed Tomography for Diagnosing Urolithiasis without Pyuria in the Emergency Department.

    PubMed

    Lee, Dong Hoon; Chang, In Ho; Kim, Jin Wook; Chi, Byung Hoon; Park, Sung Bin

    2015-01-01

    We compared the clinical utility of nonenhanced computed tomography (NECT) and intravenous urography (IVU) in patients with classic symptoms of renal colic without evidence of a urine infection. This was a retrospective analysis of IVU and NECT performed in adult patients with suspected renal colic at an emergency department between January 2005 and December 2013. The records of all patients in NECT and IVU groups were reviewed, and the patients were categorized according to the cause of their symptoms. A total of 2218 patients were enrolled. Of these patients, 1525 (68.8%) underwent IVU and 693 (31.2%) underwent NECT. The patients in NECT group were older (45.48 ± 14.96 versus 42.37 ± 13.68 years, p < 0.001), had less gross hematuria (7.6 versus 2.9%, p < 0.001), and were admitted more often (18.6 versus 12.0%, p < 0.001) than the patients in IVU group. Urinary stones were detected in 1413 (63.7%) patients. NECT had a higher detection rate of urolithiasis than IVP (74.0 versus 59.0%, p < 0.001). No significant difference was observed in the incidence of urinary stones greater than 4mm between groups from the radiologic findings (p = 0.79) or the full medical record review (p = 0.87).

  14. Computer architectures for computational physics work done by Computational Research and Technology Branch and Advanced Computational Concepts Group

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Slides are reproduced that describe the importance of having high performance number crunching and graphics capability. They also indicate the types of research and development underway at Ames Research Center to ensure that, in the near term, Ames is a smart buyer and user, and in the long-term that Ames knows the best possible solutions for number crunching and graphics needs. The drivers for this research are real computational physics applications of interest to Ames and NASA. They are concerned with how to map the applications, and how to maximize the physics learned from the results of the calculations. The computer graphics activities are aimed at getting maximum information from the three-dimensional calculations by using the real time manipulation of three-dimensional data on the Silicon Graphics workstation. Work is underway on new algorithms that will permit the display of experimental results that are sparse and random, the same way that the dense and regular computed results are displayed.

  15. U.S. Department of Energy facilities needed to advance nuclear power.

    PubMed

    Ahearne, John F

    2011-01-01

    This talk is based upon a November 2008 report by the U.S. Department of Energy (DOE) Nuclear Energy Advisory Committee (NEAC). The report has two parts, a policy section and a technology section. Here extensive material from the Technical Subcommittee section of the NEAC report is used.

  16. The Advance of Computing from the Ground to the Cloud

    ERIC Educational Resources Information Center

    Breeding, Marshall

    2009-01-01

    A trend toward the abstraction of computing platforms that has been developing in the broader IT arena over the last few years is just beginning to make inroads into the library technology scene. Cloud computing offers for libraries many interesting possibilities that may help reduce technology costs and increase capacity, reliability, and…

  17. Computational Enzyme Design: Advances, hurdles and possible ways forward

    PubMed Central

    Linder, Mats

    2012-01-01

    This mini review addresses recent developments in computational enzyme design. Successful protocols as well as known issues and limitations are discussed from an energetic perspective. It will be argued that improved results can be obtained by including a dynamic treatment in the design protocol. Finally, a molecular dynamics-based approach for evaluating and refining computational designs is presented. PMID:24688650

  18. Advanced building energy management system demonstration for Department of Defense buildings.

    PubMed

    O'Neill, Zheng; Bailey, Trevor; Dong, Bing; Shashanka, Madhusudana; Luo, Dong

    2013-08-01

    This paper presents an advanced building energy management system (aBEMS) that employs advanced methods of whole-building performance monitoring combined with statistical methods of learning and data analysis to enable identification of both gradual and discrete performance erosion and faults. This system assimilated data collected from multiple sources, including blueprints, reduced-order models (ROM) and measurements, and employed advanced statistical learning algorithms to identify patterns of anomalies. The results were presented graphically in a manner understandable to facilities managers. A demonstration of aBEMS was conducted in buildings at Naval Station Great Lakes. The facility building management systems were extended to incorporate the energy diagnostics and analysis algorithms, producing systematic identification of more efficient operation strategies. At Naval Station Great Lakes, greater than 20% savings were demonstrated for building energy consumption by improving facility manager decision support to diagnose energy faults and prioritize alternative, energy-efficient operation strategies. The paper concludes with recommendations for widespread aBEMS success.

  19. A Computational Methodology for Simulating Thermal Loss Testing of the Advanced Stirling Convertor

    NASA Technical Reports Server (NTRS)

    Reid, Terry V.; Wilson, Scott D.; Schifer, Nicholas A.; Briggs, Maxwell H.

    2012-01-01

    The U.S. Department of Energy (DOE) and Lockheed Martin Space Systems Company (LMSSC) have been developing the Advanced Stirling Radioisotope Generator (ASRG) for use as a power system for space science missions. This generator would use two highefficiency Advanced Stirling Convertors (ASCs), developed by Sunpower Inc. and NASA Glenn Research Center (GRC). The ASCs convert thermal energy from a radioisotope heat source into electricity. As part of ground testing of these ASCs, different operating conditions are used to simulate expected mission conditions. These conditions require achieving a particular operating frequency, hot end and cold end temperatures, and specified electrical power output for a given net heat input. In an effort to improve net heat input predictions, numerous tasks have been performed which provided a more accurate value for net heat input into the ASCs, including the use of multidimensional numerical models. Validation test hardware has also been used to provide a direct comparison of numerical results and validate the multi-dimensional numerical models used to predict convertor net heat input and efficiency. These validation tests were designed to simulate the temperature profile of an operating Stirling convertor and resulted in a measured net heat input of 244.4 W. The methodology was applied to the multi-dimensional numerical model which resulted in a net heat input of 240.3 W. The computational methodology resulted in a value of net heat input that was 1.7 percent less than that measured during laboratory testing. The resulting computational methodology and results are discussed.

  20. Perception of radiation dose and potential risks of computed tomography in emergency department medical personnel

    PubMed Central

    Lee, Jin Hee; Kim, Kyuseok; Lee, Kyoung Ho; Kim, Kwang Pyo; Kim, Yu Jin; Park, Chanjong; Kang, Changwoo; Lee, Soo Hoon; Jeong, Jin Hee; Rhee, Joong Eui

    2015-01-01

    Objective Use of computed tomography (CT) continues to increase, but the relatively high radiation doses associated with CT have raised health concerns such as future risk of cancer. We investigated the level of awareness regarding radiation doses and possible risks associated with CT in medical personnel (MP). Methods This study was conducted from April to May 2012 and included physicians and nurses who worked in the emergency department of 17 training hospitals. The questionnaire included measurement of the effect of CT or radiography on health using a 10-point numerical rating scale, estimation of the radiation dose of one abdominal CT scan compared with one chest radiograph, and perception of the increased lifetime risk of cancer associated with CT. Results A total of 354 MP participated in this study: 142 nurses, 87 interns, 86 residents, and 39 specialists. Interns were less aware of the effects of CT or radiography on health than other physicians or nurses (mean±SD of 4.8±2.7, 5.9±2.7, 6.1±2.7, and 6.0±2.2 for interns, residents, specialists, and nurses, respectively; P<0.05). There was a significant difference in knowledge about the relative radiation dose of one abdominal CT scan compared with one chest radiograph between physicians and nurses (48.6% vs. 28.9% for physicians vs. nurses, P<0.05). MP perceived an increased risk of cancer from radiation associated with CT. Conclusion MP perceive the risk of radiation associated with CT, but their level of knowledge seems to be insufficient. PMID:27752583

  1. Benefit or burden? A sociotechnical analysis of diagnostic computer kiosks in four California hospital emergency departments.

    PubMed

    Ackerman, Sara L; Tebb, Kathleen; Stein, John C; Frazee, Bradley W; Hendey, Gregory W; Schmidt, Laura A; Gonzales, Ralph

    2012-12-01

    High expectations for new technologies coexist with wide variability in the actual adoption and impact of information technology (IT) projects in clinical settings, and the frequent failure to incorporate otherwise "successful" projects into routine practice. This paper draws on actor-network theory to present an in-depth, sociotechnical analysis of one such project--a computer kiosk designed to diagnose and expedite treatment of urinary tract infections (UTI) in adult women. Research at a hospital urgent care clinic demonstrated the kiosk program's effectiveness at diagnosing UTI and reducing patient wait times, and the kiosk was subsequently adopted by the clinic for routine patient care. However, a study promoting the adoption of the device at emergency departments (ED) was characterized by persistent staff resistance and lower-than-expected patient eligibility for kiosk-assisted care. The device was ultimately abandoned at all but one of the new sites. Observations and interviews with ED staff and the design/research team were conducted at four California EDs between April and July 2011 and point to conflicting understandings of evidence for the device's usefulness and reasons for its (non)adoption. The kiosk program's designers had attempted to "rationalize" medical work by embedding a formal representation of triage practices in the kiosk's software. However, the kiosk's "network" failed to stabilize as it encountered different patient populations, institutional politics, and the complex, pragmatic aspects of ED work at each site. The results of this evaluation challenge the persistent myth that a priori qualities and meanings inhere in technology regardless of context. The design and deployment of new IT projects in complex medical settings would benefit from empirically informed understandings of, and responses to, the contingent properties of human-technology relations.

  2. A Study into Advanced Guidance Laws Using Computational Methods

    DTIC Science & Technology

    2011-12-01

    computing aerodynamic forces % and moments. Except where noted, all dimensions in % MKS system. % Inputs...9] R. L. Shaw, Fighter Combat: Tactics and Maneuvering. Annapolis, MD: Naval Institute Press, 1988. [10] U. S. Shukla and P. R. Mahapatra

  3. RECENT ADVANCES IN COMPUTATIONAL MECHANICS FOR CIVIL ENGINEERING

    NASA Astrophysics Data System (ADS)

    Applied Mechanics Committee, Computational Mechanics Subcommittee,

    In order to clarify mechanical phenomena in civil engineering, it is necessary to improve computational theory and technique in consideration of the particularity of objects to be analyzed and to update computational mechanics focusing on practical use. In addition to the analysis of infrastructure, for damage prediction of natural disasters such as earthquake, tsunami and flood, since it is essential to reflect broad ranges in space and time inherent to fields of civil engineering as well as material properties, it is important to newly develop computational method in view of the particularity of fields of civil engineering. In this context, research trend of methods of computational mechanics which is noteworthy for resolving the complex mechanics problems in civil engineering is reviewed in this paper.

  4. Advances in Domain Mapping of Massively Parallel Scientific Computations

    SciTech Connect

    Leland, Robert W.; Hendrickson, Bruce A.

    2015-10-01

    One of the most important concerns in parallel computing is the proper distribution of workload across processors. For most scientific applications on massively parallel machines, the best approach to this distribution is to employ data parallelism; that is, to break the datastructures supporting a computation into pieces and then to assign those pieces to different processors. Collectively, these partitioning and assignment tasks comprise the domain mapping problem.

  5. Advances in computational design and analysis of airbreathing propulsion systems

    NASA Technical Reports Server (NTRS)

    Klineberg, John M.

    1989-01-01

    The development of commercial and military aircraft depends, to a large extent, on engine manufacturers being able to achieve significant increases in propulsion capability through improved component aerodynamics, materials, and structures. The recent history of propulsion has been marked by efforts to develop computational techniques that can speed up the propulsion design process and produce superior designs. The availability of powerful supercomputers, such as the NASA Numerical Aerodynamic Simulator, and the potential for even higher performance offered by parallel computer architectures, have opened the door to the use of multi-dimensional simulations to study complex physical phenomena in propulsion systems that have previously defied analysis or experimental observation. An overview of several NASA Lewis research efforts is provided that are contributing toward the long-range goal of a numerical test-cell for the integrated, multidisciplinary design, analysis, and optimization of propulsion systems. Specific examples in Internal Computational Fluid Mechanics, Computational Structural Mechanics, Computational Materials Science, and High Performance Computing are cited and described in terms of current capabilities, technical challenges, and future research directions.

  6. 77 FR 62231 - DOE/Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-12

    ...: Office of Science, Department of Energy. ACTION: Notice of Open Meeting. SUMMARY: This notice announces a...: The meeting is open to the public. A webcast of this meeting will be available. Please check the Web... R. Butler, Acting Deputy Committee Management Officer. [FR Doc. 2012-25144 Filed 10-11-12; 8:45...

  7. The Use of Computer Competencies of Students in the Departments of Physical Education and Sport Teaching, and School Teaching

    ERIC Educational Resources Information Center

    Okan, Ilyas

    2016-01-01

    This study aims to reveal the levels of the use of computer, which is nowadays one of the most important technologies, of teacher candidate studying in the departments of Physical Education and Sport Teaching, and School teaching; also aims to research whether there is differences according to various criteria or not. In research, data were…

  8. Advanced Simulation and Computing Co-Design Strategy

    SciTech Connect

    Ang, James A.; Hoang, Thuc T.; Kelly, Suzanne M.; McPherson, Allen; Neely, Rob

    2015-11-01

    This ASC Co-design Strategy lays out the full continuum and components of the co-design process, based on what we have experienced thus far and what we wish to do more in the future to meet the program’s mission of providing high performance computing (HPC) and simulation capabilities for NNSA to carry out its stockpile stewardship responsibility.

  9. Connecting Performance Analysis and Visualization to Advance Extreme Scale Computing

    SciTech Connect

    Bremer, Peer-Timo; Mohr, Bernd; Schulz, Martin; Pasccci, Valerio; Gamblin, Todd; Brunst, Holger

    2015-07-29

    The characterization, modeling, analysis, and tuning of software performance has been a central topic in High Performance Computing (HPC) since its early beginnings. The overall goal is to make HPC software run faster on particular hardware, either through better scheduling, on-node resource utilization, or more efficient distributed communication.

  10. Advanced Computer Image Generation Techniques Exploiting Perceptual Characteristics. Final Report.

    ERIC Educational Resources Information Center

    Stenger, Anthony J.; And Others

    This study suggests and identifies computer image generation (CIG) algorithms for visual simulation that improve the training effectiveness of CIG simulators and identifies areas of basic research in visual perception that are significant for improving CIG technology. The first phase of the project entailed observing three existing CIG simulators.…

  11. MAX - An advanced parallel computer for space applications

    NASA Technical Reports Server (NTRS)

    Lewis, Blair F.; Bunker, Robert L.

    1991-01-01

    MAX is a fault-tolerant multicomputer hardware and software architecture designed to meet the needs of NASA spacecraft systems. It consists of conventional computing modules (computers) connected via a dual network topology. One network is used to transfer data among the computers and between computers and I/O devices. This network's topology is arbitrary. The second network operates as a broadcast medium for operating system synchronization messages and supports the operating system's Byzantine resilience. A fully distributed operating system supports multitasking in an asynchronous event and data driven environment. A large grain dataflow paradigm is used to coordinate the multitasking and provide easy control of concurrency. It is the basis of the system's fault tolerance and allows both static and dynamical location of tasks. Redundant execution of tasks with software voting of results may be specified for critical tasks. The dataflow paradigm also supports simplified software design, test and maintenance. A unique feature is a method for reliably patching code in an executing dataflow application.

  12. Advanced Computational Aeroacoustics Methods for Fan Noise Prediction

    NASA Technical Reports Server (NTRS)

    Envia, Edmane (Technical Monitor); Tam, Christopher

    2003-01-01

    Direct computation of fan noise is presently not possible. One of the major difficulties is the geometrical complexity of the problem. In the case of fan noise, the blade geometry is critical to the loading on the blade and hence the intensity of the radiated noise. The precise geometry must be incorporated into the computation. In computational fluid dynamics (CFD), there are two general ways to handle problems with complex geometry. One way is to use unstructured grids. The other is to use body fitted overset grids. In the overset grid method, accurate data transfer is of utmost importance. For acoustic computation, it is not clear that the currently used data transfer methods are sufficiently accurate as not to contaminate the very small amplitude acoustic disturbances. In CFD, low order schemes are, invariably, used in conjunction with unstructured grids. However, low order schemes are known to be numerically dispersive and dissipative. dissipative errors are extremely undesirable for acoustic wave problems. The objective of this project is to develop a high order unstructured grid Dispersion-Relation-Preserving (DRP) scheme. would minimize numerical dispersion and dissipation errors. contains the results of the funded portion of the project. scheme on an unstructured grid has been developed. constructed in the wave number space. The characteristics of the scheme can be improved by the inclusion of additional constraints. Stability of the scheme has been investigated. Stability can be improved by adopting the upwinding strategy.

  13. Project T.E.A.M. (Technical Education Advancement Modules). Introduction to Computers.

    ERIC Educational Resources Information Center

    Ellis, Brenda

    This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 3-hour introduction to computers. The purpose is to develop the following competencies: (1) orientation to data processing; (2) use of data entry devices; (3) use of computer menus; and (4) entry of data with accuracy and…

  14. Advanced Computational Thermal Studies and their Assessment for Supercritical-Pressure Reactors (SCRs)

    SciTech Connect

    D. M. McEligot; J. Y. Yoo; J. S. Lee; S. T. Ro; E. Lurien; S. O. Park; R. H. Pletcher; B. L. Smith; P. Vukoslavcevic; J. M. Wallace

    2009-04-01

    The goal of this laboratory / university collaboration of coupled computational and experimental studies is the improvement of predictive methods for supercritical-pressure reactors. The general objective is to develop supporting knowledge needed of advanced computational techniques for the technology development of the concepts and their safety systems.

  15. Teaching Advanced Concepts in Computer Networks: VNUML-UM Virtualization Tool

    ERIC Educational Resources Information Center

    Ruiz-Martinez, A.; Pereniguez-Garcia, F.; Marin-Lopez, R.; Ruiz-Martinez, P. M.; Skarmeta-Gomez, A. F.

    2013-01-01

    In the teaching of computer networks the main problem that arises is the high price and limited number of network devices the students can work with in the laboratories. Nowadays, with virtualization we can overcome this limitation. In this paper, we present a methodology that allows students to learn advanced computer network concepts through…

  16. Advanced Technology Vehicle Program of the Maryland Department of Transportation and Metropolitan Washington Council of Governments

    SciTech Connect

    Freudberg, Stuart A.

    2001-03-31

    A multi-year Clean Alternative program is designed to integrate low-emission advanced technology vehicles into high mileage/high-fuel-use public and private fleets, which are major contributors to high pollution levels. The primary goal of the program is reduced emissions of nitrogen oxides (NO{sub x}) from on-road vehicles in the Maryland counties surrounding Washington, DC. The program is targeted at fleets operating in Calvert, Charles, Frederick, Montgomery and Prince George's counties. Eligible types of vehicle applications include taxicabs, shuttles, buses, and delivery vans and trucks. Other types may qualify if they meet certain annual fuel-use or mileage criteria. Minimum requirements have been established for participating companies, including size of fleet and age of firm. The first vehicles under this program were placed in service in 1999. The Clean Alternative provides financial incentives to selected qualified firms that purchase original equipment manufacturer (OEM) vehicles or heavy-duty engines that have been certified to Low Emission Vehicle (LEV) emission levels or lower. This program is intended to be flexible and to evolve over time. For instance, in coming years the standards for acceptable emission levels may be tightened. The level of financial incentive will be determined on a case-by-case basis and other types of incentives may be provided in some cases. The range of counties included may be extended in the future or criteria for participation changed to help meet the air quality goals of the region.

  17. Advances on modelling of ITER scenarios: physics and computational challenges

    NASA Astrophysics Data System (ADS)

    Giruzzi, G.; Garcia, J.; Artaud, J. F.; Basiuk, V.; Decker, J.; Imbeaux, F.; Peysson, Y.; Schneider, M.

    2011-12-01

    Methods and tools for design and modelling of tokamak operation scenarios are discussed with particular application to ITER advanced scenarios. Simulations of hybrid and steady-state scenarios performed with the integrated tokamak modelling suite of codes CRONOS are presented. The advantages of a possible steady-state scenario based on cyclic operations, alternating phases of positive and negative loop voltage, with no magnetic flux consumption on average, are discussed. For regimes in which current alignment is an issue, a general method for scenario design is presented, based on the characteristics of the poloidal current density profile.

  18. National facility for advanced computational science: A sustainable path to scientific discovery

    SciTech Connect

    Simon, Horst; Kramer, William; Saphir, William; Shalf, John; Bailey, David; Oliker, Leonid; Banda, Michael; McCurdy, C. William; Hules, John; Canning, Andrew; Day, Marc; Colella, Philip; Serafini, David; Wehner, Michael; Nugent, Peter

    2004-04-02

    Lawrence Berkeley National Laboratory (Berkeley Lab) proposes to create a National Facility for Advanced Computational Science (NFACS) and to establish a new partnership between the American computer industry and a national consortium of laboratories, universities, and computing facilities. NFACS will provide leadership-class scientific computing capability to scientists and engineers nationwide, independent of their institutional affiliation or source of funding. This partnership will bring into existence a new class of computational capability in the United States that is optimal for science and will create a sustainable path towards petaflops performance.

  19. Using Advanced Computer Vision Algorithms on Small Mobile Robots

    DTIC Science & Technology

    2006-04-20

    Lab. This classification algorithm has been used to successfully detect the license plates of automobiles in motion in real-time. While working...use in real-time. Test results are shown for a variety of environments. KEYWORDS: robotics, computer vision, car /license plate detection, SIFT...when detecting the make and model of automobiles , SIFT can be used to achieve very high detection rates at the expense of a hefty performance cost when

  20. Cogeneration computer model assessment: Advanced cogeneration research study

    NASA Technical Reports Server (NTRS)

    Rosenberg, L.

    1983-01-01

    Cogeneration computer simulation models to recommend the most desirable models or their components for use by the Southern California Edison Company (SCE) in evaluating potential cogeneration projects was assessed. Existing cogeneration modeling capabilities are described, preferred models are identified, and an approach to the development of a code which will best satisfy SCE requirements is recommended. Five models (CELCAP, COGEN 2, CPA, DEUS, and OASIS) are recommended for further consideration.

  1. Advanced Computational Methods for Security Constrained Financial Transmission Rights

    SciTech Connect

    Kalsi, Karanjit; Elbert, Stephen T.; Vlachopoulou, Maria; Zhou, Ning; Huang, Zhenyu

    2012-07-26

    Financial Transmission Rights (FTRs) are financial insurance tools to help power market participants reduce price risks associated with transmission congestion. FTRs are issued based on a process of solving a constrained optimization problem with the objective to maximize the FTR social welfare under power flow security constraints. Security constraints for different FTR categories (monthly, seasonal or annual) are usually coupled and the number of constraints increases exponentially with the number of categories. Commercial software for FTR calculation can only provide limited categories of FTRs due to the inherent computational challenges mentioned above. In this paper, first an innovative mathematical reformulation of the FTR problem is presented which dramatically improves the computational efficiency of optimization problem. After having re-formulated the problem, a novel non-linear dynamic system (NDS) approach is proposed to solve the optimization problem. The new formulation and performance of the NDS solver is benchmarked against widely used linear programming (LP) solvers like CPLEX™ and tested on both standard IEEE test systems and large-scale systems using data from the Western Electricity Coordinating Council (WECC). The performance of the NDS is demonstrated to be comparable and in some cases is shown to outperform the widely used CPLEX algorithms. The proposed formulation and NDS based solver is also easily parallelizable enabling further computational improvement.

  2. Parallel computing in genomic research: advances and applications

    PubMed Central

    Ocaña, Kary; de Oliveira, Daniel

    2015-01-01

    Today’s genomic experiments have to process the so-called “biological big data” that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC) environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities. PMID:26604801

  3. Parallel computing in genomic research: advances and applications.

    PubMed

    Ocaña, Kary; de Oliveira, Daniel

    2015-01-01

    Today's genomic experiments have to process the so-called "biological big data" that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC) environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities.

  4. Vision 20/20: Automation and advanced computing in clinical radiation oncology

    SciTech Connect

    Moore, Kevin L. Moiseenko, Vitali; Kagadis, George C.; McNutt, Todd R.; Mutic, Sasa

    2014-01-15

    This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authors contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy.

  5. 16th Department of Energy Computer Security Group Training Conference: Proceedings

    SciTech Connect

    Not Available

    1994-04-01

    Various topic on computer security are presented. Integrity standards, smartcard systems, network firewalls, encryption systems, cryptography, computer security programs, multilevel security guards, electronic mail privacy, the central intelligence agency, internet security, and high-speed ATM networking are typical examples of discussed topics. Individual papers are indexed separately.

  6. A self-configuring control system for storage and computing departments at INFN-CNAF Tierl

    NASA Astrophysics Data System (ADS)

    Gregori, Daniele; Dal Pra, Stefano; Ricci, Pier Paolo; Pezzi, Michele; Prosperini, Andrea; Sapunenko, Vladimir

    2015-05-01

    The storage and farming departments at the INFN-CNAF Tier1[1] manage approximately thousands of computing nodes and several hundreds of servers that provides access to the disk and tape storage. In particular, the storage server machines should provide the following services: an efficient access to about 15 petabytes of disk space with different cluster of GPFS file system, the data transfers between LHC Tiers sites (Tier0, Tier1 and Tier2) via GridFTP cluster and Xrootd protocol and finally the writing and reading data operations on magnetic tape backend. One of the most important and essential point in order to get a reliable service is a control system that can warn if problems arise and which is able to perform automatic recovery operations in case of service interruptions or major failures. Moreover, during daily operations the configurations can change, i.e. if the GPFS cluster nodes roles can be modified and therefore the obsolete nodes must be removed from the control system production, and the new servers should be added to the ones that are already present. The manual management of all these changes is an operation that can be somewhat difficult in case of several changes, it can also take a long time and is easily subject to human error or misconfiguration. For these reasons we have developed a control system with the feature of self-configure itself if any change occurs. Currently, this system has been in production for about a year at the INFN-CNAF Tier1 with good results and hardly any major drawback. There are three major key points in this system. The first is a software configurator service (e.g. Quattor or Puppet) for the servers machines that we want to monitor with the control system; this service must ensure the presence of appropriate sensors and custom scripts on the nodes to check and should be able to install and update software packages on them. The second key element is a database containing information, according to a suitable format, on

  7. Computational Approaches to Enhance Nanosafety and Advance Nanomedicine

    NASA Astrophysics Data System (ADS)

    Mendoza, Eduardo R.

    With the increasing use of nanoparticles in food processing, filtration/purification and consumer products, as well as the huge potential of their use in nanomedicine, a quantitative understanding of the effects of nanoparticle uptake and transport is needed. We provide examples of novel methods for modeling complex bio-nano interactions which are based on stochastic process algebras. Since model construction presumes sufficient availability of experimental data, recent developments in "nanoinformatics", an emerging discipline analogous to bioinfomatics, in building an accessible information infrastructure are subsequently discussed. Both computational areas offer opportunities for Filipinos to engage in collaborative, cutting edge research in this impactful field.

  8. First Responders Guide to Computer Forensics: Advanced Topics

    DTIC Science & Technology

    2005-09-01

    server of the sender , the mail server of the receiver, and the computer that receives the email. Assume that Alice wants to send an email to her friend...pleased to meet you MAIL FROM: alice.price@alphanet.com 250 alice.price@alphanet.com... Sender ok RCPT TO: bob.doe@betanet.com 250 bob.doe...betanet.com... Sender ok DATA 354 Please start mail input From: alice.price@alphanet.com To: bob.doe@betanet.com Subject: Lunch Bob, It was good

  9. Advanced Computational Methods for Thermal Radiative Heat Transfer

    SciTech Connect

    Tencer, John; Carlberg, Kevin Thomas; Larsen, Marvin E.; Hogan, Roy E.

    2016-10-01

    Participating media radiation (PMR) in weapon safety calculations for abnormal thermal environments are too costly to do routinely. This cost may be s ubstantially reduced by applying reduced order modeling (ROM) techniques. The application of ROM to PMR is a new and unique approach for this class of problems. This approach was investigated by the authors and shown to provide significant reductions in the computational expense associated with typical PMR simulations. Once this technology is migrated into production heat transfer analysis codes this capability will enable the routine use of PMR heat transfer in higher - fidelity simulations of weapon resp onse in fire environments.

  10. Advanced Computational Framework for Environmental Management ZEM, Version 1.x

    SciTech Connect

    Vesselinov, Velimir V.; O'Malley, Daniel; Pandey, Sachin

    2016-11-04

    Typically environmental management problems require analysis of large and complex data sets originating from concurrent data streams with different data collection frequencies and pedigree. These big data sets require on-the-fly integration into a series of models with different complexity for various types of model analyses where the data are applied as soft and hard model constraints. This is needed to provide fast iterative model analyses based on the latest available data to guide decision-making. Furthermore, the data and model are associated with uncertainties. The uncertainties are probabilistic (e.g. measurement errors) and non-probabilistic (unknowns, e.g. alternative conceptual models characterizing site conditions). To address all of these issues, we have developed an integrated framework for real-time data and model analyses for environmental decision-making called ZEM. The framework allows for seamless and on-the-fly integration of data and modeling results for robust and scientifically-defensible decision-making applying advanced decision analyses tools such as Bayesian- Information-Gap Decision Theory (BIG-DT). The framework also includes advanced methods for optimization that are capable of dealing with a large number of unknown model parameters, and surrogate (reduced order) modeling capabilities based on support vector regression techniques. The framework is coded in Julia, a state-of-the-art high-performance programing language (http://julialang.org). The ZEM framework is open-source and can be applied to any environmental management site. The framework will be open-source and released under GPL V3 license.

  11. Computational Efforts in Support of Advanced Coal Research

    SciTech Connect

    Suljo Linic

    2006-08-17

    The focus in this project was to employ first principles computational methods to study the underlying molecular elementary processes that govern hydrogen diffusion through Pd membranes as well as the elementary processes that govern the CO- and S-poisoning of these membranes. Our computational methodology integrated a multiscale hierarchical modeling approach, wherein a molecular understanding of the interactions between various species is gained from ab-initio quantum chemical Density Functional Theory (DFT) calculations, while a mesoscopic statistical mechanical model like Kinetic Monte Carlo is employed to predict the key macroscopic membrane properties such as permeability. The key developments are: (1) We have coupled systematically the ab initio calculations with Kinetic Monte Carlo (KMC) simulations to model hydrogen diffusion through the Pd based-membranes. The predicted tracer diffusivity of hydrogen atoms through the bulk of Pd lattice from KMC simulations are in excellent agreement with experiments. (2) The KMC simulations of dissociative adsorption of H{sub 2} over Pd(111) surface indicates that for thin membranes (less than 10{micro} thick), the diffusion of hydrogen from surface to the first subsurface layer is rate limiting. (3) Sulfur poisons the Pd surface by altering the electronic structure of the Pd atoms in the vicinity of the S atom. The KMC simulations indicate that increasing sulfur coverage drastically reduces the hydrogen coverage on the Pd surface and hence the driving force for diffusion through the membrane.

  12. The Department of Defense and the Power of Cloud Computing: Weighing Acceptable Cost Versus Acceptable Risk

    DTIC Science & Technology

    2016-04-01

    DISA is leading the way for the development of a private DOD cloud computing environment in conjunction with the Army. Operational in 2008, DISA...significant opportunities and security challenges when implementing a cloud computing environment . The transformation of DOD information technology...is this shared pool of resources, espe- cially shared resources in a commercial environment , that also creates numerous risks not usually seen in

  13. Advances in x-ray computed microtomography at the NSLS

    SciTech Connect

    Dowd, B.A.; Andrews, A.B.; Marr, R.B.; Siddons, D.P.; Jones, K.W.; Peskin, A.M.

    1998-08-01

    The X-Ray Computed Microtomography workstation at beamline X27A at the NSLS has been utilized by scientists from a broad range of disciplines from industrial materials processing to environmental science. The most recent applications are presented here as well as a description of the facility that has evolved to accommodate a wide variety of materials and sample sizes. One of the most exciting new developments reported here resulted from a pursuit of faster reconstruction techniques. A Fast Filtered Back Transform (FFBT) reconstruction program has been developed and implemented, that is based on a refinement of the gridding algorithm first developed for use with radio astronomical data. This program has reduced the reconstruction time to 8.5 sec for a 929 x 929 pixel{sup 2} slice on an R10,000 CPU, more than 8x reduction compared with the Filtered Back-Projection method.

  14. ADVANCES IN X-RAY COMPUTED MICROTOMOGRAPHY AT THE NSLS.

    SciTech Connect

    DOWD,B.A.

    1998-08-07

    The X-Ray Computed Microtomography workstation at beamline X27A at the NSLS has been utilized by scientists from a broad range of disciplines from industrial materials processing to environmental science. The most recent applications are presented here as well as a description of the facility that has evolved to accommodate a wide variety of materials and sample sizes. One of the most exciting new developments reported here resulted from a pursuit of faster reconstruction techniques. A Fast Filtered Back Transform (FFBT) reconstruction program has been developed and implemented, that is based on a refinement of the ''gridding'' algorithm first developed for use with radio astronomical data. This program has reduced the reconstruction time to 8.5 sec for a 929 x 929 pixel{sup 2} slice on an R10,000 CPU, more than 8x reduction compared with the Filtered Back-Projection method.

  15. Experimental and computing strategies in advanced material characterization problems

    SciTech Connect

    Bolzon, G.

    2015-10-28

    The mechanical characterization of materials relies more and more often on sophisticated experimental methods that permit to acquire a large amount of data and, contemporarily, to reduce the invasiveness of the tests. This evolution accompanies the growing demand of non-destructive diagnostic tools that assess the safety level of components in use in structures and infrastructures, for instance in the strategic energy sector. Advanced material systems and properties that are not amenable to traditional techniques, for instance thin layered structures and their adhesion on the relevant substrates, can be also characterized by means of combined experimental-numerical tools elaborating data acquired by full-field measurement techniques. In this context, parameter identification procedures involve the repeated simulation of the laboratory or in situ tests by sophisticated and usually expensive non-linear analyses while, in some situation, reliable and accurate results would be required in real time. The effectiveness and the filtering capabilities of reduced models based on decomposition and interpolation techniques can be profitably used to meet these conflicting requirements. This communication intends to summarize some results recently achieved in this field by the author and her co-workers. The aim is to foster further interaction between engineering and mathematical communities.

  16. National Energy Research Scientific Computing Center (NERSC): Advancing the frontiers of computational science and technology

    SciTech Connect

    Hules, J.

    1996-11-01

    National Energy Research Scientific Computing Center (NERSC) provides researchers with high-performance computing tools to tackle science`s biggest and most challenging problems. Founded in 1974 by DOE/ER, the Controlled Thermonuclear Research Computer Center was the first unclassified supercomputer center and was the model for those that followed. Over the years the center`s name was changed to the National Magnetic Fusion Energy Computer Center and then to NERSC; it was relocated to LBNL. NERSC, one of the largest unclassified scientific computing resources in the world, is the principal provider of general-purpose computing services to DOE/ER programs: Magnetic Fusion Energy, High Energy and Nuclear Physics, Basic Energy Sciences, Health and Environmental Research, and the Office of Computational and Technology Research. NERSC users are a diverse community located throughout US and in several foreign countries. This brochure describes: the NERSC advantage, its computational resources and services, future technologies, scientific resources, and computational science of scale (interdisciplinary research over a decade or longer; examples: combustion in engines, waste management chemistry, global climate change modeling).

  17. A Magnetic Tape Library System for the Computer Science Department NPGS (Naval Postgraduate School); Requirements Analysis, Design, and Implementation.

    DTIC Science & Technology

    1985-12-01

    Willis R . Greer, Jr., Chairman, Department of Administrative Sciences Kneale T. Marshall, E .Yif Information and Policy Scie!es - iwV...FRA\\’ES---------------------------------------- ’,~~ r L - - - - - - - - - - - - -- - - - - - - - - - - - - - - - - - - - - - - - J. DATABASE...be ,eveicoed ’bv r theSiS Student I0 9 ’ The current tape library is a combination of a casual computer listing of approximately 600 tapes, and opened

  18. Sunoikisis: Computer-Mediated Communication in the Creation of a Virtual Department.

    ERIC Educational Resources Information Center

    Morrell, Kenneth Scott

    2001-01-01

    Chronicles efforts of faculty members at the institutions of the Associated Colleges of the South to create "Sunoikisis," a virtual department of Greek and Roman Studies. The long-term goal of the project is to expand the academic opportunities available to undergraduates at small liberal arts colleges for studying the languages, literatures, and…

  19. Using Computers To Write Comprehensive Examinations: A Study of Doctoral Level Examinations in Educational Administration Departments.

    ERIC Educational Resources Information Center

    Fansler, A. Gigi; And Others

    Comprehensive examinations, long a bastion in many doctoral programs, are one of many customs under scrutiny for possible change in a movement towards more authentic means of educational assessment. This preliminary study surveyed chairs of departments of educational administration from universities across the United States to learn how computers…

  20. Strategies for casualty mitigation programs by using advanced tsunami computation

    NASA Astrophysics Data System (ADS)

    IMAI, K.; Imamura, F.

    2012-12-01

    1. Purpose of the study In this study, based on the scenario of great earthquakes along the Nankai trough, we aim on the estimation of the run up and high accuracy inundation process of tsunami in coastal areas including rivers. Here, using a practical method of tsunami analytical model, and taking into account characteristics of detail topography, land use and climate change in a realistic present and expected future environment, we examined the run up and tsunami inundation process. Using these results we estimated the damage due to tsunami and obtained information for the mitigation of human casualties. Considering the time series from the occurrence of the earthquake and the risk of tsunami damage, in order to mitigate casualties we provide contents of disaster risk information displayed in a tsunami hazard and risk map. 2. Creating a tsunami hazard and risk map From the analytical and practical tsunami model (a long wave approximated model) and the high resolution topography (5 m) including detailed data of shoreline, rivers, building and houses, we present a advanced analysis of tsunami inundation considering the land use. Based on the results of tsunami inundation and its analysis; it is possible to draw a tsunami hazard and risk map with information of human casualty, building damage estimation, drift of vehicles, etc. 3. Contents of disaster prevention information To improve the hazard, risk and evacuation information distribution, it is necessary to follow three steps. (1) Provide basic information such as tsunami attack info, areas and routes for evacuation and location of tsunami evacuation facilities. (2) Provide as additional information the time when inundation starts, the actual results of inundation, location of facilities with hazard materials, presence or absence of public facilities and areas underground that required evacuation. (3) Provide information to support disaster response such as infrastructure and traffic network damage prediction

  1. CART V: recent advancements in computer-aided camouflage assessment

    NASA Astrophysics Data System (ADS)

    Müller, Thomas; Müller, Markus

    2011-05-01

    In order to facilitate systematic, computer aided improvements of camouflage and concealment assessment methods, the software system CART (Camouflage Assessment in Real-Time) was built up for the camouflage assessment of objects in multispectral image sequences (see contributions to SPIE 2007-2010 [1], [2], [3], [4]). It comprises a semi-automatic marking of target objects (ground truth generation) including their propagation over the image sequence and the evaluation via user-defined feature extractors as well as methods to assess the object's movement conspicuity. In this fifth part in an annual series at the SPIE conference in Orlando, this paper presents the enhancements over the recent year and addresses the camouflage assessment of static and moving objects in multispectral image data that can show noise or image artefacts. The presented methods fathom the correlations between image processing and camouflage assessment. A novel algorithm is presented based on template matching to assess the structural inconspicuity of an object objectively and quantitatively. The results can easily be combined with an MTI (moving target indication) based movement conspicuity assessment function in order to explore the influence of object movement to a camouflage effect in different environments. As the results show, the presented methods contribute to a significant benefit in the field of camouflage assessment.

  2. Block sparse Cholesky algorithms on advanced uniprocessor computers

    SciTech Connect

    Ng, E.G.; Peyton, B.W.

    1991-12-01

    As with many other linear algebra algorithms, devising a portable implementation of sparse Cholesky factorization that performs well on the broad range of computer architectures currently available is a formidable challenge. Even after limiting our attention to machines with only one processor, as we have done in this report, there are still several interesting issues to consider. For dense matrices, it is well known that block factorization algorithms are the best means of achieving this goal. We take this approach for sparse factorization as well. This paper has two primary goals. First, we examine two sparse Cholesky factorization algorithms, the multifrontal method and a blocked left-looking sparse Cholesky method, in a systematic and consistent fashion, both to illustrate the strengths of the blocking techniques in general and to obtain a fair evaluation of the two approaches. Second, we assess the impact of various implementation techniques on time and storage efficiency, paying particularly close attention to the work-storage requirement of the two methods and their variants.

  3. Advances in computed radiography systems and their physical imaging characteristics.

    PubMed

    Cowen, A R; Davies, A G; Kengyelics, S M

    2007-12-01

    Radiological imaging is progressing towards an all-digital future, across the spectrum of medical imaging techniques. Computed radiography (CR) has provided a ready pathway from screen film to digital radiography and a convenient entry point to PACS. This review briefly revisits the principles of modern CR systems and their physical imaging characteristics. Wide dynamic range and digital image enhancement are well-established benefits of CR, which lend themselves to improved image presentation and reduced rates of repeat exposures. However, in its original form CR offered limited scope for reducing the radiation dose per radiographic exposure, compared with screen film. Recent innovations in CR, including the use of dual-sided image readout and channelled storage phosphor have eased these concerns. For example, introduction of these technologies has improved detective quantum efficiency (DQE) by approximately 50 and 100%, respectively, compared with standard CR. As a result CR currently affords greater scope for reducing patient dose, and provides a more substantive challenge to the new solid-state, flat-panel, digital radiography detectors.

  4. INFORMATION SECURITY: Computer Attacks at Department of Defense Pose Increasing Risks

    DTIC Science & Technology

    1996-05-01

    difficult in the face of the growth in Internet use , the increasing skiIl levels of attackers themselves, and technological advances in their tools...over the Internet, and millions more use it for entertainment. Internet use has been more than doubling annually for the last several years to an...year. DISA information also shows that attacks are successful 65 percent of the time, and that the number of attacks is doubling each year, as Internet

  5. WAATS: A computer program for Weights Analysis of Advanced Transportation Systems

    NASA Technical Reports Server (NTRS)

    Glatt, C. R.

    1974-01-01

    A historical weight estimating technique for advanced transportation systems is presented. The classical approach to weight estimation is discussed and sufficient data is presented to estimate weights for a large spectrum of flight vehicles including horizontal and vertical takeoff aircraft, boosters and reentry vehicles. A computer program, WAATS (Weights Analysis for Advanced Transportation Systems) embracing the techniques discussed has been written and user instructions are presented. The program was developed for use in the ODIN (Optimal Design Integration System) system.

  6. Updating United States Advanced Battery Consortium and Department of Energy battery technology targets for battery electric vehicles

    NASA Astrophysics Data System (ADS)

    Neubauer, Jeremy; Pesaran, Ahmad; Bae, Chulheung; Elder, Ron; Cunningham, Brian

    2014-12-01

    Battery electric vehicles (BEVs) offer significant potential to reduce the nation's consumption of petroleum based products and the production of greenhouse gases however, their widespread adoption is limited largely by the cost and performance limitations of modern batteries. With recent growth in efforts to accelerate BEV adoption (e.g. the Department of Energy's (DOE) EV Everywhere Grand Challenge) and the age of existing BEV battery technology targets, there is sufficient motivation to re-evaluate the industry's technology targets for battery performance and cost. Herein we document the analysis process that supported the selection of the United States Advanced Battery Consortium's (USABC) updated BEV battery technology targets. Our technology agnostic approach identifies the necessary battery performance characteristics that will enable the vehicle level performance required for a commercially successful, mass market full BEV, as guided by the workgroup's OEM members. The result is an aggressive target, implying that batteries need to advance considerably before BEVs can be both cost and performance competitive with existing petroleum powered vehicles.

  7. ADVANCED COMPUTATIONAL MODEL FOR THREE-PHASE SLURRY REACTORS

    SciTech Connect

    Goodarz Ahmadi

    2004-10-01

    In this project, an Eulerian-Lagrangian formulation for analyzing three-phase slurry flows in a bubble column was developed. The approach used an Eulerian analysis of liquid flows in the bubble column, and made use of the Lagrangian trajectory analysis for the bubbles and particle motions. The bubble-bubble and particle-particle collisions are included the model. The model predictions are compared with the experimental data and good agreement was found An experimental setup for studying two-dimensional bubble columns was developed. The multiphase flow conditions in the bubble column were measured using optical image processing and Particle Image Velocimetry techniques (PIV). A simple shear flow device for bubble motion in a constant shear flow field was also developed. The flow conditions in simple shear flow device were studied using PIV method. Concentration and velocity of particles of different sizes near a wall in a duct flow was also measured. The technique of Phase-Doppler anemometry was used in these studies. An Eulerian volume of fluid (VOF) computational model for the flow condition in the two-dimensional bubble column was also developed. The liquid and bubble motions were analyzed and the results were compared with observed flow patterns in the experimental setup. Solid-fluid mixture flows in ducts and passages at different angle of orientations were also analyzed. The model predictions were compared with the experimental data and good agreement was found. Gravity chute flows of solid-liquid mixtures were also studied. The simulation results were compared with the experimental data and discussed A thermodynamically consistent model for multiphase slurry flows with and without chemical reaction in a state of turbulent motion was developed. The balance laws were obtained and the constitutive laws established.

  8. An Overview of the Advanced CompuTational Software (ACTS)Collection

    SciTech Connect

    Drummond, Leroy A.; Marques, Osni A.

    2005-02-02

    The ACTS Collection brings together a number of general-purpose computational tools that were developed by independent research projects mostly funded and supported by the U.S. Department of Energy. These tools tackle a number of common computational issues found in many applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. In this article, we introduce the numerical tools in the collection and their functionalities, present a model for developing more complex computational applications on top of ACTS tools, and summarize applications that use these tools. Lastly, we present a vision of the ACTS project for deployment of the ACTS Collection by the computational sciences community.

  9. Department of Defense Science, Technology, Engineering, and Mathematics (STEM) Education Workshop on Computing Education

    DTIC Science & Technology

    2010-10-18

    School Cybercampus ChicTech Tech Ambassadors Competition Arduino Project Lead the Way Pico Crickets™ Workshops MIT Media Lab , STEM Rays, UMASS...Computer Science Equity Alliance Way Arduino “Kids Ahead” SMU Caruth August 2010 14 , STEM Rays, UMASS US FIRST Robotics Competition Autonomous

  10. Advances and perspectives in lung cancer imaging using multidetector row computed tomography.

    PubMed

    Coche, Emmanuel

    2012-10-01

    The introduction of multidetector row computed tomography (CT) into clinical practice has revolutionized many aspects of the clinical work-up. Lung cancer imaging has benefited from various breakthroughs in computing technology, with advances in the field of lung cancer detection, tissue characterization, lung cancer staging and response to therapy. Our paper discusses the problems of radiation, image visualization and CT examination comparison. It also reviews the most significant advances in lung cancer imaging and highlights the emerging clinical applications that use state of the art CT technology in the field of lung cancer diagnosis and follow-up.

  11. Advanced entry guidance algorithm with landing footprint computation

    NASA Astrophysics Data System (ADS)

    Leavitt, James Aaron

    -determined angle of attack profile. The method is also capable of producing orbital footprints using an automatically-generated set of angle of attack profiles of varying range, with the lowest profile designed for near-maximum range in the absence of an active heat load constraint. The accuracy of the footprint method is demonstrated by direct comparison with footprints computed independently by an optimization program.

  12. The Nuclear Energy Advanced Modeling and Simulation Enabling Computational Technologies FY09 Report

    SciTech Connect

    Diachin, L F; Garaizar, F X; Henson, V E; Pope, G

    2009-10-12

    In this document we report on the status of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Enabling Computational Technologies (ECT) effort. In particular, we provide the context for ECT In the broader NEAMS program and describe the three pillars of the ECT effort, namely, (1) tools and libraries, (2) software quality assurance, and (3) computational facility (computers, storage, etc) needs. We report on our FY09 deliverables to determine the needs of the integrated performance and safety codes (IPSCs) in these three areas and lay out the general plan for software quality assurance to meet the requirements of DOE and the DOE Advanced Fuel Cycle Initiative (AFCI). We conclude with a brief description of our interactions with the Idaho National Laboratory computer center to determine what is needed to expand their role as a NEAMS user facility.

  13. Computational Modeling and High Performance Computing in Advanced Materials Processing, Synthesis, and Design

    DTIC Science & Technology

    2014-12-07

    research efforts in this project focused on the synergistic coupling of: • Computational material science and mechanics of hybrid and light weight polymeric...MATERIAL SCIENCE AND MECHANICS OF HYBRID AND LIGHT WEIGHT POLYMERIC COMPOSITE STRUCTURES 11 A-l-l: Atomistic Modeling in Polymer Nanocomposite Systems...DETAILED TECHNICAL REPORT 16 A-1: COMPUTATIONAL MATERIAL SCIENCE AND MECHANICS OF HYBRID AND LIGHT WEIGHT POLYMERIC COMPOSITE STRUCTURES 16 A-l-l

  14. Department of Defense High Performance Computing Modernization Program. 2006 Annual Report

    DTIC Science & Technology

    2007-03-01

    Using 3-D Navier - Stokes Simulation Timothy Madden, Air Force Research Laboratory, Kirtland AFB, NM Applications of Time-Accurate CFD in Order to...shock, penetration, and blast. Computational Fluid Dynamics CFD Provides accurate numerical solution of the equations describing fluid and gas motion...Dick K.P. Yue, Massachusetts Institute of Technology, Cambridge, MA (Office of Naval Research) Molecular Rotors for Nanotechnology Josef Michl

  15. Projected role of advanced computational aerodynamic methods at the Lockheed-Georgia company

    NASA Technical Reports Server (NTRS)

    Lores, M. E.

    1978-01-01

    Experience with advanced computational methods being used at the Lockheed-Georgia Company to aid in the evaluation and design of new and modified aircraft indicates that large and specialized computers will be needed to make advanced three-dimensional viscous aerodynamic computations practical. The Numerical Aerodynamic Simulation Facility should be used to provide a tool for designing better aerospace vehicles while at the same time reducing development costs by performing computations using Navier-Stokes equations solution algorithms and permitting less sophisticated but nevertheless complex calculations to be made efficiently. Configuration definition procedures and data output formats can probably best be defined in cooperation with industry, therefore, the computer should handle many remote terminals efficiently. The capability of transferring data to and from other computers needs to be provided. Because of the significant amount of input and output associated with 3-D viscous flow calculations and because of the exceedingly fast computation speed envisioned for the computer, special attention should be paid to providing rapid, diversified, and efficient input and output.

  16. United States of America Department of Health and Human Services support for advancing influenza vaccine manufacturing in the developing world.

    PubMed

    Perdue, Michael L; Bright, Rick A

    2011-07-01

    five years of age. In addition to achievements described in this issue of Vaccine, the programme has been successful from the US perspective because the working relationships established between the US Department of Health and Human Services' (HHS) Assistant Secretary for Preparedness and Response Biomedical Advanced Research and Development Authority (BARDA) and its partners have assisted in advancing influenza vaccine development at many different levels. A few examples of BARDA's support include: establishment of egg-based influenza vaccine production from "scratch", enhancement of live attenuated influenza vaccine (LAIV) production techniques and infrastructure, completion of fill/finish operations for imported bulk vaccine, and training in advanced bio-manufacturing techniques. These HHS-supported programmes have been well-received internationally, and we and our partners hope the successes will stimulate even more interest within the international community in maximizing global production levels for influenza vaccines.

  17. Volumes to learn: advancing therapeutics with innovative computed tomography image data analysis.

    PubMed

    Maitland, Michael L

    2010-09-15

    Semi-automated methods for calculating tumor volumes from computed tomography images are a new tool for advancing the development of cancer therapeutics. Volumetric measurements, relying on already widely available standard clinical imaging techniques, could shorten the observation intervals needed to identify cohorts of patients sensitive or resistant to treatment.

  18. Next Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and Systems Biology (Final Report)

    EPA Science Inventory

    EPA announced the release of the final report, Next Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and Systems Biology. This report describes new approaches that are faster, less resource intensive, and more robust that can help ...

  19. COMPUTATIONAL TOXICOLOGY ADVANCES: EMERGING CAPABILITIES FOR DATA EXPLORATION AND SAR MODEL DEVELOPMENT

    EPA Science Inventory

    Computational Toxicology Advances: Emerging capabilities for data exploration and SAR model development
    Ann M. Richard and ClarLynda R. Williams, National Health & Environmental Effects Research Laboratory, US EPA, Research Triangle Park, NC, USA; email: richard.ann@epa.gov

  20. Computers-for-edu: An Advanced Business Application Programming (ABAP) Teaching Case

    ERIC Educational Resources Information Center

    Boyle, Todd A.

    2007-01-01

    The "Computers-for-edu" case is designed to provide students with hands-on exposure to creating Advanced Business Application Programming (ABAP) reports and dialogue programs, as well as navigating various mySAP Enterprise Resource Planning (ERP) transactions needed by ABAP developers. The case requires students to apply a wide variety…

  1. Scientific Discovery through Advanced Computing (SciDAC-3) Partnership Project Annual Report

    SciTech Connect

    Hoffman, Forest M.; Bochev, Pavel B.; Cameron-Smith, Philip J..; Easter, Richard C; Elliott, Scott M.; Ghan, Steven J.; Liu, Xiaohong; Lowrie, Robert B.; Lucas, Donald D.; Ma, Po-lun; Sacks, William J.; Shrivastava, Manish; Singh, Balwinder; Tautges, Timothy J.; Taylor, Mark A.; Vertenstein, Mariana; Worley, Patrick H.

    2014-01-15

    The Applying Computationally Efficient Schemes for BioGeochemical Cycles ACES4BGC Project is advancing the predictive capabilities of Earth System Models (ESMs) by reducing two of the largest sources of uncertainty, aerosols and biospheric feedbacks, with a highly efficient computational approach. In particular, this project is implementing and optimizing new computationally efficient tracer advection algorithms for large numbers of tracer species; adding important biogeochemical interactions between the atmosphere, land, and ocean models; and applying uncertainty quanti cation (UQ) techniques to constrain process parameters and evaluate uncertainties in feedbacks between biogeochemical cycles and the climate system.

  2. Research in Computational Aeroscience Applications Implemented on Advanced Parallel Computing Systems

    NASA Technical Reports Server (NTRS)

    Wigton, Larry

    1996-01-01

    Improving the numerical linear algebra routines for use in new Navier-Stokes codes, specifically Tim Barth's unstructured grid code, with spin-offs to TRANAIR is reported. A fast distance calculation routine for Navier-Stokes codes using the new one-equation turbulence models is written. The primary focus of this work was devoted to improving matrix-iterative methods. New algorithms have been developed which activate the full potential of classical Cray-class computers as well as distributed-memory parallel computers.

  3. Impact of computer advances on future finite elements computations. [for aircraft and spacecraft design

    NASA Technical Reports Server (NTRS)

    Fulton, Robert E.

    1985-01-01

    Research performed over the past 10 years in engineering data base management and parallel computing is discussed, and certain opportunities for research toward the next generation of structural analysis capability are proposed. Particular attention is given to data base management associated with the IPAD project and parallel processing associated with the Finite Element Machine project, both sponsored by NASA, and a near term strategy for a distributed structural analysis capability based on relational data base management software and parallel computers for a future structural analysis system.

  4. A first attempt to bring computational biology into advanced high school biology classrooms.

    PubMed

    Gallagher, Suzanne Renick; Coon, William; Donley, Kristin; Scott, Abby; Goldberg, Debra S

    2011-10-01

    Computer science has become ubiquitous in many areas of biological research, yet most high school and even college students are unaware of this. As a result, many college biology majors graduate without adequate computational skills for contemporary fields of biology. The absence of a computational element in secondary school biology classrooms is of growing concern to the computational biology community and biology teachers who would like to acquaint their students with updated approaches in the discipline. We present a first attempt to correct this absence by introducing a computational biology element to teach genetic evolution into advanced biology classes in two local high schools. Our primary goal was to show students how computation is used in biology and why a basic understanding of computation is necessary for research in many fields of biology. This curriculum is intended to be taught by a computational biologist who has worked with a high school advanced biology teacher to adapt the unit for his/her classroom, but a motivated high school teacher comfortable with mathematics and computing may be able to teach this alone. In this paper, we present our curriculum, which takes into consideration the constraints of the required curriculum, and discuss our experiences teaching it. We describe the successes and challenges we encountered while bringing this unit to high school students, discuss how we addressed these challenges, and make suggestions for future versions of this curriculum.We believe that our curriculum can be a valuable seed for further development of computational activities aimed at high school biology students. Further, our experiences may be of value to others teaching computational biology at this level. Our curriculum can be obtained at http://ecsite.cs.colorado.edu/?page_id=149#biology or by contacting the authors.

  5. Building an advanced climate model: Program plan for the CHAMMP (Computer Hardware, Advanced Mathematics, and Model Physics) Climate Modeling Program

    SciTech Connect

    Not Available

    1990-12-01

    The issue of global warming and related climatic changes from increasing concentrations of greenhouse gases in the atmosphere has received prominent attention during the past few years. The Computer Hardware, Advanced Mathematics, and Model Physics (CHAMMP) Climate Modeling Program is designed to contribute directly to this rapid improvement. The goal of the CHAMMP Climate Modeling Program is to develop, verify, and apply a new generation of climate models within a coordinated framework that incorporates the best available scientific and numerical approaches to represent physical, biogeochemical, and ecological processes, that fully utilizes the hardware and software capabilities of new computer architectures, that probes the limits of climate predictability, and finally that can be used to address the challenging problem of understanding the greenhouse climate issue through the ability of the models to simulate time-dependent climatic changes over extended times and with regional resolution.

  6. FY05-FY06 Advanced Simulation and Computing Implementation Plan, Volume 2

    SciTech Connect

    Baron, A L

    2004-07-19

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program will require the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapon design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile life extension programs and the resolution of significant finding investigations (SFIs). This requires a balanced system of technical staff, hardware, simulation software, and computer science solutions.

  7. Willingness and Ability of Older Adults in the Emergency Department to Provide Clinical Information Using a Tablet Computer

    PubMed Central

    Brahmandam, Sruti; Holland, Wesley C.; Mangipudi, Sowmya A.; Braz, Valerie A.; Medlin, Richard P.; Hunold, Katherine M.; Jones, Christopher W.; Platts-Mills, Timothy F.

    2017-01-01

    OBJECTIVES To estimate the proportion of older adults in the emergency department (ED) who are willing and able to use a tablet computer to answer questions. DESIGN Prospective, ED-based cross-sectional study. SETTING Two U.S. academic EDs. PARTICIPANTS Individuals aged 65 and older. MEASUREMENTS As part of screening for another study, potential study participants were asked whether they would be willing to use a tablet computer to answer eight questions instead of answering questions orally. A custom user interface optimized for older adults was used. Trained research assistants observed study participants as they used the tablets. Ability to use the tablet was assessed based on need for assistance and number of questions answered correctly. RESULTS Of 365 individuals approached, 248 (68%) were willing to answer screening questions, 121 of these (49%) were willing to use a tablet computer; of these, 91 (75%) were able to answer at least six questions correctly, and 35 (29%) did not require assistance. Only 14 (12%) were able to answer all eight questions correctly without assistance. Individuals aged 65 to 74 and those reporting use of a touchscreen device at least weekly were more likely to be willing and able to use the tablet computer. Of individuals with no or mild cognitive impairment, the percentage willing to use the tablet was 45%, and the percentage answering all questions correctly was 32%. CONCLUSION Approximately half of this sample of older adults in the ED was willing to provide information using a tablet computer, but only a small minority of these were able to enter all information correctly without assistance. Tablet computers may provide an efficient means of collecting clinical information from some older adults in the ED, but at present, it will be ineffective for a significant portion of this population. PMID:27804126

  8. Computer Simulation in Mass Emergency and Disaster Response: An Evaluation of Its Effectiveness as a Tool for Demonstrating Strategic Competency in Emergency Department Medical Responders

    ERIC Educational Resources Information Center

    O'Reilly, Daniel J.

    2011-01-01

    This study examined the capability of computer simulation as a tool for assessing the strategic competency of emergency department nurses as they responded to authentically computer simulated biohazard-exposed patient case studies. Thirty registered nurses from a large, urban hospital completed a series of computer-simulated case studies of…

  9. Report on the status of linear drive coolers for the Department of Defense Standard Advanced Dewar Assembly (SADA)

    NASA Astrophysics Data System (ADS)

    Salazar, William

    2003-01-01

    The Standard Advanced Dewar Assembly (SADA) is the critical module in the Department of Defense (DoD) standardization effort of scanning second-generation thermal imaging systems. DoD has established a family of SADA's to address requirements for high performance (SADA I), mid-to-high performance (SADA II), and compact class (SADA III) systems. SADA's consist of the Infrared Focal Plane Array (IRFPA), Dewar, Command and Control Electronics (C&CE), and the cryogenic cooler. SADA's are used in weapons systems such as Comanche and Apache helicopters, the M1 Abrams Tank, the M2 Bradley Fighting Vehicle, the Line of Sight Antitank (LOSAT) system, the Improved Target Acquisition System (ITAS), and Javelin's Command Launch Unit (CLU). DOD has defined a family of tactical linear drive coolers in support of the family of SADA's. The Stirling linear drive cryo-coolers are utilized to cool the SADA's Infrared Focal Plane Arrays (IRFPAs) to their operating cryogenic temperatures. These linear drive coolers are required to meet strict cool-down time requirements along with lower vibration output, lower audible noise, and higher reliability than currently fielded rotary coolers. This paper will (1) outline the characteristics of each cooler, (2) present the status and results of qualification tests, and (3) present the status and test results of efforts to increase linear drive cooler reliability.

  10. Computation of Loads on the McDonnell Douglas Advanced Bearingless Rotor

    NASA Technical Reports Server (NTRS)

    Nguyen, Khanh; Lauzon, Dan; Anand, Vaidyanathan

    1994-01-01

    Computed results from UMARC and DART analyses are compared with the blade bending moments and vibratory hub loads data obtained from a full-scale wind tunnel test of the McDonnell Douglas five-bladed advanced bearingless rotor. The 5 per-rev vibratory hub loads data are corrected using results from a dynamic calibration of the rotor balance. The comparison between UMARC computed blade bending moments at different flight conditions are poor to fair, while DART results are fair to good. Using the free wake module, UMARC adequately computes the 5P vibratory hub loads for this rotor, capturing both magnitude and variations with forward speed. DART employs a uniform inflow wake model and does not adequately compute the 5P vibratory hub loads for this rotor.

  11. Advanced Simulation & Computing FY15 Implementation Plan Volume 2, Rev. 0.5

    SciTech Connect

    McCoy, Michel; Archer, Bill; Matzen, M. Keith

    2014-09-16

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. As the program approaches the end of its second decade, ASC is intently focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), quantify critical margins and uncertainties, and resolve increasingly difficult analyses needed for the SSP. Where possible, the program also enables the use of high-performance simulation and computing tools to address broader national security needs, such as foreign nuclear weapon assessments and counternuclear terrorism.

  12. PREFACE: 16th International workshop on Advanced Computing and Analysis Techniques in physics research (ACAT2014)

    NASA Astrophysics Data System (ADS)

    Fiala, L.; Lokajicek, M.; Tumova, N.

    2015-05-01

    This volume of the IOP Conference Series is dedicated to scientific contributions presented at the 16th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2014), this year the motto was ''bridging disciplines''. The conference took place on September 1-5, 2014, at the Faculty of Civil Engineering, Czech Technical University in Prague, Czech Republic. The 16th edition of ACAT explored the boundaries of computing system architectures, data analysis algorithmics, automatic calculations, and theoretical calculation technologies. It provided a forum for confronting and exchanging ideas among these fields, where new approaches in computing technologies for scientific research were explored and promoted. This year's edition of the workshop brought together over 140 participants from all over the world. The workshop's 16 invited speakers presented key topics on advanced computing and analysis techniques in physics. During the workshop, 60 talks and 40 posters were presented in three tracks: Computing Technology for Physics Research, Data Analysis - Algorithms and Tools, and Computations in Theoretical Physics: Techniques and Methods. The round table enabled discussions on expanding software, knowledge sharing and scientific collaboration in the respective areas. ACAT 2014 was generously sponsored by Western Digital, Brookhaven National Laboratory, Hewlett Packard, DataDirect Networks, M Computers, Bright Computing, Huawei and PDV-Systemhaus. Special appreciations go to the track liaisons Lorenzo Moneta, Axel Naumann and Grigory Rubtsov for their work on the scientific program and the publication preparation. ACAT's IACC would also like to express its gratitude to all referees for their work on making sure the contributions are published in the proceedings. Our thanks extend to the conference liaisons Andrei Kataev and Jerome Lauret who worked with the local contacts and made this conference possible as well as to the program

  13. Advances in computer-aided design and computer-aided manufacture technology.

    PubMed

    Calamia, J R

    1994-01-01

    Although the development of computer-aided design (CAD) and computer-aided manufacture (CAM) technology and the benefits of increased productivity became obvious in the automobile and aerospace industries in the 1970s, investigations of this technology's application in the field of dentistry did not begin until the 1980s. Only now are we beginning to see the fruits of this work with the commercial availability of some systems; the potential for this technology seems boundless. This article reviews the recent literature with emphasis on the period from June 1992 to May 1993. This review should familiarize the reader with some of the latest developments in this technology, including a brief description of some systems currently available and the clinical and economical rationale for their acceptance into the dental mainstream. This article concentrates on a particular system, the Cerec (Siemens/Pelton and Crane, Charlotte, NC) system, for three reasons: first, this system has been available since 1985 and, as a result, has a track record of almost 7 years of data. Most of the data have just recently been released and consequently, much of this year's literature on CAD-CAM is monopolized by studies using this system. Second, this system was developed as a mobile, affordable, direct chairside CAD-CAM restorative method. As such, it is of special interest to the dentist who will offer this new technology directly to the patient, providing a one-visit restoration. Third, the author is currently engaged in research using this particular system and has a working knowledge of this system's capabilities.

  14. Recovery Act: Advanced Interaction, Computation, and Visualization Tools for Sustainable Building Design

    SciTech Connect

    Greenberg, Donald P.; Hencey, Brandon M.

    2013-08-20

    Current building energy simulation technology requires excessive labor, time and expertise to create building energy models, excessive computational time for accurate simulations and difficulties with the interpretation of the results. These deficiencies can be ameliorated using modern graphical user interfaces and algorithms which take advantage of modern computer architectures and display capabilities. To prove this hypothesis, we developed an experimental test bed for building energy simulation. This novel test bed environment offers an easy-to-use interactive graphical interface, provides access to innovative simulation modules that run at accelerated computational speeds, and presents new graphics visualization methods to interpret simulation results. Our system offers the promise of dramatic ease of use in comparison with currently available building energy simulation tools. Its modular structure makes it suitable for early stage building design, as a research platform for the investigation of new simulation methods, and as a tool for teaching concepts of sustainable design. Improvements in the accuracy and execution speed of many of the simulation modules are based on the modification of advanced computer graphics rendering algorithms. Significant performance improvements are demonstrated in several computationally expensive energy simulation modules. The incorporation of these modern graphical techniques should advance the state of the art in the domain of whole building energy analysis and building performance simulation, particularly at the conceptual design stage when decisions have the greatest impact. More importantly, these better simulation tools will enable the transition from prescriptive to performative energy codes, resulting in better, more efficient designs for our future built environment.

  15. Recent Advances in Cardiac Computed Tomography: Dual Energy, Spectral and Molecular CT Imaging

    PubMed Central

    Danad, Ibrahim; Fayad, Zahi A.; Willemink, Martin J.; Min, James K.

    2015-01-01

    Computed tomography (CT) evolved into a powerful diagnostic tool and it is impossible to imagine current clinical practice without CT imaging. Due to its widespread availability, ease of clinical application, superb sensitivity for detection of CAD, and non-invasive nature, CT has become a valuable tool within the armamentarium of the cardiologist. In the last few years, numerous technological advances in CT have occurred—including dual energy CT (DECT), spectral CT and CT-based molecular imaging. By harnessing the advances in technology, cardiac CT has advanced beyond the mere evaluation of coronary stenosis to an imaging modality tool that permits accurate plaque characterization, assessment of myocardial perfusion and even probing of molecular processes that are involved in coronary atherosclerosis. Novel innovations in CT contrast agents and pre-clinical spectral CT devices have paved the way for CT-based molecular imaging. PMID:26068288

  16. Advances in Single-Photon Emission Computed Tomography Hardware and Software.

    PubMed

    Piccinelli, Marina; Garcia, Ernest V

    2016-02-01

    Nuclear imaging techniques remain today's most reliable modality for the assessment and quantification of myocardial perfusion. In recent years, the field has experienced tremendous progress both in terms of dedicated cameras for cardiac applications and software techniques for image reconstruction. The most recent advances in single-photon emission computed tomography hardware and software are reviewed, focusing on how these improvements have resulted in an even more powerful diagnostic tool with reduced injected radiation dose and acquisition time.

  17. Condition monitoring through advanced sensor and computational technology : final report (January 2002 to May 2005).

    SciTech Connect

    Kim, Jung-Taek; Luk, Vincent K.

    2005-05-01

    The overall goal of this joint research project was to develop and demonstrate advanced sensors and computational technology for continuous monitoring of the condition of components, structures, and systems in advanced and next-generation nuclear power plants (NPPs). This project included investigating and adapting several advanced sensor technologies from Korean and US national laboratory research communities, some of which were developed and applied in non-nuclear industries. The project team investigated and developed sophisticated signal processing, noise reduction, and pattern recognition techniques and algorithms. The researchers installed sensors and conducted condition monitoring tests on two test loops, a check valve (an active component) and a piping elbow (a passive component), to demonstrate the feasibility of using advanced sensors and computational technology to achieve the project goal. Acoustic emission (AE) devices, optical fiber sensors, accelerometers, and ultrasonic transducers (UTs) were used to detect mechanical vibratory response of check valve and piping elbow in normal and degraded configurations. Chemical sensors were also installed to monitor the water chemistry in the piping elbow test loop. Analysis results of processed sensor data indicate that it is feasible to differentiate between the normal and degraded (with selected degradation mechanisms) configurations of these two components from the acquired sensor signals, but it is questionable that these methods can reliably identify the level and type of degradation. Additional research and development efforts are needed to refine the differentiation techniques and to reduce the level of uncertainties.

  18. Computer-assisted virtual planning and surgical template fabrication for frontoorbital advancement.

    PubMed

    Soleman, Jehuda; Thieringer, Florian; Beinemann, Joerg; Kunz, Christoph; Guzman, Raphael

    2015-05-01

    OBJECT The authors describe a novel technique using computer-assisted design (CAD) and computed-assisted manufacturing (CAM) for the fabrication of individualized 3D printed surgical templates for frontoorbital advancement surgery. METHODS Two patients underwent frontoorbital advancement surgery for unilateral coronal synostosis. Virtual surgical planning (SurgiCase-CMF, version 5.0, Materialise) was done by virtual mirroring techniques and superposition of an age-matched normative 3D pediatric skull model. Based on these measurements, surgical templates were fabricated using a 3D printer. Bifrontal craniotomy and the osteotomies for the orbital bandeau were performed based on the sterilized 3D templates. The remodeling was then done placing the bone plates within the negative 3D templates and fixing them using absorbable poly-dl-lactic acid plates and screws. RESULTS Both patients exhibited a satisfying head shape postoperatively and at follow-up. No surgery-related complications occurred. The cutting and positioning of the 3D surgical templates proved to be very accurate and easy to use as well as reproducible and efficient. CONCLUSIONS Computer-assisted virtual planning and 3D template fabrication for frontoorbital advancement surgery leads to reconstructions based on standardizedmeasurements, precludes subjective remodeling, and seems to be overall safe and feasible. A larger series of patients with long-term follow-up is needed for further evaluation of this novel technique.

  19. Recent advances in 3D computed tomography techniques for simulation and navigation in hepatobiliary pancreatic surgery.

    PubMed

    Uchida, Masafumi

    2014-04-01

    A few years ago it could take several hours to complete a 3D image using a 3D workstation. Thanks to advances in computer science, obtaining results of interest now requires only a few minutes. Many recent 3D workstations or multimedia computers are equipped with onboard 3D virtual patient modeling software, which enables patient-specific preoperative assessment and virtual planning, navigation, and tool positioning. Although medical 3D imaging can now be conducted using various modalities, including computed tomography (CT), magnetic resonance imaging (MRI), positron emission tomography (PET), and ultrasonography (US) among others, the highest quality images are obtained using CT data, and CT images are now the most commonly used source of data for 3D simulation and navigation image. If the 2D source image is bad, no amount of 3D image manipulation in software will provide a quality 3D image. In this exhibition, the recent advances in CT imaging technique and 3D visualization of the hepatobiliary and pancreatic abnormalities are featured, including scan and image reconstruction technique, contrast-enhanced techniques, new application of advanced CT scan techniques, and new virtual reality simulation and navigation imaging.

  20. Research Institute for Advanced Computer Science: Annual Report October 1998 through September 1999

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.; Gross, Anthony R. (Technical Monitor)

    1999-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center (ARC). It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. ARC has been designated NASA's Center of Excellence in Information Technology. In this capacity, ARC is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA ARC and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to

  1. Computer-Assisted Instruction in the Context of the Advanced Instructional System: Authoring Support Software. Final Report.

    ERIC Educational Resources Information Center

    Montgomery, Ann D.; Judd, Wilson A.

    This report details the design, development, and implementation of computer software to support the cost-effective production of computer assisted instruction (CAI) within the context of the Advanced Instructional System (AIS) located at Lowry Air Force Base. The report supplements the computer managed Air Force technical training that is…

  2. Automated Computer-Assisted Diagnosis of Obstructive Coronary Artery Disease in Emergency Department Patients Undergoing 256-Slice Coronary Computed Tomography Angiography for Acute Chest Pain.

    PubMed

    Hashoul, Sharbell; Gaspar, Tamar; Halon, David A; Lewis, Basil S; Shenkar, Yuval; Jaffe, Ronen; Peled, Nathan; Rubinshtein, Ronen

    2015-10-01

    A 256-slice coronary computed tomography angiography (CCTA) is an accurate method for detection and exclusion of obstructive coronary artery disease (OBS-CAD). However, accurate image interpretation requires expertise and may not be available at all hours. The purpose of this study was to evaluate the usefulness of a fully automated computer-assisted diagnosis (COMP-DIAG) tool for exclusion of OBS-CAD in patients in the emergency department (ED) presenting with chest pain. Three hundred sixty-nine patients in ED without known coronary disease underwent 256-slice CCTA as part of the assessment of chest pain of uncertain origin. COMP-DIAG (CorAnalyzer II) automatically reported presence or exclusion of OBS-CAD (>50% stenosis, ≥1 vessel). Performance characteristics of COMP-DIAG for exclusion and detection of OBS-CAD were determined using expert reading as the reference standard. Seventeen (5%) studies were unassessable by COMP-DIAG software, and 352 patients (1,056 vessels) were therefore available for analysis. COMP-DIAG identified 33% of assessable studies as having OBS-CAD, but the prevalence of OBS-CAD on CCTA was only 18% (66 of 352 patients) by standard expert reading. However, COMP-DIAG correctly identified 61 of the 66 patients (93%) with OBS-CAD with 21 vessels (2%) with OBS-CAD misclassified as negative. In conclusion, compared to expert reading, automated computer-assisted diagnosis using the CorAnalyzer showed high sensitivity but only moderate specificity for detection of obstructive coronary disease in patients in ED who underwent 256-slice CCTA. The high negative predictive value of this computer-assisted algorithm may be useful in the ED setting.

  3. ADVANCING THE FUNDAMENTAL UNDERSTANDING AND SCALE-UP OF TRISO FUEL COATERS VIA ADVANCED MEASUREMENT AND COMPUTATIONAL TECHNIQUES

    SciTech Connect

    Biswas, Pratim; Al-Dahhan, Muthanna

    2012-11-01

    to advance the fundamental understanding of the hydrodynamics by systematically investigating the effect of design and operating variables, to evaluate the reported dimensionless groups as scaling factors, and to establish a reliable scale-up methodology for the TRISO fuel particle spouted bed coaters based on hydrodynamic similarity via advanced measurement and computational techniques. An additional objective is to develop an on-line non-invasive measurement technique based on gamma ray densitometry (i.e. Nuclear Gauge Densitometry) that can be installed and used for coater process monitoring to ensure proper performance and operation and to facilitate the developed scale-up methodology. To achieve the objectives set for the project, the work will use optical probes and gamma ray computed tomography (CT) (for the measurements of solids/voidage holdup cross-sectional distribution and radial profiles along the bed height, spouted diameter, and fountain height) and radioactive particle tracking (RPT) (for the measurements of the 3D solids flow field, velocity, turbulent parameters, circulation time, solids lagrangian trajectories, and many other of spouted bed related hydrodynamic parameters). In addition, gas dynamic measurement techniques and pressure transducers will be utilized to complement the obtained information. The measurements obtained by these techniques will be used as benchmark data to evaluate and validate the computational fluid dynamic (CFD) models (two fluid model or discrete particle model) and their closures. The validated CFD models and closures will be used to facilitate the developed methodology for scale-up, design and hydrodynamic similarity. Successful execution of this work and the proposed tasks will advance the fundamental understanding of the coater flow field and quantify it for proper and safe design, scale-up, and performance. Such achievements will overcome the barriers to AGR applications and will help assure that the US maintains

  4. PREFACE: 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT2013)

    NASA Astrophysics Data System (ADS)

    Wang, Jianxiong

    2014-06-01

    This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2013) which took place on 16-21 May 2013 at the Institute of High Energy Physics, Chinese Academy of Sciences, Beijing, China. The workshop series brings together computer science researchers and practitioners, and researchers from particle physics and related fields to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. This year's edition of the workshop brought together over 120 participants from all over the world. 18 invited speakers presented key topics on the universe in computer, Computing in Earth Sciences, multivariate data analysis, automated computation in Quantum Field Theory as well as computing and data analysis challenges in many fields. Over 70 other talks and posters presented state-of-the-art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. The round table discussions on open-source, knowledge sharing and scientific collaboration stimulate us to think over the issue in the respective areas. ACAT 2013 was generously sponsored by the Chinese Academy of Sciences (CAS), National Natural Science Foundation of China (NFSC), Brookhaven National Laboratory in the USA (BNL), Peking University (PKU), Theoretical Physics Cernter for Science facilities of CAS (TPCSF-CAS) and Sugon. We would like to thank all the participants for their scientific contributions and for the en- thusiastic participation in all its activities of the workshop. Further information on ACAT 2013 can be found at http://acat2013.ihep.ac.cn. Professor Jianxiong Wang Institute of High Energy Physics Chinese Academy of Science Details of committees and sponsors are available in the PDF

  5. PREFACE: 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011)

    NASA Astrophysics Data System (ADS)

    Teodorescu, Liliana; Britton, David; Glover, Nigel; Heinrich, Gudrun; Lauret, Jérôme; Naumann, Axel; Speer, Thomas; Teixeira-Dias, Pedro

    2012-06-01

    ACAT2011 This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011) which took place on 5-7 September 2011 at Brunel University, UK. The workshop series, which began in 1990 in Lyon, France, brings together computer science researchers and practitioners, and researchers from particle physics and related fields in order to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. It is a forum for the exchange of ideas among the fields, exploring and promoting cutting-edge computing, data analysis and theoretical calculation techniques in fundamental physics research. This year's edition of the workshop brought together over 100 participants from all over the world. 14 invited speakers presented key topics on computing ecosystems, cloud computing, multivariate data analysis, symbolic and automatic theoretical calculations as well as computing and data analysis challenges in astrophysics, bioinformatics and musicology. Over 80 other talks and posters presented state-of-the art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. Panel and round table discussions on data management and multivariate data analysis uncovered new ideas and collaboration opportunities in the respective areas. This edition of ACAT was generously sponsored by the Science and Technology Facility Council (STFC), the Institute for Particle Physics Phenomenology (IPPP) at Durham University, Brookhaven National Laboratory in the USA and Dell. We would like to thank all the participants of the workshop for the high level of their scientific contributions and for the enthusiastic participation in all its activities which were, ultimately, the key factors in the

  6. Advances and trends in the development of computational models for tires

    NASA Technical Reports Server (NTRS)

    Noor, A. K.; Tanner, J. A.

    1985-01-01

    Status and some recent developments of computational models for tires are summarized. Discussion focuses on a number of aspects of tire modeling and analysis including: tire materials and their characterization; evolution of tire models; characteristics of effective finite element models for analyzing tires; analysis needs for tires; and impact of the advances made in finite element technology, computational algorithms, and new computing systems on tire modeling and analysis. An initial set of benchmark problems has been proposed in concert with the U.S. tire industry. Extensive sets of experimental data will be collected for these problems and used for evaluating and validating different tire models. Also, the new Aircraft Landing Dynamics Facility (ALDF) at NASA Langley Research Center is described.

  7. Optical disk archiving using a personal computer: a solution to image storage problems in diagnostic imaging departments.

    PubMed

    Parkin, A; Norwood, H; Erdentug, A; Hall, A J

    1990-01-01

    The paper describes an approach to solving the problem of providing a large-capacity image archive for diagnostic imaging departments at reasonable cost. Optical disk stores, when fitted retrospectively to scanners, are very expensive and may not be compatible with existing computer hardware. We describe the use of an industry standard personal computer (PC) linked to a standard 5 1/4-in. optical disk drive as a 'stand-alone' image store. Image data are transferred from the scanner using 8-in. floppy disks, and these are read into the PC using an attached 8-in. floppy disk drive and then transferred to the optical disk. The patient details (patient name, ID, date, etc.) are entered into a database program held on the PC and these are used to generate a reference pointer to the optical disk file through which the data can be retrieved. Data retrieval involves entering the patient details into the data base and inserting a blank 8-in. floppy disk into the drive attached to the PC. A sector copy is then made from the optical disk to the 8-in. floppy disk, which can then be used at the viewing station at the scanner. The system is flexible since it can accept data from a variety of sources in any format; it is also low cost and operates independently of the scanner. The hardware is industry standard, ensuring low maintenance costs.

  8. Advanced Simulation and Computing Fiscal Year 2011-2012 Implementation Plan, Revision 0

    SciTech Connect

    McCoy, M; Phillips, J; Hpson, J; Meisner, R

    2010-04-22

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering (D&E) programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model

  9. Advanced Simulation and Computing FY08-09 Implementation Plan Volume 2 Revision 0

    SciTech Connect

    McCoy, M; Kusnezov, D; Bikkel, T; Hopson, J

    2007-04-25

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  10. Advanced Simulation and Computing FY07-08 Implementation Plan Volume 2

    SciTech Connect

    Kusnezov, D; Hale, A; McCoy, M; Hopson, J

    2006-06-22

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program will require the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from

  11. Advanced Simulation & Computing FY09-FY10 Implementation Plan Volume 2, Rev. 0

    SciTech Connect

    Meisner, R; Perry, J; McCoy, M; Hopson, J

    2008-04-30

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  12. Advanced Simulation and Computing FY09-FY10 Implementation Plan, Volume 2, Revision 0.5

    SciTech Connect

    Meisner, R; Hopson, J; Peery, J; McCoy, M

    2008-10-07

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  13. Advanced Simulation and Computing FY08-09 Implementation Plan, Volume 2, Revision 0.5

    SciTech Connect

    Kusnezov, D; Bickel, T; McCoy, M; Hopson, J

    2007-09-13

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from

  14. Advanced Simulation and Computing FY09-FY10 Implementation Plan Volume 2, Rev. 1

    SciTech Connect

    Kissel, L

    2009-04-01

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one that

  15. Advanced Simulation and Computing FY10-FY11 Implementation Plan Volume 2, Rev. 0.5

    SciTech Connect

    Meisner, R; Peery, J; McCoy, M; Hopson, J

    2009-09-08

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering (D&E) programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model

  16. Computer vision-based technologies and commercial best practices for the advancement of the motion imagery tradecraft

    NASA Astrophysics Data System (ADS)

    Phipps, Marja; Capel, David; Srinivasan, James

    2014-06-01

    Motion imagery capabilities within the Department of Defense/Intelligence Community (DoD/IC) have advanced significantly over the last decade, attempting to meet continuously growing data collection, video processing and analytical demands in operationally challenging environments. The motion imagery tradecraft has evolved accordingly, enabling teams of analysts to effectively exploit data and generate intelligence reports across multiple phases in structured Full Motion Video (FMV) Processing Exploitation and Dissemination (PED) cells. Yet now the operational requirements are drastically changing. The exponential growth in motion imagery data continues, but to this the community adds multi-INT data, interoperability with existing and emerging systems, expanded data access, nontraditional users, collaboration, automation, and support for ad hoc configurations beyond the current FMV PED cells. To break from the legacy system lifecycle, we look towards a technology application and commercial adoption model course which will meet these future Intelligence, Surveillance and Reconnaissance (ISR) challenges. In this paper, we explore the application of cutting edge computer vision technology to meet existing FMV PED shortfalls and address future capability gaps. For example, real-time georegistration services developed from computer-vision-based feature tracking, multiple-view geometry, and statistical methods allow the fusion of motion imagery with other georeferenced information sources - providing unparalleled situational awareness. We then describe how these motion imagery capabilities may be readily deployed in a dynamically integrated analytical environment; employing an extensible framework, leveraging scalable enterprise-wide infrastructure and following commercial best practices.

  17. An expanded framework for the advanced computational testing and simulation toolkit

    SciTech Connect

    Marques, Osni A.; Drummond, Leroy A.

    2003-11-09

    The Advanced Computational Testing and Simulation (ACTS) Toolkit is a set of computational tools developed primarily at DOE laboratories and is aimed at simplifying the solution of common and important computational problems. The use of the tools reduces the development time for new codes and the tools provide functionality that might not otherwise be available. This document outlines an agenda for expanding the scope of the ACTS Project based on lessons learned from current activities. Highlights of this agenda include peer-reviewed certification of new tools; finding tools to solve problems that are not currently addressed by the Toolkit; working in collaboration with other software initiatives and DOE computer facilities; expanding outreach efforts; promoting interoperability, further development of the tools; and improving functionality of the ACTS Information Center, among other tasks. The ultimate goal is to make the ACTS tools more widely used and more effective in solving DOE's and the nation's scientific problems through the creation of a reliable software infrastructure for scientific computing.

  18. Computational fluid dynamic study on obstructive sleep apnea syndrome treated with maxillomandibular advancement.

    PubMed

    Yu, Chung-Chih; Hsiao, Hung-Da; Lee, Lung-Cheng; Yao, Chih-Min; Chen, Ning-Hung; Wang, Chau-Jan; Chen, Yu-Ray

    2009-03-01

    Maxillomandibular advancement is one of the treatments available for obstructive sleep apnea. The influence of this surgery on the upper airway and its mechanism are not fully understood. The present research simulates the flow fields of narrowed upper airways of 2 patients with obstructive sleep apnea treated with maxillomandibular advancement. The geometry of the upper airway was reconstructed from computed tomographic images taken before and after surgery. The consequent three-dimensional surface model was rendered for measurement and computational fluid dynamics simulation. Patients showed clinical improvement 6 months after surgery. The cross-sectional area of the narrowest part of the upper airway was increased in all dimensions. The simulated results showed a less constricted upper airway, with less velocity change and a decreased pressure gradient across the whole conduit during passage of air. Less breathing effort is therefore expected to achieve equivalent ventilation with the postoperative airway. This study demonstrates the possibility of computational fluid dynamics in providing information for understanding the pathogenesis of OSA and the effects of its treatment.

  19. Advance care planning for older people in Australia presenting to the emergency department from the community or residential aged care facilities.

    PubMed

    Street, Maryann; Ottmann, Goetz; Johnstone, Megan-Jane; Considine, Julie; Livingston, Patricia M

    2015-09-01

    The purpose of this retrospective, cross-sectional study was to determine the prevalence of advance care planning (ACP) among older people presenting to an Emergency Department (ED) from the community or a residential aged care facility. The study sample comprised 300 older people (aged 65+ years) presenting to three Victorian EDs in 2011. A total of 150 patients transferred from residential aged care to ED were randomly selected and then matched to 150 people who lived in the community and attended the ED by age, gender, reason for ED attendance and triage category on arrival. Overall prevalence of ACP was 13.3% (n = 40/300); over one-quarter (26.6%, n = 40/150) of those presenting to the ED from residential aged care had a documented Advance Care Plan, compared to none (0%, n = 0/150) of the people from the community. There were no significant differences in the median ED length of stay, number of investigations and interventions undertaken in ED, time seen by a doctor or rate of hospital admission for those with an Advance Care Plan compared to those without. Those with a comorbidity of cerebrovascular disease or dementia and those assessed with impaired brain function were more likely to have a documented Advance Care Plan on arrival at ED. Length of hospital stay was shorter for those with an Advance Care Plan [median (IQR) = 3 days (2-6) vs. 6 days (2-10), P = 0.027] and readmission lower (0% vs. 13.7%). In conclusion, older people from the community transferred to ED were unlikely to have a documented Advance Care Plan. Those from residential aged care who were cognitively impaired more frequently had an Advance Care Plan. In the ED, decisions of care did not appear to be influenced by the presence or absence of Advance Care Plans, but length of hospital admission was shorter for those with an Advance Care Plan.

  20. Advanced computer techniques for inverse modeling of electric current in cardiac tissue

    SciTech Connect

    Hutchinson, S.A.; Romero, L.A.; Diegert, C.F.

    1996-08-01

    For many years, ECG`s and vector cardiograms have been the tools of choice for non-invasive diagnosis of cardiac conduction problems, such as found in reentrant tachycardia or Wolff-Parkinson-White (WPW) syndrome. Through skillful analysis of these skin-surface measurements of cardiac generated electric currents, a physician can deduce the general location of heart conduction irregularities. Using a combination of high-fidelity geometry modeling, advanced mathematical algorithms and massively parallel computing, Sandia`s approach would provide much more accurate information and thus allow the physician to pinpoint the source of an arrhythmia or abnormal conduction pathway.

  1. NDE of advanced turbine engine components and materials by computed tomography

    NASA Technical Reports Server (NTRS)

    Yancey, R. N.; Baaklini, George Y.; Klima, Stanley J.

    1991-01-01

    Computed tomography (CT) is an X-ray technique that provides quantitative 3D density information of materials and components and can accurately detail spatial distributions of cracks, voids, and density variations. CT scans of ceramic materials, composites, and engine components were taken and the resulting images will be discussed. Scans were taken with two CT systems with different spatial resolution capabilities. The scans showed internal damage, density variations, and geometrical arrangement of various features in the materials and components. It was concluded that CT can play an important role in the characterization of advanced turbine engine materials and components. Future applications of this technology will be outlined.

  2. Advanced Imaging of Athletes: Added Value of Coronary Computed Tomography and Cardiac Magnetic Resonance Imaging.

    PubMed

    Martinez, Matthew W

    2015-07-01

    Cardiac magnetic resonance imaging and cardiac computed tomographic angiography have become important parts of the armamentarium for noninvasive diagnosis of cardiovascular disease. Emerging technologies have produced faster imaging, lower radiation dose, improved spatial and temporal resolution, as well as a wealth of prognostic data to support usage. Investigating true pathologic disease as well as distinguishing normal from potentially dangerous is now increasingly more routine for the cardiologist in practice. This article investigates how advanced imaging technologies can assist the clinician when evaluating all athletes for pathologic disease that may put them at risk.

  3. Cardiovascular proteomics in the era of big data: experimental and computational advances.

    PubMed

    Lam, Maggie P Y; Lau, Edward; Ng, Dominic C M; Wang, Ding; Ping, Peipei

    2016-01-01

    Proteomics plays an increasingly important role in our quest to understand cardiovascular biology. Fueled by analytical and computational advances in the past decade, proteomics applications can now go beyond merely inventorying protein species, and address sophisticated questions on cardiac physiology. The advent of massive mass spectrometry datasets has in turn led to increasing intersection between proteomics and big data science. Here we review new frontiers in technological developments and their applications to cardiovascular medicine. The impact of big data science on cardiovascular proteomics investigations and translation to medicine is highlighted.

  4. Computational Models of Exercise on the Advanced Resistance Exercise Device (ARED)

    NASA Technical Reports Server (NTRS)

    Newby, Nate; Caldwell, Erin; Scott-Pandorf, Melissa; Peters,Brian; Fincke, Renita; DeWitt, John; Poutz-Snyder, Lori

    2011-01-01

    Muscle and bone loss remain a concern for crew returning from space flight. The advanced resistance exercise device (ARED) is used for on-orbit resistance exercise to help mitigate these losses. However, characterization of how the ARED loads the body in microgravity has yet to be determined. Computational models allow us to analyze ARED exercise in both 1G and 0G environments. To this end, biomechanical models of the squat, single-leg squat, and deadlift exercise on the ARED have been developed to further investigate bone and muscle forces resulting from the exercises.

  5. Integrated Computational Materials Engineering (ICME) for Third Generation Advanced High-Strength Steel Development

    SciTech Connect

    Savic, Vesna; Hector, Louis G.; Ezzat, Hesham; Sachdev, Anil K.; Quinn, James; Krupitzer, Ronald; Sun, Xin

    2015-06-01

    This paper presents an overview of a four-year project focused on development of an integrated computational materials engineering (ICME) toolset for third generation advanced high-strength steels (3GAHSS). Following a brief look at ICME as an emerging discipline within the Materials Genome Initiative, technical tasks in the ICME project will be discussed. Specific aims of the individual tasks are multi-scale, microstructure-based material model development using state-of-the-art computational and experimental techniques, forming, toolset assembly, design optimization, integration and technical cost modeling. The integrated approach is initially illustrated using a 980 grade transformation induced plasticity (TRIP) steel, subject to a two-step quenching and partitioning (Q&P) heat treatment, as an example.

  6. Advanced Simulation and Computing Fiscal Year 2016 Implementation Plan, Version 0

    SciTech Connect

    McCoy, M.; Archer, B.; Hendrickson, B.

    2015-08-27

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The purpose of this IP is to outline key work requirements to be performed and to control individual work activities within the scope of work. Contractors may not deviate from this plan without a revised WA or subsequent IP.

  7. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan part 2 mappings for the ASC software quality engineering practices, version 2.0.

    SciTech Connect

    Heaphy, Robert; Sturtevant, Judith E.; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Minana, Molly A.; Hackney, Patricia; Forsythe, Christi A.; Schofield, Joseph Richard, Jr.; Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR001.3.2 and CPR001.3.6 and to a Department of Energy document, ''ASCI Software Quality Engineering: Goals, Principles, and Guidelines''. This document also identifies ASC management and software project teams' responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  8. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1: ASC software quality engineering practices, Version 2.0.

    SciTech Connect

    Sturtevant, Judith E.; Heaphy, Robert; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Minana, Molly A.; Hackney, Patricia; Forsythe, Christi A.; Schofield, Joseph Richard, Jr.; Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR 1.3.2 and 1.3.6 and to a Department of Energy document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines. This document also identifies ASC management and software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  9. A fission matrix based validation protocol for computed power distributions in the advanced test reactor

    SciTech Connect

    Nielsen, J. W.; Nigg, D. W.; LaPorta, A. W.

    2013-07-01

    The Idaho National Laboratory (INL) has been engaged in a significant multi year effort to modernize the computational reactor physics tools and validation procedures used to support operations of the Advanced Test Reactor (ATR) and its companion critical facility (ATRC). Several new protocols for validation of computed neutron flux distributions and spectra as well as for validation of computed fission power distributions, based on new experiments and well-recognized least-squares statistical analysis techniques, have been under development. In the case of power distributions, estimates of the a priori ATR-specific fuel element-to-element fission power correlation and covariance matrices are required for validation analysis. A practical method for generating these matrices using the element-to-element fission matrix is presented, along with a high-order scheme for estimating the underlying fission matrix itself. The proposed methodology is illustrated using the MCNP5 neutron transport code for the required neutronics calculations. The general approach is readily adaptable for implementation using any multidimensional stochastic or deterministic transport code that offers the required level of spatial, angular, and energy resolution in the computed solution for the neutron flux and fission source. (authors)

  10. Recent advances in computational methodology for simulation of mechanical circulatory assist devices

    PubMed Central

    Marsden, Alison L.; Bazilevs, Yuri; Long, Christopher C.; Behr, Marek

    2014-01-01

    Ventricular assist devices (VADs) provide mechanical circulatory support to offload the work of one or both ventricles during heart failure. They are used in the clinical setting as destination therapy, as bridge to transplant, or more recently as bridge to recovery to allow for myocardial remodeling. Recent developments in computational simulation allow for detailed assessment of VAD hemodynamics for device design and optimization for both children and adults. Here, we provide a focused review of the recent literature on finite element methods and optimization for VAD simulations. As VAD designs typically fall into two categories, pulsatile and continuous flow devices, we separately address computational challenges of both types of designs, and the interaction with the circulatory system with three representative case studies. In particular, we focus on recent advancements in finite element methodology that has increased the fidelity of VAD simulations. We outline key challenges, which extend to the incorporation of biological response such as thrombosis and hemolysis, as well as shape optimization methods and challenges in computational methodology. PMID:24449607

  11. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan : ASC software quality engineering practices Version 3.0.

    SciTech Connect

    Turgeon, Jennifer L.; Minana, Molly A.; Hackney, Patricia; Pilch, Martin M.

    2009-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in the US Department of Energy/National Nuclear Security Agency (DOE/NNSA) Quality Criteria, Revision 10 (QC-1) as 'conformance to customer requirements and expectations'. This quality plan defines the SNL ASC Program software quality engineering (SQE) practices and provides a mapping of these practices to the SNL Corporate Process Requirement (CPR) 001.3.6; 'Corporate Software Engineering Excellence'. This plan also identifies ASC management's and the software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals. This SNL ASC Software Quality Plan establishes the signatories commitments to improving software products by applying cost-effective SQE practices. This plan enumerates the SQE practices that comprise the development of SNL ASC's software products and explains the project teams opportunities for tailoring and implementing the practices.

  12. Experimental and Computational Study on the Cusp-DEC and TWDEC for Advanced Fueled Fusion

    SciTech Connect

    Tomita, Y.; Yasaka, Y.; Takeno, H.; Ishikawa, M.; Nemoto, T.

    2005-01-15

    Experimental and computational results of direct energy converters (DECs) for advanced fueled fusion such as D-{sup 3}He are presented. Kinetic energy of thermal component of end loss plasma is converted to electricity by using the Cusp DEC. The proof-of-principle experiments of a single slanted cusp have been carried out and verified the faculty of the configuration. To improve a separation of electrons from ions, numerical simulation shows a Helmholtz magnetic configuration with a uniform magnetic field is more effective than the Cusp DEC. The fusion-produced high-energy ions like 15 MeV protons in D-{sup 3}He fueled fusion can pass through the Cusp DEC without disturbing their orbits and enter a traveling-wave direct energy converter (TWDEC). Small scale experiments have shown the effectiveness of the TWDEC and the numerical simulation on optimization of interval of electrodes in a decelerator gives high conversion efficiency up to 60 %.

  13. Development of an Advanced Computational Model for OMCVD of Indium Nitride

    NASA Technical Reports Server (NTRS)

    Cardelino, Carlos A.; Moore, Craig E.; Cardelino, Beatriz H.; Zhou, Ning; Lowry, Sam; Krishnan, Anantha; Frazier, Donald O.; Bachmann, Klaus J.

    1999-01-01

    An advanced computational model is being developed to predict the formation of indium nitride (InN) film from the reaction of trimethylindium (In(CH3)3) with ammonia (NH3). The components are introduced into the reactor in the gas phase within a background of molecular nitrogen (N2). Organometallic chemical vapor deposition occurs on a heated sapphire surface. The model simulates heat and mass transport with gas and surface chemistry under steady state and pulsed conditions. The development and validation of an accurate model for the interactions between the diffusion of gas phase species and surface kinetics is essential to enable the regulation of the process in order to produce a low defect material. The validation of the model will be performed in concert with a NASA-North Carolina State University project.

  14. Advanced computational tools for optimization and uncertainty quantification of carbon capture processes

    SciTech Connect

    Miller, David C.; Ng, Brenda; Eslick, John

    2014-01-01

    Advanced multi-scale modeling and simulation has the potential to dramatically reduce development time, resulting in considerable cost savings. The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and universities that is developing, demonstrating, and deploying a suite of multi-scale modeling and simulation tools. One significant computational tool is FOQUS, a Framework for Optimization and Quantification of Uncertainty and Sensitivity, which enables basic data submodels, including thermodynamics and kinetics, to be used within detailed process models to rapidly synthesize and optimize a process and determine the level of uncertainty associated with the resulting process. The overall approach of CCSI is described with a more detailed discussion of FOQUS and its application to carbon capture systems.

  15. Advanced Computational Modeling of Vapor Deposition in a High-pressure Reactor

    NASA Technical Reports Server (NTRS)

    Cardelino, Beatriz H.; Moore, Craig E.; McCall, Sonya D.; Cardelino, Carlos A.; Dietz, Nikolaus; Bachmann, Klaus

    2004-01-01

    In search of novel approaches to produce new materials for electro-optic technologies, advances have been achieved in the development of computer models for vapor deposition reactors in space. Numerical simulations are invaluable tools for costly and difficult processes, such as those experiments designed for high pressures and microgravity conditions. Indium nitride is a candidate compound for high-speed laser and photo diodes for optical communication system, as well as for semiconductor lasers operating into the blue and ultraviolet regions. But InN and other nitride compounds exhibit large thermal decomposition at its optimum growth temperature. In addition, epitaxy at lower temperatures and subatmospheric pressures incorporates indium droplets into the InN films. However, surface stabilization data indicate that InN could be grown at 900 K in high nitrogen pressures, and microgravity could provide laminar flow conditions. Numerical models for chemical vapor deposition have been developed, coupling complex chemical kinetics with fluid dynamic properties.

  16. Advanced Computational Modeling of Vapor Deposition in a High-Pressure Reactor

    NASA Technical Reports Server (NTRS)

    Cardelino, Beatriz H.; Moore, Craig E.; McCall, Sonya D.; Cardelino, Carlos A.; Dietz, Nikolaus; Bachmann, Klaus

    2004-01-01

    In search of novel approaches to produce new materials for electro-optic technologies, advances have been achieved in the development of computer models for vapor deposition reactors in space. Numerical simulations are invaluable tools for costly and difficult processes, such as those experiments designed for high pressures and microgravity conditions. Indium nitride is a candidate compound for high-speed laser and photo diodes for optical communication system, as well as for semiconductor lasers operating into the blue and ultraviolet regions. But InN and other nitride compounds exhibit large thermal decomposition at its optimum growth temperature. In addition, epitaxy at lower temperatures and subatmospheric pressures incorporates indium droplets into the InN films. However, surface stabilization data indicate that InN could be grown at 900 K in high nitrogen pressures, and microgravity could provide laminar flow conditions. Numerical models for chemical vapor deposition have been developed, coupling complex chemical kinetics with fluid dynamic properties.

  17. The Necessity and Applicability Levels of the Courses That Are Offered in the Departments of Computer Education and Instructional Technologies (CEIT)

    ERIC Educational Resources Information Center

    Acat, M. Bahaddin; Kiliç, Abdurrahman; Girmen, Pinar; Anagün, Sengül S.

    2007-01-01

    The main purpose of this study is to identify the levels of the necessity and applicability of the courses offered in the Departments of Computer Education and Instructional Technologies based on the views of the fourth grade and graduated students. In the study descriptive research model was used. The population of the study were final-year and…

  18. The Necessity and Applicability Levels of the Courses that are Offered in the Departments of Computer Education and Instructional Technologies (CEIT)

    ERIC Educational Resources Information Center

    Acat, M. Bahaddin; Kilic, Abdurrahman; Girmen, Pinar; Anagun, Senegul S.

    2007-01-01

    The main purpose of this study is to identify the levels of the necessity and applicability of the courses offered in the Departments of Computer Education and Instructional Technologies based on the views of the fourth grade and graduated students. In the study descriptive research model was used. The population of the study were final-year and…

  19. Current Advances in the Computational Simulation of the Formation of Low-Mass Stars

    SciTech Connect

    Klein, R I; Inutsuka, S; Padoan, P; Tomisaka, K

    2005-10-24

    Developing a theory of low-mass star formation ({approx} 0.1 to 3 M{sub {circle_dot}}) remains one of the most elusive and important goals of theoretical astrophysics. The star-formation process is the outcome of the complex dynamics of interstellar gas involving non-linear interactions of turbulence, gravity, magnetic field and radiation. The evolution of protostellar condensations, from the moment they are assembled by turbulent flows to the time they reach stellar densities, spans an enormous range of scales, resulting in a major computational challenge for simulations. Since the previous Protostars and Planets conference, dramatic advances in the development of new numerical algorithmic techniques have been successfully implemented on large scale parallel supercomputers. Among such techniques, Adaptive Mesh Refinement and Smooth Particle Hydrodynamics have provided frameworks to simulate the process of low-mass star formation with a very large dynamic range. It is now feasible to explore the turbulent fragmentation of molecular clouds and the gravitational collapse of cores into stars self-consistently within the same calculation. The increased sophistication of these powerful methods comes with substantial caveats associated with the use of the techniques and the interpretation of the numerical results. In this review, we examine what has been accomplished in the field and present a critique of both numerical methods and scientific results. We stress that computational simulations should obey the available observational constraints and demonstrate numerical convergence. Failing this, results of large scale simulations do not advance our understanding of low-mass star formation.

  20. Identifying human disease genes: advances in molecular genetics and computational approaches.

    PubMed

    Bakhtiar, S M; Ali, A; Baig, S M; Barh, D; Miyoshi, A; Azevedo, V

    2014-07-04

    The human genome project is one of the significant achievements that have provided detailed insight into our genetic legacy. During the last two decades, biomedical investigations have gathered a considerable body of evidence by detecting more than 2000 disease genes. Despite the imperative advances in the genetic understanding of various diseases, the pathogenesis of many others remains obscure. With recent advances, the laborious methodologies used to identify DNA variations are replaced by direct sequencing of genomic DNA to detect genetic changes. The ability to perform such studies depends equally on the development of high-throughput and economical genotyping methods. Currently, basically for every disease whose origen is still unknown, genetic approaches are available which could be pedigree-dependent or -independent with the capacity to elucidate fundamental disease mechanisms. Computer algorithms and programs for linkage analysis have formed the foundation for many disease gene detection projects, similarly databases of clinical findings have been widely used to support diagnostic decisions in dysmorphology and general human disease. For every disease type, genome sequence variations, particularly single nucleotide polymorphisms are mapped by comparing the genetic makeup of case and control groups. Methods that predict the effects of polymorphisms on protein stability are useful for the identification of possible disease associations, whereas structural effects can be assessed using methods to predict stability changes in proteins using sequence and/or structural information.

  1. Advanced Computed Tomography Inspection System (ACTIS): An overview of the technology and its application

    NASA Technical Reports Server (NTRS)

    Hediger, Lisa H.

    1991-01-01

    The Advanced Computed Tomography Inspection System (ACTIS) was developed by NASA Marshall to support solid propulsion test programs. ACTIS represents a significant advance in state-of-the-art inspection systems. Its flexibility and superior technical performance have made ACTIS very popular, both within and outside the aerospace community. Through technology utilization efforts, ACTIS has been applied to inspection problems in commercial aerospace, lumber, automotive, and nuclear waste disposal industries. ACTIS has been used to inspect items of historical interest. ACTIS has consistently produced valuable results, providing information which was unattainable through conventional inspection methods. Although many successes have already been shown, the full potential of ACTIS has not yet been realized. It is currently being applied in the commercial aerospace industry by Boeing. Smaller systems, based on ACTIS technology, are becoming increasingly available. This technology has much to offer the small business and industry, especially in identifying design and process problems early in the product development cycle to prevent defects. Several options are available to businesses interested in this technology.

  2. Advanced computed tomography inspection system (ACTIS): an overview of the technology and its applications

    NASA Astrophysics Data System (ADS)

    Beshears, Ronald D.; Hediger, Lisa H.

    1994-10-01

    The Advanced Computed Tomography Inspection System (ACTIS) was developed by the Marshall Space Flight Center to support in-house solid propulsion test programs. ACTIS represents a significant advance in state-of-the-art inspection systems. Its flexibility and superior technical performance have made ACTIS very popular, both within and outside the aerospace community. Through Technology Utilization efforts, ACTIS has been applied to inspection problems in commercial aerospace, lumber, automotive, and nuclear waste disposal industries. ACTIS has even been used to inspect items of historical interest. ACTIS has consistently produced valuable results, providing information which was unattainable through conventional inspection methods. Although many successes have already been demonstrated, the full potential of ACTIS has not yet been realized. It is currently being applied in the commercial aerospace industry by Boeing Aerospace Company. Smaller systems, based on ACTIS technology are becoming increasingly available. This technology has much to offer small businesses and industry, especially in identifying design and process problems early in the product development cycle to prevent defects. Several options are available to businesses interested in pursuing this technology.

  3. Computational studies of horizontal axis wind turbines in high wind speed condition using advanced turbulence models

    NASA Astrophysics Data System (ADS)

    Benjanirat, Sarun

    Next generation horizontal-axis wind turbines (HAWTs) will operate at very high wind speeds. Existing engineering approaches for modeling the flow phenomena are based on blade element theory, and cannot adequately account for 3-D separated, unsteady flow effects. Therefore, researchers around the world are beginning to model these flows using first principles-based computational fluid dynamics (CFD) approaches. In this study, an existing first principles-based Navier-Stokes approach is being enhanced to model HAWTs at high wind speeds. The enhancements include improved grid topology, implicit time-marching algorithms, and advanced turbulence models. The advanced turbulence models include the Spalart-Allmaras one-equation model, k-epsilon, k-o and Shear Stress Transport (k-o-SST) models. These models are also integrated with detached eddy simulation (DES) models. Results are presented for a range of wind speeds, for a configuration termed National Renewable Energy Laboratory Phase VI rotor, tested at NASA Ames Research Center. Grid sensitivity studies are also presented. Additionally, effects of existing transition models on the predictions are assessed. Data presented include power/torque production, radial distribution of normal and tangential pressure forces, root bending moments, and surface pressure fields. Good agreement was obtained between the predictions and experiments for most of the conditions, particularly with the Spalart-Allmaras-DES model.

  4. Spectral computed tomography in advanced gastric cancer: Can iodine concentration non-invasively assess angiogenesis?

    PubMed Central

    Chen, Xiao-Hua; Ren, Ke; Liang, Pan; Chai, Ya-Ru; Chen, Kui-Sheng; Gao, Jian-Bo

    2017-01-01

    AIM To investigate the correlation of iodine concentration (IC) generated by spectral computed tomography (CT) with micro-vessel density (MVD) and vascular endothelial growth factor (VEGF) expression in patients with advanced gastric carcinoma (GC). METHODS Thirty-four advanced GC patients underwent abdominal enhanced CT in the gemstone spectral imaging mode. The IC of the primary lesion in the arterial phase (AP) and venous phase (VP) were measured, and were then normalized against that in the aorta to provide the normalized IC (nIC). MVD and VEGF were detected by immunohistochemical assays, using CD34 and VEGF-A antibodies, respectively. Correlations of nIC with MVD, VEGF, and clinical-pathological features were analyzed. RESULTS Both nICs correlated linearly with MVD and were higher in the primary lesion site than in the normal control site, but were not correlated with VEGF expression. After stratification by clinical-pathological subtypes, nIC-AP showed a statistically significant correlation with MVD, particularly in the group with tumors at stage T4, without nodular involvement, of a mixed Lauren type, where the tumor was located at the antrum site, and occurred in female individuals. nIC-VP showed a positive correlation with MVD in the group with the tumor at stage T4 and above, had nodular involvement, was poorly differentiated, was located at the pylorus site, of a mixed and diffused Lauren subtype, and occurred in male individuals. nIC-AP and nIC-VP showed significant differences in terms of histological differentiation and Lauren subtype. CONCLUSION The IC detected by spectral CT correlated with the MVD. nIC-AP and nIC-VP can reflect angiogenesis in different pathological subgroups of advanced GC. PMID:28321168

  5. Influence of setback and advancement osseous genioplasty on facial outcome: A computer-simulated study.

    PubMed

    Möhlhenrich, Stephan Christian; Heussen, Nicole; Kamal, Mohammad; Peters, Florian; Fritz, Ulrike; Hölzle, Frank; Modabber, Ali

    2015-12-01

    The aim of this virtual study was to investigate the influence of angular deviation and displacement distance on the overlying soft tissue during chin genioplasty. Computed tomography data from 21 patients were read using ProPlan CMF software. Twelve simulated genioplasties were performed per patient with variable osteotomy angles and displacement distances. Soft-tissue deformations and cephalometric analysis were compared. Changes in anterior and inferior soft-tissue of the chin along with resultant lower facial third area were determined. Maximum average changes in soft-tissue were obtained anterior after 10-mm advancement about 4.19 SD 0.84 mm and inferior about -1.55 SD 0.96 mm. After 10-mm setback anterior -4.63 SD 0.56 mm and inferior 0.75 SD 1.16 mm were deviations found. The anterior soft tissue showed a statistically significant change with bony displacement in both directions independent of osteotomy angle (p < 0.001) and only after a 10-mm advancement with an angle of -5° significant differences at inferior soft-tissue were noted (p = 0.0055). The average area of the total lower third of the face was 24,807.80 SD 4,091.72 mm(2) and up to 62.75% was influenced. Advanced genioplasty leads to greater changes in the overlying soft tissue, whereas the affected area is larger after setback displacement. The ratio between soft and hard tissue movements largely depends on the displacement distance.

  6. Dual vs. single computer monitor in a Canadian hospital Archiving Department: a study of efficiency and satisfaction.

    PubMed

    Poder, Thomas G; Godbout, Sylvie T; Bellemare, Christian

    2011-01-01

    This paper describes a comparative study of clinical coding by Archivists (also known as Clinical Coders in some other countries) using single and dual computer monitors. In the present context, processing a record corresponds to checking the available information; searching for the missing physician information; and finally, performing clinical coding. We collected data for each Archivist during her use of the single monitor for 40 hours and during her use of the dual monitor for 20 hours. During the experimental periods, Archivists did not perform other related duties, so we were able to measure the real-time processing of records. To control for the type of records and their impact on the process time required, we categorised the cases as major or minor, based on whether acute care or day surgery was involved. Overall results show that 1,234 records were processed using a single monitor and 647 records using a dual monitor. The time required to process a record was significantly higher (p= .071) with a single monitor compared to a dual monitor (19.83 vs.18.73 minutes). However, the percentage of major cases was significantly higher (p= .000) in the single monitor group compared to the dual monitor group (78% vs. 69%). As a consequence, we adjusted our results, which reduced the difference in time required to process a record between the two systems from 1.1 to 0.61 minutes. Thus, the net real-time difference was only 37 seconds in favour of the dual monitor system. Extrapolated over a 5-year period, this would represent a time savings of 3.1% and generate a net cost savings of $7,729 CAD (Canadian dollars) for each workstation that devoted 35 hours per week to the processing of records. Finally, satisfaction questionnaire responses indicated a high level of satisfaction and support for the dual-monitor system. The implementation of a dual-monitor system in a hospital archiving department is an efficient option in the context of scarce human resources and has the

  7. Year 2000 Computing Crisis: Significant Risks Remain to Department of Education's Student Financial Aid Systems. Testimony before the Subcommittee on Oversight and Investigations, Committee on Education and the Workforce, House of Representatives.

    ERIC Educational Resources Information Center

    Willemssen, Joel C.

    This testimony discusses the risks faced by the U.S. Department of Education due to the year 2000 (Y2K) computing crisis, focusing on student financial aid systems, the actions the Department has taken in recent months to address these risks, and the key issues the Department must deal with if its computer systems are to be ready for the century…

  8. Advanced display object selection methods for enhancing user-computer productivity

    NASA Technical Reports Server (NTRS)

    Osga, Glenn A.

    1993-01-01

    The User-Interface Technology Branch at NCCOSC RDT&E Division has been conducting a series of studies to address the suitability of commercial off-the-shelf (COTS) graphic user-interface (GUI) methods for efficiency and performance in critical naval combat systems. This paper presents an advanced selection algorithm and method developed to increase user performance when making selections on tactical displays. The method has also been applied with considerable success to a variety of cursor and pointing tasks. Typical GUI's allow user selection by: (1) moving a cursor with a pointing device such as a mouse, trackball, joystick, touchscreen; and (2) placing the cursor on the object. Examples of GUI objects are the buttons, icons, folders, scroll bars, etc. used in many personal computer and workstation applications. This paper presents an improved method of selection and the theoretical basis for the significant performance gains achieved with various input devices tested. The method is applicable to all GUI styles and display sizes, and is particularly useful for selections on small screens such as notebook computers. Considering the amount of work-hours spent pointing and clicking across all styles of available graphic user-interfaces, the cost/benefit in applying this method to graphic user-interfaces is substantial, with the potential for increasing productivity across thousands of users and applications.

  9. Building highly available control system applications with Advanced Telecom Computing Architecture and open standards

    NASA Astrophysics Data System (ADS)

    Kazakov, Artem; Furukawa, Kazuro

    2010-11-01

    Requirements for modern and future control systems for large projects like International Linear Collider demand high availability for control system components. Recently telecom industry came up with a great open hardware specification - Advanced Telecom Computing Architecture (ATCA). This specification is aimed for better reliability, availability and serviceability. Since its first market appearance in 2004, ATCA platform has shown tremendous growth and proved to be stable and well represented by a number of vendors. ATCA is an industry standard for highly available systems. On the other hand Service Availability Forum, a consortium of leading communications and computing companies, describes interaction between hardware and software. SAF defines a set of specifications such as Hardware Platform Interface, Application Interface Specification. SAF specifications provide extensive description of highly available systems, services and their interfaces. Originally aimed for telecom applications, these specifications can be used for accelerator controls software as well. This study describes benefits of using these specifications and their possible adoption to accelerator control systems. It is demonstrated how EPICS Redundant IOC was extended using Hardware Platform Interface specification, which made it possible to utilize benefits of the ATCA platform.

  10. Development of Experimental and Computational Aeroacoustic Tools for Advanced Liner Evaluation

    NASA Technical Reports Server (NTRS)

    Jones, Michael G.; Watson, Willie R.; Nark, Douglas N.; Parrott, Tony L.; Gerhold, Carl H.; Brown, Martha C.

    2006-01-01

    Acoustic liners in aircraft engine nacelles suppress radiated noise. Therefore, as air travel increases, increasingly sophisticated tools are needed to maximize noise suppression. During the last 30 years, NASA has invested significant effort in development of experimental and computational acoustic liner evaluation tools. The Curved Duct Test Rig is a 152-mm by 381- mm curved duct that supports liner evaluation at Mach numbers up to 0.3 and source SPLs up to 140 dB, in the presence of user-selected modes. The Grazing Flow Impedance Tube is a 51- mm by 63-mm duct currently being fabricated to operate at Mach numbers up to 0.6 with source SPLs up to at least 140 dB, and will replace the existing 51-mm by 51-mm duct. Together, these test rigs allow evaluation of advanced acoustic liners over a range of conditions representative of those observed in aircraft engine nacelles. Data acquired with these test ducts are processed using three aeroacoustic propagation codes. Two are based on finite element solutions to convected Helmholtz and linearized Euler equations. The third is based on a parabolic approximation to the convected Helmholtz equation. The current status of these computational tools and their associated usage with the Langley test rigs is provided.

  11. Computational fluid dynamics in the design and analysis of thermal processes: a review of recent advances.

    PubMed

    Norton, Tomás; Tiwari, Brijesh; Sun, Da Wen

    2013-01-01

    The design of thermal processes in the food industry has undergone great developments in the last two decades due to the availability of cheap computer power alongside advanced modelling techniques such as computational fluid dynamics (CFD). CFD uses numerical algorithms to solve the non-linear partial differential equations of fluid mechanics and heat transfer so that the complex mechanisms that govern many food-processing systems can be resolved. In thermal processing applications, CFD can be used to build three-dimensional models that are both spatially and temporally representative of a physical system to produce solutions with high levels of physical realism without the heavy costs associated with experimental analyses. Therefore, CFD is playing an ever growing role in the development of optimization of conventional as well as the development of new thermal processes in the food industry. This paper discusses the fundamental aspects involved in developing CFD solutions and forms a state-of-the-art review on various CFD applications in conventional as well as novel thermal processes. The challenges facing CFD modellers of thermal processes are also discussed. From this review it is evident that present-day CFD software, with its rich tapestries of mathematical physics, numerical methods and visualization techniques, is currently recognized as a formidable and pervasive technology which can permit comprehensive analyses of thermal processing.

  12. Current advances in molecular, biochemical, and computational modeling analysis of microalgal triacylglycerol biosynthesis.

    PubMed

    Lenka, Sangram K; Carbonaro, Nicole; Park, Rudolph; Miller, Stephen M; Thorpe, Ian; Li, Yantao

    2016-01-01

    Triacylglycerols (TAGs) are highly reduced energy storage molecules ideal for biodiesel production. Microalgal TAG biosynthesis has been studied extensively in recent years, both at the molecular level and systems level through experimental studies and computational modeling. However, discussions of the strategies and products of the experimental and modeling approaches are rarely integrated and summarized together in a way that promotes collaboration among modelers and biologists in this field. In this review, we outline advances toward understanding the cellular and molecular factors regulating TAG biosynthesis in unicellular microalgae with an emphasis on recent studies on rate-limiting steps in fatty acid and TAG synthesis, while also highlighting new insights obtained from the integration of multi-omics datasets with mathematical models. Computational methodologies such as kinetic modeling, metabolic flux analysis, and new variants of flux balance analysis are explained in detail. We discuss how these methods have been used to simulate algae growth and lipid metabolism in response to changing culture conditions and how they have been used in conjunction with experimental validations. Since emerging evidence indicates that TAG synthesis in microalgae operates through coordinated crosstalk between multiple pathways in diverse subcellular destinations including the endoplasmic reticulum and plastids, we discuss new experimental studies and models that incorporate these findings for discovering key regulatory checkpoints. Finally, we describe tools for genetic manipulation of microalgae and their potential for future rational algal strain design. This comprehensive review explores the potential synergistic impact of pathway analysis, computational approaches, and molecular genetic manipulation strategies on improving TAG production in microalgae.

  13. Implementation of a clinical innovation: the case of advanced clinic access in the Department of Veterans Affairs.

    PubMed

    Lukas, Carol VanDeusen; Meterko, Mark M; Mohr, David; Seibert, Marjorie Nealon; Parlier, Renee; Levesque, Odette; Petzel, Robert A

    2008-01-01

    Healthcare organizations seeking to improve clinical practices often have disappointing results because the planned innovations are not successfully implemented. To increase the understanding of implementation, we analyzed the national spread of an ambulatory innovation in the Department of Veterans Affairs. This study provides support for a conceptual model that posits that the extent to which a clinical innovation is implemented will be affected by factors in 3 domains: (1) intentional activities to introduce, spread, and support the innovation; (2) the attitudes and capabilities of clinic staff responsible for implementing the innovation; and (3) the context of the facility in which the innovation is being introduced. Among the strongest predictors of successful implementation, management support for the innovation and clinic team knowledge and skills to make changes successfully were significant across both primary care and specialty clinics.

  14. Reducing access times for an endoscopy department by an iterative combination of computer simulation and linear programming.

    PubMed

    Joustra, P E; de Wit, J; Struben, V M D; Overbeek, B J H; Fockens, P; Elkhuizen, S G

    2010-03-01

    To reduce the access times of an endoscopy department, we developed an iterative combination of Discrete Event simulation and Integer Linear Programming. We developed the method in the Endoscopy Department of the Academic Medical Center in Amsterdam and compared different scenarios to reduce the access times for the department. The results show that by a more effective allocation of the current capacity, all procedure types will meet their corresponding performance targets in contrast to the current situation. This improvement can be accomplished without requiring additional equipment and staff. Currently, our recommendations are implemented.

  15. Advancing the Use of Evidence-Based Decision-Making in Local Health Departments With Systems Science Methodologies

    PubMed Central

    Li, Yan; Kong, Nan; Lawley, Mark; Weiss, Linda

    2015-01-01

    Objectives. We assessed how systems science methodologies might be used to bridge resource gaps at local health departments (LHDs) so that they might better implement evidence-based decision-making (EBDM) to address population health challenges. Methods. We used the New York Academy of Medicine Cardiovascular Health Simulation Model to evaluate the results of a hypothetical program that would reduce the proportion of people smoking, eating fewer than 5 fruits and vegetables per day, being physically active less than 150 minutes per week, and who had a body mass index (BMI) of 25 kg/m2 or greater. We used survey data from the Behavioral Risk Factor Surveillance System to evaluate health outcomes and validate simulation results. Results. Smoking rates and the proportion of the population with a BMI of 25 kg/m2 or greater would have decreased significantly with implementation of the hypothetical program (P < .001). Two areas would have experienced a statistically significant reduction in the local population with diabetes between 2007 and 2027 (P < .05). Conclusions. The use of systems science methodologies might be a novel and efficient way to systematically address a number of EBDM adoption barriers at LHDs. PMID:25689181

  16. Determination of Perceptions of the Teacher Candidates Studying in the Computer and Instructional Technology Department towards Human-Computer Interaction and Related Basic Concepts

    ERIC Educational Resources Information Center

    Kiyici, Mubin

    2011-01-01

    HCI is a field which has an increasing popularity by virtue of the spread of the computers and internet and gradually contributes to the production of the user-friendlier software and hardware with the contribution of the scientists from different disciplines. Teacher candidates studying at the computer and instructional technologies department…

  17. Optimisation of the usage of LHC and local computing resources in a multidisciplinary physics department hosting a WLCG Tier-2 centre

    NASA Astrophysics Data System (ADS)

    Barberis, Stefano; Carminati, Leonardo; Leveraro, Franco; Mazza, Simone Michele; Perini, Laura; Perlz, Francesco; Rebatto, David; Tura, Ruggero; Vaccarossa, Luca; Villaplana, Miguel

    2015-12-01

    We present the approach of the University of Milan Physics Department and the local unit of INFN to allow and encourage the sharing among different research areas of computing, storage and networking resources (the largest ones being those composing the Milan WLCG Tier-2 centre and tailored to the needs of the ATLAS experiment). Computing resources are organised as independent HTCondor pools, with a global master in charge of monitoring them and optimising their usage. The configuration has to provide satisfactory throughput for both serial and parallel (multicore, MPI) jobs. A combination of local, remote and cloud storage options are available. The experience of users from different research areas operating on this shared infrastructure is discussed. The promising direction of improving scientific computing throughput by federating access to distributed computing and storage also seems to fit very well with the objectives listed in the European Horizon 2020 framework for research and development.

  18. Connectionist Models for Intelligent Computation.

    DTIC Science & Technology

    1988-08-31

    Studies and Department of Physics and Astronomy and Institute for Advanced Computer Studies TInivpr-%tv of Maryland College Park, MD 20742 ABSTRACT A...distributed in the network. II. TRAINING OF THE NETWORK The stereo vision is achieved by detecting the binocular disparity of the two images observed by...SUN, Y.C. LEE and H.H. CHEN oli toSios, d Department of Physics and Astronomy SO and ent tio: Institute for Advanced Computer Studies inhowve

  19. [Implementation of the program of "Collaborative Development of Advanced Practical Education to Train Pharmacists in Leadership" under the joint operation of the pharmaceutical departments in fourteen national universities].

    PubMed

    Hirata, Kazumasa; Tamura, Satoru; Kobayashi, Motomasa

    2012-01-01

    "Collaborative Development of Advanced Practical Education Program to Train Pharmacists with Leadership" applied jointly by the pharmaceutical departments of fourteen national universities was selected to receive the special expenditure support of Ministry of Education, Culture, Sports, Science and Technology for fiscal year 2010 under "the Training of Highly Skillful Professionals and Improvement of the Quality of the Function of Professional Education". This project is to promote the collaborative development of the educational program which will make it possible to further advance and substantiate the education of pharmacists in the six year course of the pharmaceutical department for the ultimate purpose to introduce pharmacists with leadership who can play an active role and fill in a leadership position in a wide range of responsibilities into the society which, more and more, has come to expect pharmacy to take the initiative in acting against health hazards caused by infections, foods and environmental pollution as well as to meet the diversification of healthcare. To be more specific, this project is to try and evaluate the following programs repeatedly based on the plan-do-check-act (PDCA) cycle: 1) Practical medical and pharmaceutical education program; 2) Program concerning research on long term themes and advanced education; 3) Program concerning training and education of SPs (standardized patients or simulated patients) and PBL (problem-based learning) tutorial education; and 4) Program concerning the method of evaluation of education. Through this repeated trial and evaluation, this project ultimately seeks to construct a highly effective practical educational program which integrates each university's achievements and educational attempts rich in originality.

  20. Computer Security Awareness Guide for Department of Energy Laboratories, Government Agencies, and others for use with Lawrence Livermore National Laboratory`s (LLNL): Computer security short subjects videos

    SciTech Connect

    Not Available

    1993-12-31

    Lonnie Moore, the Computer Security Manager, CSSM/CPPM at Lawrence Livermore National Laboratory (LLNL) and Gale Warshawsky, the Coordinator for Computer Security Education & Awareness at LLNL, wanted to share topics such as computer ethics, software piracy, privacy issues, and protecting information in a format that would capture and hold an audience`s attention. Four Computer Security Short Subject videos were produced which ranged from 1-3 minutes each. These videos are very effective education and awareness tools that can be used to generate discussions about computer security concerns and good computing practices. Leaders may incorporate the Short Subjects into presentations. After talking about a subject area, one of the Short Subjects may be shown to highlight that subject matter. Another method for sharing them could be to show a Short Subject first and then lead a discussion about its topic. The cast of characters and a bit of information about their personalities in the LLNL Computer Security Short Subjects is included in this report.

  1. The NASA/Baltimore Applications Project (BAP). Computer aided dispatch and communications system for the Baltimore Fire Department: A case study of urban technology application

    NASA Technical Reports Server (NTRS)

    Levine, A. L.

    1981-01-01

    An engineer and a computer expert from Goddard Space Flight Center were assigned to provide technical assistance in the design and installation of a computer assisted system for dispatching and communicating with fire department personnel and equipment in Baltimore City. Primary contributions were in decision making and management processes. The project is analyzed from four perspectives: (1) fire service; (2) technology transfer; (3) public administration; and (5) innovation. The city benefitted substantially from the approach and competence of the NASA personnel. Given the proper conditions, there are distinct advantages in having a nearby Federal laboratory provide assistance to a city on a continuing basis, as is done in the Baltimore Applications Project.

  2. TRANS_MU computer code for computation of transmutant formation kinetics in advanced structural materials for fusion reactors

    NASA Astrophysics Data System (ADS)

    Markina, Natalya V.; Shimansky, Gregory A.

    A method of controlling a systematic error in transmutation computations is described for a class of problems, in which strictly a one-parental and one-residual nucleus are considered in each nuclear transformation channel. A discrete-logical algorithm is stated for the differential equations system matrix to reduce it to a block-triangular type. A computing procedure is developed determining a strict estimation of a computing error for each value of the computation results for the above named class of transmutation computation problems with some additional restrictions on the complexity of the nuclei transformations scheme. The computer code for this computing procedure - TRANS_MU - compared with an analogue approach has a number of advantages. Besides the mentioned quantitative control of a systematic and computing errors as an important feature of the code TRANS_MU, it is necessary to indicate the calculation of the contribution of each considered reaction to the transmutant accumulation and gas production. The application of the TRANS_MU computer code is shown using copper alloys as an example when the planning of irradiation experiments with fusion reactor material specimens in fission reactors, and processing the experimental results.

  3. Utilizing Computer and Multimedia Technology in Generating Choreography for the Advanced Dance Student at the High School Level.

    ERIC Educational Resources Information Center

    Griffin, Irma Amado

    This study describes a pilot program utilizing various multimedia computer programs on a MacQuadra 840 AV. The target group consisted of six advanced dance students who participated in the pilot program within the dance curriculum by creating a database of dance movement using video and still photography. The students combined desktop publishing,…

  4. Using an Advanced Computational Laboratory Experiment to Extend and Deepen Physical Chemistry Students' Understanding of Atomic Structure

    ERIC Educational Resources Information Center

    Hoffman, Gary G.

    2015-01-01

    A computational laboratory experiment is described, which involves the advanced study of an atomic system. The students use concepts and techniques typically covered in a physical chemistry course but extend those concepts and techniques to more complex situations. The students get a chance to explore the study of atomic states and perform…

  5. Computer experiments on periodic systems identification using rotor blade transient flapping-torsion responses at high advance ratio

    NASA Technical Reports Server (NTRS)

    Hohenemser, K. H.; Prelewicz, D. A.

    1974-01-01

    Systems identification methods have recently been applied to rotorcraft to estimate stability derivatives from transient flight control response data. While these applications assumed a linear constant coefficient representation of the rotorcraft, the computer experiments described in this paper used transient responses in flap-bending and torsion of a rotor blade at high advance ratio which is a rapidly time varying periodic system.

  6. Evaluation of Computed Tomography of Mock Uranium Fuel Rods at the Advanced Photon Source

    DOE PAGES

    Hunter, James F.; Brown, Donald William; Okuniewski, Maria

    2015-06-01

    This study discusses a multi-year effort to evaluate the utility of computed tomography at the Advanced Photon Source (APS) as a tool for non-destructive evaluation of uranium based fuel rods. The majority of the data presented is on mock material made with depleted uranium which mimics the x-ray attenuation characteristics of fuel rods while allowing for simpler handling. A range of data is presented including full thickness (5mm diameter) fuel rodlets, reduced thickness (1.8mm) sintering test samples, and pre/post irradiation samples (< 1mm thick). These data were taken on both a white beam (bending magnet) beamline and a high energy,more » monochromatic beamline. This data shows the utility of a synchrotron type source in the evealuation of manufacturing defects (pre-irradiation) and lays out the case for in situ CT of fuel pellet sintering. Finally, in addition data is shown from small post-irradiation samples and a case is made for post-irradiation CT of larger samples.« less

  7. Effect of surgical mandibular advancement on pharyngeal airway dimensions: a three-dimensional computed tomography study.

    PubMed

    Kochar, G D; Chakranarayan, A; Kohli, S; Kohli, V S; Khanna, V; Jayan, B; Chopra, S S; Verma, M

    2016-05-01

    The aim of this study was to quantify the changes in pharyngeal airway space (PAS) in patients with a skeletal class II malocclusion managed by bilateral sagittal split ramus osteotomy for mandibular advancement, using three-dimensional (3D) registration. The sample comprised 16 patients (mean age 21.69±2.80 years). Preoperative (T0) and postoperative (T1) computed tomography scans were recorded. Linear, cross-sectional area (CSA), and volumetric parameters of the velopharynx, oropharynx, and hypopharynx were evaluated. Parameters were compared with paired samples t-tests. Highly significant changes in dimension were measured in both sagittal and transverse planes (P<0.001). CSA measurements increased significantly between T0 and T1 (P<0.001). A significant increase in PAS volume was found at T1 compared with T0 (P<0.001). The changes in PAS were quantified using 3D reconstruction. Along the sagittal and transverse planes, the greatest increase was seen in the oropharynx (12.16% and 11.50%, respectively), followed by hypopharynx (11.00% and 9.07%) and velopharynx (8.97% and 6.73%). CSA increased by 41.69%, 34.56%, and 28.81% in the oropharynx, hypopharynx, and velopharynx, respectively. The volumetric increase was greatest in the oropharynx (49.79%) and least in the velopharynx (38.92%). These established quantifications may act as a useful guide for clinicians in the field of dental sleep medicine.

  8. Recent advances in computational fluid dynamics relevant to the modelling of pesticide flow on leaf surfaces.

    PubMed

    Glass, C Richard; Walters, Keith F A; Gaskell, Philip H; Lee, Yeaw C; Thompson, Harvey M; Emerson, David R; Gu, Xiao-Jun

    2010-01-01

    Increasing societal and governmental concern about the worldwide use of chemical pesticides is now providing strong drivers towards maximising the efficiency of pesticide utilisation and the development of alternative control techniques. There is growing recognition that the ultimate goal of achieving efficient and sustainable pesticide usage will require greater understanding of the fluid mechanical mechanisms governing the delivery to, and spreading of, pesticide droplets on target surfaces such as leaves. This has led to increasing use of computational fluid dynamics (CFD) as an important component of efficient process design with regard to pesticide delivery to the leaf surface. This perspective highlights recent advances in CFD methods for droplet spreading and film flows, which have the potential to provide accurate, predictive models for pesticide flow on leaf surfaces, and which can take account of each of the key influences of surface topography and chemistry, initial spray deposition conditions, evaporation and multiple droplet spreading interactions. The mathematical framework of these CFD methods is described briefly, and a series of new flow simulation results relevant to pesticide flows over foliage is provided. The potential benefits of employing CFD for practical process design are also discussed briefly.

  9. Evaluation of Computed Tomography of Mock Uranium Fuel Rods at the Advanced Photon Source

    SciTech Connect

    Hunter, James F.; Brown, Donald William; Okuniewski, Maria

    2015-06-01

    This study discusses a multi-year effort to evaluate the utility of computed tomography at the Advanced Photon Source (APS) as a tool for non-destructive evaluation of uranium based fuel rods. The majority of the data presented is on mock material made with depleted uranium which mimics the x-ray attenuation characteristics of fuel rods while allowing for simpler handling. A range of data is presented including full thickness (5mm diameter) fuel rodlets, reduced thickness (1.8mm) sintering test samples, and pre/post irradiation samples (< 1mm thick). These data were taken on both a white beam (bending magnet) beamline and a high energy, monochromatic beamline. This data shows the utility of a synchrotron type source in the evealuation of manufacturing defects (pre-irradiation) and lays out the case for in situ CT of fuel pellet sintering. Finally, in addition data is shown from small post-irradiation samples and a case is made for post-irradiation CT of larger samples.

  10. Advanced flight computing technologies for validation by NASA's new millennium program

    NASA Astrophysics Data System (ADS)

    Alkalai, Leon

    1996-11-01

    The New Millennium Program (NMP) consists of a series of Deep-Space and Earth Orbiting missions that are technology-driven, in contrast to the more traditional science-driven space exploration missions of the past. These flights are designed to validate technologies that will enable a new era of low-cost highly miniaturized and highly capable spacebome applications in the new millennium. In addition to the series of flight projects managed by separate flight teams, the NMP technology initiatives are managed by the following six focused technology programs: Microelectronics Systems, Autonomy, Telecommunications, Instrument Technologies and Architectures, In-Situ Instruments and Micro-electromechanical Systems, and Modular and Multifunctional Systems. Each technology program is managed as an Integrated Product Development Team (IPDT) of government, academic, and industry partners. In this paper, we will describe elements of the technology roadmap proposed by the NMP Microelectronics IPDT. Moreover, we will relate the proposed technology roadmap to existing NASA technology development programs, such as the Advanced Flight Computing (AFC) program, and the Remote Exploration and Experimentation (REE) program, which constitute part of the on-going NASA technology development pipeline. We will also describe the Microelectronics Systems technologies that have been accepted as part of the first New Millennium Deep-Space One spacecraft, which is an asteroid fly-by mission scheduled for launched in July 1998.

  11. IMPROVED COMPUTATIONAL NEUTRONICS METHODS AND VALIDATION PROTOCOLS FOR THE ADVANCED TEST REACTOR

    SciTech Connect

    David W. Nigg; Joseph W. Nielsen; Benjamin M. Chase; Ronnie K. Murray; Kevin A. Steuhm

    2012-04-01

    The Idaho National Laboratory (INL) is in the process of modernizing the various reactor physics modeling and simulation tools used to support operation and safety assurance of the Advanced Test Reactor (ATR). Key accomplishments so far have encompassed both computational as well as experimental work. A new suite of stochastic and deterministic transport theory based reactor physics codes and their supporting nuclear data libraries (HELIOS, KENO6/SCALE, NEWT/SCALE, ATTILA, and an extended implementation of MCNP5) has been installed at the INL. Corresponding models of the ATR and ATRC are now operational with all five codes, demonstrating the basic feasibility of the new code packages for their intended purpose. Of particular importance, a set of as-run core depletion HELIOS calculations for all ATR cycles since August 2009 was successfully completed during 2011. This demonstration supported a decision late in the year to proceed with the phased incorporation of the HELIOS methodology into the ATR fuel cycle management process beginning in 2012. On the experimental side of the project, new hardware was fabricated, measurement protocols were finalized, and the first four of six planned physics code validation experiments based on neutron activation spectrometry were conducted at the ATRC facility. Data analysis for the first three experiments, focused on characterization of the neutron spectrum in one of the ATR flux traps, has been completed. The six experiments will ultimately form the basis for a flexible, easily-repeatable ATR physics code validation protocol that is consistent with applicable ASTM standards.

  12. Advanced imaging findings and computer-assisted surgery of suspected synovial chondromatosis in the temporomandibular joint.

    PubMed

    Hohlweg-Majert, Bettina; Metzger, Marc C; Böhm, Joachim; Muecke, Thomas; Schulze, Dirk

    2008-11-01

    Synovial chondromatosis of the joint occurs mainly in teenagers and young adults. Only 3% of these neoplasms are located in the head and neck region. Synovial chondromatosis of the temporomandibular joint is therefore a very rare disorder. Therefore, developing a working, histological confirmation is required for differential diagnosis. In this case series, the outcome of histological investigation and imaging techniques are compared. Based on clinical symptoms, five cases of suspected synovial chondromatosis of the temporomandibular joint are presented. In each of the subjects, the diagnosis was confirmed by histology. Specific imaging features for each case are described. The tomography images were compared with the histological findings. All patients demonstrated preauricular swelling, dental midline deviation, and limited mouth opening. Computer-assisted surgery was performed. Histology disclosed synovial chondromatosis of the temporomandibular joint in four cases. The other case was found to be a developmental disorder of the tympanic bone. The diagnosis of synovial chondromatosis of the temporomandibular joint can only be based on histology. Clinical symptoms are too general and the available imaging techniques only show nonspecific tumorous destruction, infiltration, and/or residual calcified bodies, they are only for advanced cases. A rare developmental disorder of the tympanic bone--persistence of foramen of Huschke--has to be differentiated.

  13. Advances in automated deception detection in text-based computer-mediated communication

    NASA Astrophysics Data System (ADS)

    Adkins, Mark; Twitchell, Douglas P.; Burgoon, Judee K.; Nunamaker, Jay F., Jr.

    2004-08-01

    The Internet has provided criminals, terrorists, spies, and other threats to national security a means of communication. At the same time it also provides for the possibility of detecting and tracking their deceptive communication. Recent advances in natural language processing, machine learning and deception research have created an environment where automated and semi-automated deception detection of text-based computer-mediated communication (CMC, e.g. email, chat, instant messaging) is a reachable goal. This paper reviews two methods for discriminating between deceptive and non-deceptive messages in CMC. First, Document Feature Mining uses document features or cues in CMC messages combined with machine learning techniques to classify messages according to their deceptive potential. The method, which is most useful in asynchronous applications, also allows for the visualization of potential deception cues in CMC messages. Second, Speech Act Profiling, a method for quantifying and visualizing synchronous CMC, has shown promise in aiding deception detection. The methods may be combined and are intended to be a part of a suite of tools for automating deception detection.

  14. Advanced practice registered nurse usability testing of a tailored computer-mediated health communication program.

    PubMed

    Lin, Carolyn A; Neafsey, Patricia J; Anderson, Elizabeth

    2010-01-01

    This study tested the usability of a touch-screen-enabled Personal Education Program with advanced practice RNs. The Personal Education Program is designed to enhance medication adherence and reduce adverse self-medication behaviors in older adults with hypertension. An iterative research process was used, which involved the use of (1) pretrial focus groups to guide the design of system information architecture, (2) two different cycles of think-aloud trials to test the software interface, and (3) post-trial focus groups to gather feedback on the think-aloud studies. Results from this iterative usability-testing process were used to systematically modify and improve the three Personal Education Program prototype versions-the pilot, prototype 1, and prototype 2. Findings contrasting the two separate think-aloud trials showed that APRN users rated the Personal Education Program system usability, system information, and system-use satisfaction at a moderately high level between trials. In addition, errors using the interface were reduced by 76%, and the interface time was reduced by 18.5% between the two trials. The usability-testing processes used in this study ensured an interface design adapted to APRNs' needs and preferences to allow them to effectively use the computer-mediated health-communication technology in a clinical setting.

  15. Study of flutter related computational procedures for minimum weight structural sizing of advanced aircraft

    NASA Technical Reports Server (NTRS)

    Oconnell, R. F.; Hassig, H. J.; Radovcich, N. A.

    1976-01-01

    Results of a study of the development of flutter modules applicable to automated structural design of advanced aircraft configurations, such as a supersonic transport, are presented. Automated structural design is restricted to automated sizing of the elements of a given structural model. It includes a flutter optimization procedure; i.e., a procedure for arriving at a structure with minimum mass for satisfying flutter constraints. Methods of solving the flutter equation and computing the generalized aerodynamic force coefficients in the repetitive analysis environment of a flutter optimization procedure are studied, and recommended approaches are presented. Five approaches to flutter optimization are explained in detail and compared. An approach to flutter optimization incorporating some of the methods discussed is presented. Problems related to flutter optimization in a realistic design environment are discussed and an integrated approach to the entire flutter task is presented. Recommendations for further investigations are made. Results of numerical evaluations, applying the five methods of flutter optimization to the same design task, are presented.

  16. Observations on computational methodologies for use in large-scale, gradient-based, multidisciplinary design incorporating advanced CFD codes

    NASA Technical Reports Server (NTRS)

    Newman, P. A.; Hou, G. J.-W.; Jones, H. E.; Taylor, A. C., III; Korivi, V. M.

    1992-01-01

    How a combination of various computational methodologies could reduce the enormous computational costs envisioned in using advanced CFD codes in gradient based optimized multidisciplinary design (MdD) procedures is briefly outlined. Implications of these MdD requirements upon advanced CFD codes are somewhat different than those imposed by a single discipline design. A means for satisfying these MdD requirements for gradient information is presented which appear to permit: (1) some leeway in the CFD solution algorithms which can be used; (2) an extension to 3-D problems; and (3) straightforward use of other computational methodologies. Many of these observations have previously been discussed as possibilities for doing parts of the problem more efficiently; the contribution here is observing how they fit together in a mutually beneficial way.

  17. Modeling, Simulation and Analysis of Complex Networked Systems: A Program Plan for DOE Office of Advanced Scientific Computing Research

    SciTech Connect

    Brown, D L

    2009-05-01

    Many complex systems of importance to the U.S. Department of Energy consist of networks of discrete components. Examples are cyber networks, such as the internet and local area networks over which nearly all DOE scientific, technical and administrative data must travel, the electric power grid, social networks whose behavior can drive energy demand, and biological networks such as genetic regulatory networks and metabolic networks. In spite of the importance of these complex networked systems to all aspects of DOE's operations, the scientific basis for understanding these systems lags seriously behind the strong foundations that exist for the 'physically-based' systems usually associated with DOE research programs that focus on such areas as climate modeling, fusion energy, high-energy and nuclear physics, nano-science, combustion, and astrophysics. DOE has a clear opportunity to develop a similarly strong scientific basis for understanding the structure and dynamics of networked systems by supporting a strong basic research program in this area. Such knowledge will provide a broad basis for, e.g., understanding and quantifying the efficacy of new security approaches for computer networks, improving the design of computer or communication networks to be more robust against failures or attacks, detecting potential catastrophic failure on the power grid and preventing or mitigating its effects, understanding how populations will respond to the availability of new energy sources or changes in energy policy, and detecting subtle vulnerabilities in large software systems to intentional attack. This white paper outlines plans for an aggressive new research program designed to accelerate the advancement of the scientific basis for complex networked systems of importance to the DOE. It will focus principally on four research areas: (1) understanding network structure, (2) understanding network dynamics, (3) predictive modeling and simulation for complex networked systems

  18. The North Carolina Field Test: Field Performance of the Preliminary Version of an Advanced Weatherization Audit for the Department of Energy's Weatherization Assistance Program

    SciTech Connect

    Sharp, T.R.

    1994-01-01

    The field performance of weatherizations based on a newly-developed advanced technique for selecting residential energy conservation measures was tested alongside current Retro-Tech-based weatherizations in North Carolina. The new technique is computer-based and determines measures based on the needs of an individual house. In addition, it recommends only those measures that it determines will have a benefit-to-cost ratio greater than 1 for the house being evaluated. The new technique also considers the interaction of measures in computing the benefit-to-cost ratio of each measure. The two weatherization approaches were compared based on implementation ease, measures installed, labor and cost requirements, and both heating and cooling energy savings achieved. One-hundred and twenty houses with the following characteristics participated: the occupants were low-income, eligible for North Carolina's current weatherization program, and responsible for their own fuel and electric bills. Houses were detached single-family dwellings, not mobile homes; were heated by kerosene, fuel oil, natural gas, or propane; and had one or two operating window air conditioners. Houses were divided equally into one control group and two weatherization groups. Weekly space heating and cooling energy use, and hourly indoor and outdoor temperatures were monitored between November 1989 and September 1990 (pre-period) and between December 1990 and August 1991 (post-period). House consumption models were used to normalize for annual weather differences and a 68 F indoor temperature. Control group savings were used to adjust the savings determined for the weatherization groups. The two weatherization approaches involved installing attic and floor insulations in near equivalent quantities, and installing storm windows and wall insulation in drastically different quantities. Substantial differences also were found in average air leakage reductions for the two weatherization groups. Average

  19. Recent Advances and Issues in Computers. Oryx Frontiers of Science Series.

    ERIC Educational Resources Information Center

    Gay, Martin K.

    Discussing recent issues in computer science, this book contains 11 chapters covering: (1) developments that have the potential for changing the way computers operate, including microprocessors, mass storage systems, and computing environments; (2) the national computational grid for high-bandwidth, high-speed collaboration among scientists, and…

  20. ‘I'll be in a safe place’: a qualitative study of the decisions taken by people with advanced cancer to seek emergency department care

    PubMed Central

    Henson, Lesley A; Higginson, Irene J; Daveson, Barbara A; Ellis-Smith, Clare; Koffman, Jonathan; Morgan, Myfanwy; Gao, Wei

    2016-01-01

    Objective To explore the decisions of people with advanced cancer and their caregivers to seek emergency department (ED) care, and understand the issues that influence the decision-making process. Design Cross-sectional qualitative study incorporating semistructured patient and caregiver interviews. Methods Between December 2014 and July 2015, semistructured interviews were conducted with 18 people with advanced cancer, all of whom had recently attended the ED of a large university teaching hospital located in south-east London; and six of their caregivers. Interviews were audio recorded, transcribed verbatim and analysed using a constant comparative approach. Padgett and Brodsky's modified version of the ‘Behavioral Model of Health Services Use’ was used as a framework to guide the study. Results Issues influencing the decision-making process included: (1) disease-related anxiety—those with greater anxiety related to their cancer diagnosis interpreted their symptoms as more severe and/or requiring immediate attention; (2) prior patterns of health-seeking behaviour—at times of crisis participants defaulted to previously used services; (3) feelings of safety and familiarity with the hospital setting—many felt reassured by the presence of healthcare professionals and monitoring of their condition; and, (4) difficulties accessing community healthcare services—especially urgently and/or out-of-hours. Conclusions These data provide healthcare professionals and policymakers with a greater understanding of how systems of care may be developed to help reduce ED visits by people with advanced cancer. In particular, our findings suggest that the number of ED visits could be reduced with greater end-of-life symptom support and education, earlier collaboration between oncology and palliative care, and with increased access to community healthcare services. PMID:27807085

  1. ENHANCING THE ATOMIC-LEVEL UNDERSTANDING OF CO2 MINERAL SEQUESTRATION MECHANISMS VIA ADVANCED COMPUTATIONAL MODELING

    SciTech Connect

    A.V.G. Chizmeshya; M.J. McKelvy; G.H. Wolf; R.W. Carpenter; D.A. Gormley; J.R. Diefenbacher; R. Marzke

    2006-03-01

    significantly improved our understanding of mineral carbonation. Group members at the Albany Research Center have recently shown that carbonation of olivine and serpentine, which naturally occurs over geological time (i.e., 100,000s of years), can be accelerated to near completion in hours. Further process refinement will require a synergetic science/engineering approach that emphasizes simultaneous investigation of both thermodynamic processes and the detailed microscopic, atomic-level mechanisms that govern carbonation kinetics. Our previously funded Phase I Innovative Concepts project demonstrated the value of advanced quantum-mechanical modeling as a complementary tool in bridging important gaps in our understanding of the atomic/molecular structure and reaction mechanisms that govern CO2 mineral sequestration reaction processes for the model Mg-rich lamellar hydroxide feedstock material Mg(OH)2. In the present simulation project, improved techniques and more efficient computational schemes have allowed us to expand and augment these capabilities and explore more complex Mg-rich, lamellar hydroxide-based feedstock materials, including the serpentine-based minerals. These feedstock materials are being actively investigated due to their wide availability, and low-cost CO2 mineral sequestration potential. Cutting-edge first principles quantum chemical, computational solid-state and materials simulation methodology studies proposed herein, have been strategically integrated with our new DOE supported (ASU-Argonne National Laboratory) project to investigate the mechanisms that govern mineral feedstock heat-treatment and aqueous/fluid-phase serpentine mineral carbonation in situ. This unified, synergetic theoretical and experimental approach has provided a deeper understanding of the key reaction mechanisms than either individual approach can alone. We used ab initio techniques to significantly advance our understanding of atomic-level processes at the solid/solution interface by

  2. ENHANCING THE ATOMIC-LEVEL UNDERSTANDING OF CO2 MINERAL SEQUESTRATION MECHANISMS VIA ADVANCED COMPUTATIONAL MODELING

    SciTech Connect

    A.V.G. Chizmeshya

    2003-12-19

    /NETL managed National Mineral Sequestration Working Group we have already significantly improved our understanding of mineral carbonation. Group members at the Albany Research Center have recently shown that carbonation of olivine and serpentine, which naturally occurs over geological time (i.e., 100,000s of years), can be accelerated to near completion in hours. Further process refinement will require a synergetic science/engineering approach that emphasizes simultaneous investigation of both thermodynamic processes and the detailed microscopic, atomic-level mechanisms that govern carbonation kinetics. Our previously funded Phase I Innovative Concepts project demonstrated the value of advanced quantum-mechanical modeling as a complementary tool in bridging important gaps in our understanding of the atomic/molecular structure and reaction mechanisms that govern CO{sub 2} mineral sequestration reaction processes for the model Mg-rich lamellar hydroxide feedstock material Mg(OH){sub 2}. In the present simulation project, improved techniques and more efficient computational schemes have allowed us to expand and augment these capabilities and explore more complex Mg-rich, lamellar hydroxide-based feedstock materials, including the serpentine-based minerals. These feedstock materials are being actively investigated due to their wide availability, and low-cost CO{sub 2} mineral sequestration potential. Cutting-edge first principles quantum chemical, computational solid-state and materials simulation methodology studies proposed herein, have been strategically integrated with our new DOE supported (ASU-Argonne National Laboratory) project to investigate the mechanisms that govern mineral feedstock heat-treatment and aqueous/fluid-phase serpentine mineral carbonation in situ. This unified, synergetic theoretical and experimental approach will provide a deeper understanding of the key reaction mechanisms than either individual approach can alone. Ab initio techniques will also

  3. ENHANCING THE ATOMIC-LEVEL UNDERSTANDING OF CO2 MINERAL SEQUESTRATION MECHANISMS VIA ADVANCED COMPUTATIONAL MODELING

    SciTech Connect

    A.V.G. Chizmeshya

    2002-12-19

    /NETL managed National Mineral Sequestration Working Group we have already significantly improved our understanding of mineral carbonation. Group members at the Albany Research Center have recently shown that carbonation of olivine and serpentine, which naturally occurs over geological time (i.e., 100,000s of years), can be accelerated to near completion in hours. Further process refinement will require a synergetic science/engineering approach that emphasizes simultaneous investigation of both thermodynamic processes and the detailed microscopic, atomic-level mechanisms that govern carbonation kinetics. Our previously funded Phase I Innovative Concepts project demonstrated the value of advanced quantum-mechanical modeling as a complementary tool in bridging important gaps in our understanding of the atomic/molecular structure and reaction mechanisms that govern CO{sub 2} mineral sequestration reaction processes for the model Mg-rich lamellar hydroxide feedstock material Mg(OH){sub 2}. In the present simulation project, improved techniques and more efficient computational schemes have allowed us to expand and augment these capabilities and explore more complex Mg-rich, lamellar hydroxide-based feedstock materials, including the serpentine-based minerals. These feedstock materials are being actively investigated due to their wide availability, and low-cost CO{sub 2} mineral sequestration potential. Cutting-edge first principles quantum chemical, computational solid-state and materials simulation methodology studies proposed herein, have been strategically integrated with our new DOE supported (ASU-Argonne National Laboratory) project to investigate the mechanisms that govern mineral feedstock heat-treatment and aqueous/fluid-phase serpentine mineral carbonation in situ. This unified, synergetic theoretical and experimental approach will provide a deeper understanding of the key reaction mechanisms than either individual approach can alone. Ab initio techniques will also

  4. Integrated Graphics Operations and Analysis Lab Development of Advanced Computer Graphics Algorithms

    NASA Technical Reports Server (NTRS)

    Wheaton, Ira M.

    2011-01-01

    The focus of this project is to aid the IGOAL in researching and implementing algorithms for advanced computer graphics. First, this project focused on porting the current International Space Station (ISS) Xbox experience to the web. Previously, the ISS interior fly-around education and outreach experience only ran on an Xbox 360. One of the desires was to take this experience and make it into something that can be put on NASA s educational site for anyone to be able to access. The current code works in the Unity game engine which does have cross platform capability but is not 100% compatible. The tasks for an intern to complete this portion consisted of gaining familiarity with Unity and the current ISS Xbox code, porting the Xbox code to the web as is, and modifying the code to work well as a web application. In addition, a procedurally generated cloud algorithm will be developed. Currently, the clouds used in AGEA animations and the Xbox experiences are a texture map. The desire is to create a procedurally generated cloud algorithm to provide dynamically generated clouds for both AGEA animations and the Xbox experiences. This task consists of gaining familiarity with AGEA and the plug-in interface, developing the algorithm, creating an AGEA plug-in to implement the algorithm inside AGEA, and creating a Unity script to implement the algorithm for the Xbox. This portion of the project was unable to be completed in the time frame of the internship; however, the IGOAL will continue to work on it in the future.

  5. Development and Validation of a Fast, Accurate and Cost-Effective Aeroservoelastic Method on Advanced Parallel Computing Systems

    NASA Technical Reports Server (NTRS)

    Goodwin, Sabine A.; Raj, P.

    1999-01-01

    Progress to date towards the development and validation of a fast, accurate and cost-effective aeroelastic method for advanced parallel computing platforms such as the IBM SP2 and the SGI Origin 2000 is presented in this paper. The ENSAERO code, developed at the NASA-Ames Research Center has been selected for this effort. The code allows for the computation of aeroelastic responses by simultaneously integrating the Euler or Navier-Stokes equations and the modal structural equations of motion. To assess the computational performance and accuracy of the ENSAERO code, this paper reports the results of the Navier-Stokes simulations of the transonic flow over a flexible aeroelastic wing body configuration. In addition, a forced harmonic oscillation analysis in the frequency domain and an analysis in the time domain are done on a wing undergoing a rigid pitch and plunge motion. Finally, to demonstrate the ENSAERO flutter-analysis capability, aeroelastic Euler and Navier-Stokes computations on an L-1011 wind tunnel model including pylon, nacelle and empennage are underway. All computational solutions are compared with experimental data to assess the level of accuracy of ENSAERO. As the computations described above are performed, a meticulous log of computational performance in terms of wall clock time, execution speed, memory and disk storage is kept. Code scalability is also demonstrated by studying the impact of varying the number of processors on computational performance on the IBM SP2 and the Origin 2000 systems.

  6. Development of 3D multimedia with advanced computer animation tools for outreach activities related to Meteor Science and Meteoritics

    NASA Astrophysics Data System (ADS)

    Madiedo, J. M.

    2012-09-01

    Documentaries related to Astronomy and Planetary Sciences are a common and very attractive way to promote the interest of the public in these areas. These educational tools can get benefit from new advanced computer animation software and 3D technologies, as these allow making these documentaries even more attractive. However, special care must be taken in order to guarantee that the information contained in them is serious and objective. In this sense, an additional value is given when the footage is produced by the own researchers. With this aim, a new documentary produced and directed by Prof. Madiedo has been developed. The documentary, which has been entirely developed by means of advanced computer animation tools, is dedicated to several aspects of Meteor Science and Meteoritics. The main features of this outreach and education initiative are exposed here.

  7. Prediction of helicopter rotor discrete frequency noise: A computer program incorporating realistic blade motions and advanced acoustic formulation

    NASA Technical Reports Server (NTRS)

    Brentner, K. S.

    1986-01-01

    A computer program has been developed at the Langley Research Center to predict the discrete frequency noise of conventional and advanced helicopter rotors. The program, called WOPWOP, uses the most advanced subsonic formulation of Farassat that is less sensitive to errors and is valid for nearly all helicopter rotor geometries and flight conditions. A brief derivation of the acoustic formulation is presented along with a discussion of the numerical implementation of the formulation. The computer program uses realistic helicopter blade motion and aerodynamic loadings, input by the user, for noise calculation in the time domain. A detailed definition of all the input variables, default values, and output data is included. A comparison with experimental data shows good agreement between prediction and experiment; however, accurate aerodynamic loading is needed.

  8. Proceedings of the topical meeting on advances in human factors research on man/computer interactions

    SciTech Connect

    Not Available

    1990-01-01

    This book discusses the following topics: expert systems and knowledge engineering-I; verification and validation of software; methods for modeling UMAN/computer performance; MAN/computer interaction problems in producing procedures -1-2; progress and problems with automation-1-2; experience with electronic presentation of procedures-2; intelligent displays and monitors; modeling user/computer interface; and computer-based human decision-making aids.

  9. How will you need me, how will you read me, when I'm 64 (or more!)?: volume computed tomographic scanning and information overload in the emergency department.

    PubMed

    Chason, David P; Anderson, Jon A; Stephens, Jason S; Suss, Richard A; Guild, Jeffrey B; Blackburn, Timothy J; Champine, Julie G; Lane, Thomas J

    2010-01-01

    Computed tomographic (CT) scanning technology now employs up to 320 detector rows of 0.5-mm width and allows rapid acquisition of isotropic volume datasets over the entire body. Data from a single CT acquisition can be reconstructed into image series that would formerly have required multiple acquisitions. Small isotropic voxels permit scan parameters to be general while reconstruction algorithms remain specific to anatomy. While this results in more efficient operation in the Emergency Department, it necessitates new ways of displaying, interpreting, and archiving the information. Critical decisions include how much of the patient to scan and how to time contrast injections when imaging multiple organs. These choices must be made in light of dose considerations to the patient and the general population of patients. The technical basis of high-density CT scanning is discussed, including detector configurations and reconstruction techniques. Volumetric scanning in the Emergency Department can improve patient care but requires a change of technical habits.

  10. A computer program for estimating the power-density spectrum of advanced continuous simulation language generated time histories

    NASA Technical Reports Server (NTRS)

    Dunn, H. J.

    1981-01-01

    A computer program for performing frequency analysis of time history data is presented. The program uses circular convolution and the fast Fourier transform to calculate power density spectrum (PDS) of time history data. The program interfaces with the advanced continuous simulation language (ACSL) so that a frequency analysis may be performed on ACSL generated simulation variables. An example of the calculation of the PDS of a Van de Pol oscillator is presented.

  11. Harnessing the Department of Energy’s High-Performance Computing Expertise to Strengthen the U.S. Chemical Enterprise

    SciTech Connect

    Dixon, David A.; Dupuis, Michel; Garrett, Bruce C.; Neaton, Jeffrey B.; Plata, Charity; Tarr, Matthew A.; Tomb, Jean-Francois; Golab, Joseph T.

    2012-01-17

    High-performance computing (HPC) is one area where the DOE has developed extensive expertise and capability. However, this expertise currently is not properly shared with or used by the private sector to speed product development, enable industry to move rapidly into new areas, and improve product quality. Such use would lead to substantial competitive advantages in global markets and yield important economic returns for the United States. To stimulate the dissemination of DOE's HPC expertise, the Council for Chemical Research (CCR) and the DOE jointly held a workshop on this topic. Four important energy topic areas were chosen as the focus of the meeting: Biomass/Bioenergy, Catalytic Materials, Energy Storage, and Photovoltaics. Academic, industrial, and government experts in these topic areas participated in the workshop to identify industry needs, evaluate the current state of expertise, offer proposed actions and strategies, and forecast the expected benefits of implementing those strategies.

  12. Encouraging Advanced Second Language Speakers to Recognise Their Language Difficulties: A Personalised Computer-Based Approach

    ERIC Educational Resources Information Center

    Xu, Jing; Bull, Susan

    2010-01-01

    Despite holding advanced language qualifications, many overseas students studying at English-speaking universities still have difficulties in formulating grammatically correct sentences. This article introduces an "independent open learner model" for advanced second language speakers of English, which confronts students with the state of their…

  13. Overview of the NASA/RECON educational, research, and development activities of the Computer Science Departments of the University of Southwestern Louisiana and Southern University

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor)

    1984-01-01

    This document presents a brief overview of the scope of activities undertaken by the Computer Science Departments of the University of Southern Louisiana (USL) and Southern University (SU) pursuant to a contract with NASA. Presented are only basic identification data concerning the contract activities since subsequent entries within the Working Paper Series will be oriented specifically toward a detailed development and presentation of plans, methodologies, and results of each contract activity. Also included is a table of contents of the entire USL/DBMS NASA/RECON Working Paper Series.

  14. Application of advanced computational procedures for modeling solar-wind interactions with Venus: Theory and computer code

    NASA Technical Reports Server (NTRS)

    Stahara, S. S.; Klenke, D.; Trudinger, B. C.; Spreiter, J. R.

    1980-01-01

    Computational procedures are developed and applied to the prediction of solar wind interaction with nonmagnetic terrestrial planet atmospheres, with particular emphasis to Venus. The theoretical method is based on a single fluid, steady, dissipationless, magnetohydrodynamic continuum model, and is appropriate for the calculation of axisymmetric, supersonic, super-Alfvenic solar wind flow past terrestrial planets. The procedures, which consist of finite difference codes to determine the gasdynamic properties and a variety of special purpose codes to determine the frozen magnetic field, streamlines, contours, plots, etc. of the flow, are organized into one computational program. Theoretical results based upon these procedures are reported for a wide variety of solar wind conditions and ionopause obstacle shapes. Plasma and magnetic field comparisons in the ionosheath are also provided with actual spacecraft data obtained by the Pioneer Venus Orbiter.

  15. Advances in Navy pharmacy information technology: accessing Micromedex via the Composite Healthcare Computer System and local area networks.

    PubMed

    Koerner, S D; Becker, F

    1999-07-01

    The pharmacy profession has long used technology to more effectively bring health care to the patient. Navy pharmacy has embraced technology advances in its daily operations, from computers to dispensing robots. Evolving from the traditional role of compounding and dispensing specialists, pharmacists are establishing themselves as vital team members in direct patient care: on the ward, in ambulatory clinics, in specialty clinics, and in other specialty patient care programs (e.g., smoking cessation). An important part of the evolution is the timely access to the most up-to-date information available. Micromedex, Inc. (Denver, Colorado), has developed a number of computer CD-ROM-based full-text pharmacy, toxicology, emergency medicine, and patient education products. Micromedex is a recognized leader with regard to total pharmaceutical information availability. This article discusses the implementation of Micromedex products within the established Composite Healthcare Computer System and the subsequent use by and effect on the international Navy pharmacy community.

  16. Recent Advances in Photonic Devices for Optical Computing and the Role of Nonlinear Optics-Part II

    NASA Technical Reports Server (NTRS)

    Abdeldayem, Hossin; Frazier, Donald O.; Witherow, William K.; Banks, Curtis E.; Paley, Mark S.

    2007-01-01

    The twentieth century has been the era of semiconductor materials and electronic technology while this millennium is expected to be the age of photonic materials and all-optical technology. Optical technology has led to countless optical devices that have become indispensable in our daily lives in storage area networks, parallel processing, optical switches, all-optical data networks, holographic storage devices, and biometric devices at airports. This chapters intends to bring some awareness to the state-of-the-art of optical technologies, which have potential for optical computing and demonstrate the role of nonlinear optics in many of these components. Our intent, in this Chapter, is to present an overview of the current status of optical computing, and a brief evaluation of the recent advances and performance of the following key components necessary to build an optical computing system: all-optical logic gates, adders, optical processors, optical storage, holographic storage, optical interconnects, spatial light modulators and optical materials.

  17. Parallel-META 2.0: enhanced metagenomic data analysis with functional annotation, high performance computing and advanced visualization.

    PubMed

    Su, Xiaoquan; Pan, Weihua; Song, Baoxing; Xu, Jian; Ning, Kang

    2014-01-01

    The metagenomic method directly sequences and analyses genome information from microbial communities. The main computational tasks for metagenomic analyses include taxonomical and functional structure analysis for all genomes in a microbial community (also referred to as a metagenomic sample). With the advancement of Next Generation Sequencing (NGS) techniques, the number of metagenomic samples and the data size for each sample are increasing rapidly. Current metagenomic analysis is both data- and computation- intensive, especially when there are many species in a metagenomic sample, and each has a large number of sequences. As such, metagenomic analyses require extensive computational power. The increasing analytical requirements further augment the challenges for computation analysis. In this work, we have proposed Parallel-META 2.0, a metagenomic analysis software package, to cope with such needs for efficient and fast analyses of taxonomical and functional structures for microbial communities. Parallel-META 2.0 is an extended and improved version of Parallel-META 1.0, which enhances the taxonomical analysis using multiple databases, improves computation efficiency by optimized parallel computing, and supports interactive visualization of results in multiple views. Furthermore, it enables functional analysis for metagenomic samples including short-reads assembly, gene prediction and functional annotation. Therefore, it could provide accurate taxonomical and functional analyses of the metagenomic samples in high-throughput manner and on large scale.

  18. Surface Computer System Architecture for the Advanced Unmanned Search System (AUSS)

    DTIC Science & Technology

    1992-12-01

    called the data docker , is planned, which is basically a single board, 80286-based machine from Ampro Computers, Inc., intended to receive the vehicle on...the data docker computer, which is networked into the control van system. The drive is packaged in a carrier that trakes the drive act like a plug-in...and data docker computers. The FS and data docker machines require only infrequent use of a keyboard and monitor, and therefore, the switchbox is an

  19. Computer order entry systems in the emergency department significantly reduce the time to medication delivery for high acuity patients

    PubMed Central

    2013-01-01

    and radiology results may reflect system issues outside of the emergency department and as a result of potential confounding may not be a reflection of CPOE impact. PMID:23830095

  20. U.S. Department of Energy FreedomCAR & Vehicle Technologies Program Advanced Vehicle Testing Activity, Hydrogen/CNG Blended Fuels Performance Testing in a Ford F-150

    SciTech Connect

    James E. Francfort

    2003-11-01

    Federal regulation requires energy companies and government entities to utilize alternative fuels in their vehicle fleets. To meet this need, several automobile manufacturers are producing compressed natural gas (CNG)-fueled vehicles. In addition, several converters are modifying gasoline-fueled vehicles to operate on both gasoline and CNG (Bifuel). Because of the availability of CNG vehicles, many energy company and government fleets have adopted CNG as their principle alternative fuel for transportation. Meanwhile, recent research has shown that blending hydrogen with CNG (HCNG) can reduce emissions from CNG vehicles. However, blending hydrogen with CNG (and performing no other vehicle modifications) reduces engine power output, due to the lower volumetric energy density of hydrogen in relation to CNG. Arizona Public Service (APS) and the U.S. Department of Energy’s Advanced Vehicle Testing Activity (DOE AVTA) identified the need to determine the magnitude of these effects and their impact on the viability of using HCNG in existing CNG vehicles. To quantify the effects of using various blended fuels, a work plan was designed to test the acceleration, range, and exhaust emissions of a Ford F-150 pickup truck operating on 100% CNG and blends of 15 and 30% HCNG. This report presents the results of this testing conducted during May and June 2003 by Electric Transportation Applications (Task 4.10, DOE AVTA Cooperative Agreement DEFC36- 00ID-13859).

  1. An evaluation of the use of an advanced oxidation process to remove chlorinated hydrocarbons from groundwater at the US Department of Energy Kansas City Plant

    SciTech Connect

    Garland, S.B. II; Peyton, G.R.

    1990-10-01

    The Allied-Signal Aerospace Company currently operates a production facility in Kansas City, Missouri, under contract with the US Department of Energy (DOE). Over the years the operation of the DOE Kansas City Plant has resulted in the contamination of groundwater with chlorinated hydrocarbons, including trichloroethene (TCE). One of the plumes of contaminated groundwater, the underground tank farm (UTF) plume, was selected for remediation with an advanced oxidation process (AOP) consisting of simultaneous treatment by ozone (O{sub 3}), ultraviolet (UV) radiation, and hydrogen peroxide (H{sub 2}O{sub 2}). Since the use of AOPs is relatively new for the removal of organics from groundwater, information on design criteria, costs, performance, and operating experience is not well documented in the literature. Therefore, the Oak Ridge National Laboratory (ORNL) was requested to evaluate the treatment process. This report documents the work performed through FY 1989. The results of the initial year of the evaluations, FY 1988, have been published previously, and the evaluation will continue at least through FY 1990. This report first briefly describes the treatment plant and the mechanisms of the treatment process. Next, the methodology and the results from the evaluation are discussed. Finally, conclusions and recommendations are presented. 8 refs., 14 figs., 16 tabs.

  2. The North Carolina Field Test: Field performance of the preliminary version of an advanced weatherization audit for the Department of Energy`s Weatherization Assistance Program

    SciTech Connect

    Sharp, T.R.

    1994-06-01

    The field performance of weatherizations based on a newly-developed advanced technique for selecting residential energy conservation measures was tested alongside current Retro-Tech-based weatherizations in North Carolina. The new technique is computer-based and determines measures based on the needs of an individual house. In addition, it recommends only those measures that it determines will have a benefit-to-cost ratio greater than 1 for the house being evaluated. The new technique also considers the interaction of measures in computing the benefit-to-cost ratio of each measure. The two weatherization approaches were compared based on implementation ease, measures installed, labor and cost requirements, and both heating and cooling energy savings achieved. One-hundred and twenty houses with the following characteristics participated: the occupants were low-income, eligible for North Carolina`s current weatherization program, and responsible for their own fuel and electric bills. Houses were detached single-family dwellings, not mobile homes; were heated by kerosene, fuel oil, natural gas, or propane; and had one or two operating window air conditioners. Houses were divided equally into one control group and two weatherization groups. Weekly space heating and cooling energy use, and hourly indoor and outdoor temperatures were monitored between November 1989 and September 1990 (pre-period) and between December 1990 and August 1991 (post-period). House consumption models were used to normalize for annual weather differences and a 68{degrees}F indoor temperature. Control group savings were used to adjust the savings determined for the weatherization groups. The two weatherization approaches involved installing attic and floor insulations in near equivalent quantities, and installing storm windows and wall insulation in drastically different quantities. Substantial differences also were found in average air leakage reductions for the two weatherization groups.

  3. 78 FR 59927 - Next Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-30

    ... Systems Biology [External Review Draft] AGENCY: Environmental Protection Agency (EPA). ACTION: Notice of... Molecular, Computational, and Systems Biology '' (EPA/600/R-13/214A). EPA is also announcing that Eastern..., Computational, and Systems Biology '' is available primarily via the Internet on the NCEA home page under...

  4. BOOK REVIEW: Advanced Topics in Computational Partial Differential Equations: Numerical Methods and Diffpack Programming

    NASA Astrophysics Data System (ADS)

    Katsaounis, T. D.

    2005-02-01

    The scope of this book is to present well known simple and advanced numerical methods for solving partial differential equations (PDEs) and how to implement these methods using the programming environment of the software package Diffpack. A basic background in PDEs and numerical methods is required by the potential reader. Further, a basic knowledge of the finite element method and its implementation in one and two space dimensions is required. The authors claim that no prior knowledge of the package Diffpack is required, which is true, but the reader should be at least familiar with an object oriented programming language like C++ in order to better comprehend the programming environment of Diffpack. Certainly, a prior knowledge or usage of Diffpack would be a great advantage to the reader. The book consists of 15 chapters, each one written by one or more authors. Each chapter is basically divided into two parts: the first part is about mathematical models described by PDEs and numerical methods to solve these models and the second part describes how to implement the numerical methods using the programming environment of Diffpack. Each chapter closes with a list of references on its subject. The first nine chapters cover well known numerical methods for solving the basic types of PDEs. Further, programming techniques on the serial as well as on the parallel implementation of numerical methods are also included in these chapters. The last five chapters are dedicated to applications, modelled by PDEs, in a variety of fields. The first chapter is an introduction to parallel processing. It covers fundamentals of parallel processing in a simple and concrete way and no prior knowledge of the subject is required. Examples of parallel implementation of basic linear algebra operations are presented using the Message Passing Interface (MPI) programming environment. Here, some knowledge of MPI routines is required by the reader. Examples solving in parallel simple PDEs using

  5. Early, computer-Aided Design/Computer-Aided Modeling Planned, Le Fort I Advancement With Internal Distractors to Treat Severe Maxillary Hypoplasia in Cleft Lip and Palate.

    PubMed

    Chang, Catherine S; Swanson, Jordan; Yu, Jason; Taylor, Jesse A

    2017-04-11

    Traditionally, maxillary hypoplasia in the setting of cleft lip and palate is treated via orthognathic surgery at skeletal maturity, which condemns these patients to abnormal facial proportions during adolescence. The authors sought to determine the safety profile of computer-aided design/computer-aided modeling (CAD/CAM) planned, Le Fort I distraction osteogenesis with internal distractors in select patients presenting at a young age with severe maxillary retrusion. The authors retrospectively reviewed our "early" Le Fort I distraction osteogenesis experience-patients performed for severe maxillary retrusion (≥12 mm underjet), after canine eruption but prior to skeletal maturity-at a single institution. Patient demographics, cleft characteristics, CAD/CAM operative plans, surgical complications, postoperative imaging, and outcomes were analyzed. Four patients were reviewed, with a median age of 12.8 years at surgery (range 8.6-16.1 years). Overall mean advancement was 17.95 + 2.9 mm (range 13.7-19.9 mm) with mean SNA improved 18.4° to 87.4 ± 5.7°. Similarly, ANB improved 17.7° to a postoperative mean of 2.4 ± 3.1°. Mean follow-up was 100.7 weeks, with 3 of 4 patients in a Class I occlusion with moderate-term follow-up; 1 of 4 will need an additional maxillary advancement due to pseudo-relapse. In conclusion, Le Fort I distraction osteogenesis with internal distractors is a safe procedure to treat severe maxillary hypoplasia after canine eruption but before skeletal maturity. Short-term follow-up demonstrates safety of the procedure and relative stability of the advancement. Pseudo-relapse is a risk of the procedure that must be discussed at length with patients and families.

  6. Computer architecture for efficient algorithmic executions in real-time systems: New technology for avionics systems and advanced space vehicles

    NASA Technical Reports Server (NTRS)

    Carroll, Chester C.; Youngblood, John N.; Saha, Aindam

    1987-01-01

    Improvements and advances in the development of computer architecture now provide innovative technology for the recasting of traditional sequential solutions into high-performance, low-cost, parallel system to increase system performance. Research conducted in development of specialized computer architecture for the algorithmic execution of an avionics system, guidance and control problem in real time is described. A comprehensive treatment of both the hardware and software structures of a customized computer which performs real-time computation of guidance commands with updated estimates of target motion and time-to-go is presented. An optimal, real-time allocation algorithm was developed which maps the algorithmic tasks onto the processing elements. This allocation is based on the critical path analysis. The final stage is the design and development of the hardware structures suitable for the efficient execution of the allocated task graph. The processing element is designed for rapid execution of the allocated tasks. Fault tolerance is a key feature of the overall architecture. Parallel numerical integration techniques, tasks definitions, and allocation algorithms are discussed. The parallel implementation is analytically verified and the experimental results are presented. The design of the data-driven computer architecture, customized for the execution of the particular algorithm, is discussed.

  7. A Computational Future for Preventing HIV in Minority Communities: How Advanced Technology Can Improve Implementation of Effective Programs

    PubMed Central

    Brown, C Hendricks; Mohr, David C.; Gallo, Carlos G.; Mader, Christopher; Palinkas, Lawrence; Wingood, Gina; Prado, Guillermo; Kellam, Sheppard G.; Pantin, Hilda; Poduska, Jeanne; Gibbons, Robert; McManus, John; Ogihara, Mitsunori; Valente, Thomas; Wulczyn, Fred; Czaja, Sara; Sutcliffe, Geoff; Villamar, Juan; Jacobs, Christopher

    2013-01-01

    African Americans and Hispanics in the U.S. have much higher rates of HIV than non-minorities. There is now strong evidence that a range of behavioral interventions are efficacious in reducing sexual risk behavior in these populations. While a handful of these programs are just beginning to be disseminated widely, we still have not implemented effective programs to a level that would reduce the population incidence of HIV for minorities. We propose that innovative approaches involving computational technologies be explored for their use in both developing new interventions as well as in supporting wide-scale implementation of effective behavioral interventions. Mobile technologies have a place in both of these activities. First, mobile technologies can be used in sensing contexts and interacting to the unique preferences and needs of individuals at times where intervention to reduce risk would be most impactful. Secondly, mobile technologies can be used to improve the delivery of interventions by facilitators and their agencies. Systems science methods, including social network analysis, agent based models, computational linguistics, intelligent data analysis, and systems and software engineering all have strategic roles that can bring about advances in HIV prevention in minority communities. Using an existing mobile technology for depression and three effective HIV prevention programs, we illustrate how eight areas in the intervention/implementation process can use innovative computational approaches to advance intervention adoption, fidelity, and sustainability. PMID:23673892

  8. A computational future for preventing HIV in minority communities: how advanced technology can improve implementation of effective programs.

    PubMed

    Brown, C Hendricks; Mohr, David C; Gallo, Carlos G; Mader, Christopher; Palinkas, Lawrence; Wingood, Gina; Prado, Guillermo; Kellam, Sheppard G; Pantin, Hilda; Poduska, Jeanne; Gibbons, Robert; McManus, John; Ogihara, Mitsunori; Valente, Thomas; Wulczyn, Fred; Czaja, Sara; Sutcliffe, Geoff; Villamar, Juan; Jacobs, Christopher

    2013-06-01

    African Americans and Hispanics in the United States have much higher rates of HIV than non-minorities. There is now strong evidence that a range of behavioral interventions are efficacious in reducing sexual risk behavior in these populations. Although a handful of these programs are just beginning to be disseminated widely, we still have not implemented effective programs to a level that would reduce the population incidence of HIV for minorities. We proposed that innovative approaches involving computational technologies be explored for their use in both developing new interventions and in supporting wide-scale implementation of effective behavioral interventions. Mobile technologies have a place in both of these activities. First, mobile technologies can be used in sensing contexts and interacting to the unique preferences and needs of individuals at times where intervention to reduce risk would be most impactful. Second, mobile technologies can be used to improve the delivery of interventions by facilitators and their agencies. Systems science methods including social network analysis, agent-based models, computational linguistics, intelligent data analysis, and systems and software engineering all have strategic roles that can bring about advances in HIV prevention in minority communities. Using an existing mobile technology for depression and 3 effective HIV prevention programs, we illustrated how 8 areas in the intervention/implementation process can use innovative computational approaches to advance intervention adoption, fidelity, and sustainability.

  9. Verification, Validation and Credibility Assessment of a Computational Model of the Advanced Resistive Exercise Device (ARED)

    NASA Technical Reports Server (NTRS)

    Werner, C. R.; Humphreys, B. T.; Mulugeta, L.

    2014-01-01

    The Advanced Resistive Exercise Device (ARED) is the resistive exercise device used by astronauts on the International Space Station (ISS) to mitigate bone loss and muscle atrophy due to extended exposure to microgravity (micro g). The Digital Astronaut Project (DAP) has developed a multi-body dynamics model of biomechanics models for use in spaceflight exercise physiology research and operations. In an effort to advance model maturity and credibility of the ARED model, the DAP performed verification, validation and credibility (VV and C) assessment of the analyses of the model in accordance to NASA-STD-7009 'Standards for Models and Simulations'.

  10. Advanced computational methods for nodal diffusion, Monte Carlo, and S{sub n} problems. Final Report

    SciTech Connect

    1994-12-31

    The work addresses basic computational difficulties that arise in the numerical simulation of neutral particle radiation transport: discretized radiation transport problems, iterative methods, selection of parameters, and extension of current algorithms.

  11. Study of flutter related computational procedures for minimum weight structural sizing of advanced aircraft, supplemental data

    NASA Technical Reports Server (NTRS)

    Oconnell, R. F.; Hassig, H. J.; Radovcich, N. A.

    1975-01-01

    Computational aspects of (1) flutter optimization (minimization of structural mass subject to specified flutter requirements), (2) methods for solving the flutter equation, and (3) efficient methods for computing generalized aerodynamic force coefficients in the repetitive analysis environment of computer-aided structural design are discussed. Specific areas included: a two-dimensional Regula Falsi approach to solving the generalized flutter equation; method of incremented flutter analysis and its applications; the use of velocity potential influence coefficients in a five-matrix product formulation of the generalized aerodynamic force coefficients; options for computational operations required to generate generalized aerodynamic force coefficients; theoretical considerations related to optimization with one or more flutter constraints; and expressions for derivatives of flutter-related quantities with respect to design variables.

  12. Geophysical outlook. Part IV. New vector super computers promote seismic advancements

    SciTech Connect

    Nelson, H.R. Jr.

    1982-01-01

    Some major oil companies are beginning to test the use of vector computers to process the huge volumes of seismic data acquired by modern prospecting techniques. To take advantage of the parallel-processing techniques offered by the vector mode of analysis, users must completely restructure the seismic-data processing packages. The most important application of vector computers, to date, has been in numerical reservoir modeling.

  13. Advanced Methods for the Computer-Aided Diagnosis of Lesions in Digital Mammograms

    DTIC Science & Technology

    2000-07-01

    classification of mammographic mass lesions. Radiology 213: 200, 1999. " Nishikawa R, Giger ML, Yarusso L, Kupinski M, Baehr A, Venta L,: Computer-aided...detection of mass lesions in digital mammography using radial gradient index filtering. Radiology 213: 229, 1999. " Maloney M, Huo Z, Giger ML, Venta L...Nishikawa R, Huo Z, Jiang Y, Venta L, Doi K: Computer-aided diagnosis (CAD) in breast imaging. Radiology 213: 507, 1999. -Final Report DAMD 17-96-1-6058 19

  14. Development of Advanced Computational Aeroelasticity Tools at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Bartels, R. E.

    2008-01-01

    NASA Langley Research Center has continued to develop its long standing computational tools to address new challenges in aircraft and launch vehicle design. This paper discusses the application and development of those computational aeroelastic tools. Four topic areas will be discussed: 1) Modeling structural and flow field nonlinearities; 2) Integrated and modular approaches to nonlinear multidisciplinary analysis; 3) Simulating flight dynamics of flexible vehicles; and 4) Applications that support both aeronautics and space exploration.

  15. Next Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and Systems Biology (Final Report)

    EPA Science Inventory

    Cover of the Next Generation of Risk Assessment Final report This final report, "Next Generation Risk Assessment: Recent Advances in Molec...

  16. Uncertainty Analyses of Advanced Fuel Cycles

    SciTech Connect

    Laurence F. Miller; J. Preston; G. Sweder; T. Anderson; S. Janson; M. Humberstone; J. MConn; J. Clark

    2008-12-12

    The Department of Energy is developing technology, experimental protocols, computational methods, systems analysis software, and many other capabilities in order to advance the nuclear power infrastructure through the Advanced Fuel Cycle Initiative (AFDI). Our project, is intended to facilitate will-informed decision making for the selection of fuel cycle options and facilities for development.

  17. Advanced Computational Approaches for Characterizing Stochastic Cellular Responses to Low Dose, Low Dose Rate Exposures

    SciTech Connect

    Scott, Bobby, R., Ph.D.

    2003-06-27

    OAK - B135 This project final report summarizes modeling research conducted in the U.S. Department of Energy (DOE), Low Dose Radiation Research Program at the Lovelace Respiratory Research Institute from October 1998 through June 2003. The modeling research described involves critically evaluating the validity of the linear nonthreshold (LNT) risk model as it relates to stochastic effects induced in cells by low doses of ionizing radiation and genotoxic chemicals. The LNT model plays a central role in low-dose risk assessment for humans. With the LNT model, any radiation (or genotoxic chemical) exposure is assumed to increase one¡¯s risk of cancer. Based on the LNT model, others have predicted tens of thousands of cancer deaths related to environmental exposure to radioactive material from nuclear accidents (e.g., Chernobyl) and fallout from nuclear weapons testing. Our research has focused on developing biologically based models that explain the shape of dose-response curves for low-dose radiation and genotoxic chemical-induced stochastic effects in cells. Understanding the shape of the dose-response curve for radiation and genotoxic chemical-induced stochastic effects in cells helps to better understand the shape of the dose-response curve for cancer induction in humans. We have used a modeling approach that facilitated model revisions over time, allowing for timely incorporation of new knowledge gained related to the biological basis for low-dose-induced stochastic effects in cells. Both deleterious (e.g., genomic instability, mutations, and neoplastic transformation) and protective (e.g., DNA repair and apoptosis) effects have been included in our modeling. Our most advanced model, NEOTRANS2, involves differing levels of genomic instability. Persistent genomic instability is presumed to be associated with nonspecific, nonlethal mutations and to increase both the risk for neoplastic transformation and for cancer occurrence. Our research results, based on

  18. Web based provider education for competency of scope of practice (Best Practice): Medicine Department Safe training is a computer based review program (de' medri).

    PubMed

    Tabriziani, Hossein; Hatcher, Myron; Heetebry, Irene

    2005-12-01

    Medicine Department Safe training is a computer based review program named de'medici and it is an employee training program. This annual review packet serves as a generic training tool. All health-care providers with direct patient care are required by state law to complete a group of 11 modules and pass a mandatory training test to assess proficiency in these areas. They include emergency preparedness, life and fire safety, electrical safety, working safety with hazardous materials, back safety, violence in the workplace, latex allergy prevention, preventing TB in the workplace, preventing AIDS and hepatitis B and C in the workplace, radiation safety, and age related care for health-care workers.

  19. Investigation of advanced counterrotation blade configuration concepts for high speed turboprop systems. Task 4: Advanced fan section aerodynamic analysis computer program user's manual

    NASA Technical Reports Server (NTRS)

    Crook, Andrew J.; Delaney, Robert A.

    1992-01-01

    The computer program user's manual for the ADPACAPES (Advanced Ducted Propfan Analysis Code-Average Passage Engine Simulation) program is included. The objective of the computer program is development of a three-dimensional Euler/Navier-Stokes flow analysis for fan section/engine geometries containing multiple blade rows and multiple spanwise flow splitters. An existing procedure developed by Dr. J. J. Adamczyk and associates at the NASA Lewis Research Center was modified to accept multiple spanwise splitter geometries and simulate engine core conditions. The numerical solution is based upon a finite volume technique with a four stage Runge-Kutta time marching procedure. Multiple blade row solutions are based upon the average-passage system of equations. The numerical solutions are performed on an H-type grid system, with meshes meeting the requirement of maintaining a common axisymmetric mesh for each blade row grid. The analysis was run on several geometry configurations ranging from one to five blade rows and from one to four radial flow splitters. The efficiency of the solution procedure was shown to be the same as the original analysis.

  20. Advanced computational simulation for design and manufacturing of lightweight material components for automotive applications

    SciTech Connect

    Simunovic, S.; Aramayo, G.A.; Zacharia, T.; Toridis, T.G.; Bandak, F.; Ragland, C.L.

    1997-04-01

    Computational vehicle models for the analysis of lightweight material performance in automobiles have been developed through collaboration between Oak Ridge National Laboratory, the National Highway Transportation Safety Administration, and George Washington University. The vehicle models have been verified against experimental data obtained from vehicle collisions. The crashed vehicles were analyzed, and the main impact energy dissipation mechanisms were identified and characterized. Important structural parts were extracted and digitized and directly compared with simulation results. High-performance computing played a key role in the model development because it allowed for rapid computational simulations and model modifications. The deformation of the computational model shows a very good agreement with the experiments. This report documents the modifications made to the computational model and relates them to the observations and findings on the test vehicle. Procedural guidelines are also provided that the authors believe need to be followed to create realistic models of passenger vehicles that could be used to evaluate the performance of lightweight materials in automotive structural components.

  1. Using Computer-Assisted Argumentation Mapping to develop effective argumentation skills in high school advanced placement physics

    NASA Astrophysics Data System (ADS)

    Heglund, Brian

    Educators recognize the importance of reasoning ability for development of critical thinking skills, conceptual change, metacognition, and participation in 21st century society. There is a recognized need for students to improve their skills of argumentation, however, argumentation is not explicitly taught outside logic and philosophy---subjects that are not part of the K-12 curriculum. One potential way of supporting the development of argumentation skills in the K-12 context is through incorporating Computer-Assisted Argument Mapping to evaluate arguments. This quasi-experimental study tested the effects of such argument mapping software and was informed by the following two research questions: 1. To what extent does the collaborative use of Computer-Assisted Argumentation Mapping to evaluate competing theories influence the critical thinking skill of argument evaluation, metacognitive awareness, and conceptual knowledge acquisition in high school Advanced Placement physics, compared to the more traditional method of text tables that does not employ Computer-Assisted Argumentation Mapping? 2. What are the student perceptions of the pros and cons of argument evaluation in the high school Advanced Placement physics environment? This study examined changes in critical thinking skills, including argumentation evaluation skills, as well as metacognitive awareness and conceptual knowledge, in two groups: a treatment group using Computer-Assisted Argumentation Mapping to evaluate physics arguments, and a comparison group using text tables to evaluate physics arguments. Quantitative and qualitative methods for collecting and analyzing data were used to answer the research questions. Quantitative data indicated no significant difference between the experimental groups, and qualitative data suggested students perceived pros and cons of argument evaluation in the high school Advanced Placement physics environment, such as self-reported sense of improvement in argument

  2. Decreased length of stay after addition of healthcare provider in emergency department triage: a comparison between computer-simulated and real-world interventions

    PubMed Central

    Al-Roubaie, Abdul Rahim; Goldlust, Eric Jonathan

    2013-01-01

    Objective (1) To determine the effects of adding a provider in triage on average length of stay (LOS) and proportion of patients with >6 h LOS. (2) To assess the accuracy of computer simulation in predicting the magnitude of such effects on these metrics. Methods A group-level quasi-experimental trial comparing the St. Louis Veterans Affairs Medical Center emergency department (1) before intervention, (2) after institution of provider in triage, and discrete event simulation (DES) models of similar (3) ‘before’ and (4) ‘after’ conditions. The outcome measures were daily mean LOS and percentage of patients with LOS >6 h. Results The DES-modelled intervention predicted a decrease in the %6-hour LOS from 19.0% to 13.1%, and a drop in the daily mean LOS from 249 to 200 min (p<0.0001). Following (actual) intervention, the number of patients with LOS >6 h decreased from 19.9% to 14.3% (p<0.0001), with the daily mean LOS decreasing from 247 to 210 min (p<0.0001). Conclusion Physician and mid-level provider coverage at triage significantly reduced emergency department LOS in this setting. DES accurately predicted the magnitude of this effect. These results suggest further work in the generalisability of triage providers and in the utility of DES for predicting quantitative effects of process changes. PMID:22398851

  3. Advances in I/O, Speedup, and Universality on Colossus, an Unconventional Computer

    NASA Astrophysics Data System (ADS)

    Wells, Benjamin

    Colossus, the first electronic digital (and very unconventional) computer, was not a stored-program general purpose computer in the modern sense, although there are printed claims to the contrary. At least one of these asserts Colossus was a Turing machine. Certainly, an appropriate Turing machine can simulate the operation of Colossus. That is hardly an argument for generality of computation. But this is: a universal Turing machine could have been implemented on a clustering of the ten Colossus machines installed at Bletchley Park, England, by the end of WWII in 1945. Along with the presentation of this result, several improvements in input, output, and speed, within the hardware capability and specification of Colossus are discussed.

  4. Inspection of advanced computational lithography logic reticles using a 193-nm inspection system

    NASA Astrophysics Data System (ADS)

    Yu, Ching-Fang; Lin, Mei-Chun; Lai, Mei-Tsu; Hsu, Luke T. H.; Chin, Angus; Lee, S. C.; Yen, Anthony; Wang, Jim; Chen, Ellison; Wu, David; Broadbent, William H.; Huang, William; Zhu, Zinggang

    2010-09-01

    We report inspection results of early 22-nm logic reticles designed with both conventional and computational lithography methods. Inspection is performed using a state-of-the-art 193-nm reticle inspection system in the reticleplane inspection mode (RPI) where both rule-based sensitivity control (RSC) and a newer modelbased sensitivity control (MSC) method are tested. The evaluation includes defect detection performance using several special test reticles designed with both conventional and computational lithography methods; the reticles contain a variety of programmed critical defects which are measured based on wafer print impact. Also included are inspection results from several full-field product reticles designed with both conventional and computational lithography methods to determine if low nuisance-defect counts can be achieved. These early reticles are largely single-die and all inspections are performed in the die-to-database inspection mode only.

  5. Designing a single board computers for space using the most advanced processor and mitigation technologies

    NASA Astrophysics Data System (ADS)

    Longden, L.; Thibodeau, C.; Hiliman, R.; Layton, P.; Dowd, M.

    2002-12-01

    As high-end computing becomes more of a necessity in space, there currently exists a large gap between what is available to satellite manufacturers and the state of the commercial processor industry. As a result, Maxwell Technologies has developed a Super Computer for Space that utilizes the latest commercial Silicon-on-Insulator PowerPC processors and state-of- the-art memory modules to achieve space-qualified performance that is from 10 to 1000 times that of current technology. In addition, Maxwell's Super Computer for Space (SCS750) SBC is capable of executing up to 1800+ millions of instruction per second (MIPS), while guaranteeing upset rates for the entire board of less then 1 every 1000 years. Presented is a brief synopsis of Maxwell's design approach and radiation mitigation techniques and radiation test results employed on Maxwell's next generation SBC.

  6. Effects of a radiation dose reduction strategy for computed tomography in severely injured trauma patients in the emergency department: an observational study

    PubMed Central

    2011-01-01

    Background Severely injured trauma patients are exposed to clinically significant radiation doses from computed tomography (CT) imaging in the emergency department. Moreover, this radiation exposure is associated with an increased risk of cancer. The purpose of this study was to determine some effects of a radiation dose reduction strategy for CT in severely injured trauma patients in the emergency department. Methods We implemented the radiation dose reduction strategy in May 2009. A prospective observational study design was used to collect data from patients who met the inclusion criteria during this one year study (intervention group) from May 2009 to April 2010. The prospective data were compared with data collected retrospectively for one year prior to the implementation of the radiation dose reduction strategy (control group). By comparison of the cumulative effective dose and the number of CT examinations in the two groups, we evaluated effects of a radiation dose reduction strategy. All the patients met the institutional adult trauma team activation criteria. The radiation doses calculated by the CT scanner were converted to effective doses by multiplication by a conversion coefficient. Results A total of 118 patients were included in this study. Among them, 33 were admitted before May 2009 (control group), and 85 were admitted after May 2009 (intervention group). There were no significant differences between the two groups regarding baseline characteristics, such as injury severity and mortality. Additionally, there was no difference between the two groups in the mean number of total CT examinations per patient (4.8 vs. 4.5, respectively; p = 0.227). However, the mean effective dose of the total CT examinations per patient significantly decreased from 78.71 mSv to 29.50 mSv (p < 0.001). Conclusions The radiation dose reduction strategy for CT in severely injured trauma patients effectively decreased the cumulative effective dose of the total CT

  7. High performance computing and communications: Advancing the frontiers of information technology

    SciTech Connect

    1997-12-31

    This report, which supplements the President`s Fiscal Year 1997 Budget, describes the interagency High Performance Computing and Communications (HPCC) Program. The HPCC Program will celebrate its fifth anniversary in October 1996 with an impressive array of accomplishments to its credit. Over its five-year history, the HPCC Program has focused on developing high performance computing and communications technologies that can be applied to computation-intensive applications. Major highlights for FY 1996: (1) High performance computing systems enable practical solutions to complex problems with accuracies not possible five years ago; (2) HPCC-funded research in very large scale networking techniques has been instrumental in the evolution of the Internet, which continues exponential growth in size, speed, and availability of information; (3) The combination of hardware capability measured in gigaflop/s, networking technology measured in gigabit/s, and new computational science techniques for modeling phenomena has demonstrated that very large scale accurate scientific calculations can be executed across heterogeneous parallel processing systems located thousands of miles apart; (4) Federal investments in HPCC software R and D support researchers who pioneered the development of parallel languages and compilers, high performance mathematical, engineering, and scientific libraries, and software tools--technologies that allow scientists to use powerful parallel systems to focus on Federal agency mission applications; and (5) HPCC support for virtual environments has enabled the development of immersive technologies, where researchers can explore and manipulate multi-dimensional scientific and engineering problems. Educational programs fostered by the HPCC Program have brought into classrooms new science and engineering curricula designed to teach computational science. This document contains a small sample of the significant HPCC Program accomplishments in FY 1996.

  8. Development of Computational Approaches for Simulation and Advanced Controls for Hybrid Combustion-Gasification Chemical Looping

    SciTech Connect

    Joshi, Abhinaya; Lou, Xinsheng; Neuschaefer, Carl; Chaudry, Majid; Quinn, Joseph

    2012-07-31

    This document provides the results of the project through September 2009. The Phase I project has recently been extended from September 2009 to March 2011. The project extension will begin work on Chemical Looping (CL) Prototype modeling and advanced control design exploration in preparation for a scale-up phase. The results to date include: successful development of dual loop chemical looping process models and dynamic simulation software tools, development and test of several advanced control concepts and applications for Chemical Looping transport control and investigation of several sensor concepts and establishment of two feasible sensor candidates recommended for further prototype development and controls integration. There are three sections in this summary and conclusions. Section 1 presents the project scope and objectives. Section 2 highlights the detailed accomplishments by project task area. Section 3 provides conclusions to date and recommendations for future work.

  9. Detecting Nano-Scale Vibrations in Rotating Devices by Using Advanced Computational Methods

    PubMed Central

    del Toro, Raúl M.; Haber, Rodolfo E.; Schmittdiel, Michael C.

    2010-01-01

    This paper presents a computational method for detecting vibrations related to eccentricity in ultra precision rotation devices used for nano-scale manufacturing. The vibration is indirectly measured via a frequency domain analysis of the signal from a piezoelectric sensor attached to the stationary component of the rotating device. The algorithm searches for particular harmonic sequences associated with the eccentricity of the device rotation axis. The detected sequence is quantified and serves as input to a regression model that estimates the eccentricity. A case study presents the application of the computational algorithm during precision manufacturing processes. PMID:22399918

  10. Advanced reliability modeling of fault-tolerant computer-based systems

    NASA Technical Reports Server (NTRS)

    Bavuso, S. J.

    1982-01-01

    Two methodologies for the reliability assessment of fault tolerant digital computer based systems are discussed. The computer-aided reliability estimation 3 (CARE 3) and gate logic software simulation (GLOSS) are assessment technologies that were developed to mitigate a serious weakness in the design and evaluation process of ultrareliable digital systems. The weak link is based on the unavailability of a sufficiently powerful modeling technique for comparing the stochastic attributes of one system against others. Some of the more interesting attributes are reliability, system survival, safety, and mission success.

  11. Structural analysis of advanced polymeric foams by means of high resolution X-ray computed tomography

    NASA Astrophysics Data System (ADS)

    Nacucchi, M.; De Pascalis, F.; Scatto, M.; Capodieci, L.; Albertoni, R.

    2016-06-01

    Advanced polymeric foams with enhanced thermal insulation and mechanical properties are used in a wide range of industrial applications. The properties of a foam strongly depend upon its cell structure. Traditionally, their microstructure has been studied using 2D imaging systems based on optical or electron microscopy, with the obvious disadvantage that only the surface of the sample can be analysed. To overcome this shortcoming, the adoption of X-ray micro-tomography imaging is here suggested to allow for a complete 3D, non-destructive analysis of advanced polymeric foams. Unlike metallic foams, the resolution of the reconstructed structural features is hampered by the low contrast in the images due to weak X-ray absorption in the polymer. In this work an advanced methodology based on high-resolution and low-contrast techniques is used to perform quantitative analyses on both closed and open cells foams. Local structural features of individual cells such as equivalent diameter, sphericity, anisotropy and orientation are statistically evaluated. In addition, thickness and length of the struts are determined, underlining the key role played by the achieved resolution. In perspective, the quantitative description of these structural features will be used to evaluate the results of in situ mechanical and thermal test on foam samples.

  12. Methodological Advances in Political Gaming: The One-Person Computer Interactive, Quasi-Rigid Rule Game.

    ERIC Educational Resources Information Center

    Shubik, Martin

    The main problem in computer gaming research is the initial decision of choosing the type of gaming method to be used. Free-form games lead to exciting open-ended confrontations that generate much information. However, they do not easily lend themselves to analysis because they generate far too much information and their results are seldom…

  13. CNC Turning Center Advanced Operations. Computer Numerical Control Operator/Programmer. 444-332.

    ERIC Educational Resources Information Center

    Skowronski, Steven D.; Tatum, Kenneth

    This student guide provides materials for a course designed to introduce the student to the operations and functions of a two-axis computer numerical control (CNC) turning center. The course consists of seven units. Unit 1 presents course expectations and syllabus, covers safety precautions, and describes the CNC turning center components, CNC…

  14. Collaborative Learning: Cognitive and Computational Approaches. Advances in Learning and Instruction Series.

    ERIC Educational Resources Information Center

    Dillenbourg, Pierre, Ed.

    Intended to illustrate the benefits of collaboration between scientists from psychology and computer science, namely machine learning, this book contains the following chapters, most of which are co-authored by scholars from both sides: (1) "Introduction: What Do You Mean by 'Collaborative Learning'?" (Pierre Dillenbourg); (2)…

  15. 78 FR 68058 - Next Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-13

    ... Systems Biology [External Review Draft] AGENCY: Environmental Protection Agency (EPA). ACTION: Notice of... Systems Biology '' (EPA/600/R-13/214A). The original Federal Register notice announcing the public comment..., computational, and systems biology data can better inform risk assessment. This draft document is available...

  16. Exploring Interactive and Dynamic Simulations Using a Computer Algebra System in an Advanced Placement Chemistry Course

    ERIC Educational Resources Information Center

    Matsumoto, Paul S.

    2014-01-01

    The article describes the use of Mathematica, a computer algebra system (CAS), in a high school chemistry course. Mathematica was used to generate a graph, where a slider controls the value of parameter(s) in the equation; thus, students can visualize the effect of the parameter(s) on the behavior of the system. Also, Mathematica can show the…

  17. Advanced manned space flight simulation and training: An investigation of simulation host computer system concepts

    NASA Technical Reports Server (NTRS)

    Montag, Bruce C.; Bishop, Alfred M.; Redfield, Joe B.

    1989-01-01

    The findings of a preliminary investigation by Southwest Research Institute (SwRI) in simulation host computer concepts is presented. It is designed to aid NASA in evaluating simulation technologies for use in spaceflight training. The focus of the investigation is on the next generation of space simulation systems that will be utilized in training personnel for Space Station Freedom operations. SwRI concludes that NASA should pursue a distributed simulation host computer system architecture for the Space Station Training Facility (SSTF) rather than a centralized mainframe based arrangement. A distributed system offers many advantages and is seen by SwRI as the only architecture that will allow NASA to achieve established functional goals and operational objectives over the life of the Space Station Freedom program. Several distributed, parallel computing systems are available today that offer real-time capabilities for time critical, man-in-the-loop simulation. These systems are flexible in terms of connectivity and configurability, and are easily scaled to meet increasing demands for more computing power.

  18. The Efficacy of Computer-Assisted Instruction for Advancing Literacy Skills in Kindergarten Children

    ERIC Educational Resources Information Center

    Macaruso, Paul; Walker, Adelaide

    2008-01-01

    We examined the benefits of computer-assisted instruction (CAI) as a supplement to a phonics-based reading curriculum for kindergartners in an urban public school system. The CAI program provides systematic exercises in phonological awareness and letter-sound correspondences. Comparisons were made between children in classes receiving a sufficient…

  19. The coupling of fluids, dynamics, and controls on advanced architecture computers

    NASA Technical Reports Server (NTRS)

    Atwood, Christopher

    1995-01-01

    This grant provided for the demonstration of coupled controls, body dynamics, and fluids computations in a workstation cluster environment; and an investigation of the impact of peer-peer communication on flow solver performance and robustness. The findings of these investigations were documented in the conference articles.The attached publication, 'Towards Distributed Fluids/Controls Simulations', documents the solution and scaling of the coupled Navier-Stokes, Euler rigid-body dynamics, and state feedback control equations for a two-dimensional canard-wing. The poor scaling shown was due to serialized grid connectivity computation and Ethernet bandwidth limits. The scaling of a peer-to-peer communication flow code on an IBM SP-2 was also shown. The scaling of the code on the switched fabric-linked nodes was good, with a 2.4 percent loss due to communication of intergrid boundary point information. The code performance on 30 worker nodes was 1.7 (mu)s/point/iteration, or a factor of three over a Cray C-90 head. The attached paper, 'Nonlinear Fluid Computations in a Distributed Environment', documents the effect of several computational rate enhancing methods on convergence. For the cases shown, the highest throughput was achieved using boundary updates at each step, with the manager process performing communication tasks only. Constrained domain decomposition of the implicit fluid equations did not degrade the convergence rate or final solution. The scaling of a coupled body/fluid dynamics problem on an Ethernet-linked cluster was also shown.

  20. Dynamic Docking Test System (DDTS) active table computer program NASA Advanced Docking System (NADS)

    NASA Technical Reports Server (NTRS)

    Gates, R. M.; Jantz, R. E.

    1974-01-01

    A computer program was developed to describe the three-dimensional motion of the Dynamic Docking Test System active table. The input consists of inertia and geometry data, actuator structural data, forcing function data, hydraulics data, servo electronics data, and integration control data. The output consists of table responses, actuator bending responses, and actuator responses.

  1. 13th International Workshop on Advanced Computing and Analysis Techniques in Physics Research

    NASA Astrophysics Data System (ADS)

    Speer, T.; Boudjema, F.; Lauret, J.; Naumann, A.; Teodorescu, L.; Uwer, P.

    "Beyond the Cutting edge in Computing" Fundamental research is dealing, by definition, with the two extremes: the extremely small and the extremely large. The LHC and Astroparticle physics experiments will soon offer new glimpses beyond the current frontiers. And the computing infrastructure to support such physics research needs to look beyond the cutting edge. Once more it seems that we are on the edge of a computing revolution. But perhaps what we are seeing now is a even more epochal change where not only the pace of the revolution is changing, but also its very nature. Change is not any more an "event" meant to open new possibilities that have to be understood first and exploited then to prepare the ground for a new leap. Change is becoming the very essence of the computing reality, sustained by a continuous flow of technical and paradigmatic innovation. The hardware is definitely moving toward more massive parallelism, in a breathtaking synthesis of all the past techniques of concurrent computation. New many-core machines offer opportunities for all sorts of Single/Multiple Instructions, Single/Multiple Data and Vector computations that in the past required specialised hardware. At the same time, all levels of virtualisation imagined till now seem to be possible via Clouds, and possibly many more. Information Technology has been the working backbone of the Global Village, and now, in more than one sense, it is becoming itself the Global Village. Between these two, the gap between the need for adapting applications to exploit the new hardware possibilities and the push toward virtualisation of resources is widening, creating more challenges as technical and intellectual progress continues. ACAT 2010 proposes to explore and confront the different boundaries of the evolution of computing, and its possible consequences on our scientific activity. What do these new technologies entail for physics research? How will physics research benefit from this revolution in

  2. Advances in Computational Radiation Biophysics for Cancer Therapy: Simulating Nano-Scale Damage by Low-Energy Electrons

    NASA Astrophysics Data System (ADS)

    Kuncic, Zdenka

    2015-10-01

    Computational radiation biophysics is a rapidly growing area that is contributing, alongside new hardware technologies, to ongoing developments in cancer imaging and therapy. Recent advances in theoretical and computational modeling have enabled the simulation of discrete, event-by-event interactions of very low energy (≪ 100 eV) electrons with water in its liquid thermodynamic phase. This represents a significant advance in our ability to investigate the initial stages of radiation induced biological damage at the molecular level. Such studies are important for the development of novel cancer treatment strategies, an example of which is given by microbeam radiation therapy (MRT). Here, new results are shown demonstrating that when excitations and ionizations are resolved down to nano-scales, their distribution extends well outside the primary microbeam path, into regions that are not directly irradiated. This suggests that radiation dose alone is insufficient to fully quantify biological damage. These results also suggest that the radiation cross-fire may be an important clue to understanding the different observed responses of healthy cells and tumor cells to MRT.

  3. Advances in Computational Radiation Biophysics for Cancer Therapy: Simulating Nano-Scale Damage by Low-Energy Electrons

    NASA Astrophysics Data System (ADS)

    Kuncic, Zdenka

    Computational radiation biophysics is a rapidly growing area that is contributing, alongside new hardware technologies, to ongoing developments in cancer imaging and therapy. Recent advances in theoretical and computational modeling have enabled the simulation of discrete, event-by-event interactions of very low energy (≪ 100 eV) electrons with water in its liquid thermodynamic phase. This represents a significant advance in our ability to investigate the initial stages of radiation induced biological damage at the molecular level. Such studies are important for the development of novel cancer treatment strategies, an example of which is given by microbeam radiation therapy (MRT). Here, new results are shown demonstrating that when excitations and ionizations are resolved down to nano-scales, their distribution extends well outside the primary microbeam path, into regions that are not directly irradiated. This suggests that radiation dose alone is insufficient to fully quantify biological damage. These results also suggest that the radiation cross-fire may be an important clue to understanding the different observed responses of healthy cells and tumor cells to MRT.

  4. Novel genotype-phenotype associations in human cancers enabled by advanced molecular platforms and computational analysis of whole slide images.

    PubMed

    Cooper, Lee A D; Kong, Jun; Gutman, David A; Dunn, William D; Nalisnik, Michael; Brat, Daniel J

    2015-04-01

    Technological advances in computing, imaging, and genomics have created new opportunities for exploring relationships between histology, molecular events, and clinical outcomes using quantitative methods. Slide scanning devices are now capable of rapidly producing massive digital image archives that capture histological details in high resolution. Commensurate advances in computing and image analysis algorithms enable mining of archives to extract descriptions of histology, ranging from basic human annotations to automatic and precisely quantitative morphometric characterization of hundreds of millions of cells. These imaging capabilities represent a new dimension in tissue-based studies, and when combined with genomic and clinical endpoints, can be used to explore biologic characteristics of the tumor microenvironment and to discover new morphologic biomarkers of genetic alterations and patient outcomes. In this paper, we review developments in quantitative imaging technology and illustrate how image features can be integrated with clinical and genomic data to investigate fundamental problems in cancer. Using motivating examples from the study of glioblastomas (GBMs), we demonstrate how public data from The Cancer Genome Atlas (TCGA) can serve as an open platform to conduct in silico tissue-based studies that integrate existing data resources. We show how these approaches can be used to explore the relation of the tumor microenvironment to genomic alterations and gene expression patterns and to define nuclear morphometric features that are predictive of genetic alterations and clinical outcomes. Challenges, limitations, and emerging opportunities in the area of quantitative imaging and integrative analyses are also discussed.

  5. Functional imaging using computational fluid dynamics to predict treatment success of mandibular advancement devices in sleep-disordered breathing.

    PubMed

    De Backer, J W; Vanderveken, O M; Vos, W G; Devolder, A; Verhulst, S L; Verbraecken, J A; Parizel, P M; Braem, M J; Van de Heyning, P H; De Backer, W A

    2007-01-01

    Mandibular advancement devices (MADs) have emerged as a popular alternative for the treatment of sleep-disordered breathing. These devices bring the mandibula forward in order to increase upper airway (UA) volume and prevent total UA collapse during sleep. However, the precise mechanism of action appears to be quite complex and is not yet completely understood; this might explain interindividual variation in treatment success. We examined whether an UA model, that combines imaging techniques and computational fluid dynamics (CFD), allows for a prediction of the treatment outcome with MADs. Ten patients that were treated with a custom-made mandibular advancement device (MAD), underwent split-night polysomnography. The morning after the sleep study, a low radiation dose CT scan was scheduled with and without the MAD. The CT examinations allowed for a comparison between the change in UA volume and the anatomical characteristics through the conversion to three-dimensional computer models. Furthermore, the change in UA resistance could be calculated through flow simulations with CFD. Boundary conditions for the model such as mass flow rate and pressure distributions were obtained during the split-night polysomnography. Therefore, the flow modeling was based on a patient specific geometry and patient specific boundary conditions. The results indicated that a decrease in UA resistance and an increase in UA volume correlate with both a clinical and an objective improvement. The results of this pilot study suggest that the outcome of MAD treatment can be predicted using the described UA model.

  6. The Practical Impact of Recent Computer Advances on the Analysis and Design of Large Scale Networks

    DTIC Science & Technology

    1974-06-01

    problem was developed and tested . - An extensive study of flow, delay and throughput in packet radio networks was completed. Department of Defense...to construct pathological examples in which chains with predominantly internal traffic are declared collapsable by the test ), the criterium has been...deletion) and one link upgrading (or insertion) are performed simultaneously) if REMIN<RE<REMAX. 3. Acceptance test . If the new solution is dominated

  7. Initiatives to Advance Computer-Assisted Instruction. Report of the Joint Subcommittee to the Governor and the General Assembly of Virginia. House Document No. 34.

    ERIC Educational Resources Information Center

    Lambert, Benjamin J., III; And Others

    A study authorized by the General Assembly of Virginia during the 1983 session examined the specific education and training strategies that the state should pursue to meet the demands of technological advancements. Information was gathered concerning instructional computing, computer science programs, the needs of local school divisions,…

  8. Use of Respiratory-Correlated Four-Dimensional Computed Tomography to Determine Acceptable Treatment Margins for Locally Advanced Pancreatic Adenocarcinoma

    SciTech Connect

    Goldstein, Seth D.; Ford, Eric C.; Duhon, Mario; McNutt, Todd; Wong, John; Herman, Joseph M.

    2010-02-01

    Purpose: Respiratory-induced excursions of locally advanced pancreatic adenocarcinoma could affect dose delivery. This study quantified tumor motion and evaluated standard treatment margins. Methods and Materials: Respiratory-correlated four-dimensional computed tomography images were obtained on 30 patients with locally advanced pancreatic adenocarcinoma; 15 of whom underwent repeat scanning before cone-down treatment. Treatment planning software was used to contour the gross tumor volume (GTV), bilateral kidneys, and biliary stent. Excursions were calculated according to the centroid of the contoured volumes. Results: The mean +- standard deviation GTV excursion in the superoinferior (SI) direction was 0.55 +- 0.23 cm; an expansion of 1.0 cm adequately accounted for the GTV motion in 97% of locally advanced pancreatic adenocarcinoma patients. Motion GTVs were generated and resulted in a 25% average volume increase compared with the static GTV. Of the 30 patients, 17 had biliary stents. The mean SI stent excursion was 0.84 +- 0.32 cm, significantly greater than the GTV motion. The xiphoid process moved an average of 0.35 +- 0.12 cm, significantly less than the GTV. The mean SI motion of the left and right kidneys was 0.65 +- 0.27 cm and 0.77 +- 0.30 cm, respectively. At repeat scanning, no significant changes were seen in the mean GTV size (p = .8) or excursion (p = .3). Conclusion: These data suggest that an asymmetric expansion of 1.0, 0.7, and 0.6 cm along the respective SI, anteroposterior, and medial-lateral directions is recommended if a respiratory-correlated four-dimensional computed tomography scan is not available to evaluate the tumor motion during treatment planning. Surrogates of tumor motion, such as biliary stents or external markers, should be used with caution.

  9. U.S. Department of Energy FreedomCAR and Vehicle Technologies Program Advanced Vehicle Testing Activity Federal Fleet Use of Electric Vehicles

    SciTech Connect

    Mindy Kirpatrick; J. E. Francfort

    2003-11-01

    Per Executive Order 13031, “Federal Alternative Fueled Vehicle Leadership,” the U.S. Department of Energy’s (DOE’s) Advanced Vehicle Testing Activity provided $998,300 in incremental funding to support the deployment of 220 electric vehicles in 36 Federal fleets. The 145 electric Ford Ranger pickups and 75 electric Chrysler EPIC (Electric Powered Interurban Commuter) minivans were operated in 14 states and the District of Columbia. The 220 vehicles were driven an estimated average of 700,000 miles annually. The annual estimated use of the 220 electric vehicles contributed to 39,000 fewer gallons of petroleum being used by Federal fleets and the reduction in emissions of 1,450 pounds of smog-forming pollution. Numerous attempts were made to obtain information from all 36 fleets. Information responses were received from 25 fleets (69% response rate), as some Federal fleet personnel that were originally involved with the Incremental Funding Project were transferred, retired, or simply could not be found. In addition, many of the Department of Defense fleets indicated that they were supporting operations in Iraq and unable to provide information for the foreseeable future. It should be noted that the opinions of the 25 fleets is based on operating 179 of the 220 electric vehicles (81% response rate). The data from the 25 fleets is summarized in this report. Twenty-two of the 25 fleets reported numerous problems with the vehicles, including mechanical, traction battery, and charging problems. Some of these problems, however, may have resulted from attempting to operate the vehicles beyond their capabilities. The majority of fleets reported that most of the vehicles were driven by numerous drivers each week, with most vehicles used for numerous trips per day. The vehicles were driven on average from 4 to 50 miles per day on a single charge. However, the majority of the fleets reported needing gasoline vehicles for missions beyond the capabilities of the electric

  10. Recent advances in renal hemodynamics: insights from bench experiments and computer simulations

    PubMed Central

    2015-01-01

    It has been long known that the kidney plays an essential role in the control of body fluids and blood pressure and that impairment of renal function may lead to the development of diseases such as hypertension (Guyton AC, Coleman TG, Granger Annu Rev Physiol 34: 13–46, 1972). In this review, we highlight recent advances in our understanding of renal hemodynamics, obtained from experimental and theoretical studies. Some of these studies were published in response to a recent Call for Papers of this journal: Renal Hemodynamics: Integrating with the Nephron and Beyond. PMID:25715984

  11. Computer-Controlled Detonation Spraying: From Process Fundamentals Toward Advanced Applications

    NASA Astrophysics Data System (ADS)

    Ulianitsky, V.; Shtertser, A.; Zlobin, S.; Smurov, I.

    2011-06-01

    Detonation spraying is a well-known technology which is applied for deposition of diverse powders, in particular cermets, to form various protective coatings. Actual progress is related to a recently developed technique of computer-controlled detonation spraying and its application in non-traditional domains as development of composite and graded coatings or metallization of plastics. The gas detonation parameters are analyzed to estimate the efficiency of different fuels to vary particle-in-flight velocity and temperature over a broad range thus providing conditions to spray diverse powders. A particle of a given nature and fixed size could be sprayed in a solid state or being strongly overheated above the melting point by variation of the quantity of the explosive gas mixture which is computer-controlled. Particle-in-flight velocity and temperature are calculated and compared with jet monitoring by a CCD-camera-based diagnostic tool and experimental data on splats formation.

  12. Artificial intelligence in medicine and cardiac imaging: harnessing big data and advanced computing to provide personalized medical diagnosis and treatment.

    PubMed

    Dilsizian, Steven E; Siegel, Eliot L

    2014-01-01

    Although advances in information technology in the past decade have come in quantum leaps in nearly every aspect of our lives, they seem to be coming at a slower pace in the field of medicine. However, the implementation of electronic health records (EHR) in hospitals is increasing rapidly, accelerated by the meaningful use initiatives associated with the Center for Medicare & Medicaid Services EHR Incentive Programs. The transition to electronic medical records and availability of patient data has been associated with increases in the volume and complexity of patient information, as well as an increase in medical alerts, with resulting "alert fatigue" and increased expectations for rapid and accurate diagnosis and treatment. Unfortunately, these increased demands on health care providers create greater risk for diagnostic and therapeutic errors. In the near future, artificial intelligence (AI)/machine learning will likely assist physicians with differential diagnosis of disease, treatment options suggestions, and recommendations, and, in the case of medical imaging, with cues in image interpretation. Mining and advanced analysis of "big data" in health care provide the potential not only to perform "in silico" research but also to provide "real time" diagnostic and (potentially) therapeutic recommendations based on empirical data. "On demand" access to high-performance computing and large health care databases will support and sustain our ability to achieve personalized medicine. The IBM Jeopardy! Challenge, which pitted the best all-time human players against the Watson computer, captured the imagination of millions of people across the world and demonstrated the potential to apply AI approaches to a wide variety of subject matter, including medicine. The combination of AI, big data, and massively parallel computing offers the potential to create a revolutionary way of practicing evidence-based, personalized medicine.

  13. Digital Mammography: Development of an Advanced Computer-Aided Diagnosis System for Breast Cancer Detection

    DTIC Science & Technology

    2006-05-01

    aided diagnosis of masses with full-field digital mammography,” Acad. Radiol. 9, 4–12 2002. 34D. Gur, J. S. Stalder, L. A. Hardesty , B. Zheng, J. H...Pickett RM , D’Orsi CJ. Stereo- scopic digital mammography: improving detection and diagnosis of breast cancer. Berlin, Germany: International Congress...other is the root-mean-square ( RMS ) distance between the computer and manually identified pectoral boundary. For 118 MLO view mammograms, 99.2% (117

  14. Research in advanced formal theorem-proving techniques. [design and implementation of computer languages

    NASA Technical Reports Server (NTRS)

    Raphael, B.; Fikes, R.; Waldinger, R.

    1973-01-01

    The results are summarised of a project aimed at the design and implementation of computer languages to aid in expressing problem solving procedures in several areas of artificial intelligence including automatic programming, theorem proving, and robot planning. The principal results of the project were the design and implementation of two complete systems, QA4 and QLISP, and their preliminary experimental use. The various applications of both QA4 and QLISP are given.

  15. Advanced earth observation spacecraft computer-aided design software: Technical, user and programmer guide

    NASA Technical Reports Server (NTRS)

    Farrell, C. E.; Krauze, L. D.

    1983-01-01

    The IDEAS computer of NASA is a tool for interactive preliminary design and analysis of LSS (Large Space System). Nine analysis modules were either modified or created. These modules include the capabilities of automatic model generation, model mass properties calculation, model area calculation, nonkinematic deployment modeling, rigid-body controls analysis, RF performance prediction, subsystem properties definition, and EOS science sensor selection. For each module, a section is provided that contains technical information, user instructions, and programmer documentation.

  16. Towards Precision Medicine: Advances in Computational Approaches for the Analysis of Human Variants

    PubMed Central

    Peterson, Thomas A; Doughty, Emily; Kann, Maricel G

    2013-01-01

    Variations and similarities in our individual genomes are part of our history, our heritage, and our identity. Some human genomic variants are associated with common traits such as hair and eye color, while others are associated with susceptibility to disease or response to drug treatment. Identifying the human variations producing clinically relevant phenotypic changes is critical for providing accurate and personalized diagnosis, prognosis, and treatment for diseases. Furthermore, a better understanding of the molecular underpinning of disease can lead to development of new drug targets for precision medicine. Several resources have been designed for collecting and storing human genomic variations in highly structured, easily accessible databases. Unfortunately, a vast amount of information about these genetic variants and their functional and phenotypic associations is currently buried in the literature, only accessible by manual curation or sophisticated text mining technology to extract the relevant information. In addition, the low cost of sequencing technologies coupled with increasing computational power has enabled the development of numerous computational methodologies to predict the pathogenicity of human variants. This review provides a detailed comparison of current human variant resources, including HGMD, OMIM, ClinVar, and UniProt/Swiss-Prot, followed by an overview of the computational methods and techniques used to leverage the available data to predict novel deleterious variants. We expect these resources and tools to become the foundation for understanding the molecular details of genomic variants leading to disease, which in turn will enable the promise of precision medicine. PMID:23962656

  17. Application of advanced grid generation techniques for flow field computations about complex configurations

    NASA Technical Reports Server (NTRS)

    Kathong, Monchai; Tiwari, Surendra N.

    1988-01-01

    In the computation of flowfields about complex configurations, it is very difficult to construct a boundary-fitted coordinate system. An alternative approach is to use several grids at once, each of which is generated independently. This procedure is called the multiple grids or zonal grids approach; its applications are investigated. The method conservative providing conservation of fluxes at grid interfaces. The Euler equations are solved numerically on such grids for various configurations. The numerical scheme used is the finite-volume technique with a three-stage Runge-Kutta time integration. The code is vectorized and programmed to run on the CDC VPS-32 computer. Steady state solutions of the Euler equations are presented and discussed. The solutions include: low speed flow over a sphere, high speed flow over a slender body, supersonic flow through a duct, and supersonic internal/external flow interaction for an aircraft configuration at various angles of attack. The results demonstrate that the multiple grids approach along with the conservative interfacing is capable of computing the flows about the complex configurations where the use of a single grid system is not possible.

  18. Development of advanced computational fluid dynamics tools and their application to simulation of internal turbulent flows

    NASA Astrophysics Data System (ADS)

    Emelyanov, V. N.; Karpenko, A. G.; Volkov, K. N.

    2015-06-01

    Modern graphics processing units (GPU) provide architectures and new programming models that enable to harness their large processing power and to design computational fluid dynamics (CFD) simulations at both high performance and low cost. Possibilities of the use of GPUs for the simulation of internal fluid flows are discussed. The finite volume method is applied to solve three-dimensional (3D) unsteady compressible Euler and Navier-Stokes equations on unstructured meshes. Compute Inified Device Architecture (CUDA) technology is used for programming implementation of parallel computational algorithms. Solution of some fluid dynamics problems on GPUs is presented and approaches to optimization of the CFD code related to the use of different types of memory are discussed. Speedup of solution on GPUs with respect to the solution on central processor unit (CPU) is compared with the use of different meshes and different methods of distribution of input data into blocks. Performance measurements show that numerical schemes developed achieve 20 to 50 speedup on GPU hardware compared to CPU reference implementation. The results obtained provide promising perspective for designing a GPU-based software framework for applications in CFD.

  19. Advancing the boundaries of high-connectivity network simulation with distributed computing.

    PubMed

    Morrison, Abigail; Mehring, Carsten; Geisel, Theo; Aertsen, A D; Diesmann, Markus

    2005-08-01

    The availability of efficient and reliable simulation tools is one of the mission-critical technologies in the fast-moving field of computational neuroscience. Research indicates that higher brain functions emerge from large and complex cortical networks and their interactions. The large number of elements (neurons) combined with the high connectivity (synapses) of the biological network and the specific type of interactions impose severe constraints on the explorable system size that previously have been hard to overcome. Here we present a collection of new techniques combined to a coherent simulation tool removing the fundamental obstacle in the computational study of biological neural networks: the enormous number of synaptic contacts per neuron. Distributing an individual simulation over multiple computers enables the investigation of networks orders of magnitude larger than previously possible. The software scales excellently on a wide range of tested hardware, so it can be used in an interactive and iterative fashion for the development of ideas, and results can be produced quickly even for very large networks. In contrast to earlier approaches, a wide class of neuron models and synaptic dynamics can be represented.

  20. Exploring the influence of gender, seniority and specialty on paper and computer-based feedback provision during mini-CEX assessments in a busy emergency department.

    PubMed

    Chang, Yu-Che; Lee, Ching-Hsing; Chen, Chien-Kuang; Liao, Chien-Hung; Ng, Chip-Jin; Chen, Jih-Chang; Chaou, Chung-Hsien

    2017-03-01

    The mini-clinical evaluation exercise (mini-CEX) is a well-established method of assessing trainees' clinical competence in the workplace. In order to improve the quality of clinical learning, factors that influence the provision of feedback are worthy of further investigation. A retrospective data analysis of documented feedback provided by assessors using the mini-CEX in a busy emergency department (ED) was conducted. The assessors comprised emergency physicians (EPs) and trauma surgeons. The trainees were all postgraduate year one (PGY1) residents. The completion rate and word count for each of three feedback components (positive feedback, suggestions for development, and an agreed action plan) were recorded. Other variables included observation time, feedback time, the format used (paper versus computer-based), the seniority of the assessor, the gender of the assessor and the specialty of the assessor. The components of feedback provided by the assessors and the influence of these contextual and demographic factors were also analyzed. During a 26-month study period, 1101 mini-CEX assessments (from 273 PGY1 residents and 67 assessors) were collected. The overall completion rate for the feedback components was 85.3 % (positive feedback), 54.8 % (suggestions for development), and 29.5 % (agreed action plan). In only 22.9 % of the total mini-CEX assessments were all three aspects of feedback completed, and 7.4 % contained no feedback. In the univariate analysis, the mini-CEX format, the seniority of the assessor and the specialty of the assessor were identified as influencing the completion of all three components of feedback. In the multivariate analysis, only the mini-CEX format and the seniority of the assessor were statistically significant. In a subgroup analysis, the feedback-facilitating effect of the computer-based format was uneven across junior and senior EPs. In addition, feedback provision showed a primacy effect: assessors tended to provide only

  1. Computational Fluid Dynamic Analysis of the Posterior Airway Space After Maxillomandibular Advancement For Obstructive Sleep Apnea Syndrome

    PubMed Central

    Sittitavornwong, Somsak; Waite, Peter D.; Shih, Alan M.; Cheng, Gary C.; Koomullil, Roy; Ito, Yasushi; Cure, Joel K; Harding, Susan M.; Litaker, Mark

    2013-01-01

    Purpose Evaluate the soft tissue change of the upper airway after maxillomandibular advancement (MMA) by computational fluid dynamics (CFD). Materials and Methods Eight OSAS patients who required MMA were recruited into this study. All participants had pre- and post-operative computed tomography (CT) and underwent MMA by a single oral and maxillofacial surgeon. Upper airway CT data sets for these 8 participants were created with high-fidelity 3-D numerical models for computational fluid dynamics (CFD). The 3-D models were simulated and analyzed to study how changes in airway anatomy affects pressure effort required for normal breathing. Airway dimensions, skeletal changes, Apnea-Hypopnea Index (AHI), and pressure efforts of pre- and post-operative 3-D models were compared and correlations interpreted. Results After MMA, laminar and turbulent air flow was significantly decreased at every level of the airway. The cross-sectional areas at the soft palate and tongue base were significantly increased. Conclusions This study shows that MMA increases airway dimensions by the increasing the occipital base (Base) - pogonion (Pg) distance. An increase of the Base-Pg distance showed a significant correlation with an AHI improvement and a decreased pressure effort of the upper airway. Decreasing the pressure effort will decrease the breathing workload. This improves the condition of OSAS. PMID:23642544

  2. Advanced adaptive computational methods for Navier-Stokes simulations in rotorcraft aerodynamics

    NASA Technical Reports Server (NTRS)

    Stowers, S. T.; Bass, J. M.; Oden, J. T.

    1993-01-01

    A phase 2 research and development effort was conducted in area transonic, compressible, inviscid flows with an ultimate goal of numerically modeling complex flows inherent in advanced helicopter blade designs. The algorithms and methodologies therefore are classified as adaptive methods, which are error estimation techniques for approximating the local numerical error, and automatically refine or unrefine the mesh so as to deliver a given level of accuracy. The result is a scheme which attempts to produce the best possible results with the least number of grid points, degrees of freedom, and operations. These types of schemes automatically locate and resolve shocks, shear layers, and other flow details to an accuracy level specified by the user of the code. The phase 1 work involved a feasibility study of h-adaptive methods for steady viscous flows, with emphasis on accurate simulation of vortex initiation, migration, and interaction. Phase 2 effort focused on extending these algorithms and methodologies to a three-dimensional topology.

  3. Development of Computational Capabilities to Predict the Corrosion Wastage of Boiler Tubes in Advanced Combustion Systems

    SciTech Connect

    Kung, Steven; Rapp, Robert

    2014-08-31

    A comprehensive corrosion research project consisting of pilot-scale combustion testing and long-term laboratory corrosion study has been successfully performed. A pilot-scale combustion facility available at Brigham Young University was selected and modified to enable burning of pulverized coals under the operating conditions typical for advanced coal-fired utility boilers. Eight United States (U.S.) coals were selected for this investigation, with the test conditions for all coals set to have the same heat input to the combustor. In addition, the air/fuel stoichiometric ratio was controlled so that staged combustion was established, with the stoichiometric ratio maintained at 0.85 in the burner zone and 1.15 in the burnout zone. The burner zone represented the lower furnace of utility boilers, while the burnout zone mimicked the upper furnace areas adjacent to the superheaters and reheaters. From this staged combustion, approximately 3% excess oxygen was attained in the combustion gas at the furnace outlet. During each of the pilot-scale combustion tests, extensive online measurements of the flue gas compositions were performed. In addition, deposit samples were collected at the same location for chemical analyses. Such extensive gas and deposit analyses enabled detailed characterization of the actual combustion environments existing at the lower furnace walls under reducing conditions and those adjacent to the superheaters and reheaters under oxidizing conditions in advanced U.S. coal-fired utility boilers. The gas and deposit compositions were then carefully simulated in a series of 1000-hour laboratory corrosion tests, in which the corrosion performances of different commercial candidate alloys and weld overlays were evaluated at various temperatures for advanced boiler systems. Results of this laboratory study led to significant improvement in understanding of the corrosion mechanisms operating on the furnace walls as well as superheaters and reheaters in

  4. TRAC-PF1: an advanced best-estimate computer program for pressurized water reactor analysis

    SciTech Connect

    Liles, D.R.; Mahaffy, J.H.

    1984-02-01

    The Transient Reactor Analysis Code (TRAC) is being developed at the Los Alamos National Laboratory to provide advanced best-estimate predictions of postulated accidents in light water reactors. The TRAC-PF1 program provides this capability for pressurized water reactors and for many thermal-hydraulic experimental facilities. The code features either a one-dimensional or a three-dimensional treatment of the pressure vessel and its associated internals; a two-phase, two-fluid nonequilibrium hydrodynamics model with a noncondensable gas field; flow-regime-dependent constitutive equation treatment; optional reflood tracking capability for both bottom flood and falling-film quench fronts; and consistent treatment of entire accident sequences including the generation of consistent initial conditions. This report describes the thermal-hydraulic models and the numerical solution methods used in the code. Detailed programming and user information also are provided.

  5. Advanced Computational Thermal Fluid Physics (CTFP) and Its Assessment for Light Water Reactors and Supercritical Reactors

    SciTech Connect

    D.M. McEligot; K. G. Condie; G. E. McCreery; H. M. McIlroy; R. J. Pink; L.E. Hochreiter; J.D. Jackson; R.H. Pletcher; B.L. Smith; P. Vukoslavcevic; J.M. Wallace; J.Y. Yoo; J.S. Lee; S.T. Ro; S.O. Park

    2005-10-01

    Background: The ultimate goal of the study is the improvement of predictive methods for safety analyses and design of Generation IV reactor systems such as supercritical water reactors (SCWR) for higher efficiency, improved performance and operation, design simplification, enhanced safety and reduced waste and cost. The objective of this Korean / US / laboratory / university collaboration of coupled fundamental computational and experimental studies is to develop the supporting knowledge needed for improved predictive techniques for use in the technology development of Generation IV reactor concepts and their passive safety systems. The present study emphasizes SCWR concepts in the Generation IV program.

  6. Computer Aided Design of Advanced Turbine Airfoil Alloys for Industrial Gas Turbines in Coal Fired Environments

    SciTech Connect

    G.E. Fuchs

    2007-12-31

    Recent initiatives for fuel flexibility, increased efficiency and decreased emissions in power generating industrial gas turbines (IGT's), have highlighted the need for the development of techniques to produce large single crystal or columnar grained, directionally solidified Ni-base superalloy turbine blades and vanes. In order to address the technical difficulties of producing large single crystal components, a program has been initiated to, using computational materials science, better understand how alloy composition in potential IGT alloys and solidification conditions during processing, effect castability, defect formation and environmental resistance. This program will help to identify potential routes for the development of high strength, corrosion resistant airfoil/vane alloys, which would be a benefit to all IGT's, including small IGT's and even aerospace gas turbines. During the first year, collaboration with Siemens Power Corporation (SPC), Rolls-Royce, Howmet and Solar Turbines has identified and evaluated about 50 alloy compositions that are of interest for this potential application. In addition, alloy modifications to an existing alloy (CMSX-4) were also evaluated. Collaborating with SPC and using computational software at SPC to evaluate about 50 alloy compositions identified 5 candidate alloys for experimental evaluation. The results obtained from the experimentally determined phase transformation temperatures did not compare well to the calculated values in many cases. The effects of small additions of boundary strengtheners (i.e., C, B and N) to CMSX-4 were also examined. The calculated phase transformation temperatures were somewhat closer to the experimentally determined values than for the 5 candidate alloys, discussed above. The calculated partitioning coefficients were similar for all of the CMSX-4 alloys, similar to the experimentally determined segregation behavior. In general, it appears that computational materials science has become a

  7. Advances in Human-Computer Interaction: Graphics and Animation Components for Interface Design

    NASA Astrophysics Data System (ADS)

    Cipolla Ficarra, Francisco V.; Nicol, Emma; Cipolla-Ficarra, Miguel; Richardson, Lucy

    We present an analysis of communicability methodology in graphics and animation components for interface design, called CAN (Communicability, Acceptability and Novelty). This methodology has been under development between 2005 and 2010, obtaining excellent results in cultural heritage, education and microcomputing contexts. In studies where there is a bi-directional interrelation between ergonomics, usability, user-centered design, software quality and the human-computer interaction. We also present the heuristic results about iconography and layout design in blogs and websites of the following countries: Spain, Italy, Portugal and France.

  8. Recent advances in first principles computational research of cathode materials for lithium-ion batteries.

    PubMed

    Meng, Ying Shirley; Arroyo-de Dompablo, M Elena

    2013-05-21

    To meet the increasing demands of energy storage, particularly for transportation applications such as plug-in hybrid electric vehicles, researchers will need to develop improved lithium-ion battery electrode materials that exhibit high energy density, high power, better safety, and longer cycle life. The acceleration of materials discovery, synthesis, and optimization will benefit from the combination of both experimental and computational methods. First principles (ab Initio) computational methods have been widely used in materials science and can play an important role in accelerating the development and optimization of new energy storage materials. These methods can prescreen previously unknown compounds and can explain complex phenomena observed with these compounds. Intercalation compounds, where Li(+) ions insert into the host structure without causing significant rearrangement of the original structure, have served as the workhorse for lithium ion rechargeable battery electrodes. Intercalation compounds will also facilitate the development of new battery chemistries such as sodium-ion batteries. During the electrochemical discharge reaction process, the intercalating species travel from the negative to the positive electrode, driving the transition metal ion in the positive electrode to a lower oxidation state, which delivers useful current. Many materials properties change as a function of the intercalating species concentrations (at different state of charge). Therefore, researchers will need to understand and control these dynamic changes to optimize the electrochemical performance of the cell. In this Account, we focus on first-principles computational investigations toward understanding, controlling, and improving the intrinsic properties of five well known high energy density Li intercalation electrode materials: layered oxides (LiMO2), spinel oxides (LiM2O4), olivine phosphates (LiMPO4), silicates-Li2MSiO4, and the tavorite-LiM(XO4)F (M = 3d

  9. Computer Aided Diagnosis for Confocal Laser Endomicroscopy in Advanced Colorectal Adenocarcinoma

    PubMed Central

    Ştefănescu, Daniela; Streba, Costin; Cârţână, Elena Tatiana; Săftoiu, Adrian; Gruionu, Gabriel; Gruionu, Lucian Gheorghe

    2016-01-01

    Introduction Confocal laser endomicroscopy (CLE) is becoming a popular method for optical biopsy of digestive mucosa for both diagnostic and therapeutic procedures. Computer aided diagnosis of CLE images, using image processing and fractal analysis can be used to quantify the histological structures in the CLE generated images. The aim of this study is to develop an automatic diagnosis algorithm of colorectal cancer (CRC), based on fractal analysis and neural network modeling of the CLE-generated colon mucosa images. Materials and Methods We retrospectively analyzed a series of 1035 artifact-free endomicroscopy images, obtained during CLE examinations from normal mucosa (356 images) and tumor regions (679 images). The images were processed using a computer aided diagnosis (CAD) medical imaging system in order to obtain an automatic diagnosis. The CAD application includes image reading and processing functions, a module for fractal analysis, grey-level co-occurrence matrix (GLCM) computation module, and a feature identification module based on the Marching Squares and linear interpolation methods. A two-layer neural network was trained to automatically interpret the imaging data and diagnose the pathological samples based on the fractal dimension and the characteristic features of the biological tissues. Results Normal colon mucosa is characterized by regular polyhedral crypt structures whereas malignant colon mucosa is characterized by irregular and interrupted crypts, which can be diagnosed by CAD. For this purpose, seven geometric parameters were defined for each image: fractal dimension, lacunarity, contrast correlation, energy, homogeneity, and feature number. Of the seven parameters only contrast, homogeneity and feature number were significantly different between normal and cancer samples. Next, a two-layer feed forward neural network was used to train and automatically diagnose the malignant samples, based on the seven parameters tested. The neural network

  10. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1 : ASC software quality engineering practices version 1.0.

    SciTech Connect

    Minana, Molly A.; Sturtevant, Judith E.; Heaphy, Robert; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Forsythe, Christi A.; Schofield, Joseph Richard, Jr.; Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2005-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in DOE/AL Quality Criteria (QC-1) as conformance to customer requirements and expectations. This quality plan defines the ASC program software quality practices and provides mappings of these practices to the SNL Corporate Process Requirements (CPR 1.3.2 and CPR 1.3.6) and the Department of Energy (DOE) document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines (GP&G). This quality plan identifies ASC management and software project teams' responsibilities for cost-effective software engineering quality practices. The SNL ASC Software Quality Plan establishes the signatories commitment to improving software products by applying cost-effective software engineering quality practices. This document explains the project teams opportunities for tailoring and implementing the practices; enumerates the practices that compose the development of SNL ASC's software products; and includes a sample assessment checklist that was developed based upon the practices in this document.

  11. Advanced computational workflow for the multi-scale modeling of the bone metabolic processes.

    PubMed

    Dao, Tien Tuan

    2016-09-16

    Multi-scale modeling of the musculoskeletal system plays an essential role in the deep understanding of complex mechanisms underlying the biological phenomena and processes such as bone metabolic processes. Current multi-scale models suffer from the isolation of sub-models at each anatomical scale. The objective of this present work was to develop a new fully integrated computational workflow for simulating bone metabolic processes at multi-scale levels. Organ-level model employs multi-body dynamics to estimate body boundary and loading conditions from body kinematics. Tissue-level model uses finite element method to estimate the tissue deformation and mechanical loading under body loading conditions. Finally, cell-level model includes bone remodeling mechanism through an agent-based simulation under tissue loading. A case study on the bone remodeling process located on the human jaw was performed and presented. The developed multi-scale model of the human jaw was validated using the literature-based data at each anatomical level. Simulation outcomes fall within the literature-based ranges of values for estimated muscle force, tissue loading and cell dynamics during bone remodeling process. This study opens perspectives for accurately simulating bone metabolic processes using a fully integrated computational workflow leading to a better understanding of the musculoskeletal system function from multiple length scales as well as to provide new informative data for clinical decision support and industrial applications.

  12. Advanced computations of multi-physics, multi-scale effects in beam dynamics

    SciTech Connect

    Amundson, J.F.; Macridin, A.; Spentzouris, P.; Stern, E.G.; /Fermilab

    2009-01-01

    Current state-of-the-art beam dynamics simulations include multiple physical effects and multiple physical length and/or time scales. We present recent developments in Synergia2, an accelerator modeling framework designed for multi-physics, multi-scale simulations. We summarize recent several recent results in multi-physics beam dynamics, including simulations of three Fermilab accelerators: the Tevatron, the Main Injector and the Debuncher. Early accelerator simulations focused on single-particle dynamics. To a first approximation, the forces on the particles in an accelerator beam are dominated by the external fields due to magnets, RF cavities, etc., so the single-particle dynamics are the leading physical effects. Detailed simulations of accelerators must include collective effects such as the space-charge repulsion of the beam particles, the effects of wake fields in the beam pipe walls and beam-beam interactions in colliders. These simulations require the sort of massively parallel computers that have only become available in recent times. We give an overview of the accelerator framework Synergia2, which was designed to take advantage of the capabilities of modern computational resources and enable simulations of multiple physical effects. We also summarize some recent results utilizing Synergia2 and BeamBeam3d, a tool specialized for beam-beam simulations.

  13. Recent advances in the application of computational mechanics to the diagnosis and treatment of cardiovascular disease.

    PubMed

    Del Alamo, Juan C; Marsden, Alison L; Lasheras, Juan C

    2009-07-01

    During the last 30 years, research into the pathogenesis and progression of cardiovascular disease has had to employ a multidisciplinary approach involving a wide range of subject areas, from molecular and cell biology to computational mechanics and experimental solid and fluid mechanics. In general, research was driven by the need to provide answers to questions of critical importance for disease management. Ongoing improvements in the spatial resolution of medical imaging equipment coupled to an exponential growth in the capacity, flexibility and speed of computational techniques have provided a valuable opportunity for numerical simulations and complex experimental techniques to make a contribution to improving the diagnosis and clinical management of many forms of cardiovascular disease. This paper contains a review of recent progress in the numerical simulation of cardiovascular mechanics, focusing on three particular areas: patient-specific modeling and the optimization of surgery in pediatric cardiology, evaluating the risk of rupture in aortic aneurysms, and noninvasive characterization of intraventricular flow in the management of heart failure.

  14. Advancement in Understanding Volcanic Processes by 4D Synchrotron X-ray Computed Microtomography Imaging of Rock Textures

    NASA Astrophysics Data System (ADS)

    Polacci, M.; Arzilli, F.; La Spina, G.

    2015-12-01

    X-ray computed microtomography (μCT) is the only high-resolution, non-destructive technique that allows visualization and processing of geomaterials directly in three-dimensions. This, together with the development of more and more sophisticated imaging techniques, have generated in the last ten years a widespread application of this methodology in Earth Sciences, from structural geology to palaeontology to igneous petrology to volcanology. Here, I will describe how X-ray μCT has contributed to advance our knowledge of volcanic processes and eruption dynamics and illustrate the first, preliminary results from 4D (space+time) X-ray microtomographic experiments of magma kinetics in basaltic systems.

  15. Battlefield Medical Information System-Tactical (BMIST): the application of mobile computing technologies to support health surveillance in the Department of Defense.

    PubMed

    Morris, Tommy J; Pajak, John; Havlik, Frank; Kenyon, Jessica; Calcagni, Dean

    2006-08-01

    This paper discusses the innovation process of the Battlefield Medical Information System- Tactical (BMIST), a point-of-care mobile computing solution for reducing medical errors and improving the quality of care provided to our military personnel in the field. In such remote environments, medical providers have traditionally had limited access to medical information, a situation quite analogous to that in remote areas of underdeveloped or developing countries. BMIST provides an all-in-one suite of mobile applications that empowers providers via access to critical medical information and powerful clinical decision support tools to accurately create an electronic health record (EHR). This record is synchronized with Department of Defense (DOD) joint health surveillance and medical information systems from the earliest echelons of care through chronic care provided by the Veterans Administration. Specific goals met in the initial phase were: integration of the PDA and wireless interface; development of the local application and user interface; development of a communications infrastructure and development of a data storage and retrieval system. The system had been used extensively in the field to create an EHR far forward that supports a longitudinal medical record across time and across all elements of the Military Healthcare System.

  16. Advances in Software Tools for Pre-processing and Post-processing of Overset Grid Computations

    NASA Technical Reports Server (NTRS)

    Chan, William M.

    2004-01-01

    Recent developments in three pieces of software for performing pre-processing and post-processing work on numerical computations using overset grids are presented. The first is the OVERGRID graphical interface which provides a unified environment for the visualization, manipulation, generation and diagnostics of geometry and grids. Modules are also available for automatic boundary conditions detection, flow solver input preparation, multiple component dynamics input preparation and dynamics animation, simple solution viewing for moving components, and debris trajectory analysis input preparation. The second is a grid generation script library that enables rapid creation of grid generation scripts. A sample of recent applications will be described. The third is the OVERPLOT graphical interface for displaying and analyzing history files generated by the flow solver. Data displayed include residuals, component forces and moments, number of supersonic and reverse flow points, and various dynamics parameters.

  17. Advanced Undergraduate Computer Based Astronomy Lab. The Astrometric Binary Kruger 60.

    NASA Astrophysics Data System (ADS)

    Slovak, M. H.

    2002-12-01

    A challenging computer based lab for astronomy undergraduate students has been developed to determine the masses of the components of the visual binary system Kruger 60 = HD 239960 = BD+56 2783 using archival astrometric observations. The data consist of separations and position angles from 1898 to 1949 (Lippincott 1953; Van de Kamp 1967) of Kruger 60B relative to Kruger 60A covering a complete orbit. After reviewing Kepler's 3rd or Harmonic Law and Newton's revision, they analyze the data using Microsoft Excel to calculate a best fitting elliptical orbit to the relative orbit of Kruger 60B. The importance of deriving stellar masses from such binaries is emphasized by discussing the significance of mass in the role of stellar evolution. This lab is one in a series being designed to provide astronomy majors practical experience in mathematically modeling astronomical data.This research was supported in part by NASA LaSPACE LURA Grant LSU 3115-30-5199.

  18. Large Advanced Space Systems (LASS) computer-aided design program additions

    NASA Technical Reports Server (NTRS)

    Farrell, C. E.

    1982-01-01

    The LSS preliminary and conceptual design requires extensive iteractive analysis because of the effects of structural, thermal, and control intercoupling. A computer aided design program that will permit integrating and interfacing of required large space system (LSS) analyses is discussed. The primary objective of this program is the implementation of modeling techniques and analysis algorithms that permit interactive design and tradeoff studies of LSS concepts. Eight software modules were added to the program. The existing rigid body controls module was modified to include solar pressure effects. The new model generator modules and appendage synthesizer module are integrated (interfaced) to permit interactive definition and generation of LSS concepts. The mass properties module permits interactive specification of discrete masses and their locations. The other modules permit interactive analysis of orbital transfer requirements, antenna primary beam n, and attitude control requirements.

  19. Advances in the design of a thermomechanical analyzer for fibers. II. Computer facilities and software

    NASA Astrophysics Data System (ADS)

    Noui, L.; Hearle, J. W. S.

    1995-06-01

    PC-based software for the full control of the flexible thermomechanical analyzer (FTMA) for yarns and fibers is described. The software permits a flexible procedure to control three essential parameters of the FTMA, namely tension, twist, and temperature. The computer program allows data acquisition at a programmable rate of up to 62.5 ksamples/s, on-line data display, and on-line data storage. Up to eight channels can be monitored. A circular buffer was used to store unlimited amount of data. For FTMA applications, data were calibrated in terms of Newtons for the tension, degree Celsius for the temperature, and Newton-meter for the torque and can be saved in three different formats, ASCII, LOTUS, or binary. The software is user friendly as it makes use of graphical user interface for motor control and data display. The software is also capable of controlling thermomechanical tests at constant force.

  20. Advanced multi-dimensional deterministic transport computational capability for safety analysis of pebble-bed reactors

    NASA Astrophysics Data System (ADS)

    Tyobeka, Bismark Mzubanzi

    A coupled neutron transport thermal-hydraulics code system with both diffusion and transport theory capabilities is presented. At the heart of the coupled code is a powerful neutronics solver, based on a neutron transport theory approach, powered by the time-dependent extension of the well known DORT code, DORT-TD. DORT-TD uses a fully implicit time integration scheme and is coupled via a general interface to the thermal-hydraulics code THERMIX-DIREKT, an HTR-specific two dimensional core thermal-hydraulics code. Feedback is accounted for by interpolating multigroup cross sections from pre-generated libraries which are structured for user specified discrete sets of thermal-hydraulic parameters e.g. fuel and moderator temperatures. The coupled code system is applied to two HTGR designs, the PBMR 400MW and the PBMR 268MW. Steady-state and several design basis transients are modeled in an effort to discern with the adequacy of using neutron diffusion theory as against the more accurate but yet computationally expensive neutron transport theory. It turns out that there are small but significant differences in the results from using either of the two theories. It is concluded that diffusion theory can be used with a higher degree of confidence in the PBMR as long as more than two energy groups are used and that the result must be checked against lower order transport solution, especially for safety analysis purposes. The end product of this thesis is a high fidelity, state-of-the-art computer code system, with multiple capabilities to analyze all PBMR safety related transients in an accurate and efficient manner.

  1. Improved NASA-ANOPP Noise Prediction Computer Code for Advanced Subsonic Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Kontos, K. B.; Janardan, B. A.; Gliebe, P. R.

    1996-01-01

    Recent experience using ANOPP to predict turbofan engine flyover noise suggests that it over-predicts overall EPNL by a significant amount. An improvement in this prediction method is desired for system optimization and assessment studies of advanced UHB engines. An assessment of the ANOPP fan inlet, fan exhaust, jet, combustor, and turbine noise prediction methods is made using static engine component noise data from the CF6-8OC2, E(3), and QCSEE turbofan engines. It is shown that the ANOPP prediction results are generally higher than the measured GE data, and that the inlet noise prediction method (Heidmann method) is the most significant source of this overprediction. Fan noise spectral comparisons show that improvements to the fan tone, broadband, and combination tone noise models are required to yield results that more closely simulate the GE data. Suggested changes that yield improved fan noise predictions but preserve the Heidmann model structure are identified and described. These changes are based on the sets of engine data mentioned, as well as some CFM56 engine data that was used to expand the combination tone noise database. It should be noted that the recommended changes are based on an analysis of engines that are limited to single stage fans with design tip relative Mach numbers greater than one.

  2. Computational design and fabrication of a novel bioresorbable cage for tibial tuberosity advancement application.

    PubMed

    Castilho, Miguel; Rodrigues, Jorge; Vorndran, Elke; Gbureck, Uwe; Quental, Carlos; Folgado, João; Fernandes, Paulo R

    2017-01-01

    Tibial tuberosity advancement (TTA) is a promising method for the treatment of cruciate ligament rupture in dogs that usually implies the implantation of a titanium cage as bone implant. This cage is non-biodegradable and fails in providing adequate implant-bone tissue integration. The objective of this work is to propose a new process chain for designing and manufacturing an alternative biodegradable cage that can fulfill specific patient requirements. A three-dimensional finite element model (3D FEM) of the TTA system was first created to evaluate the mechanical environment at cage domain during different stages of the dog walk. The cage microstructure was then optimized using a topology optimization tool, which addresses the accessed local mechanical requirements, and at same time ensures the maximum permeability to allow nutrient and oxygen supply to the implant core. The designed cage was then biofabricated by a 3D powder printing of tricalcium phosphate cement. This work demonstrates that the combination of a 3D FEM with a topology optimization approach enabled the design of a novel cage for TTA application with tailored permeability and mechanical properties, that can be successfully 3D printed in a biodegradable bioceramic material. These results support the potential of the design optimization strategy and fabrication method to the development of customized and bioresorbable implants for bone repair.

  3. Advances in physiologic lung assessment via electron beam computed tomography (EBCT)

    NASA Astrophysics Data System (ADS)

    Hoffman, Eric A.

    1999-09-01

    Lung function has been evaluated in both health and disease states by techniques, such as pulmonary function tests, which generally study aggregate function. These decades old modalities have yielded a valuable understanding of global physiologic and pathophysiologic structure-to-function relationships. However, such approaches have reached their limits. They cannot meet the current and anticipated needs of new surgical and pharmaceutical treatments. 4-D CT can provide insights into regional lung function (ventilation and blood flow) and thus can provide information at an early stage of disease when intervention will have the greatest impact. Lung CT over the last decade has helped with further defining anatomic features in disease, but has lagged behind advances on the cellular and molecular front largely because of the failure to account for functional correlates to structural pathology. Commercially available CT scanners are now capable of volumetric data acquisition in a breath-hold and capable of multi-level slice acquisitions of the heart and lungs with a per slice scan aperture of 50 - 300 msec, allowing for regional blood flow measurements. Static, volumetric imaging of the lung is inadequate in that much of lung pathology is a dynamic phenomenon and, thus, is only detectable if the lung is imaged as air and blood are flowing. This paper review the methodologies and early physiologic findings associated with our measures of lung tissue properties coupled with regional ventilation and perfusion.

  4. Recent advances in computational predictions of NMR parameters for the structure elucidation of carbohydrates: methods and limitations.

    PubMed

    Toukach, Filip V; Ananikov, Valentine P

    2013-11-07

    All living systems are comprised of four fundamental classes of macromolecules--nucleic acids, proteins, lipids, and carbohydrates (glycans). Glycans play a unique role of joining three principal hierarchical levels of the living world: (1) the molecular level (pathogenic agents and vaccine recognition by the immune system, metabolic pathways involving saccharides that provide cells with energy, and energy accumulation via photosynthesis); (2) the nanoscale level (cell membrane mechanics, structural support of biomolecules, and the glycosylation of macromolecules); (3) the microscale and macroscale levels (polymeric materials, such as cellulose, starch, glycogen, and biomass). NMR spectroscopy is the most powerful research approach for getting insight into the solution structure and function of carbohydrates at all hierarchical levels, from monosaccharides to oligo- and polysaccharides. Recent progress in computational procedures has opened up novel opportunities to reveal the structural information available in the NMR spectra of saccharides and to advance our understanding of the corresponding biochemical processes. The ability to predict the molecular geometry and NMR parameters is crucial for the elucidation of carbohydrate structures. In the present paper, we review the major NMR spectrum simulation techniques with regard to chemical shifts, coupling constants, relaxation rates and nuclear Overhauser effect prediction applied to the three levels of glycomics. Outstanding development in the related fields of genomics and proteomics has clearly shown that it is the advancement of research tools (automated spectrum analysis, structure elucidation, synthesis, sequencing and amplification) that drives the large challenges in modern science. Combining NMR spectroscopy and the computational analysis of structural information encoded in the NMR spectra reveals a way to the automated elucidation of the structure of carbohydrates.

  5. Application of advanced computational codes in the design of an experiment for a supersonic throughflow fan rotor

    NASA Technical Reports Server (NTRS)

    Wood, Jerry R.; Schmidt, James F.; Steinke, Ronald J.; Chima, Rodrick V.; Kunik, William G.

    1987-01-01

    Increased emphasis on sustained supersonic or hypersonic cruise has revived interest in the supersonic throughflow fan as a possible component in advanced propulsion systems. Use of a fan that can operate with a supersonic inlet axial Mach number is attractive from the standpoint of reducing the inlet losses incurred in diffusing the flow from a supersonic flight Mach number to a subsonic one at the fan face. The design of the experiment using advanced computational codes to calculate the components required is described. The rotor was designed using existing turbomachinery design and analysis codes modified to handle fully supersonic axial flow through the rotor. A two-dimensional axisymmetric throughflow design code plus a blade element code were used to generate fan rotor velocity diagrams and blade shapes. A quasi-three-dimensional, thin shear layer Navier-Stokes code was used to assess the performance of the fan rotor blade shapes. The final design was stacked and checked for three-dimensional effects using a three-dimensional Euler code interactively coupled with a two-dimensional boundary layer code. The nozzle design in the expansion region was analyzed with a three-dimensional parabolized viscous code which corroborated the results from the Euler code. A translating supersonic diffuser was designed using these same codes.

  6. Recent Advances in Computational Studies of Charge Exchange X-ray Emission

    NASA Astrophysics Data System (ADS)

    Cumbee, Renata

    2016-06-01

    Interest in astrophysical sources of charge exchange (CX) has grown since X-ray emission from comet Hyakutake was first observed, the origin of which is primarily due to CX processes between neutral species in the comet’s atmosphere and highly charged ions from the solar wind. More recent observations have shown that CX may have a significant contribution to the X-ray emission spectra of a wide variety of environments within our solar system including solar wind charge exchange (SWCX) with neutral gases in the heliosphere and in planetary atmospheres, as well as beyond the solar system in galaxy clusters, supernova remnants, and star forming galaxies.While the basic process of CX has been studied for many decades, the reliability of the existing data is not uniform, and the coverage of the astrophysically important projectile and target combinations and collisional velocities is insufficient. The need for reliable and robust CX X-ray emission models will only be amplified with the with the high resolution X-ray spectra expected from the soft X-ray imaging calorimeter spectrometer (SXS) onboard the Hitomi X-ray observatory. In this talk, I will discuss recent advances in theoretical CX cross sections and X-ray modeling with a focus on CX diagnostics. The need for experimental X-ray spectra and cross sections for benchmarking current theory will also be highlighted. This work was performed in collaboration with David Lyons, Patrick Mullen, David Schultz, Phillip Stancil, and Robin Shelton. Work at UGA was partially supported by NASA grant NNX09AC46G.

  7. The Spin Torque Lego - from spin torque nano-devices to advanced computing architectures

    NASA Astrophysics Data System (ADS)

    Grollier, Julie

    2013-03-01

    Spin transfer torque (STT), predicted in 1996, and first observed around 2000, brought spintronic devices to the realm of active elements. A whole class of new devices, based on the combined effects of STT for writing and Giant Magneto-Resistance or Tunnel Magneto-Resistance for reading has emerged. The second generation of MRAMs, based on spin torque writing : the STT-RAM, is under industrial development and should be out on the market in three years. But spin torque devices are not limited to binary memories. We will rapidly present how the spin torque effect also allows to implement non-linear nano-oscillators, spin-wave emitters, controlled stochastic devices and microwave nano-detectors. What is extremely interesting is that all these functionalities can be obtained using the same materials, the exact same stack, simply by changing the device geometry and its bias conditions. So these different devices can be seen as Lego bricks, each brick with its own functionality. During this talk, I will show how spin torque can be engineered to build new bricks, such as the Spintronic Memristor, an artificial magnetic nano-synapse. I will then give hints on how to assemble these bricks in order to build novel types of computing architectures, with a special focus on neuromorphic circuits. Financial support by the European Research Council Starting Grant NanoBrain (ERC 2010 Stg 259068) is acknowledged.

  8. Advances of multidetector computed tomography in the characterization and staging of renal cell carcinoma

    PubMed Central

    Tsili, Athina C; Argyropoulou, Maria I

    2015-01-01

    Renal cell carcinoma (RCC) accounts for approximately 90%-95% of kidney tumors. With the widespread use of cross-sectional imaging modalities, more than half of RCCs are detected incidentally, often diagnosed at an early stage. This may allow the planning of more conservative treatment strategies. Computed tomography (CT) is considered the examination of choice for the detection and staging of RCC. Multidetector CT (MDCT) with the improvement of spatial resolution and the ability to obtain multiphase imaging, multiplanar and three-dimensional reconstructions in any desired plane brought about further improvement in the evaluation of RCC. Differentiation of RCC from benign renal tumors based on MDCT features is improved. Tumor enhancement characteristics on MDCT have been found closely to correlate with the histologic subtype of RCC, the nuclear grade and the cytogenetic characteristics of clear cell RCC. Important information, including tumor size, localization, and organ involvement, presence and extent of venous thrombus, possible invasion of adjacent organs or lymph nodes, and presence of distant metastases are provided by MDCT examination. The preoperative evaluation of patients with RCC was improved by depicting the presence or absence of renal pseudocapsule and by assessing the possible neoplastic infiltration of the perirenal fat tissue and/or renal sinus fat compartment. PMID:26120380

  9. Advanced computational tools for PEM fuel cell design. Part 2. Detailed experimental validation and parametric study

    NASA Astrophysics Data System (ADS)

    Sui, P. C.; Kumar, S.; Djilali, N.

    This paper reports on the systematic experimental validation of a comprehensive 3D CFD-based computational model presented and documented in Part 1. Simulations for unit cells with straight channels, similar to the Ballard Mk902 hardware, are performed and analyzed in conjunction with detailed current mapping measurements and water mass distributions in the membrane-electrode assembly. The experiments were designed to display sensitivity of the cell over a range of operating parameters including current density, humidification, and coolant temperature, making the data particularly well suited for systematic validation. Based on the validation and analysis of the predictions, values of model parameters, including the electro-osmotic drag coefficient, capillary diffusion coefficient, and catalyst specific surface area are determined adjusted to fit experimental data of current density and MEA water content. The predicted net water flux out of the anode (normalized by the total water generated) increases as anode humidification water flow rate is increased, in agreement with experimental results. A modification of the constitutive equation for the capillary diffusivity of water in the porous electrodes that attempts to incorporate the experimentally observed immobile (or irreducible) saturation yields a better fit of the predicted MEA water mass with experimental data. The specific surface area parameter used in the catalyst layer model is found to be effective in tuning the simulations to predict the correct cell voltage over a range of stoichiometries.

  10. Advancing global marine biogeography research with open-source GIS software and cloud-computing

    USGS Publications Warehouse

    Fujioka, Ei; Vanden Berghe, Edward; Donnelly, Ben; Castillo, Julio; Cleary, Jesse; Holmes, Chris; McKnight, Sean; Halpin, patrick

    2012-01-01

    Across many scientific domains, the ability to aggregate disparate datasets enables more meaningful global analyses. Within marine biology, the Census of Marine Life served as the catalyst for such a global data aggregation effort. Under the Census framework, the Ocean Biogeographic Information System was established to coordinate an unprecedented aggregation of global marine biogeography data. The OBIS data system now contains 31.3 million observations, freely accessible through a geospatial portal. The challenges of storing, querying, disseminating, and mapping a global data collection of this complexity and magnitude are significant. In the face of declining performance and expanding feature requests, a redevelopment of the OBIS data system was undertaken. Following an Open Source philosophy, the OBIS technology stack was rebuilt using PostgreSQL, PostGIS, GeoServer and OpenLayers. This approach has markedly improved the performance and online user experience while maintaining a standards-compliant and interoperable framework. Due to the distributed nature of the project and increasing needs for storage, scalability and deployment flexibility, the entire hardware and software stack was built on a Cloud Computing environment. The flexibility of the platform, combined with the power of the application stack, enabled rapid re-development of the OBIS infrastructure, and ensured complete standards-compliance.

  11. Computational consideration on advanced oxidation degradation of phenolic preservative, methylparaben, in water: mechanisms, kinetics, and toxicity assessments.

    PubMed

    Gao, Yanpeng; An, Taicheng; Fang, Hansun; Ji, Yuemeng; Li, Guiying

    2014-08-15

    Hydroxyl radicals ((•)OH) are strong oxidants that can degrade organic pollutants in advanced oxidation processes (AOPs). The mechanisms, kinetics, and toxicity assessment of the (•)OH-initiated oxidative degradation of the phenolic preservative, methylparaben (MPB), were systematically investigated using a computational approach, as the supplementary information for experimental data. Results showed that MPB can be initially attacked by (•)OH via OH-addition and H-abstraction routes. Among these routes, the (•)OH addition to the C atom at the ortho-position of phenolic hydroxyl group was the most significant route. However, the methyl-H-abstraction route also cannot be neglected. Further, the formed transient intermediates, OH-adduct ((•)MPB-OH1) and dehydrogenated radical ((•)MPB(-H)α), could be easily transformed to several stable degradation products in the presence of O2 and (•)OH. To better understand the potential toxicity of MPB and its products to aquatic organisms, both acute and chronic toxicities were assessed computationally at three trophic levels. Both MPB and its products, particularly the OH-addition products, are harmful to aquatic organisms. Therefore, the application of AOPs to remove MPB should be carefully performed for safe water treatment.

  12. Tuning the cache memory usage in tomographic reconstruction on standard computers with Advanced Vector eXtensions (AVX).

    PubMed

    Agulleiro, Jose-Ignacio; Fernandez, Jose-Jesus

    2015-06-01

    Cache blocking is a technique widely used in scientific computing to minimize the exchange of information with main memory by reusing the data kept in cache memory. In tomographic reconstruction on standard computers using vector instructions, cache blocking turns out to be central to optimize performance. To this end, sinograms of the tilt-series and slices of the volumes to be reconstructed have to be divided into small blocks that fit into the different levels of cache memory. The code is then reorganized so as to operate with a block as much as possible before proceeding with another one. This data article is related to the research article titled Tomo3D 2.0 - Exploitation of Advanced Vector eXtensions (AVX) for 3D reconstruction (Agulleiro and Fernandez, 2015) [1]. Here we present data of a thorough study of the performance of tomographic reconstruction by varying cache block sizes, which allows derivation of expressions for their automatic quasi-optimal tuning.

  13. Community Petascale Project for Accelerator Science and Simulation: Advancing Computational Science for Future Accelerators and Accelerator Technologies

    SciTech Connect

    Spentzouris, P.; Cary, J.; McInnes, L.C.; Mori, W.; Ng, C.; Ng, E.; Ryne, R.; /LBL, Berkeley

    2011-11-14

    for software development and applications accounts for the natural domain areas (beam dynamics, electromagnetics, and advanced acceleration), and all areas depend on the enabling technologies activities, such as solvers and component technology, to deliver the desired performance and integrated simulation environment. The ComPASS applications focus on computationally challenging problems important for design or performance optimization to all major HEP, NP, and BES accelerator facilities. With the cost and complexity of particle accelerators rising, the use of computation to optimize their designs and find improved operating regimes becomes essential, potentially leading to significant cost savings with modest investment.

  14. Advancing the detection of steady-state visual evoked potentials in brain-computer interfaces

    NASA Astrophysics Data System (ADS)

    Abu-Alqumsan, Mohammad; Peer, Angelika

    2016-06-01

    Objective. Spatial filtering has proved to be a powerful pre-processing step in detection of steady-state visual evoked potentials and boosted typical detection rates both in offline analysis and online SSVEP-based brain-computer interface applications. State-of-the-art detection methods and the spatial filters used thereby share many common foundations as they all build upon the second order statistics of the acquired Electroencephalographic (EEG) data, that is, its spatial autocovariance and cross-covariance with what is assumed to be a pure SSVEP response. The present study aims at highlighting the similarities and differences between these methods. Approach. We consider the canonical correlation analysis (CCA) method as a basis for the theoretical and empirical (with real EEG data) analysis of the state-of-the-art detection methods and the spatial filters used thereby. We build upon the findings of this analysis and prior research and propose a new detection method (CVARS) that combines the power of the canonical variates and that of the autoregressive spectral analysis in estimating the signal and noise power levels. Main results. We found that the multivariate synchronization index method and the maximum contrast combination method are variations of the CCA method. All three methods were found to provide relatively unreliable detections in low signal-to-noise ratio (SNR) regimes. CVARS and the minimum energy combination methods were found to provide better estimates for different SNR levels. Significance. Our theoretical and empirical results demonstrate that the proposed CVARS method outperforms other state-of-the-art detection methods when used in an unsupervised fashion. Furthermore, when used in a supervised fashion, a linear classifier learned from a short training session is able to estimate the hidden user intention, including the idle state (when the user is not attending to any stimulus), rapidly, accurately and reliably.

  15. Developing advanced x-ray scattering methods combined with crystallography and computation

    PubMed Central

    Perry, J. Jefferson P.; Tainer, John A.

    2013-01-01

    The extensive use of small angle x-ray scattering (SAXS) over the last few years is rapidly providing new insights into protein interactions, complex formation and conformational states in solution. This SAXS methodology allows for detailed biophysical quantification of samples of interest. Initial analyses provide a judgment of sample quality, revealing the potential presence of aggregation, the overall extent of folding or disorder, the radius of gyration, maximum particle dimensions and oligomerization state. Structural characterizations include ab initio approaches from SAXS data alone, and when combined with previously determined crystal/NMR, atomistic modeling can further enhance structural solutions and assess validity. This combination can provide definitions of architectures, spatial organizations of protein domains within a complex, including those not determined by crystallography or NMR, as well as defining key conformational states of a protein interaction. SAXS is not generally constrained by macromolecule size, and the rapid collection of data in a 96-well plate format provides methods to screen sample conditions. This includes screening for co-factors, substrates, differing protein or nucleotide partners or small molecule inhibitors, to more fully characterize the variations within assembly states and key conformational changes. Such analyses may be useful for screening constructs and conditions to determine those most likely to promote crystal growth of a complex under study. Moreover, these high throughput structural determinations can be leveraged to define how polymorphisms affect assembly formations and activities. This is in addition to potentially providing architectural characterizations of complexes and interactions for systems biology-based research, and distinctions in assemblies and interactions in comparative genomics. Thus, SAXS combined with crystallography/NMR and computation provides a unique set of tools that should be considered

  16. Advances in the Thermodynamics of Ideal Gases by Means of Computer Simulations

    NASA Astrophysics Data System (ADS)

    Sands, David; Dunning-Davies, Jeremy

    2010-12-01

    Irreversible thermodynamic processes in ideal gases are investigated by computer simulations of the compound piston. A hard-sphere model of the gas on either side of a compound piston shows that damping occurs naturally without invoking extraneous mechanisms such as friction. Inter-particle collisions are identified as being responsible, as these redistribute the particle energies by altering all the components of momentum. In collisions with the piston, on the other hand, only the component of particle momentum in the direction of the piston motion is affected. Thus inter-particle collisions effectively dissipate the energy of the piston. These ideas are then incorporated into a simpler, one dimensional model based on kinetic theory in which all the particles have the same initial energy and inter-particle collisions are simulated by randomly adjusting the energy distribution. Varying the rate of energy redistribution alters the rate of decay of the piston motion. In addition, this simple model allows thermal interactions with the walls of the vessel to be simulated easily, and we observe a second mechanism of damping due to delayed heating and cooling. These ideas lead directly to a macroscopic formulation of thermodynamics in terms of rate equations. The models give an insight into the micro-dynamical origins of irreversibility in ideal gases and allow the thermodynamics of these irreversible processes to be investigated. We find surprisingly simple relationships between the volume changes and characteristic pressures in the system. Finally, we apply these idea s to the Carnot cycle and show that a dynamic cycle is executed if the piston is allowed to move under alternately ideal isothermal and adiabatic conditions. In this dynamic Carnot cycle not only is work done but power is developed through the motion of the piston. The implications for classical thermodynamics are discussed briefly.

  17. 2003 U.S. Department of Energy Strategic Plan: Protecting National, Energy, and Economic Security with Advanced Science and Technology and Ensuring Environmental Cleanup

    SciTech Connect

    None,

    2003-09-30

    The Department of Energy contributes to the future of the Nation by ensuring energy security, maintaining the safety, security and reliability of the nuclear weapons stockpile, cleaning up the environment from the legacy of the Cold War, and developing innovations in science and technology. After 25 years in existence, the Department now operates 24 preeminent research laboratories and facilities and four power marketing administrations, and manages the environmental cleanup from 50 years of nuclear defense activities that impacted two million acres in communities across the country. The Department has an annual budget of about $23 billion and employs about 14,500 Federal and 100,000 contractor employees. The Department of Energy is principally a national security agency and all of its missions flow from this core mission to support national security. That is true not just today, but throughout the history of the agency. The origins of the Department can be traced to the Manhattan Project and the race to develop the atomic bomb during World War II. Following the war, Congress engaged in a vigorous and contentious debate over civilian versus military control of the atom. The Atomic Energy Act of 1946 settled the debate by creating the Atomic Energy Commission, which took over the Manhattan Project’s sprawling scientific and industrial complex.

  18. Computed tomography for non-traumatic headache in the emergency department and the impact of follow-up testing on altering the initial diagnosis.

    PubMed

    Quon, Jeffrey S; Glikstein, Rafael; Lim, Christopher S; Schwarz, Betty Anne

    2015-10-01

    The purpose of this study was twofold: (1) to determine the incidence of positive computed tomography (CT) findings in patients presenting to the emergency department (ED) with non-traumatic headache at our institution and (2) to examine follow-up exams, including lumbar puncture, non-enhanced CT, CT angiogram, CT venogram, and magnetic resonance imaging (MRI), to see how often the use of further testing changes the diagnosis. With IRB approval, 865 patients were identified through ED requisitions for CT head with the indication of headache during the calendar year 2011. Exclusion criteria included head trauma, prior intracranial surgery, focal neurologic symptoms, and known intracranial mass. CT results were divided into three categories: P0, P1, and P2. Negative studies were graded as P0. Positive studies were subdivided into clinically insignificant or P1 and clinically significant or P2. Clinically significant was defined as requiring medical treatment. Subsequently, the electronic medical records and picture archiving and communication system (PACS) were reviewed to determine the incidence of follow-up exams, including lumbar puncture or imaging. The secondary tests were divided into the same P0, P1, and P2 categories. There were 254 positive studies: P1 clinically insignificant (27.1 %, 235/865) and P2 clinically significant (2.2 %, 19/865). Of 257 follow-up exams performed, the majority were lumbar punctures (36.0 %) or CT angiograms (29.5 %). In 19/257 exams or 7.4 %, the additional testing changed the clinically insignificant (P0/P1) diagnosis to a significant (P2) result. At our institution, there was a 2.2 % incidence of significant positive CT findings in patients presenting to the ED with non-traumatic headache. Follow-up testing was variable and resulted in a 7.4 % increase in the severity of diagnosis compared to the initial negative CT scan.

  19. United States Department of Energy Thermally Activated Heat Pump Program

    SciTech Connect

    Fiskum, R.J.; Adcock, P.W.; DeVault, R.C.

    1996-06-01

    The US Department of Energy (DOE) is working with partners from the gas heating and cooling industry to improve energy efficiency using advance absorption technologies, to eliminate chlorofluorocarbons (CFCs) and hydrochlorofluorocarbons (HCFCs), to reduce global warming through more efficient combustion of natural gas, and to impact electric peak demand of air conditioning. To assist industry in developing these gas heating and cooling absorption technologies, the US DOE sponsors the Thermally Activated Heat Pump Program. It is divided into five key activities, addressing residential gas absorption heat pumps, large commercial chillers, advanced absorption fluids, computer-aided design, and advanced ``Hi-Cool`` heat pumps.

  20. Local Labor Management Relationships as a Vehicle to Advance Reform: Findings from the U.S. Department of Education's Labor Management Conference

    ERIC Educational Resources Information Center

    Eckert, Jonathan; Houtchens, Bobbi Ciriza; Garcia, Antero; Greer, Nicholas; Khachatryan, Edit; Liou, James; Owens, Steve; Raphael, Leah; Romero, Elaine; Taylor, Katie; Ulmer, Jasmine; VanDusen, Tracey; Yaron, Linda

    2011-01-01

    In February 2011, the U.S. Department of Education (ED)--along with co-sponsors from the American Association of School Administrators, the American Federation of Teachers, the Council of the Great City Schools, the Federal Mediation and Conciliation Service, the National Education Association, and the National School Boards Association--brought…

  1. Sensing with Advanced Computing Technology: Fin Field-Effect Transistors with High-k Gate Stack on Bulk Silicon.

    PubMed

    Rigante, Sara; Scarbolo, Paolo; Wipf, Mathias; Stoop, Ralph L; Bedner, Kristine; Buitrago, Elizabeth; Bazigos, Antonios; Bouvet, Didier; Calame, Michel; Schönenberger, Christian; Ionescu, Adrian M

    2015-05-26

    Field-effect transistors (FETs) form an established technology for sensing applications. However, recent advancements and use of high-performance multigate metal-oxide semiconductor FETs (double-gate, FinFET, trigate, gate-all-around) in computing technology, instead of bulk MOSFETs, raise new opportunities and questions about the most suitable device architectures for sensing integrated circuits. In this work, we propose pH and ion sensors exploiting FinFETs fabricated on bulk silicon by a fully CMOS compatible approach, as an alternative to the widely investigated silicon nanowires on silicon-on-insulator substrates. We also provide an analytical insight of the concept of sensitivity for the electronic integration of sensors. N-channel fully depleted FinFETs with critical dimensions on the order of 20 nm and HfO2 as a high-k gate insulator have been developed and characterized, showing excellent electrical properties, subthreshold swing, SS ∼ 70 mV/dec, and on-to-off current ratio, Ion/Ioff ∼ 10(6), at room temperature. The same FinFET architecture is validated as a highly sensitive, stable, and reproducible pH sensor. An intrinsic sensitivity close to the Nernst limit, S = 57 mV/pH, is achieved. The pH response in terms of output current reaches Sout = 60%. Long-term measurements have been performed over 4.5 days with a resulting drift in time δVth/δt = 0.10 mV/h. Finally, we show the capability to reproduce experimental data with an extended three-dimensional commercial finite element analysis simulator, in both dry and wet environments, which is useful for future advanced sensor design and optimization.

  2. Computational Science and Innovation

    SciTech Connect

    Dean, David Jarvis

    2011-01-01

    Simulations - utilizing computers to solve complicated science and engineering problems - are a key ingredient of modern science. The U.S. Department of Energy (DOE) is a world leader in the development of high-performance computing (HPC), the development of applied math and algorithms that utilize the full potential of HPC platforms, and the application of computing to science and engineering problems. An interesting general question is whether the DOE can strategically utilize its capability in simulations to advance innovation more broadly. In this article, I will argue that this is certainly possible.

  3. Basic Energy Sciences Exascale Requirements Review. An Office of Science review sponsored jointly by Advanced Scientific Computing Research and Basic Energy Sciences, November 3-5, 2015, Rockville, Maryland

    SciTech Connect

    Clark, Aurora; Millis, Andy; Gagliardi, Laura; Panagiotopoulos, Thanos; Siepmann, Ilja; Wolverton, Chris; Vashishta, Priya; Stevens, Mark; Gordon, Mark; Kent, Paul; va DAm, Kerstin Kleese; Proffen, Thomas; Tull, Craig; Diachin, Lori; Sethian, Jamie; Benali, Anouar; Chen, Jackie; Antypas, Katie; Gerber, Richard; Riley, Katherine; Straatsma, Tjerk

    2015-12-31

    Computers have revolutionized every aspect of our lives. Yet in science, the most tantalizing applications of computing lie just beyond our reach. The current quest to build an exascale computer with one thousand times the capability of today’s fastest machines (and more than a million times that of a laptop) will take researchers over the next horizon. The field of materials, chemical reactions, and compounds is inherently complex. Imagine millions of new materials with new functionalities waiting to be discovered — while researchers also seek to extend those materials that are known to a dizzying number of new forms. We could translate massive amounts of data from high precision experiments into new understanding through data mining and analysis. We could have at our disposal the ability to predict the properties of these materials, to follow their transformations during reactions on an atom-by-atom basis, and to discover completely new chemical pathways or physical states of matter. Extending these predictions from the nanoscale to the mesoscale, from the ultrafast world of reactions to long-time simulations to predict the lifetime performance of materials, and to the discovery of new materials and processes will have a profound impact on energy technology. In addition, discovery of new materials is vital to move computing beyond Moore’s law. To realize this vision, more than hardware is needed. New algorithms to take advantage of the increase in computing power, new programming paradigms, and new ways of mining massive data sets are needed as well. This report summarizes the opportunities and the requisite computing ecosystem needed to realize the potential before us. In addition to pursuing new and more complete physical models and theoretical frameworks, this review found that the following broadly grouped areas relevant to the U.S. Department of Energy (DOE) Office of Advanced Scientific Computing Research (ASCR) would directly affect the Basic Energy

  4. Fluid/Structure Interaction Computational Investigation of Blast-Wave Mitigation Efficacy of the Advanced Combat Helmet

    NASA Astrophysics Data System (ADS)

    Grujicic, M.; Bell, W. C.; Pandurangan, B.; Glomski, P. S.

    2011-08-01

    To combat the problem of traumatic brain injury (TBI), a signature injury of the current military conflicts, there is an urgent need to design head protection systems with superior blast/ballistic impact mitigation capabilities. Toward that end, the blast impact mitigation performance of an advanced combat helmet (ACH) head protection system equipped with polyurea suspension pads and subjected to two different blast peak pressure loadings has been investigated computationally. A fairly detailed (Lagrangian) finite-element model of a helmet/skull/brain assembly is first constructed and placed into an Eulerian air domain through which a single planar blast wave propagates. A combined Eulerian/Lagrangian transient nonlinear dynamics computational fluid/solid interaction analysis is next conducted in order to assess the extent of reduction in intra-cranial shock-wave ingress (responsible for TBI). This was done by comparing temporal evolutions of intra-cranial normal and shear stresses for the cases of an unprotected head and the helmet-protected head and by correlating these quantities with the three most common types of mild traumatic brain injury (mTBI), i.e., axonal damage, contusion, and subdural hemorrhage. The results obtained show that the ACH provides some level of protection against all investigated types of mTBI and that the level of protection increases somewhat with an increase in blast peak pressure. In order to rationalize the aforementioned findings, a shockwave propagation/reflection analysis is carried out for the unprotected head and helmet-protected head cases. The analysis qualitatively corroborated the results pertaining to the blast-mitigation efficacy of an ACH, but also suggested that there are additional shockwave energy dissipation phenomena which play an important role in the mechanical response of the unprotected/protected head to blast impact.

  5. Advanced Artificial Science. The development of an artificial science and engineering research infrastructure to facilitate innovative computational modeling, analysis, and application to interdisciplinary areas of scientific investigation.

    SciTech Connect

    Saffer, Shelley I.

    2014-12-01

    This is a final report of the DOE award DE-SC0001132, Advanced Artificial Science. The development of an artificial science and engineering research infrastructure to facilitate innovative computational modeling, analysis, and application to interdisciplinary areas of scientific investigation. This document describes the achievements of the goals, and resulting research made possible by this award.

  6. TAIGA: Twente Advanced Interactive Graphic Authoring System. A New Concept in Computer Assisted Learning (CAL) and Educational Research. Doc 88-18.

    ERIC Educational Resources Information Center

    Pilot, A.

    TAIGA (Twente Advanced Interactive Graphic Authoring system) is a system which can be used to develop instructional software. It is written in MS-PASCAL, and runs on computers that support MS-DOS. Designed to support the production of structured software, TAIGA has a hierarchical structure of three layers, each with a specific function, and each…

  7. Real geometry gyrokinetic PIC computations of ion turbulence in advanced tokamak discharges with SUMMIT/PG3EQ_/NC

    NASA Astrophysics Data System (ADS)

    Leboeuf, Jean-Noel; Decyk, Viktor; Rhodes, Terry; Dimits, Andris; Shumaker, Dan

    2006-04-01

    The PG3EQ_/NC module within the SUMMIT Gyrokinetic PIC FORTRAN90 Framework makes possible 3D nonlinear toroidal computations of ion turbulence in the real geometry of DIII-D discharges. This is accomplished with the use of local, field line following, quasi-ballooning coordinates and through a direct interface with DIII-D equilibrium data via the EFIT and ONETWO codes, as well as Holger Saint John's PLOTEQ code for the (R, Z) position of each flux surface. The effect of real geometry is being elucidated with CYCLONE shot 81499 by comparing results from PGEQ_/NC to those of its circular counterpart. The PG3EQ_/NC module is also being used to model ion channel turbulence in advanced tokamak discharges 118561 and 120327. Linear results will be compared to growth rate calculations with the GKS code. Nonlinear results will also be compared with scattering measurements of turbulence, as well as with accessible measurements of fluctuation amplitudes and spectra from other diagnostics.

  8. Review of Evaluative Mechanisms in the Departments of Advanced Education and Labour and Human Resources Development--New Brunswick = Examen des mecanismes d'evaluation au ministere de l'Enseignement superieur et du Travail et au ministere du Developpement des Ressources humaines du Nouveau-Brunswick.

    ERIC Educational Resources Information Center

    New Brunswick Labour Force Development Board, Fredericton.

    The evaluative mechanisms in the Department of Advanced Education and Labour and Department of Human Resources Development in the Canadian province of New Brunswick were reviewed. Data were gathered from the following: meetings with key staff in each department, briefing session for all key informants, 19 personal interviews, brief review of the…

  9. Advanced Computing for Manufacturing.

    ERIC Educational Resources Information Center

    Erisman, Albert M.; Neves, Kenneth W.

    1987-01-01

    Discusses ways that supercomputers are being used in the manufacturing industry, including the design and production of airplanes and automobiles. Describes problems that need to be solved in the next few years for supercomputers to assume a major role in industry. (TW)

  10. NIH support of Centers for AIDS Research and Department of Health Collaborative Public Health Research: advancing CDC's Enhanced Comprehensive HIV Prevention Planning project.

    PubMed

    Greenberg, Alan E; Purcell, David W; Gordon, Christopher M; Flores, Stephen; Grossman, Cynthia; Fisher, Holly H; Barasky, Rebecca J

    2013-11-01

    The contributions reported in this supplemental issue highlight the relevance of NIH-funded CEWG research to health department–supported HIV prevention and care activities in the 9 US cities with the highest numbers of AIDS cases. The project findings have the potential to enhance ongoing HIV treatment and care services and to advance the wider scientific agenda. The HIV testing to care continuum, while providing a framework to help track progress on national goals, also can reflect the heterogeneities of local epidemics. The collaborative research that is highlighted in this issue not only reflects a locally driven research agenda but also demonstrates research methods, data collection tools, and collaborative processes that could be encouraged across jurisdictions. Projects such as these, capitalizing on the integrated efforts of NIH, CDC, DOH, and academic institutions, have the potential to contribute to improvements in the HIV care continuum in these communities, bringing us closer to realizing the HIV prevention and treatment goals of the NHAS.

  11. Advanced Hydrogen Turbine Development

    SciTech Connect

    Marra, John

    2015-09-30

    Under the sponsorship of the U.S. Department of Energy (DOE) National Energy Technology Laboratories, Siemens has completed the Advanced Hydrogen Turbine Development Program to develop an advanced gas turbine for incorporation into future coal-based Integrated Gasification Combined Cycle (IGCC) plants. All the scheduled DOE Milestones were completed and significant technical progress was made in the development of new technologies and concepts. Advanced computer simulations and modeling, as well as subscale, full scale laboratory, rig and engine testing were utilized to evaluate and select concepts for further development. Program Requirements of: A 3 to 5 percentage point improvement in overall plant combined cycle efficiency when compared to the reference baseline plant; 20 to 30 percent reduction in overall plant capital cost when compared to the reference baseline plant; and NOx emissions of 2 PPM out of the stack. were all met. The program was completed on schedule and within the allotted budget

  12. Integrating Remote Sensing Data, Hybrid-Cloud Computing, and Event Notifications for Advanced Rapid Imaging & Analysis (Invited)

    NASA Astrophysics Data System (ADS)

    Hua, H.; Owen, S. E.; Yun, S.; Lundgren, P.; Fielding, E. J.; Agram, P.; Manipon, G.; Stough, T. M.; Simons, M.; Rosen, P. A.; Wilson, B. D.; Poland, M. P.; Cervelli, P. F.; Cruz, J.

    2013-12-01

    Space-based geodetic measurement techniques such as Interferometric Synthetic Aperture Radar (InSAR) and Continuous Global Positioning System (CGPS) are now important elements in our toolset for monitoring earthquake-generating faults, volcanic eruptions, hurricane damage, landslides, reservoir subsidence, and other natural and man-made hazards. Geodetic imaging's unique ability to capture surface deformation with high spatial and temporal resolution has revolutionized both earthquake science and volcanology. Continuous monitoring of surface deformation and surface change before, during, and after natural hazards improves decision-making from better forecasts, increased situational awareness, and more informed recovery. However, analyses of InSAR and GPS data sets are currently handcrafted following events and are not generated rapidly and reliably enough for use in operational response to natural disasters. Additionally, the sheer data volumes needed to handle a continuous stream of InSAR data sets also presents a bottleneck. It has been estimated that continuous processing of InSAR coverage of California alone over 3-years would reach PB-scale data volumes. Our Advanced Rapid Imaging and Analysis for Monitoring Hazards (ARIA-MH) science data system enables both science and decision-making communities to monitor areas of interest with derived geodetic data products via seamless data preparation, processing, discovery, and access. We will present our findings on the use of hybrid-cloud computing to improve the timely processing and delivery of geodetic data products, integrating event notifications from USGS to improve the timely processing for response, as well as providing browse results for quick looks with other tools for integrative analysis.

  13. A comparison between ten advanced and soft computing models for groundwater qanat potential assessment in Iran using R and GIS

    NASA Astrophysics Data System (ADS)

    Naghibi, Seyed Amir; Pourghasemi, Hamid Reza; Abbaspour, Karim

    2017-01-01

    Considering the unstable condition of water resources in Iran and many other countries in arid and semi-arid regions, groundwater studies are very important. Therefore, the aim of this study is to model groundwater potential by qanat locations as indicators and ten advanced and soft computing models applied to the Beheshtabad Watershed, Iran. Qanat is a man-made underground construction which gathers groundwater from higher altitudes and transmits it to low land areas where it can be used for different purposes. For this purpose, at first, the location of the qanats was detected using extensive field surveys. These qanats were classified into two datasets including training (70%) and validation (30%). Then, 14 influence factors depicting the region's physical, morphological, lithological, and hydrological features were identified to model groundwater potential. Linear discriminant analysis (LDA), quadratic discriminant analysis (QDA), flexible discriminant analysis (FDA), penalized discriminant analysis (PDA), boosted regression tree (BRT), random forest (RF), artificial neural network (ANN), K-nearest neighbor (KNN), multivariate adaptive regression splines (MARS), and support vector machine (SVM) models were applied in R scripts to produce groundwater potential maps. For evaluation of the performance accuracies of the developed models, ROC curve and kappa index were implemented. According to the results, RF had the best performance, followed by SVM and BRT models. Our results showed that qanat locations could be used as a good indicator for groundwater potential. Furthermore, altitude, slope, plan curvature, and profile curvature were found to be the most important influence factors. On the other hand, lithology, land use, and slope aspect were the least significant factors. The methodology in the current study could be used by land use and terrestrial planners and water resource managers to reduce the costs of groundwater resource discovery.

  14. Building Partnerships Between Research Institutions, University Academic Departments, Local School Districts, and Private Enterprise to Advance K-12 Science Education in Texas

    NASA Astrophysics Data System (ADS)

    Ellins, K. K.; Ganey-Curry, P.; Fennell, T.

    2003-12-01

    The University of Texas at Austin Institute for Geophysics (UTIG) is engaged in six K-12 education and outreach programs, including two NSF-sponsored projects--GK-12: Linking Graduate Fellows with K-12 Students and Teachers and Cataclysms and Catastrophes--Texas Teachers in the Field, Adopt-a-School, Geoscience in the Classroom, and UT's Science and Engineering Apprenticeship Program. The GK-12 Program is central to UTIG's effort and links the six education projects together. While the specific objectives of each project differ, the broad goals of UTIG's education and outreach are to provide high-quality professional development for teachers, develop curriculum resources aligned with state and national education standards, and promote interaction between teachers, scientists, graduate students, and science educators. To achieve these goals, UTIG has forged funded partnerships with scientific colleagues at UT's Bureau of Economic Geology, Marine Science Institute and Department of Geological Sciences; science educators at UT's Charles A. Dana Center and in the Department of Curriculum and Instruction in the College of Education; teachers in six Texas independent school districts; and 4empowerment.com, a private education company that established the "Cyberways and Waterways" Web site to integrate technology and education through an environmentally-based curriculum. These partnerships have allowed UTIG to achieve far more than would have been possible through individual projects alone. Examples include the development of more than 30 inquiry-based activities, hosting workshops and a summer institute, and participation in local science fairs. UTIG has expanded the impact of its education and outreach and achieved broader dissemination of learning activities through 4empowerment's web-based programs, which reach ethnically diverse students in schools across Texas. These partnerships have also helped UTIG and 4empowerment to secure additional funding for other education

  15. Advanced information processing system for advanced launch system: Avionics architecture synthesis

    NASA Technical Reports Server (NTRS)

    Lala, Jaynarayan H.; Harper, Richard E.; Jaskowiak, Kenneth R.; Rosch, Gene; Alger, Linda S.; Schor, Andrei L.

    1991-01-01

    The Advanced Information Processing System (AIPS) is a fault-tolerant distributed computer system architecture that was developed to meet the real time computational needs of advanced aerospace vehicles. One such vehicle is the Advanced Launch System (ALS) being developed jointly by NASA and the Department of Defense to launch heavy payloads into low earth orbit at one tenth the cost (per pound of payload) of the current launch vehicles. An avionics architecture that utilizes the AIPS hardware and software building blocks was synthesized for ALS. The AIPS for ALS architecture synthesis process starting with the ALS mission requirements and ending with an analysis of the candidate ALS avionics architecture is described.

  16. Development and validation of burnup dependent computational schemes for the analysis of assemblies with advanced lattice codes

    NASA Astrophysics Data System (ADS)

    Ramamoorthy, Karthikeyan

    The main aim of this research is the development and validation of computational schemes for advanced lattice codes. The advanced lattice code which forms the primary part of this research is "DRAGON Version4". The code has unique features like self shielding calculation with capabilities to represent distributed and mutual resonance shielding effects, leakage models with space-dependent isotropic or anisotropic streaming effect, availability of the method of characteristics (MOC), burnup calculation with reaction-detailed energy production etc. Qualified reactor physics codes are essential for the study of all existing and envisaged designs of nuclear reactors. Any new design would require a thorough analysis of all the safety parameters and burnup dependent behaviour. Any reactor physics calculation requires the estimation of neutron fluxes in various regions of the problem domain. The calculation goes through several levels before the desired solution is obtained. Each level of the lattice calculation has its own significance and any compromise at any step will lead to poor final result. The various levels include choice of nuclear data library and energy group boundaries into which the multigroup library is cast; self shielding of nuclear data depending on the heterogeneous geometry and composition; tracking of geometry, keeping error in volume and surface to an acceptable minimum; generation of regionwise and groupwise collision probabilities or MOC-related information and their subsequent normalization thereof, solution of transport equation using the previously generated groupwise information and obtaining the fluxes and reaction rates in various regions of the lattice; depletion of fuel and of other materials based on normalization with constant power or constant flux. Of the above mentioned levels, the present research will mainly focus on two aspects, namely self shielding and depletion. The behaviour of the system is determined by composition of resonant

  17. Modern meteorological computing resources - The Maryland experience

    NASA Technical Reports Server (NTRS)

    Huffman, George J.

    1988-01-01

    The Department of Meteorology at the University of Maryland is developing one of the first computer systems in meteorology to take advantage of the new networked computer architecture that has been made possible by recent advances in computer and communication technology. Elements of the department's system include scientific workstations, local mainframe computers, remote mainframe computers, local-area networks,'long-haul' computer-to-computer communications, and 'receive-only' communications. Some background is provided, together with highlights of some lessons that were learned in carrying out the design. In agreement with work in the Unidata Project, this work shows that the networked computer architecture discussed here presents a new style of resources for solving problems that arise in meteorological research and education.

  18. Department of Energy Project ER25739 Final Report QoS-Enabled, High-performance Storage Systems for Data-Intensive Scientific Computing

    SciTech Connect

    Rangaswami, Raju

    2009-05-31

    This project's work resulted in the following research projects: (1) BORG - Block-reORGanization for Self-optimizing Storage Systems; (2) ABLE - Active Block Layer Extensions; (3) EXCES - EXternal Caching in Energy-Saving Storage Systems; (4) GRIO - Guaranteed-Rate I/O Scheduler. These projects together help in substantially advancing the over-arching project goal of developing 'QoS-Enabled, High-Performance Storage Systems'.

  19. Improving Departments of Psychology.

    PubMed

    Diener, Ed

    2016-11-01

    Our procedures for creating excellent departments of psychology are based largely on selection-hiring and promoting the best people. I argue that these procedures have been successful, but I suggest the implementation of policies that I believe will further improve departments in the behavioral and brain sciences. I recommend that we institute more faculty development programs attached to incentives to guarantee continuing education and scholarly activities after the Ph.D. degree. I also argue that we would do a much better job if we more strongly stream our faculty into research, education, or service and not expect all faculty members to carry equal responsibility for each of these. Finally, I argue that more hiring should occur at advanced levels, where scholars have a proven track record of independent scholarship. Although these practices will be a challenge to implement, institutions do ossify over time and thus searching for ways to improve our departments should be a key element of faculty governance.

  20. Computational Aerodynamic Simulations of an 840 ft/sec Tip Speed Advanced Ducted Propulsor Fan System Model for Acoustic Methods Assessment and Development

    NASA Technical Reports Server (NTRS)

    Tweedt, Daniel L.

    2014-01-01

    Computational Aerodynamic simulations of an 840 ft/sec tip speed, Advanced Ducted Propulsor fan system were performed at five different operating points on the fan operating line, in order to provide detailed internal flow field information for use with fan acoustic prediction methods presently being developed, assessed and validated. The fan system is a sub-scale, lownoise research fan/nacelle model that has undergone extensive experimental testing in the 9- by 15- foot Low Speed Wind Tunnel at the NASA Glenn Research Center, resulting in quality, detailed aerodynamic and acoustic measurement data. Details of the fan geometry, the computational fluid dynamics methods, the computational grids, and various computational parameters relevant to the numerical simulations are discussed. Flow field results for three of the five operating conditions simulated are presented in order to provide a representative look at the computed solutions. Each of the five fan aerodynamic simulations involved the entire fan system, excluding a long core duct section downstream of the core inlet guide vane. As a result, only fan rotational speed and system bypass ratio, set by specifying static pressure downstream of the core inlet guide vane row, were adjusted in order to set the fan operating point, leading to operating points that lie on a fan operating line and making mass flow rate a fully dependent parameter. The resulting mass flow rates are in good agreement with measurement values. The computed blade row flow fields for all five fan operating points are, in general, aerodynamically healthy. Rotor blade and fan exit guide vane flow characteristics are good, including incidence and deviation angles, chordwise static pressure distributions, blade surface boundary layers, secondary flow structures, and blade wakes. Examination of the computed flow fields reveals no excessive boundary layer separations or related secondary-flow problems. A few spanwise comparisons between

  1. Application of Robust Design and Advanced Computer Aided Engineering Technologies: Cooperative Research and Development Final Report, CRADA Number CRD-04-143

    SciTech Connect

    Thornton, M.

    2013-06-01

    Oshkosh Corporation (OSK) is taking an aggressive approach to implementing advanced technologies, including hybrid electric vehicle (HEV) technology, throughout their commercial and military product lines. These technologies have important implications for OSK's commercial and military customers, including fleet fuel efficiency, quiet operational modes, additional on-board electric capabilities, and lower thermal signature operation. However, technical challenges exist with selecting the optimal HEV components and design to work within the performance and packaging constraints of specific vehicle applications. SK desires to use unique expertise developed at the Department of Energy?s (DOE) National Renewable Energy Laboratory (NREL), including HEV modeling and simulation. These tools will be used to overcome technical hurdles to implementing advanced heavy vehicle technology that meet performance requirements while improving fuel efficiency.

  2. Investigation of Advanced Counterrotation Blade Configuration Concepts for High Speed Turboprop Systems. Task 3: Advanced Fan Section Grid Generator Final Report and Computer Program User's Manual

    NASA Technical Reports Server (NTRS)

    Crook, Andrew J.; Delaney, Robert A.

    1991-01-01

    A procedure is studied for generating three-dimensional grids for advanced turbofan engine fan section geometries. The procedure constructs a discrete mesh about engine sections containing the fan stage, an arbitrary number of axisymmetric radial flow splitters, a booster stage, and a bifurcated core/bypass flow duct with guide vanes. The mesh is an h-type grid system, the points being distributed with a transfinite interpolation scheme with axial and radial spacing being user specified. Elliptic smoothing of the grid in the meridional plane is a post-process option. The grid generation scheme is consistent with aerodynamic analyses utilizing the average-passage equation system developed by Dr. John Adamczyk of NASA Lewis. This flow solution scheme requires a series of blade specific grids each having a common axisymmetric mesh, but varying in the circumferential direction according to the geometry of the specific blade row.

  3. 76 FR 56744 - Privacy Act of 1974; Notice of a Computer Matching Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-14

    ... of the Secretary Privacy Act of 1974; Notice of a Computer Matching Program AGENCY: Defense Manpower Data Center, Department of Defense (DoD). ACTION: Notice of a computer matching program. SUMMARY... advance notice of any proposed or revised computer matching program by the matching agency for...

  4. 76 FR 77811 - Privacy Act of 1974; Notice of a Computer Matching Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-14

    ... of the Secretary Privacy Act of 1974; Notice of a Computer Matching Program AGENCY: Defense Manpower Data Center, Department of Defense (DoD). ACTION: Notice of a Computer Matching Program. SUMMARY... advance notice of any proposed or revised computer matching program by the matching agency for...

  5. 76 FR 50460 - Privacy Act of 1974; Notice of a Computer Matching Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-15

    ... of the Secretary Privacy Act of 1974; Notice of a Computer Matching Program AGENCY: Defense Manpower Data Center, Department of Defense (DoD). ACTION: Notice of a Computer Matching Program. SUMMARY... advance notice of any proposed or revised computer matching program by the matching agency for...

  6. Computational modeling of the amphibian thyroid axis supported by targeted in vivo testing to advance quantitative adverse outcome pathway development

    EPA Science Inventory

    In vitro screening of chemicals for bioactivity together with computational modeling are beginning to replace animal toxicity testing in support of chemical risk assessment. To facilitate this transition, an amphibian thyroid axis model has been developed to describe thyroid home...

  7. Advances in time-domain electromagnetic simulation capabilities through the use of overset grids and massively parallel computing

    NASA Astrophysics Data System (ADS)

    Blake, Douglas Clifton

    A new methodology is presented for conducting numerical simulations of electromagnetic scattering and wave-propagation phenomena on massively parallel computing platforms. A process is constructed which is rooted in the Finite-Volume Time-Domain (FVTD) technique to create a simulation capability that is both versatile and practical. In terms of versatility, the method is platform independent, is easily modifiable, and is capable of solving a large number of problems with no alterations. In terms of practicality, the method is sophisticated enough to solve problems of engineering significance and is not limited to mere academic exercises. In order to achieve this capability, techniques are integrated from several scientific disciplines including computational fluid dynamics, computational electromagnetics, and parallel computing. The end result is the first FVTD solver capable of utilizing the highly flexible overset-gridding process in a distributed-memory computing environment. In the process of creating this capability, work is accomplished to conduct the first study designed to quantify the effects of domain-decomposition dimensionality on the parallel performance of hyperbolic partial differential equations solvers; to develop a new method of partitioning a computational domain comprised of overset grids; and to provide the first detailed assessment of the applicability of overset grids to the field of computational electromagnetics. Using these new methods and capabilities, results from a large number of wave propagation and scattering simulations are presented. The overset-grid FVTD algorithm is demonstrated to produce results of comparable accuracy to single-grid simulations while simultaneously shortening the grid-generation process and increasing the flexibility and utility of the FVTD technique. Furthermore, the new domain-decomposition approaches developed for overset grids are shown to be capable of producing partitions that are better load balanced and

  8. Patient understanding of radiation risk from medical computed tomography—A comparison of Hispanic vs. non-Hispanic emergency department populations

    PubMed Central

    McNierney-Moore, Afton; Smith, Cynthia; Guardiola, Jose; Xu, K. Tom

    2015-01-01

    Background. Cultural differences and language barriers may adversely impact patients with respect to understanding the risks/benefits of medical testing. Objective. We hypothesized that there would be no difference in Hispanic vs. non-Hispanic patients’ knowledge of radiation risk that results from CT of the abdomen/pelvis (CTAP). Methods. We enrolled a convenience sample of adults at an inner-city emergency department (ED). Patients provided written answers to rate agreement on a 10-point scale for two correct statements comparing radiation exposure equality between: CTAP and 5 years of background radiation (question 1); CTAP and 200 chest x-rays (question 3). Patients also rated their agreement that multiple CT scans increase the lifetime cancer risk (question 2). Scores of >8 were considered good knowledge. Multivariate logistic regression analyses were performed to estimate the independent effect of the Hispanic variable. Results. 600 patients in the study group; 63% Hispanic, mean age 39.2 ± 13.9 years. Hispanics and non-Hispanics whites were similar with respect to good knowledge-level answers to question 1 (17.3 vs. 15.1%; OR = 1.2; 95% CI [0.74–2.0]), question 2 (31.2 vs. 39.3%; OR = 0.76; 95% CI [0.54–1.1]), and question 3 (15.2 vs. 16.5%; OR = 1.1; 95% CI [0.66–1.8]). Compared to patients who earned <20,000, patients with income >40,000 were more likely to answer question 2 with good knowledge (OR = 1.96; 95% CI [1.2–3.1]). Conclusion. The study group’s overall knowledge of radiation risk was poor, but we did not find significant differences between Hispanic vs. non-Hispanic patients. PMID:26019999

  9. Computer sciences

    NASA Technical Reports Server (NTRS)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  10. Volumetric three-dimensional computed tomographic evaluation of the upper airway in patients with obstructive sleep apnoea syndrome treated by maxillomandibular advancement.

    PubMed

    Bianchi, Alberto; Betti, Enrico; Tarsitano, Achille; Morselli-Labate, Antonio Maria; Lancellotti, Lorenzo; Marchetti, Claudio

    2014-11-01

    Obstructive sleep apnoea syndrome is the periodic reduction or cessation of airflow during sleep together with daytime sleepiness. Its diagnosis requires polysomnographic evidence of 5 or more episodes of apnoea or hypopnoea/hour of sleep (apnoea/hypopnoea index, AHI). Volumetric 3-dimensional computed tomographic (CT) reconstruction enables the accurate measurement of the volume of the airway. Nasal continuous positive airway pressure (CPAP) is the conventional non-surgical treatment for patients with severe disease. Operations on the soft tissues that are currently available give success rates of only 40%-60%. Maxillomandibular advancement is currently the most effective craniofacial surgical technique for the treatment of obstructive sleep apnoea in adults. However, the appropriate distance for advancement has not been established. Expansion of the air-flow column volume did not result in an additional reduction in AHI, which raises the important issue of how much the maxillomandibular complex should be advanced to obtain an adequate reduction in AHI while avoiding the risks of overexpansion or underexpansion. We have shown that there is a significant linear relation between increased absolute upper airway volume after advancement and improvement in the AHI (p=0.013). However, increases in upper airway volume of 70% or more achieved no further reduction in the AHI, which suggests that the clinical improvement in AHI reaches a plateau, and renders further expansion unnecessary. This gives a new perspective to treatment based on the prediction of changes in volume, so the amount of sagittal advancement can be tailored in each case, which replaces the current standard of 1cm.

  11. A site oriented supercomputer for theoretical physics: The Fermilab Advanced Computer Program Multi Array Processor System (ACMAPS)

    SciTech Connect

    Nash, T.; Atac, R.; Cook, A.; Deppe, J.; Fischler, M.; Gaines, I.; Husby, D.; Pham, T.; Zmuda, T.; Eichten, E.

    1989-03-06

    The ACPMAPS multipocessor is a highly cost effective, local memory parallel computer with a hypercube or compound hypercube architecture. Communication requires the attention of only the two communicating nodes. The design is aimed at floating point intensive, grid like problems, particularly those with extreme computing requirements. The processing nodes of the system are single board array processors, each with a peak power of 20 Mflops, supported by 8 Mbytes of data and 2 Mbytes of instruction memory. The system currently being assembled has a peak power of 5 Gflops. The nodes are based on the Weitek XL Chip set. The system delivers performance at approximately $300/Mflop. 8 refs., 4 figs.

  12. Advanced Technology Airfoil Research, volume 1, part 1. [conference on development of computational codes and test facilities

    NASA Technical Reports Server (NTRS)

    1979-01-01

    A comprehensive review of all NASA airfoil research, conducted both in-house and under grant and contract, as well as a broad spectrum of airfoil research outside of NASA is presented. Emphasis is placed on the development of computational aerodynamic codes for airfoil analysis and design, the development of experimental facilities and test techniques, and all types of airfoil applications.

  13. Employing a Structured Interface to Advance Primary Students' Communicative Competence in a Text-Based Computer Mediated Environment

    ERIC Educational Resources Information Center

    Chiu, Chiung-Hui; Wu, Chiu-Yi; Hsieh, Sheng-Jieh; Cheng, Hsiao-Wei; Huang, Chung-Kai

    2013-01-01

    This study investigated whether a structured communication interface fosters primary students' communicative competence in a synchronous typewritten computer-mediated collaborative learning environment. The structured interface provided a set of predetermined utterance patterns for elementary students to use or imitate to develop communicative…

  14. Writing Teachers Writing Software: Creating Our Place in the Electronic Age. Advances in Computers and Composition on Studies Series.

    ERIC Educational Resources Information Center

    LeBlanc, Paul J.

    Presenting a comprehensive look at (and critical history of) computer-aided composition (CAC), this book focuses on faculty development of software for composition studies. The book describes who is building these writing tools, how they are doing so, how their work is being received, and what is likely to affect their efforts in the future.…

  15. Development and validation of a computational finite element model of the rabbit upper airway: simulations of mandibular advancement and tracheal displacement.

    PubMed

    Amatoury, Jason; Cheng, Shaokoon; Kairaitis, Kristina; Wheatley, John R; Amis, Terence C; Bilston, Lynne E

    2016-04-01

    The mechanisms leading to upper airway (UA) collapse during sleep are complex and poorly understood. We previously developed an anesthetized rabbit model for studying UA physiology. On the basis of this body of physiological data, we aimed to develop and validate a two-dimensional (2D) computational finite element model (FEM) of the passive rabbit UA and peripharyngeal tissues. Model geometry was reconstructed from a midsagittal computed tomographic image of a representative New Zealand White rabbit, which included major soft (tongue, soft palate, constrictor muscles), cartilaginous (epiglottis, thyroid cartilage), and bony pharyngeal tissues (mandible, hard palate, hyoid bone). Other UA muscles were modeled as linear elastic connections. Initial boundary and contact definitions were defined from anatomy and material properties derived from the literature. Model parameters were optimized to physiological data sets associated with mandibular advancement (MA) and caudal tracheal displacement (TD), including hyoid displacement, which featured with both applied loads. The model was then validated against independent data sets involving combined MA and TD. Model outputs included UA lumen geometry, peripharyngeal tissue displacement, and stress and strain distributions. Simulated MA and TD resulted in UA enlargement and nonuniform increases in tissue displacement, and stress and strain. Model predictions closely agreed with experimental data for individually applied MA, TD, and their combination. We have developed and validated an FEM of the rabbit UA that predicts UA geometry and peripharyngeal tissue mechanical changes associated with interventions known to improve UA patency. The model has the potential to advance our understanding of UA physiology and peripharyngeal tissue mechanics.

  16. Computer-based written emotional disclosure: the effects of advance or real-time guidance and moderation by Big 5 personality traits.

    PubMed

    Beyer, Jonathan A; Lumley, Mark A; Latsch, Deborah V; Oberleitner, Lindsay M S; Carty, Jennifer N; Radcliffe, Alison M

    2014-01-01

    Standard written emotional disclosure (WED) about stress, which is private and unguided, yields small health benefits. The effect of providing individualized guidance to writers may enhance WED, but has not been tested. This trial of computer-based WED compared two novel therapist-guided forms of WED - advance guidance (before sessions) and real-time guidance (during sessions, through instant messaging) - to both standard WED and control writing; it also tested Big 5 personality traits as moderators of guided WED. Young adult participants (n = 163) with unresolved stressful experiences were randomized to conditions, had three, 30-min computer-based writing sessions, and were reassessed six weeks later. Contrary to hypotheses, real-time guidance WED had poorer outcomes than the other conditions on several measures, and advance guidance WED also showed some poorer outcomes. Moderator analyses revealed that participants with low baseline agreeableness, low extraversion, or high conscientiousness had relatively poor responses to guidance. We conclude that providing guidance for WED, especially in real-time, may interfere with emotional processing of unresolved stress, particularly for people whose personalities have poor fit with this interactive form of WED.

  17. Surgical orthodontic treatment for a patient with advanced periodontal disease: evaluation with electromyography and 3-dimensional cone-beam computed tomography.

    PubMed

    Nakajima, Kan; Yamaguchi, Tetsutaro; Maki, Koutaro

    2009-09-01

    We report here the case of a woman with Class III malocclusion and advanced periodontal disease who was treated with surgical orthodontic correction. Functional recovery after orthodontic treatment is often monitored by serial electromyography of the masticatory muscles, whereas 3-dimensional cone-beam computed tomography can provide detailed structural information about, for example, periodontal bone defects. However, it is unclear whether the information obtained via these methods is sufficient to determine the treatment goal. It might be useful to address this issue for patients with advanced periodontal disease because of much variability between patients in the determination of treatment goals. We used detailed information obtained by 3-dimensional cone-beam computed tomography to identify periodontal bone defects and set appropriate treatment goals for inclination of the incisors and mandibular surgery. Results for this patient included stable occlusion and improved facial esthetics. This case report illustrates the benefits of establishing treatment goals acceptable to the patient, based on precise 3-dimensional assessment of dentoalveolar bone, and by using masticatory muscle activity to monitor the stability of occlusion.

  18. Research Advances: DNA Computing Targets West Nile Virus, Other Deadly Diseases, and Tic-Tac-Toe; Marijuana Component May Offer Hope for Alzheimer's Disease Treatment; New Wound Dressing May Lead to Maggot Therapy--Without the Maggots

    ERIC Educational Resources Information Center

    King, Angela G.

    2007-01-01

    This article presents three reports of research advances. The first report describes a deoxyribonucleic acid (DNA)-based computer that could lead to faster, more accurate tests for diagnosing West Nile Virus and bird flu. Representing the first "medium-scale integrated molecular circuit," it is the most powerful computing device of its type to…

  19. Enhancing the Effectiveness of Department Chairs

    ERIC Educational Resources Information Center

    Lumpkin, Angela

    2004-01-01

    The department chair is one of the most challenging positions in higher education. Advancing one's department can occur by attending to the parameters that highly successful organizations have implemented. In addition to outlining the challenges of serving as a department chair, this article describes four requirements for dealing with, and…

  20. FINAL REPORT DE-FG02-04ER41317 Advanced Computation and Chaotic Dynamics for Beams and Accelerators

    SciTech Connect

    Cary, John R

    2014-09-08

    During the year ending in August 2013, we continued to investigate the potential of photonic crystal (PhC) materials for acceleration purposes. We worked to characterize acceleration ability of simple PhC accelerator structures, as well as to characterize PhC materials to determine whether current fabrication techniques can meet the needs of future accelerating structures. We have also continued to design and optimize PhC accelerator structures, with the ultimate goal of finding a new kind of accelerator structure that could offer significant advantages over current RF acceleration technology. This design and optimization of these requires high performance computation, and we continue to work on methods to make such computation faster and more efficient.